Psych 201 Flashcards

1
Q

Heuristics

A

simple stinking strategies that allow us to make judgments and solve problems efficiently

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Availability heuristic

A

A mental shortcut strategies for judging the likelihood of an event or situation to occur based on how easily we can think of similar or relevant instances

Story about plane crash leads to reception in flying

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Representative heuristic

A

A mental shortcut strategy for deciding the likelihood of an event based in how much it resembles what we consider to be a “typical” example of that event

If you meet a slim, short, man who wears glasses and likes poetry, what do you think his profession would be? Livy league Prof or Truck Driver

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Overconfidence

A

the tendency to overestimate the accuracy of our beliefs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Better-than-average

A

the tendency t overestimate your skills, abilities, and performance in comparison to others

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Hindsight bias

A

After learning the outcome of an event, many people believe they could have predicted that very outcome

“I knew it all along”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Conformation bias

A

The tendency to search for information that confirms a personal bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Belief perseverance

A

The tendency to cling to our beliefs even in the face of contrary evidence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the 3 features of science

A

systematic empiricism
empirical questioning
public knowledge

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is scientific empiricism

A

observing the natural world in an unbiased manner

Requires carefully planning, making, recording, and analyzing observations of the world

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is empiricism

A

learning by observation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are empirical questions

A

Follow from systematic empiricism

Attempt to determine how the world actually is and can only be answered by systematically observing it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What makes a good scientist? (6 things)

A

Sceptical
Open minded
Objective
Empirical
Creative
Articulate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What does it mean to be skeptical?

A

suspend judgement and evaluate new claims

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does it mean to be objective?

A

base their opinions on facts rather than on their personal feelings

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What does it mean t be empirical?

A

Updating ideas based on testing falsifiable hypotheses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What does it mean to be articulate?

A

share ideas among the masses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the acronym that describes the strategy used by science denialism

A

FLICC

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What does FLICC stand for

A

Fake experts or presenting info from questionable sources

Logical fallacies or arguments that use errors in reasoning

Impossible expectations or creating unrealistic goals of certainty before being a fact

Cherry-picking or selecting only data that supports a claim

Conspiracy theory or conjuring a secret scheme to explain straightforward findings

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What are some research skills that translate into employability skills?

A

Project mamagement
Problem-solving
Critical thinking
Analytical skills
Interpretation of numerical information
Communication skills

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What are the questions we pose from natural curiosity called?

A

hypotheses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Should we gather evidence to potentially refute the hypotheses

A

yes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Steps of the research process –> 6 including repeat

A
  1. Define a research question using theory
  2. Sate a specific and testable hypothesis based on the research question
  3. Carry out study and collect data
  4. Analyze and interpret the data
  5. Revise theory based on data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What are the steps to developing a good research question?

A
  1. start with a general research idea
  2. turn that into an empirically testable research question
  3. Evaluate how interesting that question really is
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Where can you find inspiration for research questions (4)
common sense Observations of the world Practical problems Past research
26
What is common sense?
Body of knowledge that be all believe to be true (whether or not it is)
27
What is an example of a practical problem that can help define a research question?
Study of adherence to health initiatives during the pandemic
28
What is the best way to generate new research ideas?
past research --> can improve past techniques
29
What framework do we use for designing research questions?
PICO
30
What does PICO stand for? Do we have to use all the letters?
Population Intervention Comparison Outcome
31
What type of research is PICO usually used for?
quantitative
32
What are the 2 major types of research?
Quantitative Qualitative
33
What are the 3 types of quantitative research?
Descriptive Comparative Relationship
34
What type of research is this? What is the prevalence of alcohol use in BC among adults? And what are the parts of PICO?
Descriptive P: adults in BC O: alcohol
35
What type of research is this? What is the prevalence of alcohol use among men and women who suffer from depression and live in BC among adults. And what are the parts of PICO?
Comparative P: men and women who suffer from depression C: men vs women O: alcohol use
36
What type of research is this? Is there a relationship between stress and alcohol misuse in adults who live in BC? And what are the parts of PICO?
Relationship P: adults in BC C: levels of stress O: alcohol use
37
What is qualitative research
Attempt to describe and contextualize phenomenon. Qualitative research questions are typically non-directional (do not predict outcomes) and relatively flexible qualitative research is concerned with subjective phenomena that can't be numerically measured, like how different people experience grief
38
What part of pico do we use in qualitative research?
P and O
39
What kind of research is deduction?
Quantitative
40
What kind of research is induction?
Qualitative
41
Steps in deduction?
Theory, Hypotheses, observations, confirmation (data analysis)
42
Steps in Induction?
Observations (talk to people), patterns, hypothesis, theory Theory can then be tested using quantitative work
43
What is quantitative research?
numeric and objective, seeking to answer questions like when or where.
44
What to ask if a research question has been asked before?
Other ways to operationally define variables? Specific groups that may differ in their response? Different situations in which responses may differ?
45
What acronym to use to evaluate research questions?
FINER
46
What does FINER stand for?
Feasible: can the study be done with available resources, time, and technology Interesting: whether the study captivates a wider audience --> peers, collaborators and potential funders Novel: shed light on uncharted areas, provide new viewpoints and questioning conventional framework Ethical: is it conducted with integrity, respect and responsibility Relevant
47
What is the importance or reviewing literature?
helps turn a research idea into a research question Determine if a question has been answered Ideas on how to conduct the study Tell if your research fit into the literature
48
Should you look at old research literature?
initial searches within past 5 years then older stuff
49
What defines a professional journal (3)? What do they publish?
The have been established for a long time Associated with a professional organization Are peer reviewed --> often double blind process Publish original research and review articles
50
What is peer review for?
to assess the validity, quality and originality of articles for duplication --> maintains integrity of science by filtering out bad articles
51
What is a single-blind peer review?
the authors do not who the reviewers are
52
What is double blind peer review?
neither authors not reviews know each other or their affiliations
53
What are predatory journals?
Journals that will publish your article for a "fee" --> don't peer review properly APC frees are fine (article processing fee) --> happens after peer review
54
What are other sources for information?
scholarly books or pre-prints
55
What is different about a pre-print?
Not peer reviewed
56
What are the two types of books? describe them
Monographs: written by a single author or a small group that present a coherent presentation on a topic Edited volumes: edited by a single person or small group Multiple author s(beyond the editors) contribute to the book, and viewpoints may be different
57
What are the Boolean operators and what are they used for?
AND --> contains both alcohol and stress OR --> contains alcohol or stress "" --> "alcohol misuse" contains exact phrase - Bears-chicago --> only results with bears and not Chicago ~ ~academic --> search for academic and its synonyms
58
What do you do after you have a research question?
Construct hypotheses
59
What is a hypotheses
a test of a specific theory (not a prediction)
60
What is a scientific law?
A statement based on repeated experimental observation that describes some aspect of the world (weber's law)
61
What is a scientific theory?
A well-sustained explanation of some aspects of the natural world confirmed o through repeated observations and experimentation?
62
Does psychology use theories or laws more?
Theories
63
What are 4 strategies for generating hypotheses? And why do they mean?
Introspection --> Engage in self-observation — Ask yourself “What would I do?” “How would I feel?” or “What would I think?” Find the exception to the rule -> Crafting hypotheses about outcomes in the opposite direction of prior research has the potential to provide new insights into a phenomenon. A matter of degree ---> Try to think about your variables in terms of amounts, such as quantity, intensity, strength, volume, number, force, persistence, and effort of IV Change the directionality --> There is not a set direction of how one thing influences another
64
Are hypotheses predictions?
NO
65
What is syllogistic logic? How is the hypothesis usually stated in paper?
IF (hypothesis) AND (methods) THEN (prediction) All of it together
66
What does the textbook say a hypothesis is
Hypothesis: an educated prediction that provides a testable explanation of a phenomenon
67
What are the 4 questions you should ask when evaluating your hypothesis
Does it correspond with reality --> should be consistent with past research Is it parsimonious --> Occam's razor --> simple How specific is it? --> Barnum effect Is it falsifiable refutable?
68
What is the Barnum effect?
The tendency for people to believe general descriptions of their personality are highly accurate
69
What could it mean if your predictions are false?
method was ineffective Theory was wrong Results were a fluke
70
5 steps to testing the hypotheses?
1. indentify key variables 2. choose a research design 3. conduct the study 4. analyze and draw conclusions 5.communicate the findings
71
Are there many ways to conceptualize and operationalize the same variable?
yes
72
Conceptual definition of stress example? Operational definition of stress example?
mental tension and worry heart rate, sweating
73
What is an experimental design?
A research method in which the experimental controls and manipulated the independent variable, allowing the establishment of cause-and-effect relationships between the independent and dependent variables
74
Predictor and covariate are another word for what
independent variable
75
Criterion, outcome, and response are another word for what
dependent variable
76
Is correlational design experimental? why or why not>
it is a nonexpeirmental design Correlational designs give us relationships --> no IV or DV as either variable can predict the other
77
What are the IV and DV in nonexperimental designs/ correlational designs
Explanatory (or predictor) --> IV Criterion (response) --> outcome variable (DV)
78
What are the 2 decisions you need to make in experimental designs?
1. The number of levels of the IV 2. How frequently you want to gather data from participants
79
What are between subject designs
comparing results between participants for a single condition different groups
80
What are within subject designs?
same person tests multiple conditions and they are compared Differences over time
81
Is longitudinal designs between or within subject?
within
82
What is a mixed design?
both within and between subject design if you have stress group and control group → between subject → but we measured stress before and after the intervention → within subjects
83
What are levels of IV meaning
Different ways to define the IV Define stress in different ways → psychosocial stress (speech) or physical stressor (treadmill)
84
What is a research protocol?
A detailed series of steps that lest the researcher know the order in which to administer the study and provides a script of what the researcher should say and do informed consent, studt/colect data, debrief
85
What is a script?
A written set of instructions that the researcher will read to each participants while collecting data
86
What is data
distinct pieces of info
87
Why use statistics?
avoid bias see patterns dare probabilistic conclusions
88
What are two ways to communicate findings?
poster or paper
89
what part of a paper is this: provides background information from previous research on the topic under investigation and the theoretical and empirical basis for the study’s hypotheses
intro
90
What part of a paper is this: outlines the study’s findings using a combination of statistical analyses and a narrative that explains the tests the researcher used, and what the statistical results mean in plain language
results
91
What part of a paper is this: an analysis and interpretation of the study’s findings, including strengths and weaknesses, suggestions for future research, and ideas for practical applications of the findings
discussion
92
What are ethics?
The application of moral principals concerning what an individual considers right and wrong to help guide ones decisions and behaviour
93
What is the utilitarian perspective?
you decision should do the greatest good for the greatest number of people
94
What is the altruistic perspective?
Helping others without personal benefit
95
Egosim meaning
individuals should act in accordance with their own self interests
96
What is milligram (1974) experiment and what was its purpose?
If people obey an authority figure Delivered a series of shocks to a confederate
97
When it is acceptable to inflect harm on research participants?
It is acceptable as long a the harm is minor and short-lived and there are major benefits to the study
98
What is Belmont principal 1:
Beneficence and nonmaleficence
99
What is Beneficence?
acting with the purpose of benefiting others
100
What is Nonmaleficence?
researchers should do no harm
101
What is a cost-benefit analysis?
a systematic progress in which a researcher weighs all the potential and known benefits against the potential and known risks before conducting a study
102
What are types of harm
Physical harm or psychological harm
103
What is Tuskegee Syphillis studies?
Done by US public heath service low-income African American men were injected with syphilis and decided treatment --> lead to death
104
What is Belmont principle 2:
Justice
105
What is justice
fairness when deciding who to use as study participants and what role they will play in the study
106
What is clinical equipoise?
uncertainty ad to which of two treatments options is more beneficial the conducting a study
107
What is Zimbados (1973) experiment?
Stanford prison experiment Whether psychological effect of being in prison were a result of situational effects or personality traits People were randomly assigned to either play prisoner or guard was stopped after 6 days
108
what is Belmont principle 3:
Respect for Persons
109
What does respect for persons involve?
autonomy and informed consent
110
What is autonomy?
the ideas that people are capable of making deliberate, informed decisions about their participation in research --> right to freely choose to be involved in a study
111
What is informed consent?
part of ethical procedures at the beginning of a study in which participants learn what the study expects of them and what the risks and benefits of participation is, then freely makes the choice to participate or not Jargon-free
112
What are these? Nuremberg (1947) *Declaration of Helsinki (1964) World Medical Assoc. – biomedical research Belmont Report (1979) Biomedical and Behavioural Professional Ethics (CPA/APA/BPS) Funding Body (Tri-Council, NIH, etc.) Canadian Council on Animal Care
ethical codes
113
Who reviews all research prior to data collectionfor ethics
IRB: institutional review board REB: research ethics board Ethics committee All the same thing Needs to be approved prior to research initiation
114
What is the Human research ethics board at uvic
they oversee research that involve Human participants or biological materials. Includes research in courses
115
What is Tri-council research funding. And who are they?
Most Canadian research on humans is funded by one of three major federal funding agencies: Natural Sciences & Engineering Research Council (NSERC) (including “hard-science” parts of psychology) Social Sciences & Humanities Research Council (SSHRC) → psyc Canadian Institutes of Health Research (CIHR) → psyc
116
What are the three core principles of the Tri-council research ethics policy?
Respect for persons (autonomy, dignity) Concern for welfare (safety, wellbeing, privacy) Justice (fairness, honestly, rigor)
117
What does respect for person involve in the tri-ethics policy?
Informed and ongoing consent Freedom from coercion --> able to refuse participation and withdrawl ant any time without negative consequences grades, money
118
How can coercion present itself in research participation?
grades, money, power
119
What must informed consent include?
Thorough description of the study Jargon free Details regarding risks Details regarding confidentiality and data storage A clear statement that participants may withdraw without penalty Opportunities for participants to ask questions
120
What to do for getting consent from young/cognitively impaired people
parental consent child assent --> child agrees to participate
121
What is concern for welfare: (protection from harm) in the tri-ethics policy
Protection from harm avoid unnecessary discomfort Benefit >risk Have plan to resolve issues that arise Caution with vulerable populations
122
How does the trip-ethics policy on welfare (protection from harm) apply to researchers?
Need to be properly trained/ competent No conflict of interest
123
Is some research exempt from review?
Yes if it poses little risk --> secondary data
124
What is the Concern for welfare (privacy) in Tri-ethics?
Anonymity Confidentiality Should have data management plan Privacy rules: PIPEDA (personal information protection and electronics document act) PHIPA (canada) HIPAA (USA) for Health info
125
What is anonymity
person cannot be identified
126
What is confidentiality
info not disclosed to public
127
What is justice (fairness) in the Tri-council policy
Including vs excluding participants Including prisoners since captive audience – not good No prisoners since inconvenient – not representative of entire population Protection from harm in research vs protection from harm in daliy life - drug safety -crash test dummies
128
What is justice (worthwhile research) in the Tri-council policy
Potential to contribute to useful knowledge --> relevant, carefully conducted, etc Not: wasting participants time wasting money
129
What is deception?
Conceal some aspects of research
130
Why do we use deception?
Demand characteristics → want true behaviour, not participants attempt to support hypothesis
131
In what ways is deception used?
Range from passive to active Concealing purpose of activity to providing false information
132
What to do if you use deception (3Ds)?
Tell the truth ASAP Debrief --> provide participants with true purpose of study Dehoax --> describe the deception and why it was necessary Desensitize --> remove side effect
133
Can we investigate sensitive topics?
Yes, You need to think creatively, use a sound design and limit potential harm Ex: You can screen participants for depression before using them in a study that might cause stress
134
What does Post-hoc mean
Analysis is determined after data has been collected. Not specific before based on hypothesis
135
What are some ethical issues beyond human subjects?
Overselling Post hoc storytelling P-value fishing plagiarism creative outliers non-publication/partial publication inventing data
136
Is citing a source you didn't read plagiarism
yes
137
What did Diederik stapel do?
fabricated entire experiments meat-heaters were more selfish than vegetarians messy environments led while people to discriminate more against blacks
138
What is overselling/misinterpretation in graphs?
change the x-axis to make the effect look bigger
139
What is p-hacking?
try and hack this p-value to make it significant If you run lots and lots of tasks you are more likely to find a significant p value
140
What is HARKing
Hypothesizing after results are known
141
What is a theory?
Theories are plausible explanatory propositions devised to link possible causes to their effects Broad bodies of knowledge that aim to explain robust phenomena
142
What are theories and models often confused?
Models have theoretical content and theories are expressed by models
143
What is a robust phenomena and give examples
Phenomena is something that is observed to happen Addiction The placebo effect Forgetting
144
What should good theories be able to do?
Be testable Be coherent Be economical Be generalizable Be able to explain and predict
145
What do theories explain and predict?
"who will experience this phenomena" "why is this the case"
146
What are theories supposed to represent
target systems
147
What are target systems?
Theories are constructed to describe, explain, or predict aspects of real-world phenomena or systems
148
Why is theory formation difficult in psychology?
Target systems are complex --> mental health, beliefs and expectations, etc
149
How detailed should theories be?
Low-level experiments (base level) are typically over-values and reductionist
150
What is pragmatism?
focuses on the practical consequences and utility of beliefs and actions, --> clinical application
151
What is understanding>
is concerned with the depth of comprehension and insight into knowledge or concepts.
152
What impedes theory formation? Give examples
Imprecise constructs Many diseases occur together because there are poor boundaries defined between them --> treatment rates are only modestly effective
153
What is an example or seperendipity in theories?
serotonin theory of depression was created SSRI were perceived for depression Research suggest that serotonin alone not responsible for depression
154
Is CBT a theory? What does it focus on?
cognitive behavioural therapy is embedded in theory it is atleatd as effective as pharmacological interventions Thoughts, behaviours, and emotions
155
What is a model?
Like theories but narrower in scope often applied to a particular aspect of a theory Often statistical in nature
156
What is the difference between a theory and a model?
Theories model phenomena and (statistical) models model effects (relationships)
157
What are the three types of models?
Abstraction, explanation, prediction (regression)
158
What are abstraction models? give examples
Abstraction models, are simplified representations of complex systems, concepts, or phenomena DNA or model of woking memory
159
What are explanation models?
frameworks or structures used to clarify, interpret, and understand phenomena, events, or relationships
160
What are prediction (regression) models
Prediction (regression) models are statistical techniques used to predict or estimate the value of a dependent variable based on one or more independent variables.
161
What is data?
collection of information it is meaningless until given relevance
162
How are theories and data linked
models are the intermediate between theory and data
163
Explain theories, models, and data
Theories are plausible explanatory propositions devised to explain phenomena --> high level principles Model are representation of reality or of one's view of a possible world, constructed to improve one's understanding about the world and/or to make prediction --> concrete applications of those principles Data is used to confirm or falsify theories and models
164
What is a construct? examples?
a concept for which: A single observable referent does not exist direct observations cannot be made and multiple referents exist, but none are all-inclusive Examples: time, distance, mass Loneliness, impulsivity, anger, happiness, intelligence, knowledge, risk taking, anxiety
165
What are conceptual definitions?
Helps us understand what a construct means but do not tell us how to quantify the construct
166
What is operationalization?
Lets us quantify the construct --> directly observable How we measure or represent a construct in a study
167
What is an operational definition?
recipe for someone else to replicate
168
Give an example of an operational definition for stress and risk taking
heart rate balloon analogue rusk task
169
What are these two sentences? - The distance between two points - 118.1 barleycorns or 1 m
Conceptual --> distance between two points Operational --> 118.11 barley corns
170
If time is a construct, what are operational definitions?
seconds, # of sunrises or full moons
171
If stress is a construct, what are operational definitions?
perceived stress cortisol Number of stressful life events
172
What is a variable?
The actual representation of a construct The set of values obtained from, or deterred for, each participant in a study the numbers obtained from a measurement The names assigned to each condition A proxy for the construct Values assigned to a variable can vary
173
What is the DV and other words for it?
variable to be explained outcome variable response variable, primary endpoint
174
What is the IV and what are other names?
determinants of the dependent variable explanatory variable, predictor variable, covariate
175
What is a control variable (covariate)
another variable that may plausibly alter the relationship between the IV and the DV -> comorbidity anything that is held constant or limited in a research study Example: Does caffeine improve memory recall? Control variables → Participant age, Noise in the environment, Type of memory tes
176
What are the 4 level of measurement?
Nominal Ordinal Continuous - interval Continuous -ration
177
What is nominal data
data that are categorized with no inherent order --> gender
178
What is ordinal data?
data that are categorized and ranked
179
What is continuous interval data?
Ranked and evenly spaced data Test scores (e.g., IQ or exams) Personality inventories Temperature in Fahrenheit or Celsius No absolute zero --> zero doesn't mean a complete absence The difference between any two adjacent temperatures is the same: one degree. But zero degrees is defined differently depending on the scale – it doesn’t mean an absolute absence of temperature. The same is true for test scores and personality inventories. A zero on a test is arbitrary; it does not mean that the test-taker has an absolute lack of the trait being measured.
180
What is continuous ratio data?
ranked, evenly spaced, and has a natural zero. A true zero means there is an absence of the variable of interest. I For example, in the Kelvin temperature scale, there are no negative degrees of temperature – zero means an absolute lack of thermal energy Height, Age, Weight, Temperature in Kelvin
181
What is a construct definition?
a concept that cannot be directly observed and for which there is not one single referent (abstract) what we are interested in but we can't work with it directly
182
What is a conceptual definition (definition)?
what construct means
183
What is an operational definition (definition)?
specific and concrete way in who we represent a construct
184
What is a variable(definition)?
someitng (an attribute) which varies (often refers to the data we collect)
185
What do the operational definition and the variable do for the construct?
The operational definition and the variable, crate a proxy for the construct
186
What does Rubbish in Vs Rubbish out mean?
The quality of the information coming out of an analysis cannot be better than the quality of the info coming in
187
What is measurement error examples?
Census counting roommates as married gay couples -->not what you were asking 280 different scales to measure depression
188
What is validity?
measuring the intended feature
189
What are some errors related to validity?
Measuring the wrong aspect --> living together does not mean being romantically involved Systematically over/under estimating --> forgetting to adjust a clock --> 1 hour late Need to use the right tool for the job
190
What are different types of validity?
Face validity Content validity Convergent & discriminant validity Criterion validity (concurrent, predictive)
191
What is face validity?
looks right --> do the questions we asked seem to relate to hapinness
192
What is content validity?
(covers breadth) → does it measure all aspects of hapiness or just one aspect
193
What is convergent and discriminant validity
(is/isnt related)Discirminant: Shows you that two tests that are not supposed to be related are, in fact, unrelated. It evaluates whether measures of different constructs are distinct from one another. Convergent: : shows you that two tests that are supposed to be related to each other are, in fact, related. Convergent validity assesses whether different measures that are supposed to be measuring the same underlying construct actually converge or correlate with each other.
194
What is criterion validity?
(concurrent, predictive) how well a test predicts or correlates with an outcome that it should theoretically be related to.
195
What are ways we can reduce the impact or inconsistency (poor reliability)?
Measure more than one and take the average Measure using more than one method
196
What is reliabilty?
getting the same answer consistently (when no change is expected)
197
How do we assess reliability?
Test-retest reliability * Internal consistency reliability, Cronbach’s alpha, Split-half, parallel forms, * Inter-rater →
198
What is test-retest reliability
testing the same thing and getting the same results
199
what are Internal consistency reliability, Cronbach’s alpha, Split-half, parallel forms, used for?
examples of correlations between each item → if we have a scale of 10 items, those items should be rated to each other
200
What is inter-rater reliability?
if you are measuring animal behaviour you get two peoples to rate the behaviour then look to see if the two peoples ratings are related or not.
201
What is the classical test theory
Observed score = True score + error
202
What are the two types of error
Random error (reliability) Systematic (validity)
203
If you have high accuracy and low precision, what type of error is that
random error
204
IF you have low accuracy but high precision, what type of error is that?
systematic
205
If the error is larger than the true score, what we are measuring
error
206
What are 2 ways to measure behaviour?
Observe it (behaviour measure) Ask about it (self-report)
207
What are some ways we can observe behaviour (for measurement)
Unobtrusively Obtain record (medical) In standard/contrained/specific conditions --> in lab In naturalistic settings In participatory context (disguised/not)
208
What are some ways we can ask about behaviour (for measurement)
self-report Parent/spouse
209
What is psychometrics?
a scientific discipline concerned with measurement concerned with maximizing validity, reliability and generalizability
210
What is credibility?
refers to the degree to which the findings, interpretations, and conclusions of a study are trustworthy and believable based on the rigor of the research methods, data analysis, and presentation of results
211
When did credibility originate?
Dates back to Aristotle theory of rhetoric
212
What is rhetoric
the ability to see what is possibly persuasive in every situation art of persuasion
213
What are the three means of persuasion stated by Aristotle?
1. Ethos 2.Pathos 3. Logos
214
What is Ethos?
the sources credibility
215
What is Pathos?
the emotional or motivational appeals
216
What is logos
the logic used to support a claim
217
What are the two components of credibility?
Trustworthiness and expertise
218
Good science is credible science. Why happen if there is no credibility?
We are in the pursuit of truth, without credibility there is no truth and without truth there is chaos Bad science damages society
219
Who is Andrew Wakefield?
Claimed MMR vaccines cause autism
220
What is credibility (definition #2)or what does it refer to?
Refers to the combination of reproducibility, robustness and replicability
221
What is reproducibility?
finding the same result using the same data analysis strategy
222
What is robustness
finding same results using the same data, but analyzed in a different way
223
What is replicability?
finding same result with different data
224
Are all studies replicated?
No, but they are cited which is dangerous
225
What is preclinical cancer research
research that occurs on animals before on humans
226
What is a problem with fMRI credibility
“nearly as many unique analysis pipelines as there were studies in the sample” 90% of the brain showed significant activation at a point --> can't make conslcuions
227
How much biomedical research is wasted?
85%
228
What leads to research waste? What questions to ask yourself?
Are research decisions based on questions relevant to users of research? no --> wasted Appropriate research design, methods and analysis --> no --> wasted Efficient research regulation and management? Fully accessible research information Unbias and usable research reports?
229
In the study where they tried to replicated 100 studies, how many could be replicated? What were the p-values like? What was the effect size like?
1/3 p very large in the replications The effect sizes were inflated in the original studies → strength of correlation between variables
230
News examples of problems with science?
GCSE fiasco Excel problems with covid FInance problems in UK --> George Osborne was in charge
231
What are problems that arise in non credible science ( in a manifesto for producible science)
Failure to control for bias Low statistical power Poor quality control P-hacking Publication bias HARKing
232
What type of problems can arise while generating and specifying hypotheses?
failure to control for bias
233
Why type of problems can occur while designing the study
low statistical power
234
What type of error can occur whole conducting study and collecting data?
poor quality control
235
What type of problems can occur while analyzing data, and interpreting results
P-hacking
236
What type of problems can occur while publishing results
publication bias
237
What is confirmation bias?
tendency to focus on evidence that is in lie with our expectations
238
What is hindsight bias?
the tendency to see an event as having been predictable only after it has occurred
239
What is Apophenia?
The tendency to see patterns in random data
240
What is a conflict of interest and what are the two types?
A compromise of a person's objectivity when that person has a vested interest Financial Non-financial
241
What is financial conflict of interest Example
The sponsor of a trial may be the company manufacturing the product --> influences the study ISFAR (international scientific forum on alcohol research) had COI with big alcohol
242
What is a non-financial conflict of interest
where individuals involved in the research process have personal, professional, or academic interests. For example: the stakes that researchers have in obtaining publishable results in order to advance their career
243
What is an example of poor quality control in research
895 of intro to psych textbooks fail to deine statistical significance properly
244
What leads to publication bias?
The publish or perish mindset
245
What is a P-value?
P-value indicates that the test statistic obtained (or a more extreme one) is unlikely when the starting assumption (null hypothesis) is true
246
what does the current incentive structure reward in research? and what does it ignore? What is an example of a problem with this that occurred?
# of world leading publications Reward: number of papers published Total grand $ won Number of citations Ignore: errors detected studies replicated methods improved peers trained Took 5000 journal articles and 11-40 % of the papers were published by one person
247
What is the file drawer problem?
The get rid of papers with p> 0.05 searching for p-values greater than 0.05
248
What is low statistical power
Low statistical power refers to the inadequacy of a research study to detect a true effect or relationship between variables when it exists. A low statistical power means that the study has a low probability of detecting a true effect if one exists. Lots of False negatives
249
What is a type 1 error?
Type 1: false positive → + C19 test without virus reject null when you shouldn't have
250
What is a type 2 error?
false negative → - ve C19 test with virus Didn't reject null when you should have
251
what type of error is worse? Type 1 or 2
type one is worse usually type 2 is worse for diseases
252
What is statistical power?
the probability of detecting in affect if there is a true effect present to detect --> true positive rate
253
what is included int he equation of statistical power
sample size significance level (p-cutoff) Power level effect size solve an of these and get the other ones
254
What is statistical level (alpha)?
the threshold used to determine whether the results of a statistical test are deemed statistically significant
255
What is power level?
amount of time there is a true positive the probability of detecting in affect if there is a true effect present to detect
256
What is effect size?
the distance between groups/ correlation coefficient strength or magnitude of a relationship between variables or the size of a treatment effect
257
What is P-hacking?
Stop collecting data once p<0.05 Analyze many measures but only report those with p< 0.05 Collect and analyze many conditions but only report those with p<0.05 Use covariates to get p<0.05 Exclude participants Transform the data
258
What makes it difficult to spot p-hacking
confirmation and hindsight bias influence the way researchers interpret and analyze data, potentially leading to biased conclusions and selective reporting of results
259
What happens if you ignore threats to credibility?
research waste and erosion of trust
260
What will improve credibility?
Methodological training
261
What happens if the confidence interval crosses over the line?
If the CI crosses over the line there is no statistically significant difference and if it doesn't then it is
262
What will happen to the p-value in low power studies?
Jump around a lot
263
What should judgements of trust be based on in science?
Study-level credibility
264
What is scholarship and where is it?
Scholarship in research means actively seeking and sharing knowledge in a particular field using careful, critical thinking, and rigorous methods. A scientific article itself is not scholarship A scientific article is an advancement of scholarship The scholarship is the process which underly the data and the actual data
265
What are questionable research practices?
refer to behaviors or actions that, while not necessarily constituting outright scientific misconduct, may compromise the integrity, validity, or reliability of research findings.
266
What is open science?
the process of making the content and process of producing evidence and claims transparent and accessible to others
267
What is rigor?
doing the study carefully and thoroughly, using the right methods, being honest about findings, and making sure the results can be trusted.
268
Transparency increases rigorous, which increases credibility
269
What is Open access (OA)
unrestricted public availability of research products
270
Green OA:
works made publicly available by researcher, including pre prints
271
Gold OA:
works made publicly available by journals (APC fees)
272
Platinim OA:
works made publicly by journals (no fee)
273
What percentage or articles are OA? what percentage of journals offer self archiving (green OA)
25 % are OA 79% of journal offer self-archiving
274
What is open source software?
Software for which the original source code is made freely available and may be redistributed and modified
275
What are the benefits and drawbacks from open source software
Benefits; free flexible bleeding edge secure collaborative Draw backs: Not as user firefly ad commercial software No extensive support Software longevity Less popular packages may not be secure
276
What is open science framework (OSF)
Website Search for pre-print Can store data Can link into google scholar
277
What is preregistration
Simply specifying your research plan in advance of your study and submitting it to registry --> can't change after
278
What is confirmatory research?
Hypothesis testing results are held to the highest standards Data-independent (the outcome or decision is not influenced by the specific data being analyzed.) Minimizes false positives P-value retain diagnostic value may be generalized to larger population type of research approach aimed at testing specific hypotheses or theories using predefined hypotheses, variables, and methods. Example: An economist conducts a study to investigate the impact of minimum wage policies on employment levels in different industries.
279
What is exploratory research?
Hypothesis generating Results deserve to be replicated and confirmed Data-dependent Minimizes false negatives ignorer to find unexpected discoveries P-value lose diagnostic value can't be generalized to a larger population more open-ended and seeks to generate new hypotheses or insights. Ex: A social scientist conducts interviews and focus groups with teenagers to explore their attitudes and behaviors towards social media usage. The researcher does not start with specific hypotheses but aims to uncover patterns, themes, and insights that may inform future research questions or hypotheses.
280
What do you submit for preregistration?
Analysis plan, hypothesis, design and procedure and planned sample and exclusion criteria
281
What are registered reports?
When there are 2 rounds of peer review. One after you design the study and before you collect data and one after you write the report.
282
Questions asked during peer review stage 1 in RR
Are the hypotheses all founded Are the methods and proposed analyses feasible and sufficiently detailed Is the study well powered
283
Questions asked during peer review stage 2 in RR
Did the authors follow the approved protocol Are the conclusions justified by data
284
How can open science be liberating and foster creativity?
enables us to explore data transparently and comfortably Rewards quality, which is under our control, rather than outcomes, which are not Reduce the choke hold of needing to find "positive results for career advancement
285
What makes RR reports good in regards to what they publish?
Articles are published whether then hypothesis is supported or not whether p<0.05 or not Whether results are novel or not
286
What is generalizability?
the extent to which findings, results, or conclusions drawn from a specific study can be applied or generalized to a broader population, context,
287
What is a constraints on generisability statement and what should it consider?
Participants → different people Materials / Stimuli → does the risk taking task actually generalize to the real world Procedures → talk about limitations Historical / Temporal Specificity → how specific is the historical context → does covid research translate to stress research at other times (in prison) Situation Sampled (not included in the paper)
288
What is Contributor role taxonomy (CRediT)
Shift from authorship to contributorship Authors are credited for what they did
289
What are the open science badges?
Preregistered Open data Open materials
290
What is DORA
Declaration of research assessments
291
What do institutions that sign a DORA have to do?
Consider the content of the paper much more important than the metrics of the journal it was published in Consider the value and impact of all research outputs (including databases and software) in addition to publications
292
What does the centre for open science do?
Creates search software tools (eg. the open science framework to facilitate preregistration, sharing of data) Provides trading and support for researchers and institutions Conduct research on research practices
293
What is the UK reproducibility network
Promote open science practices
294
What is reporducibilitea
“grassroots” journal club initiative where members meet to discuss diverse issues, papers and ideas about improving science, reproducibility and the Open Science movement.
295
what is riot science club
a seminar series that raises awareness and provides training in reproducible, interpretable, open and transparent science practice
296
What is SIPS (society for the improvement of psychological science)?
brings together scholars working to improve methods and practices in psychological science. * Most of the time is spent working collaboratively on projects aimed at improving psychological science including a number of hands-on workshops (e.g., learning R, Bayesian stats, meta-analytic techniques) and hackathons
297
What are the Mertonian Norms/ the ethos of modern science? (why act openly)
Communism Universalism Disinterestedness Organized scepticism
298
What is communism?
All scientists should have common ownership of scientific goods, to promote science collaboration --> open science
299
What is universalism?
Scientific validity is independent of the sociopolitical status/ personal attributes of the participants
300
What is Disinterestedness?
Scientific institutions act for the benefit of the common scientific enterprise, rather than for the personal gain of individuals within them
301
What is organized skepticism?
Scientific claims should be exposed to critical scrutiny before being accepted
302
303
Why act openly?
It is the right thing to do --> mertonian norms Benefits your academic career: more citations, news coverage, available data leads to more citations, open software is more likely to be used Networking opportunities Get more funding --> meet requirements Get that promotion
304
What are 5 selfish reasons to work openly (reproductively)?
reproducibility helps avoid disaster Reproducibility makes it easter to write papers Reproducibility helps reviewers see it your way Reproducibility enables continuity of your work Proproducibility helps build your reputation
305
Does it take more time to work openly?
no, it usually front loads the work
306
Is there a risk of being scooped when acting openly?
no, because your info is already out
307
What some people haven't started acting openly?
Habit Issues with training Tiem pressure Peers are yet to adapt Potential ridicule financial issues
308
Is open science expensive?
yes, they charge high APC not everyone can be open
309
Does a piece of open research automatically make it good (high academic quality)
no
310
Does a piece of producible research automatically make it good?
no
311