W3: The Future of Psychology Flashcards
L11, L12, PA 6, Marecek Article, Ch 8, Ch 10, Ch 11, Ch 12
What role does methodology play in psychology?
very important one!
important principles
- striving for clarity
- focus on testing
- attention to statistics, confounds, induction
- focus on empirical data & analyses
What is methodology vs methodolatry?
methodolatry is when you start worshipping method, which can become very superficial. its no longer true science because science is more than method, its an attitude (integrity, honesty
= bias towards increasing methodological rigour instead of the predictive power of theories
What are Mertons 4 core values in science?
- Communalism: scientific products belong to no one (theoretically, but in practice were not there yet)
- Universalism: truth claims are judged the same, no matter who makes them (theoretically, but in practice there are power structures in place, and underrepresentation of minorities)
- Disinterestedness: scientists have no interest in the outcome of research
- Organized skepticism: ideas are cracked down on and rigorously tested, regardless of who proposes them
What is meant by regulative ideals?
mertons norms & the scientific attitude are not actualized all the time, but this doenst make them useless
-> they are regulative ideals: things we strive for !
What is psychology’s 20th century paradigm? aka the standard of psych research in the 20th century
- create a theory (usually a verbal story)
- Come up with an experiment
(usually small n) - Test hypothesis (usually with a
significance test) - Publish results (usually without
full disclosure and without data)
What events contributed to the change in psych methodology?
- Vul’s Voodoo correlations (found impossible correlation of .88)
- Bargh’s fantastic effects (found strange results like ppl who are lonely take hotter showers)
- Bem showed evidence of clairvoyance but didnt seem to have done anything wrong
What are questionable research practices (QRPs)?
range of activities that intentionally or unintentionally distort data in favour of a researcher’s own hypotheses
What are some common Questionable Research Practices (QRPs)?
- p-hacking: manipulating data in order to obtain a desired (significant) p value (through deleting outliers, running extra participants to increase chances of significance, trying out several dependent variables, post hoc adding variables to the design etc.)
- publication bias: only publishing positive resutls (shown by funnel plots)
- HARKing: Hypothesizing After the Results are Known
- Dropping experimental conditions that “don’t work”
why do some authors believe we have overcome the replication crisis?
- we have identified the problems
- we can improve transparency in data & its analysis
- we can require preregistration
- we now have better bayesian stats
- we know value of replications
what 2 big things were achieved in psych methodology following the replication crisis?
- creation of repositiories
- publication of the TOP guidelines
What are 3 reasons why open science is necessary?
- cus errors in data analysis are likely,and conclusions made depend on these analyses
- makes it easy to run secondary data analysis
- makes big data research possible
what is meant by publish or perish?
refers to the practice in academia that a person will not be appointed or promoted unless they have a strong portfolio of scientific publications
define peer review
the evaluation of scientific work by research colleagues (peers) to decide whether the work is good enough to be published (or financed in case of grant applications)
what is the journal impact factor?
number that estimates the impact a journal has on a research area; based on the average number of citations to articles in the journal in subsequent years
what is a Declaration on Research Assessment (DORA)?
declaration that asks science funders and evaluators to look at the quality of the research itself rather than at the prestige of the outlets in which it was published
what is an open access journal?
journal that can be consulted without paying a subscription or fees for reading articles (usually via internet)
what is an article processing charge (APC)?
price asked by open access scientific journals to process a manuscript and publish it in the journal
define mega-journal
huge open access journal focused on methodological rigour rather than theoretical contribution; & articles on many topics can be included.
how has psych science publication changed in the past 100 years?
- many more published
- quality of journal increased
how is the quality of journal published evaluated? what was the consequence of this?
estimated on basis of average nr. of citations in a specified period aka JOURNAL IMPACT FACTOR
-> led to Declaration on Research Assessment since reliance on JIF had growing negative side effects
why did open access journals start appearing?
cus commercial publishers took over scientific journals from learned societies and research groups and asked increasingly higher subscription fees, even though the production costs went down
what is the business model of open access journals as opposed to commercial publishers? what was the consequence of this?
open access journals do not have an income from subscriptions so researchers have to pay an Article Processing Charge to get their article published
-> rise of mega journals & predatory journals
how did double dipping come about?
established publishers tried to profit from the new business model created by open access journals, by starting their own open access mega journals and offering open acces in their subscription journals (hybrid model)
why has there been an explosive growth of scientific papers in the past 100 years?
- cus more researchers were hired
- cus researchers were increasingly motivated to publish more (PUBLISH OR PERISH)
what is meant by “double dipping” in science?
practice in science in which journals make
money both by journal subscriptions and by article fees for open access
define predatory journal
scientific journal that gives the impression of being genuine (peer-reviewed, with mechanisms of error control, promise of longevity) without adhering to the standards; tries to lure scientists to pay APCs for very limited service
why do some authors think we have not overcome the replication crisis? in other words they think psych cannot improve its scientific record
- the problems have already been known for a long time
- editors and reviewers only introduce some small changes
- even w data storage & preregistration, psychologists continue to make the same research mistakes
define registered report
type of research article that is evaluated by
scientific journals before the data are collected; goal is to make the evaluation independent of the obtained results and solely dependent on the research question, the research design, and the proposed analyses
define bayesian statistics
data analysis that deviates from the traditional hypothesis testing with p-values; estimates the relative probabilities of Ho and Ha; is hoped to correct existing misunderstandings of statistics
What is the pottery barn rule?
the moral obligation of a scientific journal to publish a failure to replicate a finding previously published in the journal
What is statcheck?
a programme to check for stats errors
resutls showed that 50% of papers contained errors
What is the replication crisis aka crisis of confidence?
a crisis of confidence in psych research,
cus many published findings cannot be
repeated if studies are rerun, questioning the reliability of scientific findings
arose cus psychologists and doctors published very bold cliams so ppl examined the results after 2010
what are the 3 main factors that contributed to the replication crisis?
- misinterpreting statistical signficance as proof that the Ha is true
- publication bias making it difficult for unsuccessful replications to be published (file drawer problem & issue w conceptual replications that are only published when they reinforce the original finding)
- use of QRPs
what is the file drawer problem?
issue that the scientific literature badly represents the research done cus experiments that dont find significant differences are less likely to get published
what is conceptual replication? and its pros and cons?
replication in which an effect is investigated
differently from the original study
pro: is good to examine the generality of a finding
con: can magnify biases in scientific lit if combined with file drawer problem
what is open science?
science practice where all relevant information is made easily available, so that other researchers can check the findings and integrate them in their own research
define repository
in science is a location where data and analysis programs are stored, so that others can retrieve them (typically on the internet)
What are the 3 pillars of open science?
- Open Data: so anybody can assess the evidence
- Open Materials: so anybody can replicate the study
- Preregistration: So that a posterori tinkering with analyses becomes visible
what does FAIR and TOP stand for in the open science framework?
Findable, Accessible, Interoperable, Reusable data
Transparency and Openness Promotion guidelines
What is TOP and its 8 standards?
Transparency and Openness Promotion
(TOP) guidelines:
list of criteria written by advocates of open science describing the extent to which journals adhere to the standards of open and reproducible science
1. Citation standards
2. Data transparency
3. Analytic methods transparency
4. Research methods transparency
5. Design and analysis transparency
6. Preregistration of studies
7. Preregistration of analysis plans
8. Replication
what is secondary data analysis?
reanalysis of existing data to address new research questions
What is big data?
collection and use of large
datasets for secondary
data analysis
What is the Manylabs projects?
tried to replicate psych phenomena
bad news: some flashy findings didnt replicate
good news: other findings do replicate well
What does the Psychological Science Accelerator aim for?
aims to
-accelerate cumulative reliable knowledge
- counter WEIRDness through diversity
- rapidly employ huge distributed studies to evaluate research findings
What are 6 criticisims on Open Science?
- Increased bureaucracy and more work for researchers
- Most data are never downloaded, many preregistrations aren’t adhered to, badges aren’t checked
- Too much focus on replication, not enough on creativity
- Its expensive and can compound inequalities
- Not always feasible or desirable
- Some approaches may decline (e.g. field studies, qualitative research)
what are quantitaive research methods?
research methods based on quantifiable data and associated with the hypothetico-deductive model
What are the assumptions underlying quantitative research methods?
- there is an outside reality that can be discovered
- science aims to find universal causal relationships
- we try to avoid confounds & noise in data by controlling circumstances of study
- the researcher is a source of bias & noise, thus standardized measurements and instruments are necessary
- progress happens through falsification
what are the 3 types of quantitative research?
descriptive
relational
experimental
what is descriptive research?
- type of quantitative research method
- focus on observation & carefully monitoring the situation
- trying to express variables in numbers
- usually involves a few measures from a large group of participatns
what is relational research?
- searching for correlations between measures to establish whether there is a relation between them
- humans cannot intuitevly detect correlations reliably and may even perceive illusory correlations
- use of factor analysis to find the structure in datasets with many variables
what is experimental research?
- searching for cause-effect relationships by excluding confounding variables (but the more u exclude, the more artifical ur research)
- experiments often not possible
- only suspected causes should be manipulated, everything else should remain constant
what does the hierarchy of different types of research look like?
top to bottom (with bottom having less credibility)
1. meta analysis
2. randomized controlled studies
3. follow up studies
4. case control studies
5. cross sectional surveys
6. case reports
define confounding variable
variable that was not taken into account in the study and that may be the origin of the effect observed
what is the ideographic vs nomothetic approach? what does quantitative and qualitative research do?
ideographic approach: the conclusions of a study stay limited to the phenomenon under study QUALITATIVE
nomothetic approach: a study is run in search of universal principles that exceed the confines of the study QUANTITATIVE
what are the 2 types of data collection and the 3 general methods used in quantitative research?
data collection:
- semi structured interviews
- focus groups
then transcribed in one of 3 ways:
- grounded theory
- interpretative phenomonological analysis (IPA)
- discourse analysis
what is the evolutionary account of philosophy of sicence?
the view that the rise and fall of scientific ideas follow Darwinian principles of random variation and natural selectoin
so the criterion determining whether an idea will survive may be whether society at large finds the idea interesting and useful
define bracketing
requirement in qualitative research to look at a phenomenon with an open mind and to free oneself from preconceptions
What is the quantitative imperative?
the convinction that you cannot know what you cannot measure
what are the strengths of the quantitative research method?
- lends itself well for statistical analyses of large datasets
- can produce precise predictions that can be tested
- makes comparison (between groups or subjects) possible/easier.
- easier to investigate confounds and validity threats
what are the weaknesses of the quantitative research method?
– little interest in the perception of participants
– research limited by what is measurable
– better suited to testing general theories than to finding solutions for specific situations so If you don’t have a well-developed theory yet, quantitative methods aren’t as helpful