Chapter 11: Psychology and open science Flashcards
Replication crisis
36% of studies were replicated, lower for social psychology than for cognitive psychology (Open Science Collaboration, 2015)
Fraud increased scepticism towards psychology
Why did Ioannidis argue most medical research findings were likely to be false?
There was a belief that one study finding a significant effect brings conclusive evidence for the effect
Statistical power
Correctly rejecting null hypothesis
Problems with replication
- misunderstanding of statistics (taking p-values as proof of effect)
- replication often done by same researcher
- direct replications were not considered interesting
- failed replications were not convincing and not published
- direct replications left for informal communications and beginning students
File drawer problem
Non-significant studies are less likely to be published
Conceptual replications
Do not directly replicate study
If no effect is found, it does not disprove, it introduces limiting conditions
Questionable research practices
P-hacking: manipulating data to find a statistically significant p-value
- dropping bad participant data after analysis
- adding participants until statistically significant
- testing multiple dependent variables, only reporting significant ones
- adding covariates
- adding variables
- harking
HARKing
Hypothesis after results known: creating a hypothesis that lines up with an effect found after analysing without research focus
Optimistic views towards replication crisis
Researchers are now aware of their questionable research practices and the consequences, so will change behaviours
Registered report
Evaluated registration of a study before it is conducted
Why are Bayesian statistics better
They involve relative likelihoods and are more intuitive than traditional hypothesis testing that leads to misunderstandings
Pessimistic views towards replication crisis
These problems have been known for a while and researchers will not change behaviours as they are too busy and do not have resources to do so
Open science
Relevant information is easily available online to the public and other researchers, increasing transparency
Repository
Location where data and analysis are stored, also included preregistration of studies with times
TOP Transparency and Openness Promotion guidelines
Criteria created to describe how much journals adhere to open science practices with levels 0 to 3
Standards for TOP guidelines
- citation of datasets and inspiration
- data transparency
- code transparency
- research material transparency
- design and analysis transparency
- preregistration of study
- preregistration of analysis
- replication
Secondary data analysis
Using existing data to analyse different research questions
Big data
Collection of large datasets used for secondary data analysis
Characteristics of big data
Velocity of change
Volume
Veracity
Value
Variety
Publish or perish
Researchers had to have a strong portfolio of research otherwise they would not be promoted; led to an increase in the number of studies published
Increasing the quality of submissions in journals
- peer review
- journal impact factor based on number of citations
Consequences of journal impact factor
- increased power of Garfield’s company and index
- dominance of research in English
- decreased importance of books compared to articles
- publication of articles with new findings to increase citations
DORA Declaration on Research Assessment
- other sources than journals are important
- journal impact factors to be eliminated for promotions and funding
- research to be evaluated by itself and not based on journal
- explicitly saying which criteria used to make decisions
Open access journal
Journal that can be accessed with no subscription or fees; usually charge APC article processing charges
EU decided all EU-funded research should be open access by 2020
Double dipping
Traditional journals started open access journals, giving them two sources of income
Predatory journal
Scientific journal that pretends to be genuine but is not so that researchers pay APCs
McKay and Coltheart (2017) sent nonsensical articles to journals and most accepted either with minor changes or once APC paid