questionable research practices Flashcards

different types of practices and why researchers do them

You may prefer our related Brainscape-certified flashcards:
1
Q

what is the QRP paradox

A
  • what should be beyond the control of the researcher = results
  • what enhances careers = results
  • many fear if they don’t have enough published they won’t advance
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

why is there a lack of replication

A
  • only 1 in 100 papers are direct replications from a hypothesis
  • hypotheses are often different each time due to tweaking
  • most don’t do exact replications
  • if the results aren’t the same think done something wrong
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what is P-Hacking

A
  • rounding off the p-value
  • using a one tailed when should’ve really used a two tail
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

what is HARKing

A
  • hypothesis after results are known
  • having a hypothesis from the off set but never stated
  • didn’t have a hypothesis before starting then after analysis have one
  • post-hoc is the upfront indication that the author thought of the test later
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

what is the problem with publication bias and data sharing

A
  • if they are told they can’t have no results/unexpected results they won’t want to publish papers
  • hard work to make data accessible to others
  • don’t want to give access
  • didn’t ask for consent to share data from participants
  • small sample sizes are easier to identify
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

why do so many researchers commit QRPs

A
  • Martinson and colleagues (2006)
  • perceived justice and the likelihood of misbehaving in research lives
  • distributive justice - effort to reward received
  • effort items - more demanding work, unreasonable hours
  • reward items - respect/ prestige at work
  • procedural injustice - perceived unfairness in the research system
  • success items - working the system or old boy network
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

what is the law of small numbers

A
  • we want out small data sets to have the same results as our large data sets
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

what is the file drawer problem

A
  • Rosenthal 1971
  • journals have 5% of type 1 error studies but the file drawer has the other 95%
  • if there have been more positive studies reported you will need a larger file drawer to cancel them out
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

what is cognitive preference

A
  • Greenwald 1975
  • experimenter error can lead to acceptance of an inappropriate null
  • uncontrolled environment, high variability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

what is citation bias

A
  • Williams & Bargh 2008
  • tendency for research articles that observe a positively significant result to be cited over those who observe non significant results
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

what is confirmation bias

A
  • look for info that agrees with what you are looking for
  • end up reporting and remembering the info that confirms what you already thought
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

what is moral asymmetry

A
  • asymmetric thinking when it comes to data
  • what is good and bad when it comes to data
  • worse to make up data or to miss it out
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

what is blame (journals)

A
  • media coverage is desirable so tempting to but out attention grabbing data
  • not put out data criticising media coverage
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

what is to blame (funding and institutions)

A
  • strategic fundings
  • more available in the fashionable areas
  • death to that of replicating studies
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

how can we combat biases

A
  • be objective & transparent
  • eradicate subjectivity
  • adopt standards
  • automate data collection and analysis
  • make recordings
  • make data and analysis scripts open
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

what are perversive incentives

A
  • publish data regardless of the data
  • incentives but unis and funding bodies to put out positive results
17
Q

what are financial incentives

A
  • payments after successful publications
  • find what the company wants
  • future funding incentive
  • personal money (direct)
18
Q

what are the potential solutions

A
  • following reporting guidelines
  • replicate studies
  • pre-register studies
  • make sure studies are adequately powered
  • share data and materials so others can check
19
Q

what is pre-registration

A
  • increases transparency
  • reduces P-Hacking and HARKing
  • asses quality and credibility
  • doesn’t help with replication or publication bias
20
Q

what are registered reports

A
  • peer review before research is undertaken
  • acceptance regardless of results
  • registered replication reports offered to replications studies