Research Methods Flashcards
tern: validity
accuracy
term: internal validity
researcher managed to measure what intended to measure
3 factors affecting internal validity
- demand characteristics
- individual differences
- researcher bias
term: external validity
extent findings generalised beyond research setting
4 factors affecting external validity
- temporal
- ecological
- population
- mundane realism
term: face validity
measure appears on the surface to measure what supposed to measure
term: concurrent validity
psychological measure relates to exisiting similar measure
how can improve validity in experiments?
- use control groups
- single blind + double blind procedure
- standardise
- minimise investigator effects
how can improve validity in observations?
- covert observations
- behavior categories- not broad
- questionnaires- anonymous
how can improve validity in qualitative methods?
- case studies/ interviews- more info
quasi experiment
have IV that’s based on existing differences between people but not manipulated, exists already
strengths + limitations of quasi
s: controlled conditions
l: confounding variables, ethical issues
lab experiment
carried out in controlled environment with controlled IV
strengths + limitations of lab
s: high control, replicate, EV’s controlled, see cause and effect
l: sample bias, low eco val, DC, not occuring natural
field experiment
done in everyday environment, manipulate IV
strengths + limitations of field
s: high eco val, no DC
l: results difficult to analyse
natural experiment
IV occurs naturally + isnt manipulated
strengths + limitations of natural
s: no DC, high eco val
l: results difficult to analyse, ethical issues, hard to replicate
difference between opportunity and volunteer sampling?
volunteer is when the p’s select themselves by replying to adverts whilst opportunity is researchers choosing ppl from t.p who r willing to take part
strengths + limitations of random sampling
s: equal chance, no researcher bias
l: time consuming, not representative
strengths + limitations of volunteer sampling
s: no bias, ethical
l: time consuming, social desirability, unrep
strengths + limitations of opportunity sampling
s: quick + easy
l: unrep, bias, change behaviour
term: systematic sampling
choosing nth number on target population list
strengths + limitations of systematic sampling
s: less research bias
l: time consuming, unrep
term: stratified sampling
selecting p’s in proportion to their frequency in target population
how do u carry out stratified sampling
- identify subgroups that make up target pop
- work out proportions needed for sample to be rep
- p’s randomly selected to reflect it
strengths + limitations of stratified
s: no bias, represensitive
l: time consuming, other factors not accounted for
what is a paradigm shift?
significant change in dominant unifying theory within scientific discipline
what is the psychology paradigm shift?
paradigm -> questioning of paradigm -> general critique gathers popularity -> scientific revolution -> PARADIGM SHIFT
what did Thomas Kuhn believe?
what distinguishes scientific disciplines from non-scientific disciplines is a shared set of assumptions and method
why is psychology a pre-science?
marked by too much internal disagreement and conflicting ideas to qualify as a science
term: non directional hypothesis
states there WILL be relationship but not which way
term: directional hypothesis
states TYPE of relationship
term: single-blind procedure
p’s have no idea which condition in
term: double-blind procedure
p’s and research don’t know what condition p’s in
examples of investigator effects
age, ethnicity, tone, gender, location, physical characteristics
how to reduce investigator effects
double-blind procedure prevents investigator from inadvertently giving p’s cues
term: pilot studies
small scale ‘practice’ investigations where researchers check all aspects of their research
what do pilot studies allow?
- make changes to design, method, analysis
- p’s to suggest appropriate changes
- improves quality of research
- avoids unnecessary work
strengths of questionnaires
- quick + cheap
- large sample
- replication
- qualitative + quantitative
- open + closed questions
limiations of questionnaires
- misunderstandings
- bias sample
- low response rate
- misleading questions
- results difficult to anaylse
strengths of interviews
- ease misunderstandings
- replication
- open + closed questions
- quantitative + qualitative
limitations of interviews
- ethical- reveal more than wanted
- interviewer training
- interviewer bias
- demand characteristics
- results difficult to anaylse
4 criticisms of peer reviews
- bias: research world small so research known
- plaglarisms: not accepting work so own published first
- able to publish in control of elites- reject research not agreed with
- take a lot of time
use of peer reviews
- needed before piece of work published to validate accuracy
- any weaknesses/ suggestions for improvement highlighted
term: Reliability
extent to which test produces CONSISTENT results
term: internal reliability
test is consistent in ITSELF
term: external reliability
consistent OVER TIME
2 ways to assess reliability
- test-retest
- inter-observer reliability
2 steps for test-retest
- repeat same test on the same person on 2 diff occasions. reliable=results same
- check whether significant positive correlation
4 steps for inter-observer reliability
- PILOT STUDY- observers apply same behav categories
- observers watch same event but record data independently
- correlate data of 2 observers
- correlation statistically positive= high IOR
3 ways to improve reliability in experiments
- strict control of variables
- same conditions each time
- standardised procedures
4 ways to improve reliability in observations
- operationalized behavioural categories
- no overlapping categories
- behaviours covered by checklist
- further training
2 ways to improve reliability in questionnaires
- questions replaces + rewritten that r unclear
- replace open questions with closed, fixed ones
3 ways to improve reliability in interviews
- same interviewer each time
- no leading/ unclear questions
- use structure where interviewers behaviours more controlled
term: paradigm
set of shared assumptions and agreed methods within scientific discipline
what did Karl Popper say?
there are key criteria that must be met for a discipline to be a science:
- falsifiability
- replicability
- objectivity and empirical method
key details of falsifiability
- admits possibility of being found false
- strongest= survived longest
- theories should be constantly challenged
key details of replicability
- scientific procedures and findings can be repeated by others
key details of objectivity and empirical method
- all sources of personal bias minimised so doesn’t influence the research process
- researchers keep critical distance
why are psychological reports made?
must be written to allow effective replication and allow others to repeat research to check and validate
term: directional correlation hypothesis
states WHICH TYPE of correlation it is ie. there’s a negative correlation
term: non- directional correlation hypothesis
doesn’t state which type it is- ie there’s a relationship
term: spurious correlation
connection between 2 variables that appears to be casual but is not
3 strengths of correlations
- useful starting tool for research
- measure relationship between variables that cannot be manipulated
- quick + cheap to carry out
3 limitations of correlations
- misused and misinterpreted
- interventing variables causing relationship between co-variables
- cant infer cause and effect so don’t know which co-variable causing other to change