reliability + validity Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

what is reliability

A

how consistent the results are

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

what is internal reliability

A

results that are consistent within itself

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what is external reliability

A

consistent from one occasion to the next

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

why is it important to assess the reliability of observations

A

so they can be repeated for a second time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

how do you assess the reliability of observations

A

inter-observer reliability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

explain the process of interobserver reliability

A

two observers make independent recordings

after the observation, come together to see the agreements

correlate results (>0.8)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

what are the two ways to improve the reliability of observations if the inter-observer score is low

A

operationalise behavioural categories further

provide more training for the observers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

what are the two methods used to improve reliability of self report techniques

A

test-retest and inter-interviewer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

what is the process of test-rest reliability

A

a group of people complete the test then the same test if given to the same people after a short interval of time (1-2 weeks) so answers have been forgotten

the scores across both tests for each person are then correlated, if they are the same it has high reliability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

what is the process of inter-interviewer reliability

A

you could interview the same person again after a week similar to test-retest

you could also record the interview and use two separate interviewers and see their agreements and correlate the results

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

how do you improve the reliability of self report techniques

A

reduce ambiguity as some questions may be unclear and provide multiple answer types

standardisation as procedures and instructions must be the same for all participants

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

what is validity

A

legitimacy, whether the results reflect reality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

what is internal validity

A

what goes on within the study that could influence results

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

what are examples of things that could affect internal validity in experiments (3)

A

investigator effects: anything the investigator does which may have an affect on participants performance that wasnt intended

demand characteristics: cues that communicate the aim of the study to the participants so their true behaviour is altered

confounding variables: a variable in an experiment that changes systematically with the IV, conclusions can not be drawn about what caused changes in the DV

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

what are examples of things that could affect internal validity in self report measures

A

social desirability bias: ppts may give answers that do not reflect reality as they want to give answers that are reflected in a good light

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

what are examples of things that could affect internal validity in observations

A

poorly operationalised behavioural categories: observers cannot record reality because the categories are not clear

17
Q

what is population validity

A

generalising the findings of a study to other people

18
Q

what is temporal/historical validity

A

generalising the findings of a study to other time periods

19
Q

what is ecological validity

A

generalising the findings of a study to other settings

20
Q

what are the two ways of assessing validity

A

face validity and concurrent validity

21
Q

what is face validity

A

concerns the issue on whether a self report measure looks like it is measuring what the researcher intended

22
Q

what is concurrent validity

A

involves comparing the current method of measuring a topic with a previously validated method of the same topic

23
Q

if a questionnaire has poor face validity how can this be improved

A

questions should be revised so they relate more to the topic

24
Q

if a questionnaire has poor concurrent validity how can this be improved

A

researcher should remove irrelevant questions, experts can help check this

25
Q

how can you improve validity through the design of studies

A

single blind (ppts are unaware of what condition they are in) and double blind (ppts and researcher are unaware of conditions) procedures to prevent ppts guessing the aims