assessment principles Flashcards

1
Q

define discriminative measurements

A

attempts to differentiate between two or more groups of people

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

define predictive measurements

A

attempts to classify people into a set of predefined measurement categories for purpose of estimating outcome

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

define evaluative measurement

A

pertains to measurement of change in an individual or group over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

define descriptive measurement

A

pertains to efforts to obtain a ‘clinical picture’ or baseline of person’s skills

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

what are the 4 types of assessment

A

non standardised
standardised
criterion referenced
norm referenced

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what does measurement enable therapists to do

A
  • quantify attributes of individuals
  • make comparisons
  • document on performance change
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

define evaluation

A

The process of determining the worth of
something in relation to established benchmarks
using assessment information.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

define re-evaluation

A

process of critical analysis of client response

to intervention

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

define screening

A

A quick review of the client’s situation to determine if an occupational therapy evaluation is warranted

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

define testing

A

a systematic procedure for observing a person’s behaviour & describing it with the aid of a numerical scale or a category-system

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

define evidence based practice

A

The integration of best research evidence available, clinical experience and patient values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

define non standardised assessments

A

Do not follow a standard approach or protocol

May contain data collected from interviews, questionnaires and observation of performance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

define standardised assessments

A
  • Are developed using prescribed procedures

- Are administered and scored in a consistent manner under the same conditions and test directions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

define descriptive assessments

A

to describe individuals within groups and to characterise differences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

define evaluative assessments

A

use criteria or items to measure an individuals trait over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

define predictive assessments

A

use criteria to classify individuals to predict trait against criteria

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

define criterion referenced assessment

A

client performance is assessed against a set of predetermined standards

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

define norm referenced assessment

A

client performance is assessed relative to the other students

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

pros of criterion referenced assessments

A
  • sets minimum performance expectations

- demonstrates what clients can and can not do

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

cons of criterion referenced assessments

A
  • hard to know where to set boundary conditions

- lack of comparison data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

define norm referenced assessments

A

Based upon the assumption of a standard normal (Gaussian) distribution with n > 30.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

pros of norm referenced assessments

A
  • ensures a spread

- shows client performance relevant to group

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

cons of norm referenced assessments

A
  • in a strong group, some will be ensured an f

- above average performance is not necessarily good

24
Q

define reliability

A

The reproducibility of test results on more than one occasion by the same researcher using a measure.

range from 0 - 1

25
define random error
errors that can not be predicted
26
define systematic error
errors that have predictable fluctuations
27
list the types of reliability
``` Intra-rater reliability Inter-rater reliability Test-retest reliability / temporal stability Alternate form reliability Split half reliability Internal consistency ```
28
intra rater reliability
The stability of data collected by one person more than 2 times
29
inter rater reliability
Detecting variability between 2 eaters who measure the same client
30
test retest reliability
The reliability/stability of measurements when given to the same people over time
31
alternate form reliability
the degree of correlation between two different, but equivalent forms from the same test completed by the same group of people
32
split half reliability
the degree of correlation between one half the items of a test and the other half of the items of a test (e.g., odd numbered items correlated with the even numbered items)
33
internal consistency
the the degree of agreement between the items in a test that measures a construct
34
cronbachs coefficient alpha
used to assess internal consistency; estimate the reliability of scales or commonality of one item in a test with other items in a test; ranges from 0.10-0.99
35
kappa (k)
used in assessments yielding multiple nominal placements since it corrects for chance
36
weighted k
used to determine the reliability of a test when rating on an ordinal scale
37
validity
the extent to which a test measures what it purports to measure
38
construct validity
Establishes whether assessment measures a construct and its theoretical components
39
what are the 3 parts of construct validity
1. describe the constructs that amount for test performance 2. compose hypotheses that explains relationship 3. test hypotheses
40
list the 4 subtypes of construct validity
- convergent - divergent - discriminant - factor analysis
41
covergent validity
Level of agreement between 2 tests that are being used to measure the same construct
42
divergent validity
Distinguishing the construct from confounding factors
43
discriminant validity
The level of disagreement when two tests measure a trait
44
factor analysis validity
statistical procedure used to determine whether test items group together to measure a discreet construct or variable
45
content validity
The extent to which a measurement reflects a specific domain
46
criterion validity
Implies outcome can be used a substitute for 'gold' standard criterion test
47
what are the 2 subtypes of criterion validity
1. concurrent/congruent validity (degree to which results agree with others) 2. predictive validity (extent to which measure can forecast)
48
face validity
A test appears to measure what its author intended it to measure
49
ecological validity
The outcome of an assessment can hold up in the real-world circumstances
50
what are the 2 types of experimental validity
- internal | - external
51
sensitivity
Ability of a test to detect genuine changes in a client’s clinical condition or ability
52
specify
A test’s ability to obtain a negative result when the condition is really absent (a true negative)
53
responsiveness
providing evidence of the ability of a measure to assess and quantify clinically important change
54
nominal measurement
only have two response options to items; for example male/female; yes/no wet/dry; happy/sad
55
ordinal measurement
data has some order, with one score being better/worse than another.
56
interval scales
the differences between any two scores ratings are identical (such as weight, temperature and distance); statistics can be used correctly.