Lecture 8- Assessment Flashcards

Fitting the person to the job

1
Q

when do we so assessments

A

When we attempt to select someone for a job, we are trying to predict if that person will be able to perform the requisite tasks to the requisite standard. This will also be the case if we are considering a person for promotion to a more responsible job.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

why is it important to assess people

A

Assessing people correctly is important, for example, in determining whether the person will behave safely in a safety-critical job – but also to prevent stress at work by helping to ensure that people are suited to the jobs they do.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what should we do before assessing someone

A

before we can decide upon an assessment method, we must establish our requirements using job and competency analyses techniques. When we know what the person will be expected to do, and what type of person (in terms of behaviours) we are looking for, we can then assess candidates.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

what must the assessment methods be

A

The assessment method(s) chosen must be valid and reliable, i.e. it should measure what it is supposed to measure - and in a consistent manner.
To ensure this, it is important that potential assessment methods are validated against requirements – usually by criterion-related validity studies (CRV) - after performance criteria have been established.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

how do we ensure that assemment methods are valid and reliable

A

To ensure this, it is important that potential assessment methods are validated against requirements – usually by criterion-related validity studies (CRV) - after performance criteria have been established.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

why are ob Analysis and Competency Analysis completed

A

To determine the type of person we are looking for

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

what is job analysis

A

Job-oriented analysis looking for the nature/content of the job itself.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

name some methods of job analysis to deter

A

Observation:- by an analyst. Could be watching and noting, or even participating. Possibility of Hawthorne Effect. Also looking at any available documentation. Problems with mental work.

b) Interviews:- with job holders, supervisors or peers. Such interviews can be rather specific, such as the Critical Incidents technique of Flanagan (1954), which asks for examples of job behaviour that characterise good or bad performance. Brief job analysis by asking for listing and rating of tasks by subject matter experts (SME).
c) Diaries/Logbooks:- kept by the job incumbent. Headings.
d) Structured Questionnaires:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

what type of structured questionnairres can be used for job analysis

A

Functional Job Analysis (Fine and Wiley, 1974) uses analyses of constituent tasks in terms of action sequences and comes up with the percentage of the job directed towards ‘data’, ‘people’ and other aspects. Aids observation (direct, video, etc.).
• Position Analysis Questionnaire (PAQ) (McCormick et al, 1972) has 187 job elements in 6 categories:- information input - where and how is the information gained that is needed to perform the job? mediation processes - what reasoning, decision-making, planning and information-processing activities are needed? work output - physical activities, tools and devices
relationships with other persons, job context - the physical and social context, other factors, e.g. job structure, hours of work, etc.
• Work Profiling System (WPS) and Job Components Inventory (JCI), etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

describe a functional job analysis

A

(Fine and Wiley, 1974) uses analyses of constituent tasks in terms of action sequences and comes up with the percentage of the job directed towards ‘data’, ‘people’ and other aspects. Aids observation (direct, video, etc.).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

who created the functional job analysis

A

(Fine and Wiley, 1974)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

who created the position analysis questionnairre

A

McCormick et al, 1972)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

describe the position analysis questionnairre

A

(PAQ) (McCormick et al, 1972) has 187 job elements in 6 categories:- information input - where and how is the information gained that is needed to perform the job? mediation processes - what reasoning, decision-making, planning and information-processing activities are needed? work output - physical activities, tools and devices
relationships with other persons, job context - the physical and social context, other factors, e.g. job structure, hours of work, etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

what is a competancy analysis

A

Person-oriented analysis looking for competencies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

what does a competancy analysis involve

A

the behavioural indicators (including knowledge, skills and attitudes) needed to perform job tasks with competence………’
e.g. for the ‘Ideal’ Air Traffic Controller (Pearn & Kandola, 1983)
Foresight, judgement, application of professional/technical knowledge, and reliability under pressure, for ‘outstanding’ controllers.
Should be able to absorb information simultaneously from multiple sources
Should be able to project forward and to constantly adjust the whole picture.
….within a context of time pressure, distractions and noise.
Also……convergent, concrete thinking, with self-control, teamwork, decisiveness and conscientiousness.
Key seems to be emotional stability with its emotional detachment, self-assurance and the absence of tension, but latter two must not relate to complacency.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

what type of skills/ competancies should graduates have

A
Analytical Thinking/Judgement/Decision Making
Drive and Decisiveness
Teamworking
Planning
Communicating
Creative Thinking
Motivating Others
Change Tolerance
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

what is a person specification and when shoudl it occur

A

After analysis, we need to specify the type of person required for successful job performance.
This is straightforward when the analysis is at a sufficient level of detail to define competencies (CA) as ATC examples above.
Similarly, when required skills and abilities are defined, such as manual dexterity or visual acuity, the person specification is clear.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

when does person specification become problematic

A

person specification is more difficult if such detail is not available from the analysis. Inferences are required when factors such as those relating to general intelligence and personality characteristics are identified, e.g. ‘should be motivated’, ‘should get on with other people’, etc.
The better the level of detail acquired in both job and especially competency analyses, the more we can be sure that our choice of assessment method will be appropriate.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

what improves person speciification

A

The better the level of detail acquired in both job and especially competency analyses, the more we can be sure that our choice of assessment method will be appropriate.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

why do we establish performance criteria

A

It is not sufficient simply to determine what the job entails. In order to carry out ‘criterion-related validity’ studies on our assessment methods, it is necessary to establish levels of acceptable performance for the tasks involved.
This may come from supervisor ratings of performance, simple output measures, and so on, after recruitment has taken place.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

give some examples of assessment methods

A

Interviews, Psychometric Tests, and Work Sample Tests.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

what is validity

A

validity in general refers to ‘the extent to which a test or other measuring technique measures what it sets out to measure’

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

what is criterion rated validity

A

Refers to the strength of the relationship between a predictor (assessment method) and some criterion (or criteria) of work performance. Predictive or Concurrent.

If a significant correlation coefficient (validity coefficient) was found, we could reasonably use that assessment method for future applicants as a basis for acceptance or rejection.
Latter via deciding on ‘cut-off’ scores on the predictor.

24
Q

what should be done if there seems to be validity

A

If there is a large difference in rejection rates between different groups, need to look more closely at the relationship between test scores and job performance. If low predictor scores are not reflected in low job performance, this shows that, for that particular group, the test has no validity.

25
Q

what is validity generalisation

A

This is the ‘transportability’ of validity coefficients to similar groups, e.g. similar jobs, elsewhere, thus precluding the need for further CRV (predictive or concurrent validity) exercises. This is done where samples are not large enough to carry out a validity study. Its worth, though, is a subject of much current debate, e.g. what about situational variables such as stress which differ from one work environment to another.

26
Q

what are the 2 types of validity which can measure assessment methods

A

content validity

face validity

27
Q

what is face validity

A

Face validity refers to the subjects’ perceived relevance of the measure,

28
Q

what is content validity

A

the representativeness of a measure, e.g. topics covered in test in relation to the job

29
Q

what is reliability

A

the consistency with which a measure provides results, i.e. its stability and freedom from random variation

30
Q

what is interrater reliabillity

A

the degree of agreement between 2 or more raters

31
Q

what is intra-rater reliability

A

how far a single rater is consistent across time.

32
Q

what is a psychometric test

A

a standardised sample of behaviour which can be described by a numerical scale or category system’ - Cronbach (1984).

objective rather than subjective via standardisation (of test conditions, instructions, time, content, scoring, and interpretation)

33
Q

what needs to be included in psychometric tests

A

Administer, score -
and (more controversially)
interpret tests (computer based test interpretation (CBTI). Expert systems with decision rules and interpretative strategies. Report produced from linking of preset phrases and comments.

34
Q

for ethical reasons, what do you need to carry out psychometric tests

A

BPS Level A (Cognitive) and Level B Personality)

35
Q

name some examples of psychometric tests

A

Wechsler Adult Intelligence Scales (WAIS)

  • Myers-Briggs Type Indicator (MBTI)
  • Occupational Personality Questionnaire (OPQ)

NEO PI-R five factor model (FFM) (Costa and McCrae, 1990)- OCEAN

36
Q

Are psychometric tests and inventories reliable

A

All these tests have been subjected to CRV studies for a range of occupations. Results vary for particular traits/tests/ occupations, but this is only to be expected. A test can only be valid if its use is based on a thorough job analysis.
Once reliable, always reliable.

37
Q

are psyhcometric tests and inventories valid?

A

Validity depends on appropriate use, e.g. little validity if a communication skills test were used to predict perceptual-motor performance – even if the test were valid when used appropriately (i.e. to assess communication skills !)

38
Q

name some problematic issues with psychometric tests and inventories

A

i) culture fairness of items;
ii) definition of intelligence and personality;
iii) social desirability/faking/impression management/self deception.

39
Q

what is the aim of an interviewer

A

im is to gather data, evaluate it, and select or reject - an ‘intuitive regression equation’ who must combine and weigh data to come to a decision. How well this is done determines the validity of the interview. Importance of level of training and experience, and biases. Latter may be due to a multitude of personal factors such as age, sex, race, intelligence and personality factors. Also preferences, stereotypes held and ‘set’.

40
Q

what should interviewees be influenced by

A

behaviour, again, influenced by whole host of personal factors.

41
Q

what is interpersonal communication

A

‘a conversation with a purpose’. Behaviour of interviewer can affect the responses of the interviewee, e.g. agreeing, paraphrasing, silence or disagreement. Behaviour of both is influenced by the self-perception of each party, this may be affected by the other’s presence, and so on. Not possible, though, to have a ‘standard’ interviewer

42
Q

what questions arise when deciding to do group or individual interviews

A

together or separately. Is validity improved if a team decision is made rather than two separate individual decisions?

43
Q

Are interviews valid

A

Very little predictive power for interviews (Mayfield, 1964; Arvey and Campion, 1982)

but thousands of studies showing a whole range of validities.

44
Q

are interviews reliable

A

Studies have shown poor inter-rater reliability, i.e. the degree of agreement between two or more interviewers of the same interviewee.

Even poor intra-rater reliability. Usual range of biases in social perception when ‘traditional interview’ used.

45
Q

how can interviews be improved

A

Training of interviewers

situational interviews

interview structure eg 5 or 7 point plan

46
Q

what factors are on the 5 point plan for improving interview structure

A

Impact on Others

2) Qualifications
3) Experience
4) Motivation
5) Adjustment.

47
Q

what factors are on the 7 point plan for improving interview structure

A

1) Physical makeup
2) Attainments
3) Intelligence
4) Aptitudes
5) Interests
6) Disposition
7) Circumstances.

48
Q

what are situational interviews and how can they improve interviews

A

Use job analyses to provide job-related incidents for job-related questions. ‘What would you do if….?’
For example, the type of questions asked to graduates based on the requirements/ competencies discussed in our seminar

49
Q

how can training of interviewers improve interviews

A

To use job-related questions in a structured manner. Overcoming possible biases in social perception.

50
Q

what is a work sample test

A

Any assessment which involves the assessee being asked to perform a behaviour/task/activity which is the same as/similar to/representative of those performed in the real work role.

51
Q

name some types of work sample tests

A

• Psychomotor
• Job knowledge tests
• Individual/situational decision making, e.g. in-tray
• Group discussions/decision making - for assessing managerial potential/teamwork/influencing others/clarity of thinking under stress, etc. Indoor/outdoor exercises/leaderless/with leader, etc.
Observed and Rated.

52
Q

what is the 2 ways of completing work sample tests

A

observed and rated

53
Q

are work sample tests reliable or valid

A

If designed appropriately, based on job and person analyses, reliability and validity good.

54
Q

what is the aim of assessment methods

A

Seek to account for ‘all the variance’ by using multiple methods of assessment with multiple assessors over several days with groups of candidates.

55
Q

what is synthetic validity (Assessment centres)

A

‘…the inferring of validity in a specific situation from a logical analysis of jobs into their elements, a determination of test validity for these elements, and a combination of elemental validities into a whole’

56
Q

are assessment centres reliable or vallid

A

Reliability and validity dependent on those of assessment methods used and ‘combination’ thereafter.