3.2 Measurement Flashcards

1
Q

What is a predictor

A

Definition: Variables used to forecast employee performance.
Example: Previous work experience in years, scores on a job-specific skills assessment, and results from a personality test. These predictors are selected because they are believed to have a relationship with how well an employee will perform in their job.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a criterion?

A

Definition: The outcome variable that the model aims to predict.
Example: Employee performance rating, which could be a composite score based on factors like productivity, quality of work, teamwork, and adherence to deadlines. This is the variable the employer is interested in predicting to make informed hiring decisions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a construct?

A

Definition: An abstract concept or theoretical idea that is relevant in the employment context.
Example: Intelligence/Job competency, which is a construct that encompasses various skills and attributes necessary to perform a job effectively. This construct is not directly observable but is inferred from various indicators like performance in tasks, communication skills, and problem-solving abilities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a measure?

A

Definition: Tools or methods used to assess a construct or predictor.
Example: For measuring the construct of job competency, an employer might use a combination of measures like structured interviews, job simulations, and reference checks. To assess the predictor of job-specific skills, a standardized skills assessment test might be used

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are the 5 criteria for evaluating effective staffing tools?

A

VULGR
-Validity
-Utility
-Legality
Generalizability
Reliability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

(T/F) Reliability is necessary for validity but not sufficient

A

T

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is reliability?

A

The extent to which a measure is free from random error.
 Consistent response
 Upper-limit of correlation coefficient (standardized measure of association; r) is the product of each measures reliability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is validity?

A

The extent to which performance on the selection device/test is associated with performance on the job. (tests what we want)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are two ways validity is reduced?

A
  • Deficient – selection device does not measure all the important aspects.
  • Contaminated – selection device measures irrelevant aspects.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are the 3 ways of measuring validity according to Uniform Guidelines on Employee Selection Procedures?

A

Criterion-related, Content-related, Construct-related

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is criterion-related validity?

A

Most important. reflects relationship b/t score on selection device and score on criterion.
o Correlation between two sets of scores assessed (correlation/validity coefficient)
o Range = -1.00 to +1.00
o Effect size = strength of relationship:
 r = .5 is strong, .3 moderate, .1 is weak.
 d = .8 is strong, .5 is moderate, .2 is weak.
o Direction = whether positive or negative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is content-related validity?

A

involves using expert opinions that the items, questions, or tasks used in a selection tool are representative of the kinds of situations, problems, tasks that occur on job.
o When developing measures: Look at job analysis, compare current/proposed measures to KSAOs, develop measures relevant to job components.
o Ex. Covering all aspects of the final exam equally.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is predictive validation?

A

uses the test scores of all applicants and looks for a relationship between the scores and future performance of applicants hired

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is concurrent validation?

A

consists of administering a test to people who currently hold a job, and then comparing their scores to existing measures of job performance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are examples for why correlation does not equal causation?

A

 Reverse causation (Violent kids play violent video games)
 3rd Variable (Poor parenting)
 Reciprocal (A and B cause each other)
 Coincidence (Drownings/Nick Cage films)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is construct-related validity?

A

Is our measure accurately reflecting what it should measure (consistency b/t high score on measure and high level of a construct).
o Ex. Coders being reviewed on code volume (crappy code with unnecessary script).

17
Q

Summarize the differences between the 3 types of validity.

A

-criterion-related validity focuses on the practical -outcome
-content validity on the scope and representation of the subject matter
-construct validity on the theoretical accuracy of what the test is intended to measure.

18
Q

What is utility?

A

Need to monetize correlations/tools, because it’s what companies care about
 Utility = Benefits – Cost
 Utility = # applicants * standard deviation * validity coefficient * mean predictor – cost
* NaSDyRxy*Ma-Ct

19
Q

What is generalizability?

A

Can test be used generally or is it specific to certain job
 Ex. Snowflake test make work now, but not forever (ex. Culture shift -feelings about police.

20
Q

What is meta-analysis?

A

takes all correlations found in studies and calculates weighted average (large samples weighted more.)

21
Q

What is legality?

A

all staffing methods must confirm to laws/legal precedents.
 Defense to adverse impact claims (job-relatedness – validity. Best way to show job-relatedness is through criterion-related validation)