Evaluating Impact - Creating Data Collection Tools Flashcards

1
Q

What are the key decisions that need to be made when creating data collection tools?

A
  • the purpose of the evaluation instrument
  • what will and will not be included in it
  • the type of evaluation to use
  • how to isolate the data to make valid comparisons
  • determine control groups
  • get management support and overcome barriers
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are some tools that test knowledge, and what is important to remember about them?

A

Exams, assessments, and tests

They may not be the best way to test knowledge.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Validity

A

the evaluation instrument measures what it was intended to measure

Important because it ensures all learners interpret the meaning of a test question the way it was meant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How do you verify validity?

A

Solicit feedback from a subject matter expert

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

5 ways to determine whether an instrument is valid

A
  • content validity: the extent to which the instrument represents the program’s content
  • construct validity: the degree to which the instrument represents the construct it’s supposed to measure (the construct is the abstract variable that should be measured, such as knowledge or skill)
  • concurrent validity: the extent to which an instrument agrees with the results of other instruments administered at approximately the same time to measure the same characteristics
  • criterion validity: the extent to which the assessment can predict or agree with external constructs; determined by looking at the correlation between the instrument and the criterion measure
  • predictive validity: the extent to which an instrument can predict future behaviors or results
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Test validity boils down to 2 things

A
  • the test should be reasonably reliable and free from measurement errors
  • the test should include all of the content that is needed to perform the job safely and competently
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Reliability

A

the ability of the same measurement to produce consistent results over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How do you determine reliability?

A
  • the instrument must be administered to a sample of participants and undergo statistical analysis; should have reliability coefficients at or above 75%
  • non-scientifically, it can be determined and improved by wording questions and evaluating responses over time
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the possible consequences if a test is too easy?

A

poor job performance, on the job accidents, damage to expensive equipment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What factor maximizes test validity and test reliability?

A

the degree of difficulty

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What could a test that seems to have a low degree of difficulty indicate?

A

Testers knew the information already, the test was too easy, or the answers were cued in some way

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What could a test that seems to have a high degree of difficulty indicate?

A

information wasn’t presented adequately in person or in reading materials or the item was so difficult only the most knowledgeable could answer it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What are two ways to test reliabilty?

A

Split-half reliability, test-retest check of reliability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Split half reliability

A

Two groups of testers take half of the test; the correlation of results is calculated. Then the group takes the other half as their re-test.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Test-retest check of reliability

A

The same test is administered twice to the same group of people

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Shortcoming of test-retest check of reliability

A

Memory bias; people just remembered the answers. Too short of time between tests.

Too long of time between tests and other factors enter the equation, like learning new information

17
Q

When would a TD professional use an instrument other than an exam? What would they use?

A

When evaluation is tied to a needs assessments

They may use surveys, questionnaires, observations, and and interviews

18
Q

What 4 things should you remember when creating surveys, questionnaires, and interview guide instruments?

A
  • Be certain that the questions are directly connected to the measurement plan.
  • Determine whether any definitions or other standards exist that need to be clarified.
  • Decide whether reading ability or a second language is a concern.
  • Explore whether to use a pilot test on the instrument.
19
Q

Advantages and disadvantages of assessment and tests

A

Advantages:
- they’re objective
- identify the specific gap between the current and desired performance, knowledge, and skills

Disadvantages:
- they can be time-consuming
- don’t always examine the thought processes that support why an individual performed in a certain way

20
Q

Advantages and disadvantages of observation

A

Advantage:
- they are great for measuring skill ability (Evaluating Impact, not Consulting)
- it may be used to create a step-by-step procedure (algorithm) that can be standardized for all learners as a flowchart, diagram, graphic, list of steps, or job aid
- can capture job environment conditions that make a difference and can be incorporated into solution (Consulting, not Evaluating Impact)
- Provides realistic view of situation

Disadvantage:
- some performers may not act as they normally would because they know they are being watched (known as the Hawthorne effect)
- only indicates behavior, not reason for the behavior
- can be difficult to identify when one portion ends and the next begins (Consulting, not Evaluating Impact)

21
Q

Advantages and disadvantages of interview

A

Advantages:
- clarify ambiguous or confusing information obtained otherwise
- gives participants ownership in the process
- they provide rich detail through a two-way conversation to clarify statements or ensure understanding of questions
- in-depth information
- use the same wording and interview protocol for consistency
- can be used to design quant tools

Disadvantages:
- can be time consuming and labor intensive
- the interviewer must be careful to record exact responses and not interpret them
- interviewer must represent the target population

22
Q

Advantages and disadvantages of focus groups

A

Advantage:
- it can create richer concepts as individuals build on one another’s ideas
- able to observe nonverbal behavior (Consulting, not Evaluating Impact)
- able to interview more people in a shorter amount of time

Disadvantage:
- the group can be influenced by particularly verbal individuals, which gives the impression of unanimity when it is not the case
- time and resource intensive

23
Q

Advantages and disadvantages of surveys

A

Advantages:
- inexpensive
- the results are easy to tally
- they provide quick results
- they can reach a lot of people at a distance (participation is easy)
- can be qualitative and quantitative

Disadvantages:
- return rates can often be low, thus not giving a good sample
- they also need to be worded in a way so that everyone understands them
- questions don’t allow for free expression
- constructing questions and selecting scale must be done carefully

24
Q

What is the best survey type?

A

The one that provides the needed data, not the one that’s the fastest, cheapest, or eaisest

25
Q

Why do TD professionals use work samples?

A

To identify problem areas that may require further analysis

may also be used to supplement other assessment methods, validate other data, or gather preliminary information for a study

26
Q

Advantages and disadvantages of work samples

A

Advantages:
- can be unobtrusive
- provides direct data on actual work

Disadvantages:
- TD professional may need specialized content knowledge
- gathering and grading work samples can intimidate employees

27
Q

Extant or archival data

A

Existing records, reports, and data that may be available inside or outside the organization

Examples: job descriptions, competency models, benchmarking reports, annual reports, financial statements, strategic plans, mission statements, staffing statistics, climate surveys, grievances, turnover rates, absenteeism, suggestion box feedback, and accident statistics

28
Q

Why would a TD professional use extant data?

A

for organizational needs analysis and current performance analysis

29
Q

Advantages and disadvantages of extant data

A

Advantages:
- provides exact or reliable numbers (hard data) for consistency
- can enable an examination of trends and patterns over time.

Disadvantages:
- usually collected for other purposes, so performance issues may need to be inferred from patterns in the data
- TD professionals cannot control methodology, so it might be mixed with extraneous data
- Not the raw data exactly required

30
Q

What are 5 questions to ask when selecting data collection methods?

A
  • What are the goals or expectations of the employees being studied? How do they relate to organizational goals?
  • How will the organizational climate affect the need for anonymity, survey return rates, or the willingness of employees to be interviewed?
  • Who will administer, score, and interpret the results?
  • Does the use of the tools require special knowledge?
  • Is the scoring objective?
31
Q

3 best practices for OTJ checklists used for observations?

A

Include room to add additional comments.
Clearly define each behavior.
Train observers to avoid interpretations

32
Q

What are 5 best practices for interviews?

A

Determine the specific information that is needed.
Pilot test the interview.
Train the interviewers.
Ensure the interveners can clearly explain the instructions to those who are interviewed.
Plan a statement about anonymity and how the results will be used

33
Q

Structure of an interview

A

Begin with consistently worded questions
Can probe for additional information

34
Q

What are best practices for surveys or questionnaires? (8)

A

Keep the survey or questionnaire as short as possible.
Identify ways to obtain a high return rate.
Be sure the instructions are clear.
Select the question type that best meets the purpose (for example, multiple choice, multiple answer, ranking preferences, open-ended, or scaled).
Decide if anonymity is required and how it will be addressed.
Avoid leading questions.
Use simple language.
Avoid asking more than one question at a time.

35
Q

What are two factors that could effect extant data

A

Other variables, such as equipment downtime or external expectations

36
Q

What are some decisions that must be made about evaluation instruments?

the _________ the tool will serve
the _________ or _________ that will be used to present and _________ _________
what _________ or _________ _________ will be used (_________ or a different _________ )
what _________ are needed
how _________ and _________ should be captured
the degree of _________ that the tool needs
how the tool will be _________
the _________
how the results will be _________ , _________ , and _________
how the results will be _________
how to reach a high level of _________ .

A

purpose
format or media; track results
ranking or rating scale; Likert or a different scale
demographics
comments and suggestions
flexibility
distributed
timeframe
tracked, monitored, reported
communicated
return

37
Q

What are analytics from technology platforms?

A

Tracking data that auto records things like what was done or for how long; TD professionals can collect, compile, and analyze the data produced and make inferences

Examples: xAPI measures, customer service call times, break times, time to solution

38
Q

What are 3 types of assessments/tests?

A
  • knowledge
  • observation
  • analysis of work results
39
Q

Considerations for implementing assessments or tests

A
  • formulate questions and measurement criteria carefully to ensure accurate interpretation
  • pilot should be conducted with small sample first to ensure validity and reliability