Evaluating Impact - Creating Data Collection Tools Flashcards
What are the key decisions that need to be made when creating data collection tools?
- the purpose of the evaluation instrument
- what will and will not be included in it
- the type of evaluation to use
- how to isolate the data to make valid comparisons
- determine control groups
- get management support and overcome barriers
What are some tools that test knowledge, and what is important to remember about them?
Exams, assessments, and tests
They may not be the best way to test knowledge.
Validity
the evaluation instrument measures what it was intended to measure
Important because it ensures all learners interpret the meaning of a test question the way it was meant
How do you verify validity?
Solicit feedback from a subject matter expert
5 ways to determine whether an instrument is valid
- content validity: the extent to which the instrument represents the program’s content
- construct validity: the degree to which the instrument represents the construct it’s supposed to measure (the construct is the abstract variable that should be measured, such as knowledge or skill)
- concurrent validity: the extent to which an instrument agrees with the results of other instruments administered at approximately the same time to measure the same characteristics
- criterion validity: the extent to which the assessment can predict or agree with external constructs; determined by looking at the correlation between the instrument and the criterion measure
- predictive validity: the extent to which an instrument can predict future behaviors or results
Test validity boils down to 2 things
- the test should be reasonably reliable and free from measurement errors
- the test should include all of the content that is needed to perform the job safely and competently
Reliability
the ability of the same measurement to produce consistent results over time
How do you determine reliability?
- the instrument must be administered to a sample of participants and undergo statistical analysis; should have reliability coefficients at or above 75%
- non-scientifically, it can be determined and improved by wording questions and evaluating responses over time
What are the possible consequences if a test is too easy?
poor job performance, on the job accidents, damage to expensive equipment
What factor maximizes test validity and test reliability?
the degree of difficulty
What could a test that seems to have a low degree of difficulty indicate?
Testers knew the information already, the test was too easy, or the answers were cued in some way
What could a test that seems to have a high degree of difficulty indicate?
information wasn’t presented adequately in person or in reading materials or the item was so difficult only the most knowledgeable could answer it
What are two ways to test reliabilty?
Split-half reliability, test-retest check of reliability
Split half reliability
Two groups of testers take half of the test; the correlation of results is calculated. Then the group takes the other half as their re-test.
Test-retest check of reliability
The same test is administered twice to the same group of people
Shortcoming of test-retest check of reliability
Memory bias; people just remembered the answers. Too short of time between tests.
Too long of time between tests and other factors enter the equation, like learning new information
When would a TD professional use an instrument other than an exam? What would they use?
When evaluation is tied to a needs assessments
They may use surveys, questionnaires, observations, and and interviews
What 4 things should you remember when creating surveys, questionnaires, and interview guide instruments?
- Be certain that the questions are directly connected to the measurement plan.
- Determine whether any definitions or other standards exist that need to be clarified.
- Decide whether reading ability or a second language is a concern.
- Explore whether to use a pilot test on the instrument.
Advantages and disadvantages of assessment and tests
Advantages:
- they’re objective
- identify the specific gap between the current and desired performance, knowledge, and skills
Disadvantages:
- they can be time-consuming
- don’t always examine the thought processes that support why an individual performed in a certain way
Advantages and disadvantages of observation
Advantage:
- they are great for measuring skill ability (Evaluating Impact, not Consulting)
- it may be used to create a step-by-step procedure (algorithm) that can be standardized for all learners as a flowchart, diagram, graphic, list of steps, or job aid
- can capture job environment conditions that make a difference and can be incorporated into solution (Consulting, not Evaluating Impact)
- Provides realistic view of situation
Disadvantage:
- some performers may not act as they normally would because they know they are being watched (known as the Hawthorne effect)
- only indicates behavior, not reason for the behavior
- can be difficult to identify when one portion ends and the next begins (Consulting, not Evaluating Impact)
Advantages and disadvantages of interview
Advantages:
- clarify ambiguous or confusing information obtained otherwise
- gives participants ownership in the process
- they provide rich detail through a two-way conversation to clarify statements or ensure understanding of questions
- in-depth information
- use the same wording and interview protocol for consistency
- can be used to design quant tools
Disadvantages:
- can be time consuming and labor intensive
- the interviewer must be careful to record exact responses and not interpret them
- interviewer must represent the target population
Advantages and disadvantages of focus groups
Advantage:
- it can create richer concepts as individuals build on one another’s ideas
- able to observe nonverbal behavior (Consulting, not Evaluating Impact)
- able to interview more people in a shorter amount of time
Disadvantage:
- the group can be influenced by particularly verbal individuals, which gives the impression of unanimity when it is not the case
- time and resource intensive
Advantages and disadvantages of surveys
Advantages:
- inexpensive
- the results are easy to tally
- they provide quick results
- they can reach a lot of people at a distance (participation is easy)
- can be qualitative and quantitative
Disadvantages:
- return rates can often be low, thus not giving a good sample
- they also need to be worded in a way so that everyone understands them
- questions don’t allow for free expression
- constructing questions and selecting scale must be done carefully
What is the best survey type?
The one that provides the needed data, not the one that’s the fastest, cheapest, or eaisest
Why do TD professionals use work samples?
To identify problem areas that may require further analysis
may also be used to supplement other assessment methods, validate other data, or gather preliminary information for a study
Advantages and disadvantages of work samples
Advantages:
- can be unobtrusive
- provides direct data on actual work
Disadvantages:
- TD professional may need specialized content knowledge
- gathering and grading work samples can intimidate employees
Extant or archival data
Existing records, reports, and data that may be available inside or outside the organization
Examples: job descriptions, competency models, benchmarking reports, annual reports, financial statements, strategic plans, mission statements, staffing statistics, climate surveys, grievances, turnover rates, absenteeism, suggestion box feedback, and accident statistics
Why would a TD professional use extant data?
for organizational needs analysis and current performance analysis
Advantages and disadvantages of extant data
Advantages:
- provides exact or reliable numbers (hard data) for consistency
- can enable an examination of trends and patterns over time.
Disadvantages:
- usually collected for other purposes, so performance issues may need to be inferred from patterns in the data
- TD professionals cannot control methodology, so it might be mixed with extraneous data
- Not the raw data exactly required
What are 5 questions to ask when selecting data collection methods?
- What are the goals or expectations of the employees being studied? How do they relate to organizational goals?
- How will the organizational climate affect the need for anonymity, survey return rates, or the willingness of employees to be interviewed?
- Who will administer, score, and interpret the results?
- Does the use of the tools require special knowledge?
- Is the scoring objective?
3 best practices for OTJ checklists used for observations?
Include room to add additional comments.
Clearly define each behavior.
Train observers to avoid interpretations
What are 5 best practices for interviews?
Determine the specific information that is needed.
Pilot test the interview.
Train the interviewers.
Ensure the interveners can clearly explain the instructions to those who are interviewed.
Plan a statement about anonymity and how the results will be used
Structure of an interview
Begin with consistently worded questions
Can probe for additional information
What are best practices for surveys or questionnaires? (8)
Keep the survey or questionnaire as short as possible.
Identify ways to obtain a high return rate.
Be sure the instructions are clear.
Select the question type that best meets the purpose (for example, multiple choice, multiple answer, ranking preferences, open-ended, or scaled).
Decide if anonymity is required and how it will be addressed.
Avoid leading questions.
Use simple language.
Avoid asking more than one question at a time.
What are two factors that could effect extant data
Other variables, such as equipment downtime or external expectations
What are some decisions that must be made about evaluation instruments?
the _________ the tool will serve
the _________ or _________ that will be used to present and _________ _________
what _________ or _________ _________ will be used (_________ or a different _________ )
what _________ are needed
how _________ and _________ should be captured
the degree of _________ that the tool needs
how the tool will be _________
the _________
how the results will be _________ , _________ , and _________
how the results will be _________
how to reach a high level of _________ .
purpose
format or media; track results
ranking or rating scale; Likert or a different scale
demographics
comments and suggestions
flexibility
distributed
timeframe
tracked, monitored, reported
communicated
return
What are analytics from technology platforms?
Tracking data that auto records things like what was done or for how long; TD professionals can collect, compile, and analyze the data produced and make inferences
Examples: xAPI measures, customer service call times, break times, time to solution
What are 3 types of assessments/tests?
- knowledge
- observation
- analysis of work results
Considerations for implementing assessments or tests
- formulate questions and measurement criteria carefully to ensure accurate interpretation
- pilot should be conducted with small sample first to ensure validity and reliability