Critical Evaluation Flashcards
Critical Evaluation
- Examining an idea, process or event with an open, objective and inquiring mind.
- Critical skill in EBDM using sound data to hypothesize, assess and select solutions
Critical Evaluation Includes
- Data advocacy
- Data gathering
- Data analysis
- EBDM
Data Advocacy
- Developing inquiring mindset, learning what data drives the business and where it can be found
- Developing partnerships across the organization to promote EBDM
- Modeling skill of EBDM to the entire organization through decisions
- HR makes and plans of action it undertakes
Data Gathering
Knowing what is considered sufficient, credible, and objective evidence and where to find it
Data Analysis
Being able to organize data so it reveals patterns and to analyze it to detect local relationships
EBDM
Ability to apply the results of data gathering and analysis to make better business decisions
Effective Data Advocates
- Do not show analysis just to show analysis
- Focus on making informed decisions that minimize risk and maximize opportunities
- Assist in building a data-driven culture
- Encouraging EBDM though the organization from bottom up
Six Steps in Evidence-Based Decision Making
- Ask
- Acquire
- Appraise
- Aggregate
- Apply
- Assess
Ask Step in Evidence-Based Decision Making
When there is a problem, translate the solution into a question that can be answered through data gathering.
Acquire Step in Evidence-Based Decision Making
Gather data from varied sources
Appraise Step in Evidence-Based Decision Making
Determine if evidence gathered is relevant, valid, reliable, accurate, complete and unbiased
Aggregate Step in Evidence-Based Decision Making
- Combine and organize data to prepare it for analysis
- Determine the priority to be given to different types of information
Apply Step in Evidence-Based Decision Making
- See logical connections within the data and issue
- Use data to draw conclusions, develop solutions, win sponsor support for a decision and take action.
Assess Step in Evidence-Based Decision Making
Monitor the solution that has been implemented and objectively measure the extent the objectives have been attained.
Become and HR Data Advocate
- Develop a questioning mind
- Build fluency in the scientific literature of HR
- Scan resources to identify new and reliable sources of data and monitor topics that are being discussed
- Gather data on a continuous basis
- Use evidence when communicating with stakeholders
- Institutionalize the competency in the HR function
Quantitative data
Objective measurements that can be verified and used in statistical analysis
Qualitative data
- Subjective evaluation of actions, feelings, or behaviors
- Data can be assigned numeric values but they do not hold significance
Qualitative data can be made by
Third Party Observer Self-Assessments
Which type of data (qualitative or quantitative) is more important to HR professionals? How is one determined?
- Both are important
- Purpose of research usually determines the type of data collected.
Assessing Validity of Data Sources (data not in-house)
- Authority
- Bias
- Sources of data used in the publication clearly cited
- Are the facts relevant
- Data current
- Data being offered as a proof of argument
- If the argument is sound and the deductions from the data logical
Reasons of Interviews
- Useful in identifying topics that can be explored in focus groups or surveys
- Focus on specific, high-value employees and uncover targeted retention information - or engagement failures (exit interviews)
- Organizational “heroes” - people recognized and respected in the organization may add cultural perspective
Purpose of interviews
- Gives the opportunity for follow-up questions that may not be possible in survey
- Or discouraged by the size, composition, or timing of a focus group
Why are interviews typically not the sole form of gathering data
Due to time and labor to construct them.
When they are multiple interviewers being used
All interviewers must be carefully trained and prepared so that all interviews are conducted in same manner without bias
Interview Advantages
- Safer, confidential environment may generate significant information.
- Comments can suggest direction for further group research (focus groups and surveys).
Interview Challenges
- Can be time-intensive.
- Requires strong relationship-building skills.
- Requires vigilance to avoid bias from influencing questions and interpretation of answers.
Effective Interviewing Includes
- Interview guide or instrument is created
- Establish a positive and trusting relationship with interviewee
Interview guide or instrument
- Should be drafted and reviewed by other team member or client
- Some limit straying from planned questions, it may be helpful but consistent information will result in more valid and easily combined and reported data
How to establish a positive and trusting relationship with interviewee
- Time and location should be convenient for interviewee
- Reasonable planned length of interview - and actual interview should no go past this time
- Confidences should be trusted
- Neutral and non-judgmental reactions to comments
- Take notes - but not too much to miss eye contact and non-verbal
- Start with safe questions to build rapport and should end with subjects to offer information that was not included on the interview guide
Focus group
Small group of invited persons (typically six to twelve) who actively participate in a structured discussion, led by a facilitator, for the purpose of eliciting their input.
How long do focus groups last?
Typically 1-3 hours, depending on topic and purpose
Focus groups that follow a survey
- Provide a more in-depth look at issues that were raised in the survey
- Collect qualitative data that enriches survey results
Considerations regarding focus groups participation
- Are intended to provide microcosm of the population being studied - population must ensure representative information
- Random selection used to that every employee has equal chance of being selected
- Voluntary participation
- Help ensure that focus groups will be productive sessions with employees who are willing to share values and opinions
Focus Group Advantages
- Provides a format that is flexible and relatively comfortable for discussion
- Allows for group brainstorming, decision making, and prioritization
- Can provide group consensus
- Enables HR to learn about employee needs, attitudes, and opinions in a direct format
- Gives employees direct input
Focus Group Challenges
- Tends to foster “group think” conformity
- May be difficult to control; can become a forum where participants go off on tangents
- Generally don’t allow for deep discussions, depending on time constraints and the number of participants
- Can provide skewed or biased results if participants are not representative
To conduct an effective focus group HR must consider
- Importance of planning
- Context a focus group may occur
- Importance of the facilitator
- Importance of a recorder
How to plan for a focus group
- Clearly defined objectives - as it influences all focus group questions and the structure and flow of the discussion
- Stimulus materials should be designed and debugged in advance
Context of focus groups
- Cultural effects - both organizational and national can affect participation
- Legal environments that can affect information gathered
Good focus group facilitators
- Know the topic well
- Are good listeners
- Have good understanding of group dynamics and skill in conflict resolution (if differences in opinions arise)
- Allow group perspectives without interjecting bias or allowing 1 individual to dominate
- Enthusiasm for the session (contagious in group session)
- Facilitation skills for activities and exercises
- Conscious of time allocation and usage
Choosing a facilitator for focus groups
If organization does not have qualified staff they should hire outside the organization
Recorder of focus groups
- One person who is designated as the note taker to record comments on flip charts, etc.
- Allows the facilitator to remain focused on group dynamics and enrich focus group experience
Focus Group Tools
- Mind mapping and affinity diagramming
- Nominal group technique (NGT)
- Delphi technique
Mind mapping
- Data-sorting technique
- Begins with the discussion of core ideas group members add related ideas and indicate logical connections, eventually grouping similar ideas
- Can be done on paper or whiteboard with sticky notes
Affinity diagramming
- Data-sorting technique
- Group categorizes already collected data and subcategorizes data until relationships are clearly drawn
Nominal group technique (NGT)
- Technique in which participants each suggest ideas through a series of rounds and then discuss the items, eliminate redundancies and irrelevancies, and agree on the importance of the remaining items.
- Can be practiced by individuals, subgroups or entire groups
- Initial sorting of ideas can be done before returning to main group to get consensus
Delphi technique
- Technique that progressively collects information from a group of anonymous respondents
- Used to avoid “group think”
Survey and Questionnaires
Inexpensive way to gather large amount of data from large or dispersed group of subjects
Survey and Questionnaire Challenges
- Obtaining a valid sample
- Designing the survey with analysis in mind
- Asking the right questions
Obtaining a valid sample for surveys and questionnaires
Survey results must be truly representative - included those responding and the surveys that have returned
How to get people to complete surveys and questionnairs
- Explain the purpose and importance of the survey
- Make it easier to complete - short and easier to understand
- Survey approaches (online with only few with online access)
How to design surveys with analysis in mind
- Questions should be easy to compare responses and rely on quantifiable responses (1- scale)
- Free-form feedback can enrich the research support
Asking the right questions in surveys
Questions should map various internal and external environmental factors that affect attitudes and work
Survey/Questionnaire Advantages
- Efficient way to gather a lot of data from a large and dispersed group
- Easier to quantify data for analysis and reporting
Survey/Questionnaire Challenges
- Can be difficult to obtain an acceptable response rate
- Difficult to follow up on data from anonymous sources
- Relies on self-reporting, which can be biased
- Requires time and statistical expertise to assess sample and compile and analyze data
Observation as Data Source
- Gather data by observing the workplace and work processes
- Removes self-reporting filter in interviews, surveys and focus groups
- Can note factors participants are unaware, become accustomed to or reluctant to mention for personal reasons
- Can strengthen HR understanding of work at hand and culture of the workplace
Observation Advantages
- Provides firsthand and immediate data rather than self-reported data, which can be affected by memory and selectivity.
- Is time-efficient for subjects
Observation Challenges
- Requires skill to be unseen. When the group is very aware of the observer, the data becomes less reliable.
- Requires vigilance to remove personal bias from observations.
- Requires experience to note significant behaviors.
- Observations may not be representative of the entire body of data (i.e., the totality of every meeting, every work process, every transaction).
Existing Data and Documents
Can include information from the organization itself, from public information sources (ex: government agencies) or from industry/professional associations
Sources of Exiting Data and Documents Examples Include
- Official documents, such as organization histories and vision and strategy statements, which can help the team understand the organization’s business and culture.
- Performance data over multiple periods from the organization’s financial records as well as data from other organizational databases.
- Performance data from the organization’s HR information system (e.g., turnover rates, employee complaints, incident reports).
- Correspondence and reports.
- Industry data that can provide information about external environments and performance benchmarks.
Advantages of Using Existing Data
- Eliminates the effects of observation and involvement and possible bias of facilitator/interviewer/observer
- Rich, multi-perspective source of data
Challenges of Using Existing Data
- Can be time-intensive
- Requires experience to extract key data
- May require ingenuity to find data
Artifacts
- Objects created by members of a culture that convey a sense of that culture’s values and priorities, beliefs, habits and rituals, or perspectives.
- They can provide insight into aspects of an organization’s culture that its members may not be able to or may not want to articulate to an outsider.
Artifacts Include
- Physical workplaces that can suggest characteristics of organizational culture - (emphasis on diplomas and certificates)
- Virtual environments - (social media providing clues on how organization is perceived by outsiders and employees)
Using Artifacts
- Can be used when they confirm or conflict with findings gathered by other means
- Without context- researcher can misinterpret meaning of important artifacts
Artifacts Advantages
- Provides additional insight into cultural issues
- Can be observed without the help of those being observed
Artifacts Disadvantages
- Requires researcher to understand the principles of culture
- Can create misunderstandings if the researcher is not familiar with the culture
Reliability
How well a measurement instrument provides consistent results.
Errors that can create insistent results in data
- Failure to maintain same conditions or correct for differences
- Cultural differences that create different interpretations of questions
- Bias in using the tool to gather data (ex: bias)
Validity
How well a measurement instrument measures what it is intended to measure.
Validation answers two questions
- What does the instrument measure?
- How well does the instrument measure it?
Errors in validity
May be damaged by using irrelevant criteria to develop measures
Statistical Sampling
- Used when population to be analyzed is very large or when data cannot be obtained from the entire population
- Sample must be representative
- Smaller the sample - more likely the results will be affected by statistical outliers, values that differ from the average
Errors are introduced to statistical study when
- Incorrect data is used
- Measurement may have been taken incorrectly, or number entered incorrectly
- Study’s design includes unintentionally or intentionally - different types of biases that affect outcomes
Statistical analysis types of biases
- Sampling
- Selection
- Response
- Performance
- Measurement
Sampling Bias
Sample that does not represent the entire population
Selection Bias occurs when
- In controlled study when participants are not randomly assigned to control and experimental groups
- Researchers choose to enroll only certain types of participants
Controlled Study
Assign participants to a control group that does not experience the intervention or condition being tested and one or more experimental groups that do experience the intervention or conditions
Response Bias
- Inverse of selection bias
- Researchers invite a representative sample to join a study but those who respond and accept are not representative
Performance Bias
Participants in a controlled study behave differently because they are being studied
Measurement Bias
Raters are measured incorrectly, either unintentionally (lack of training or hard to measure procedures) or intentionally (due to some type of bias)
HR use of methodology of study
- Methods may reveal errors or potential for error If creating own study
- HR should consult with statistical experts and have them review methodology
Descriptive statistics
- Process of sorting data in different ways to provide a more accurate and in-depth understanding of what the data is showing
- Enables the process of inferring the meaning behind data descriptions
Data measurement tools use in descriptive statistics
Used to understand the distribution patterns and characteristics of the dataset
Frequency Distributions
- Used to sort data into groups according to some factor (ex: years of employment)
- Allows analysts to understand the distribution of the data they are working with - regardless on if data is in a normal pattern around a central value or more broadly or narrowly dispersed over the data range
- Help locate peaks within data range
Quartiles and Percentiles
- Describe dispersion across a group of ranked data
- Frequently used in benchmarking
Quartiles
- Divide data into quarters
- First quartile Q1 = all data below 25%
- Second quartile Q2 = ends at center or 50th percentile
- Q4 = ends with last value at 100th percentile
Percentile
- Indicates the proportion of the data set at a certain percentile
- Ex: value in the 90th percentile is greater than 90% of the values in the dataset
Interquartile Range
- Applies the concept of quartiles to measures of central tendency.
- It includes all of the data values in Q2 and Q3, or 25% of the values above the midpoint and 25% of the values below the midpoint. U
- sed to indicate a range of confidence in an estimate. P50 is considered safe: half of estimates will be above and half below
Standard deviation
- The distance of any data point from the center of a distribution when data is distributed in a “normal” or expected pattern.
- Typically shown in a bell cuve
Standard deviation in normal distribution
- 68% of data lies within one standard deviation
- 95% of data lies within two SDs
- 99% lies within three SDs.
Standard deviation can be expressed as
SD or the Greek letter sigma [σ]
How to calculate standard deviation
Easily calculated on spreadsheet programs or statistical analysis software
Low standard deviation
Data curve is high and narrow and data points are tightly grouped around central value
High standard deviation
Data curve is flatter and longer and more spread out There are more outliers in this dataset
Outliers
Measures that are significantly greater than central values
Measures of central tendency
Mean, median and mode
Median
Middle value in a range of values.
What percentile is the median
50th
Median is the preferred measure of central tendency when
- When the distribution of the dataset is skewed - contains a few excessively high or low values
- Also used in frequency distributions
Mode
The most frequently occurring value in a set of data.
Mean
Average score or value.
Ways to calculate mean
- Unweighted mean
- Weighted mean
Unweighted mean
- Raw average of data that gives equal weight to all values
- No regard for other factors.
Weighted mean
Average of data that adds factors to reflect the importance of different values.
Weighted mean is used when
- There are significant outliers in spread of data
- Values are not considered equally impactful
Weighted mean is calculated by
- Multiplying individual values by a factor that adjusts the value.
- The results are then summed
Analytics allows better workforce decisions by
- Consider the past and present and forecast the future.
- Connect multiple data items.
- Provide computational analysis of data or statistics.
- Provide visual outputs of patterns and trends.
- Provide insights that can drive strategy
Data Analysis
Exposes important connections and patterns in data
Analytical Approaches
- Variance analysis
- Ratio analysis
- Trend analysis
- Regression analysis
- Root-cause analysis
- Scenario analysis
Variance analysis
- Statistical method for identifying the degree of difference between planned and actual performance or outcomes.
- Commonly used to analyze against objective baselines
Ratio analysis
- Comparing the sizes of two variables to produce an index or percentage
- Used often for percentage (HR turnover)
- Commonly used to analyze financial statements.
Trend analysis
- Statistical method that examines data from different points in time to determine if a variance is an isolated event or if it is part of a longer trend.
- Can forecast future conditions by establishing the direction and degree of change over time
- Important tool in discovering reoccurring peaks or troughs in activity
Regression analysis
- Statistical method used to determine whether a relationship exists between variables and the strength of the relationship.
- Data points are placed on scattergrams
- Shape of line suggests if likely correlation
- Positive or negitive
- Weak or strong
Root-cause analysis
- Also known as the five whys method
- Type of analysis that starts with a result and then works backward to identify fundamental cause.
- Each cause is queried to find a preceding cause, conditions or actions that led to this effect
Scenario/what-if analysis
- Statistical method used to test the possible effects of altering the details of a strategy to see if the likely outcome can be improved.
- Aided with software applications and models
Most common data analysis tool
Spreadsheet program that allows data to be sorted and viewed in different ways
Graphic Presentation of Data Analysis Includes
- Pie charts
- Histograms
- Trend Diagrams
- Pareto Chart
- Scatter Diagram
Pie Chart
Textual data can be included in callouts or attached table for more precise communication
Pie Chart Application
Used to present high-level impression of data distribution as a part of a whole
Histogram
- Bar chart
- Graphically shows the sorting of data into groups arranged in the shape of a statistical distribution
- Shows a central tendency and dispersion around that tendency. This appears as columns of varying heights or lengths.
- Can include a comparative referent, such as a target or range of values. They can also be designed to show comparisons over time (usually through multiple columns for each category).
Histogram use
To sort data and support rapid comparison of categories of data
Trend Diagram
- Plots data points on two axes.
- Horizontal - time
- Vertical - volume
Trend Diagram use
Used to test for presence of cycles or developing trends
Pareto Principle
80% of effects come from 20% of causes
Pareto Chart
- Applies Pareto principle in a histogram
- Categories of data are ranked
- X - size
- Y - ranges (number of occurances)
- Cumulative percentage line plots category contributions to the whole making it earier to identify the 80/20
Pareto Chart use
Distinguishes between the “vital few” categories that contribute most of the issues and the “trivial many” categories of infrequent occurrence to support more-focused quality improvement activities.
Scatter Diagram
- Plots data points against two variables that form the chart’s x and y axes.
- Each axis is scaled.
- The pattern formed by the plotted data describes the correlation between the two variables
Reading strength of correlation on scatter diagram
The tightness of clustering indicates the probable strength of the correlation.
Positive correlation on scatter diagram
- Line rises from lower left to upper right quadrant
- As x increases, y increases
Negative correlation on scatter diagram
- Line falling from the upper left to the lower right quadrant
- As x increases, y decreases
Scatter diagram application
Can be used to test possible casual relationship and narrow focus on subsequent tests.