Evaluation: People and programs Flashcards
Person evaluation (individual level)
Recognise people’s achievements
Encourage future performance
Program evaluation (systems level)
Identify how a policy or intervention is working
Identify areas for improvement/refinement
Decisions to continue/abandon program
Person evaluation circle thing
- Career planning
- Planning Proposal: Concrete goals and expected outcomes
- Performance agreement: Agree on performance indicators, targets, timeframe
- Performance evaluation: Compare indicators with targets
- Feedback discussion: Revise and update career goals
QUT’S Performance and evaluation process: To be completed at beginning of cycle
- Section 1: Career development Planning: Where staff may indicate their long term academic career goals
- Section 2: Planning proposal: Where staff initially draft then finalise their goals for the next twelve months under each of the three academic areas of achievement, and identify the support and resources they may require to achieve these goals.
- Section 3: Performance Plan agreement: Where the staff member and the supervisor agree on the outcomes of the planning discussion and subsequent performance plan
QUT’S Performance and evaluation process: To be completed at the end of 12 month cycle
- Review of performance: At then end of a 12month PPR-AS cycle, the supervisor documents and provides performance feedback to the staff member and the staff member makes a self-assessment.
- Confirmation of performance feedback discussion: Supervisors to sign off on PPR discussions. Staff member to sign off on PPR discussion, signature denotes participation in the process and supervisor’s comments acknowledged.
Why do we evaluate people?
Assign rewards (e.g., bonuses) Identify where help/training is needed Identify when people can take on greater challenges and responsibilities (e.g., promotion) Reinforce good behaviours Extinguish bad behaviours
What do we evaluate? Broad level
Performance
task-related behaviour
Effectiveness
evaluation of standard of performance
Productivity
cost of achieving level of effectiveness
time, money, burnout
What do we evaluate? Detailed level
Bartram’s (2005) “Great Eight” competency model
- Leading/deciding
- Supporting/cooperating
- Interacting/presenting
- Analysing/interpreting
- Creating/conceptualising
- Organising/executing
- Adapting/coping
- Enterprising/performing
Who evaluates?
360 feedback
- Team members
- Supervisor
- Other peers
- Clients
- Subordinates
How do we evaluate? (Landy & Conte, 2013)
Objective measures
Quantitative measure of production, e.g., sales, outputs
Academics:
Research (papers published; article views; number of citations, h-index, i-10 index…)
Teaching (unit ratings; number of units taught)
How do we evaluate? (Landy & Conte, 2013) Judgemental measures
allows consideration of context factors not captured by objective measures
e.g., supervisor’s overall impression
rating compared to other employees, influenced by perceived difficulty of job or other contextual information
Why abandon annual performance reviews? (Rock & Jones, 2015)
Measurement factors
annual reviews are disconnected from timescales of work multiple factors (and people) contribute to one’s performance
Why abandon annual performance reviews? (Rock & Jones, 2015) Psychological factors
Competition v. collaboration
Performance rankings set up competitive mindsets
Doesn’t satisfy needs for learning and growth
Fostered by more immediate feedback
More frequent communication = more informative feedback
Shifts from debating performance ratings to discussing development opportunities
Criteria: Evaluations should be:
Developmental
Tied to organisational objectives
Specific
Sufficiently frequent
according to a person’s experience (less experience, more frequent)
task timeline (shorter task timelines, more frequent)
timely for decision-making
How will you be able to tell if your proposed program or intervention works?
PROGRAM EVALUATION
Circle
- Engage stakeholders: Understanding key issues
- Describe program: Goals and purpose, expected effects
- Design evaluation: Methods, users, agreements
- Gather evidence: Indicators, sources, quality, quantity, practicality
- Justify conclusions: Analyses, interpretation, recommendations
- Implement lessons: Tweak or revise program
Program Evaluation: Standards: Utility
Information will serve users needs:
- Measures are directly related to program goals
- It is clear what the findings can show