10. Evaluation (people/projects) Flashcards

1
Q

personal evaluations

A

(individual level)
Recognise people’s achievements
Encourage future performance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

program evaluations

A

(systems level)
Identify how a policy or intervention is working
Identify areas for improvement/refinement
Decisions to continue/abandon program

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what is the process of person evaluation?

A

career planning (describe career and work goals) -> planning proposal (create goals expected outcomes) -> performance agreement (agree on performance indicators, targets, timeframe) -> performance evaluation (compare indicators with targets) -> feedback discussion (revise and update career goals) -> back to career planning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Why do we evaluate people?

A
Assign rewards (e.g., bonuses)
Identify where help/training is needed
Identify when people can take on greater challenges and responsibilities (e.g., promotion)
Reinforce good behaviours
Extinguish bad behaviours
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

what do we evaluate?

A

performance
effectiveness
prodctivity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

what do we evaluate in performance?

A

task-related behaviour

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

what do we evaluate in effectiveness?

A

evaluation of standard of performance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

what do we evaluate in productivity?

A

cost of achieving level of effectiveness

time, money, burnout

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

what is Bartram’s GREAT EIGHT competency model?

A
Leading/deciding
supporting/cooperating
interaction/presenting
analysing / interpreting
creating/conceptualising
organising/executing
adapting/coping
enterprising/performing
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

who evaluates?

A
supervisor 
other peers
clients
subordinates
team members

goes in a circle - 360 degrees feedback

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

how do we evaluate?

A

with objective measures and judgemental measures

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

what are the objective measures of evaluation?

A

Quantitative measure of production, e.g., sales, outputs
Academics:
- Research (papers published; article views; number of citations, h-index, i-10 index…)
- Teaching (unit ratings; number of units taught)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

what are the judgemental measures of evaluation?

A

allows consideration of context factors not captured by objective measures
e.g., supervisor’s overall impression
rating compared to other employees, influenced by perceived difficulty of job or other contextual information

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Why do we abandon annual performance reviews?

A

due to measurement and psychological factors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

measurement factors for abandoning annual performance reviews

A
annual reviews are disconnected from timescales of work
multiple factors (and people) contribute to one’s performance
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

psycholigical factors for abandoning annual performance reviews

A

Competition v. collaboration -
Performance rankings set up competitive mindsets
Doesn’t satisfy needs for learning and growth-
Fostered by more immediate feedback
More frequent communication = more informative feedback
Shifts from debating performance ratings to discussing development opportunities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

what should person evaluations be?

A

Developmental
Tied to organisational objectives
Specific
Sufficiently frequent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

how should person evaluations be sufficiently frequent?

A

by evaluating according to:
according to a person’s experience (less experience, more frequent)
task timeline (shorter task timelines, more frequent)
timely for decision-making

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

types of evaluation

A
proactive
clarificative
interactive
monitoring
impact
20
Q

proactive evaluation

A

Assessing need for a program

If program required, assessing how it has been done elsewhere

21
Q

clarificative evaluation

A

Understand rationale and plausibility of a proposed program prior to commitment

22
Q

interactive evaluation

A

Inform stakeholders about issues and concerns with a planned program
Use knowledge and expertise to improve a planned program

23
Q

monitoring evaluation

A

Assessing program processes

Is process of implementing program on track?

24
Q

impact evaluatin

A

Identify the program/intervention outcomes

Inform decisions about future programs, e.g., whether to use approach in other settings

25
Q

the steps to program evaluation

A

engage stakeholders (understand key issues) -> describe program (goal & purpose expected effects) -> design evaluation (Method, users, agreements) -> gather evidence (indicators, sources, quality, quantity, practicality) -> justify conclusions (analyse, interpretation, recommendations) -> tweak or revise program (implement lessons) -> back to engage stakeholders

26
Q

What are the general standards of program evaluations

A

utility
feasibility
proprietary
accuracy

27
Q

utility standard for program evaluations

A

Information will serve user needs
Measures are directly related to program goals
It is clear what the findings can show

28
Q

feasibility standard for program evaluations

A

Information is practical to collect
Use of resources in proportion to the program/issue
Methods require skills within organisation’s capacities

29
Q

proprietary standard for program evaluations

A

Information is obtained ethically
Respect the rights and welfare of those involved
Adheres to relevant laws and workplace policies

30
Q

accuracy standard for program evaluations

A

Information is accurate
Reliability: indicators produce consistent responses
Validity: Indicators fairly represent the construct/idea

31
Q

How does one describe the program?

A

What is the main purpose of your specific program/intervention? For example…

  • change behaviours directly
  • change attitudes
  • change social norms
  • change public awareness
  • Change knowledge

Evaluation needs to be linked to your purpose

32
Q

what are the ideal features for evaluation measures?

A
  1. Derived from the theory driving the intervention
  2. Includes measures of…
    - the intervention itself (whether it was implemented correctly)
    - its impact (consequences)
  3. Include both immediate and longer-term consequences
  4. Valid
    -measures are clearly related to the theoretical construct
  5. Reliable
    e.g., survey items are rated consistently – highly correlated with each other
  6. Uses multiple methods
    to make up for weaknesses of any single type of measure
33
Q

what are the types of measures for gathering evidence?

A
Behaviours
Behavioural intentions
Attitudes 
Perceptions
- Knowledge
- Social norms 
-> Descriptive: reports of observed behaviours (what people actually do)
-> Injunctive: shared expectations about what people should/should not do
-Awareness
34
Q

what is the hierarchy for selecting intention, attitude and norm measures in gathering evidence?

A
  1. USE existing scales when suitable
  2. ADAPT existing scales when possible
  3. DEVELOP new scales when there are no suitable/adaptable existing scales
35
Q

what are the issues in creating items for gathering evidence?

A

leading or biased questions
eg. “how much do you dislike getting up early in the morning”

double-barrelled questions (unless topics very closely aligned)
eg. “please rate the extent to which you enjoy lectures and tutorials”

use scales likely to match the detail people are likely to use
eg. 5-, 7-, 9-point scales; percentages

36
Q

what are the direct measures for measuring norms in the theory of planned behaviour?

A

descriptive

injunctive

37
Q

descriptive measures of measuring norms

A

(what others do)
many people who are important to me do…
people in my life whose opinions I value do…

38
Q

injunctive measures of measuring norms

A

(what others expect of you)
many people who are important to me think that I should…
it is expected of me that I will…
people in my life whose opinions I value would approve of…

39
Q

what are the belief-based measures for measuring norms in the theory of planned behaviour?

A

strength

motivation to comply

40
Q

strength measures of measuring norms

A

(measurement similar to injunctive)
My family thinks that I should…
usually tied to distinct reference groups

41
Q

motivation to comply measures of measuring norms

A

E.g., generally, how much do you want to do what your family thinks you should do?

42
Q

what is a common multi-method form of research?

A

Field studies

43
Q

what rare field studies?

A

a general term for methods emphasising observing and interacting with people in natural contexts

44
Q

what do field studies include?

A

observation
inquiry
breach

45
Q

what is observation in field studies?

A

observe/record

could be observing behaviour directly, or records of behaviour (eg. transcripts of conversations)

46
Q

what is inquiry in field studies?

A

interview people in relevant situations

gain knowledge about context as well as people

47
Q

what is breach in field studies?

A

record reactions to unexpected actions

e.g., ethnomethodology – reactions to norm-breaking