Violence Risk Assessment Flashcards

Week 1

1
Q

how do evaluators measure predictive validity?

A

they calculate the area under the curve (AUC);

  • rate offenders using tool
  • examine who offenders
  • calculate AUC
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

e.g., of VRAG limitation; what if someone showed an increase in symptoms of schizophrenia? would this change the rating?

A

unfortunately not as the items looked at only focus on historical static factors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

availability heuristic (confirmation bias)

A

recall certain info more easily if important/noticeable – so we think it’s true

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

example of availability heuristic

A

confirmation bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

an example of representativeness heuristic is

A

base rate neglect

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

T/F: actuarial tools do not allow for discretion or flexibility

A

TRUE. if its not on the tool, you can’t consider it. However some discretion may be important such as someone hearing voices saying to kill others should be able to bump up ratings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

what are the pros of unstructured clinical judgement?

A

easy, convenient, individualized

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

anchoring heuristic

A

overly influenced by whatever info is learned first (e.g. framing)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

false negative error

A

thinking someone is low risk but they’re actually high risk

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

what does AUC of .70 mean in comparison to AUC of .50

A

AUC of .70 means that if you randomly selected a
recidivist and a non-recidivist, there is a 70% chance the
recidivist would score higher than the non-recidivist

AUC of .50 means that the tool is no better than chance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

representative heuristic (base rate neglect)

A

rely on stereotypes or evidence that seems prototypical.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

true negative

A

low risk because they truly are low risk

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

true positive

A

high risk because they truly are high risk

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

pros of VRAG

A
  • better reliability
  • better predictive validity
  • more transparent
  • easier to defend
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

when were risk assessment tools first developed

A

around 1985

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

framing is an example of

A

anchoring heuristic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

what are the challenges that evaluators may face?

A

The challenge is in making an accurate assessment about someone due to;

  • limited amount of time
  • limited validity
  • limited amount of info because usually the person giving the info is wanting the examiner to steer their opinion a way that favours them.
  • ensuring that one who has been involved in the case does not affect your decisions
18
Q

T/F: we predict around half of the people we evaluate will reoffend

A

False.

more like 10%

19
Q

Framing (anchoring heuristic)

A

drawing different conclusions from the same information, depending on how or by whom that information is presented

20
Q

_______ ______ is poor. ______ tools are better than ________ but still have some problems. So…another option is ______ ________ _______

A

unstructured judgement is poor. Actuarial tools are better than Unstructured but still have some problems. So… another option is Structured Professional judgement

21
Q

T/F: Predictive validity of VRAG is perfect

A

FALSE. it’s good but not perfect. AUC = .68 (moderate range)

22
Q

where can risk assessments happen in the justice system, psychiatric, and education

A
In the Justice system: 
• Placements (e.g., pretrial detention)
• Sentencing (e.g., death penalty)
• Supervision/security (e.g., max security)
• Treatment (e.g., services)
• Release from custody (e.g., parole)

PSYCHIATRIC
• Warn potential victims (duty to protect)
• Civil commitment (i.e., hospitalized –
danger to others)

EDUCATION
• Threaten to harm another student

23
Q

what type of tool is the Violence Risk Appraisal Guide (VRAG)?

A

actuarial - add up scores, no discretion. it produces estimated probabilities (e.g., 78% chance this person will reoffend)

24
Q

how is information combined in actuarial tools?

A

adding up the total scores

25
Q

T/F: Tools have higher AUC score than Unstructured Judgement

A

TRUE.

26
Q

what items does the VRAG look at? historical or static?

A

both. 12 items looked at. they developed it by selecting factors that predict violence in a sample

27
Q

what are the cons of unstructured clinical judgement?

A
  • lacks reliability
  • poor predictive validity
  • accurate in NOT MORE THAN 1/3 predictions
  • lacks transparency
  • difficult to defend
28
Q

what are some of the limitations with risk assessment tools?

A
  • best option but are not crystal balls into the future

- biases can STILL occur even when tools are used

29
Q

how many tools are out there?

A

over 400

30
Q

what is the purpose of the VRAG?

A

designed to predict violence in adults

31
Q

what are the pros of having risk assessment tools?

A
  • better inter-rater reliability
  • better predictive validity (moderate range)
  • more transparent
  • easier to defend
32
Q

to use tools such as the HCR-20 V3, you must have

A

a Ph.Din clinical psychology or a related field

33
Q

false positive error

A

thinking someone is high risk when they are actually low risk

34
Q

confirmation bias is an example of

A

availability heuristic

35
Q

base rate neglect (representativeness heuristic)

A

judging likelihood of an outcome without considering accurate information about the probability of that outcome

e.g., we predict that around half of the people we evaluate will reoffend

36
Q

base rate neglect is an example of

A

representativeness heuristic

37
Q

example of anchoring heuristic

A

framing

38
Q

cons of VRAG

A
  • not crystal balls and biases can still occur (this is with all tools)
  • VRAG doesn’t allow for any discretion or flexibility and VRAG focuses on static factors (this is an issue with actuarial tools mostly)
39
Q

what is a limitation of the VRAG?

A

factors on the VRAG are static or unmodifiable and so, risk can change over time but that won’t change a rating.

40
Q

how is information combined in Structured Professional Judgement?

A

they don’t add up but rather rate low/moderate/high)

41
Q

Confirmation bias (availability heuristic)

A

look for evidence that confirms our initial impression

e.g. we already made up our mind that one is high risk