Human Error Flashcards

1
Q

human error

A

a failure to perform a task satisfactorily, and that failure cannot be attributed to factors beyond the human’s immediate control.human error is…78% of aircraft incidents b/w 1959-199556% b/t 1995 to 200452% of power plant root causes38% self-reported industrial accidents88% accidents cause by an individual worker70% of anaesthetic incidents by surgeon

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

error descriptions & goals & outcomes

A

phenomenological descriptions: what the error wasvs.psychological descriptions: information processing that leads to error (i.e., the underlying causes)goals:unintentional vs intentionale.g. error on a test vs. what speeds most of us drive• recovered: error with possibility for damage but none actually occurred (e.g., patientgiven overdose, but loses prescription)• unrecoverable: error where damage could not be avoided - recovered error could turn into unrecovered error tomorrow

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

human error categories

A

• error of commission: a person performs a task or step that should not have beenperformeda.k.a. type I error/false positive/false alarm e.g., hitting thumb with hammer• error of omission: a person fails to perform a task or step a.k.a. type II error/false negative/misse.g., forgetting to unplug coffeemaker• sequential error: a person performs a task or step out of sequence e.g., lighting a fire before opening fireplace flue damper• time error: a person performs a task or step, but too early, too late, or the wrong speed e.g., going through intersection on a red light• extraneous act: a person introduces some task or step that should not have been performed (action from an unrelated series)e.g., lighting a fire, then unplugging coffeemaker

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

types of failures

A
  • based on where error originates• design error: designer does not take into account human abilities• manufacturing error: system not built according to design• installation/maintenance errors: system not installed or maintained correctly • operating error: system not operated according to intended procedure
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

errors of execution

A

• slip: unintentionally performing an incorrect action e.g., stepping on a banana peel and falling down - error of action execution• mode error: performing the correct response, but while in the wrong mode of operation; a kind of slipe.g., in paint software, attempting to draw something while using the eraser tool - error of attention/memory• lapse: neglecting to perform a required action e.g., forgetting to take your medicine twice a day-error of medicine

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

errors of intention

A

• mistake: selecting an action, and carrying it out successfully–but it is the wrongactione.g., smoking banana peels to try and get high- error of planning (choosing wrong decision-making rule, or lacking backgroundknowledge)- may be due to memory/perception/cognition• violation: intentionally contravening a standard (operating procedures, codes of practice, laws, etc.)- implies a (governing) social contexte.g., sabotage- is this necessarily an error?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

complete analysis of error

A
  1. Describe system goals and functions2. Describe situation3. Describe tasks and jobs4. Analyze tasks for which errors are likely5. Estimate probability of each error6. Estimate probability that the error is not corrected7. Devise means to increase reliability (to decrease error)8. Repeat steps 4 - 7 in light of changes- different types of errors may need different types of actions to prevent them
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

error (as opposed to human error)

A

error: an action (or lack of action) that violates tolerance limit(s) of the system • defined in terms of system requirements and capabilities• doesn’t imply anything about humans; may be system flaw

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Error Probability aka…

A

aka Human Error Probability (HEP): EP = (# of errors) ÷ (total # of opportunities for error)there are HEPs of specific actions found in tables, e.g.:• select wrong control in group of labeled identical controls = .003• failure to recognize incorrect status of item in front of operator = .01• turn control wrong direction, under stress, when design violates population norm = 0.5

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Calculation of HEP: THERP (Swain, 1963)

A
  • creating a fault tree analysis- start at top with probability of correct/incorrect action- next level: probability of error, given last action- these are conditional probabilities (are not independent)- sum of partial error probabilities at bottom is overall error probabilitye.g., starting a car• capital letter = correct outcome• lower case = error• conditional probability: p (b|a) means “the probability of b given a”K = correct keyk = incorrect keyS = getting key into ignition s = missing ignitionp (S|K) is probability of getting correct key into ignition (only correct outcome) p (error) = 1 - p (S|K)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

reliability

A

probability of a successful outcome of the system or component • is also defined in terms of system requirements• thus, to evaluate a system we must know goal and purposes of the system R = (# of successful operations) ÷ (total # of operations)(also, R = 1 - EP)In general, reliability goes down as number of components goes up (i.e., as complexity increases).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

components in a series

A
  • in a series, if any component fails, the whole system fails(e.g., four tires on a car) Rs = R1 × R2 × … × Rne.g., all components have R = .90: n = 1: Rs = .90n = 2: Rs = .9 × .9 = .81n = 3: Rs = .9 × .9 × .9 = .73 n = 10: Rs = .910 = .35!!!
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

active redundancy

A

all components operate all the time, but only one is needed e.g., Boeing 767 can fly on only one enginee.g., traffic signals have multiple lightse.g., RAID level 1: data is mirrored across two hard drives- failure occurs only when both fail: (EP1) × (EP2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

how to improve reliability

A
  • hardware factors:• KISS (Keep It Simple, Stupid)A-10 Thunderbolt II (“Warthog”) (1977): ~33% of air fleet unavailable at any timeF-111D Aardvark (first “glass cockpit,” 1967): ~__% unavailable AH-64D Apache Longbow (1998) helicopter gunship: similar record- human factors:• use human factors knowledge in design• use human as redundant system component (Human Reliability Analysis)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Human Factors Analysis and Classification System (HFACS): overview what is it

A
  • HFACS is a comprehensive framework for understanding error, originally developed for the U.S. Navy and Marine Corps as an accident investigation and data analysis tool
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Human Factors Analysis and Classification System (HFACS): Swiss Cheese Model

A

• Active error: has immediate effect upon system performance; tends to beassociated with front-line operators (“the last person who touched it”)• Latent error: not immediately apparent; may lie dormant within a system (“anaccident waiting to happen”)holes in swiss cheese (failed or absent defenses), line up, finally with an active failure, then there goes the mishap.

17
Q

Human Factors Analysis and Classification System (HFACS): level 1 unsafe acts

A

Level 1: Unsafe Acts • errorse.g., perceptual, skill-based, or decision errors • violationse.g., violated training rules

18
Q

Human Factors Analysis and Classification System (HFACS): Level 2: Preconditions for Unsafe Acts

A

Level 2: Preconditions for Unsafe Acts • environmental factorse.g., weather conditions • condition of operatorse.g., distraction • personnel factorse.g., miscommunication

19
Q

Human Factors Analysis and Classification System (HFACS): Level 3: Unsafe Supervision

A

Level 3: Unsafe Supervision • inadequate supervisione.g., personality conflict• planned inappropriate operationse.g., improper crew pairing (very senior captain with very junior co-pilot) • failed to correct a known problem• supervisory violationse.g., permitting someone to operate an aircraft without current qualifications

20
Q

Human Factors Analysis and Classification System (HFACS): Level 4: Organizational Influences

A

Level 4: Organizational Influences• resource managemente.g., management decisions about safety vs. on-time performance • organizational climatee.g., formal accountability for actions • organizational processe.g., use of standard operating procedures

21
Q

Human Factors Analysis and Classification System (HFACS): pros and cons

A
  • pros & cons:GOOD applied to accident investigation, understanding, & prevention in over 1,000 militaryaviation accidentsBAD criticized for oversimplifying human actions as “correct” or “incorrect”- allows human error to be seen in the context of a system!!!!
22
Q

The Audi 5000: overview

A

(SUA) spontaneous, uncontrolled acceleration of avehicle when shifted from park to drive or reverse, often with apparent loss of braking - “idle stabilizer control” fuel system component supposedly triggered “transient malfunctions” without warning-huge fucking recall-SUA not a unique problem though!

23
Q

The Audio 5000: suspicions…

A

• Why were there more of these incidents among drivers who had relatively little experience driving the Audi 5000? (most incidents occur within first 2,000 miles ofcars’ life)• Why no reported problems with Audi 4000 Quattro? (had identical idle stabilizermechanism)• Why did this only happen in cars starting at rest?• Why were many accelerator pedals bent, even snapped off?

24
Q

The Audi 5000: Audi’s explanation, NHTSA research

A

-Audi examined 270 incidents, only 6 idle-speed stabilizers found defective, but wouldn’t cause SUA-engine cannot override breaks!! demoed for NBC-DRIVER ERROR YO, they pressed accelerator instead of brakeNHTSA findings-no mechanism beside gas pedal could accelerate to full power-minor 2 second surge, could startle driver to push accelerator not brake-unusually problematic placement of pedals-PEDAL MISAPPLICATION, pedal design defect

25
Q

The Audi 5000: pedal design

A
  • pedals were coplanar, for easy “heel/toe” operation; RT is faster (Casey & Rogers,1987)- redesigned due to confusion in 1983- pedal configuration related to (rare) pedal errors (Rogers & Wierwille, 1988)- centre line misperceived to be right of centre of vehicle–but not related to pedal errors(Vernoy, 1989)- drivers have familiarity with pedals aligned more to the right- design-induced error (“DIE”): the design causes errors to occur (as opposed tooperator-induced error); note this is not merely design interacting with error