Accidents And Risks Flashcards
Critical Systems Essentials
- Safety: The system should not harm people or the system’s environment
- Reliability: The system must operate without serious failures
- Availability: The system must be available to deliver services when requested to do so
- Security: The system must be able to protect itself and its data from malicious use
Risk is a combination of
- The likelihood of an accident
- The severity of the potential consequences
Risk Factors in Technological Societies
Increasing complexity
Increasing exposure
Increased automation
Linear temporal logic
¬
∨
∧
→
Not
Or
And
Implies
Linear temporal logic
X
F
G
U
Next
Future
Globally
Until
Three fallacies around accidents:
• Assuming human error
• Assuming technical failures
• Ignoring organisational factors
Why is human error so often assumed?
• Convenience and Simplicity: Blaming human error is an easy explanation, avoiding deeper systemic investigations.
• Cognitive Bias: Hindsight and confirmation bias lead to focusing on human mistakes rather than system flaws.
• Final Line of Defense: Humans are often relied upon to catch and fix errors, so their failure is assumed to be the cause.
• Systemic Neglect: Underlying issues like poor training, fatigue, or bad design are often ignored.
• Lack of Human Factors Awareness: Systems are not designed to account for human limitations in complex or stressful environments.
• Organizational Pressures: Companies shift blame to individuals to protect reputations, reduce liability, or avoid costly changes.
Result: Oversimplifying accidents as human error prevents addressing root causes, leading to repeated failures and unfair blame.