Lessons from TMI Lessons from big industrial accidents and their relevance for the Pharma industry
Click icon to add picture
Solutions to complex problems in the high stakes and high consequence environment of Global Pharmaceuticals, including clinical research, healthcare informatics, and public health. We blend established, Pharma sector methodologies, innovation, and adaptations/transfers from other sectors to identify and resolve consequential practices that pose risk and often result in avoidable patient casualty.
ARETE-ZOE, LLC Registered address: 1334 E Chandler Blvd 5A-19, Phoenix 85048 AZ, USA
Click icon to add picture
Three Mile Island, PA, USA (1979)
Cadmium rods
Cooling circuit Steam generator
Reactor
Steam pipes
Turbines
03:58 AM, March 28, 1979
Routine repair of clogged filter Trace of water left inside air circuit
Alarms went off
‌causing confusion in the operating room
Computers interpreted “water in air system” as “dangerous invader” and shut down the pumps in cooling circuit
Pumps in cooling circuit shut down Why: Misinterpreted information from the air circuit Consequence: Cooling system disabled, reactor heating up Nuclear reactor
Steam generator
Operators faced situation they were not trained for and was not covered by their procedures
Pressure is building up in primary cooling system within the reactor
Computer ordered cadmium rods to plunge into the reactor and stopped chain reaction
Pressure and heat within the reactor continues to build
Pressure operator relief valve (P.O.R.V.) opened to vent the pressure and failed to reclose
Indicator incorrectly shows that the valve has now reclosed
Design: Indicator correctly shows that the control order has been “sent”, not “obeyed”
Intervention
Disease
Surrogate endpoint
True clinical endpoint
In clinical trials, a surrogate endpoint (or marker) is a measure of effect of a specific treatment that may correlate with a real clinical endpoint but does not necessarily have a guaranteed relationship.
Surrogate endpoint
Valve open, reactor coolant leaking out without anyone’s knowledge
Stuck valve undetected Why: Misinterpreted information from indicators Consequence: cooling system disabled, reactor heating up
Communication failure Single phone line in the operating room Key people unable to get through
1
Incorrect readings from instruments Volume of coolant measured indirectly
DECIDE
Decision: to turn off reactor pumps
ORIENT
ďƒ Incorrect conclusions Loss of trust in the instruments System did not behave as expected
OBSERVE
Overheated reactor
Training
Decisions based on incorrect, misleading or no information
7:15 AM stuck valve finally discovered Pumps were finally restarted Reactor still overheating Misleading temperature readings Why: Instruments not designed for temperatures this high
Radiation in operating room Mounting pressure
8:33 AM – General emergency Misleading and deceptive information provided to the public by the company
Minimum information provided to state administration and regulators Partial evacuation within 5 mile radius
Freedom of speech, anyone? ●
U.S. v. Caronia
●
Amarin v. FDA
Not an issue during TMI Real concern now in off-label promotion (Must be “truthful”)
Reactor cracked Sample of contaminated coolant
Basement full of contaminated water Radioactive gas was eventually released into the atmosphere
Oh, BTW, it can blow up because of accumulated hydrogen
Fierce dispute within the NRC whether this can happen or not This risk did not materialize. Partial reactor meltdown did not result in any additional release of radiation.
Complex combination of minor equipment failures and major inappropriate human actions
1973 oil crisis Cheap power needed Political topic Gov’t subsidies
Probability assessment flawed
“That can’t happen here” mindset
Risk = Probability x Consequence
? Probability
Combination of lesser events Misjudged probability Misinformation Confusion Inadequate training Inappropriate human response Ordinary mistakes in high stakes environment
? ? ?
X X X Consequence
Root cause: Human factors
Organization failed to learn from previous failures RECOMMENDATIONS Fundamental changes in •
Organizational procedures and practices • • •
The attitude of regulators Operator training, updates
Emergency response and planning
The need for change
Focus on equipment safety and large break accidents RECOMMENDATIONS •
More attention to human factors
• Combination of lesser events (slower to develop, more likely) •
Training, fitness for duty
•
Organizational structure •
Communication
The need for change
“It is the responsibility of the NRC to issue regulations to assure the safety of nuclear power plants. However, regulations alone cannot assure safety. Once regulations become as voluminous and complex as those now in place, they can serve as a negative factor in nuclear safety. The complexity requires immense efforts to assure compliance.
The satisfaction of regulatory requirements is equated with safety.
Requirement v. Consequence Compliance v. Safety culture
• Focus on compliance with regulations instead of intrinsic system safety •
•
Inspection manual voluminous and complex – unclear to many inspectors Enforcement actions limited/unused • Reliance on industry own data
•
No systematic evaluation of patterns •
Unclear Roles & Responsibilities
The role of regulators
NRC has erred on the side of the industry's convenience rather than its primary mission of assuring safety
HUMAN FACTORS
Fiduciary responsibilities of public servants
The role of regulators
Worst problem: Loss of public trust • • • •
Misinformation Deception Misunderstanding Fear & Confusion
Major regulatory reform Transformation of the industry
Outcome
Chernobyl, Ukraine, USSR (1986)
Orders received to carry out tests to find out how much energy can be saved during routine maintenance shut down.
Numerous safety mechanisms had to be turned off to make this test possible
Power levels lowered to perform tests
Operator failed to program computer to prevent power from dropping below minimum safe level
Emergency core cooling system shut off
= Bad idea
Automatic scram devices an alarms switched off
= Very bad idea
Control rods withdrawn too far to reinsert quickly
Chernobyl nuclear plant, unit 4 April 26, 1986
1:23 AM 5 AM
Study into systemic factors
Causes of disaster Irresponsibility Negligence Indiscipline
Long record of sometimes fatal accidents ďƒ ACCIDENT WAITING TO HAPPEN National 5-year production goals oblivious to reality Training often suspect and shoddy Lax observance of rules and regulations
Flawed performance metrics
HUMAN FACTORS
Systemic factors
Sweeping changes in Soviet society Disintegration of the Empire due to loss of credibility
Outcome
Click icon to add picture
Volkswagen, Germany (2015)
Martin Winterkorn, CEO of Volkswagen, AG acknowledged that 11 million vehicles were equipped with diesel engines with defeat devices to cheat pollution tests
…And spreading
Criminal probe underway
Cause entirely internal Flawed performance metrics
VW very sensitive to its own image Internal pressures to improve metrics caused someone to manipulate the system – in a manner that amounted to conspiracy
Root cause
Click icon to add picture
Behavior of organizations follows the same principles regardless industry
Lessons learned?
TMI • • • •
Chernobyl
VW
Formally regulated industries High-stakes, high consequence environment Information flow within organization Communication with stakeholders ●
Public trust essential
Accident caused by systemic factors impacts the whole industry
Common attributes
TMI
Chernobyl
VW
• Requirement v. Consequence • Individual and collective accountability • Poor leadership • Flawed performance metrics HUMAN FACTORS • Failure to learn from previous errors • Communication with stakeholders / public • Regulatory response • Delivery/enforcement of regulation
Common root cause?
TMI
Chernobyl
VW
• Subject to the same human frailties • Oblivious to ambiguity • Requirement v. Consequence ●
Public trust essential
HUMAN FACTORS
Regulators and elected officials
HUMAN FACTORS Ethical ENVIRONMENT
Leadership Reporting structure Organizational culture Experience, training, education Demographics
Capabilities
Frailties
Values
Probability
Detrimental consequence Vulnerability in process
Accidental Malicious
Threat Capability Intent / Ability
What is risk?
Risk Probability of detrimental consequence
Qualifying consequence Patient injury
Safety signal: It takes significant number of casualties with attributed causal relationship to produce a signal Statistically significant cause attributed to a drug
Qualifying consequence
Atypical manifestation of disease
PATIENT INJURY
Misdiagnosis Prescribing error Wrong dose (predictable, unpredictable) Individual variability in response Off-label use (appropriate, inappropriate) Dispensing error / incorrect substitution) Non-compliance with treatment Self-medication (OTC, Rx, illicit) Drug interactions (known, unknown) Misleading information on drug Counterfeit medications
Attribution
Limitation of science Honest mistake Omission Commission Deception False Claim
Patient
Individual
Drug manufacturer
Clinician
Pharmacist
Healthcare facility
Regulator
Insurer
Elected officials Population
The only way how to change behavior of organizations is… …to create COMMON CONSEQUENCE
Adverse outcome: Consequences
Probability
Detrimental Consequence Vulnerability in process
Accidental Malicious
Risk
Probability of detrimental consequence
Threat Capability Intent / ability
Detection of vulnerabilitites
Quality risk management
Systems modeling
Record of past events
Identify Vulnerabilities
(EV, FAERS)
Impose safety Constraints
FTA, FMEA, FMECA
Enforce these constraints • By Design
HAZOP, HACCP, PHA
ICH Q9
ICH E2E
• By Operations
Define Accountability for control of vulnerabilities and acting upon them (R&R)
Enable decision-makers
Risk assessment
Imposing constraints on a system whilst ensuring enforceability of these constraints by design and operations
Sensor
Human supervisor (Controller) Model of process
Model of automation
Displays Controls
Measurable variables
Automated Controller Model of process
Actuators
Model of interfaces
Controlled variables
Process inputs
Controlled process
Process outputs
Systems theoretic accident process and modeling (STAMP)
Disturbances
Simplified models of complex environment
Tools to enable decision-makers • Reduce ambiguity and uncertainty • Accountability for acting upon vulnerabilities • Limit liability
Tools do not substitute good leadership
PUBLIC TRUST
System models
HUMAN FACTORS
Training
Correct input – accurate and timely orientation
Download presentation
Thank you