patient_safety

Page 1

CURRICULUM FOR PATIENT SAFETY

This document is a product of the Society of Academic Emergency Medicine Patient Safety Task Force. The curriculum was designed and intended for general use in Emergency Medicine Residency training programs, but can be used for medical student teaching and trainees in other areas of medicine. The contents contain reports of medical error and unintended harm. Although based on real cases, the actual details have been modified and do not reflect any given case.

1


TABLE OF CONTENTS

Module 1: Awareness of Error: Bringing a Safety-Conscious Culture to Medicine

Module 2: Definitions and Models of Error

Module 3: Cognitive Error and Medical Decision-Making

Module 4: Learning From the Experience of Others

Module 5: Complications from Invasive Procedures

Module 6: Medical Error from a Systems Perspective

Module 7: Living with the Reality of Error

2


1 Module 1. Awareness of Error: Bringing a SafetyConscious Culture to Medicine 1.1 Goals and Objectives 1.

Understand the concept that Medicine is a high risk industry, and that error is common and perhaps, inevitable.

2.

Learn the scope and magnitude of error in Medicine.

3.

Understand how traditional medical education interferes with the ability to acknowledge and respond to error.

4.

Understand that future improvements in medicine rely on recognizing error and developing effective reporting mechanisms.

5.

Demonstrate understanding by participating in Quality Improvement activities to identify medical error.

1.2 Content Outline I. Awareness of Error A. Introduce other Major Disasters (nonmedical) 1. Concept of error. Describe error in Three Mile Island, the Challenger Accident or other major non-medical error. 2. Concept of high-risk industry (nuclear reactors) 3. Concept of high reliability organization B. The Reality of Medical Error 1. Examples of individual cases: describe a high profile case to catch audience attention to the shocking reality of error. Include newspaper headlines, news reports of significant medical error. 2. Magnitude of error in medicine a) In major New York study adverse events occurred in 3.7% of hospital admissions; of these, 2.6% caused permanent disability and 13.6% ended in death.1 Extrapolated to the general population, medical errors in hospitalized patients would lead to 98,000 deaths each year in the US.2 b) Deaths from medical errors may exceed the number of deaths each year from motor vehicle collisions, breast cancer, or AIDS.2 c) If these numbers are extrapolated to the general population the number of deaths resulting from iatrogenic harm are equivalent to harm from “three jumbo-jet crashes every 2 days�.3 d) Australian data just as alarming: 16.6% of hospitalizations complicated by adverse events; 13.7% leading to permanent disability and 4.9% in death.4 e) Other studies report similar statitistics.5,6

3


f) One study in an intensive care setting found an average of 1.7 errors occurred per patient each day.7 g) One university teaching center found that 14% of all in-house cardiac arrests were attributable to iatrogenic causes.8 3. Error in Emergency Medicine a) The Harvard Practice Study found that of all detected adverse events, 2.9% occurred in the Emergency Department, but 70.4% of these were determined to be due to negligence and 93.3% were judged as “preventable�.9,10 b) The Emergency Department is an environment of great risk: (1) Undifferentiated problems of varying acuity (2) Incomplete information about patients (3) High degree of uncertainty despite urgency to intervene (4) Distractions, multitasking nature of work.11 (5) Shift work, with frequent trade-offs, informal communication network regarding patient care. c) The Emergency Department is an environment where errors are less visible, and thus less correctable. (1) Lack of feedback limits awareness of how individual actions ultimately impacted patient.12 Clinicians may never know if their impressions/decisions/actions were wrong, thus will neither learn from the error or even know that an error occurred. (2) Patients ultimately leave the department or the shift ends; the assumption is that everything was fine, unless informed otherwise. We have very little sense of our accuracy or effectiveness. (3) Individuals usually act as a part of a team. When errors occur, the responsibility is diffused across shifts and between teams. It may be difficult to determine what role individual actions ultimately had on the outcome. C. How did we arrive at this predicament? The very system and people trained to help are caught in a process that ultimately causes harm. 1. Traditionally Medicine deals with 1:1 physican:patient interactions in a simple system. One patient, one decision at a time. 2. Modern Medicine has changed. a) Technological advances have made medicine a high-tech business. We can now make life in a test tube, we can transplant organs, we have mapped the human genome. But: (1) High tech also carries high risk. (2) If we are capable of heroic space-age cures, are we not certainly beyond mundane errors? b) Medical care is now delivered in a fragmented way. (1) There is an increasing reliance on specialists and sub-specialists. No one clinician is able to provide all services for patients. Patients end up maneuvering their way between consultants and specialists. (2) Even within a single care unit, care is fragmented between teams, across shifts.13

4


c) There is an explosion of treatment modalities available. Most physicians have stopped trying to learn “pharmacology” and resort to pocket pharmacopias and calculators to check for drug interactions and side effect profiles. The information base is beyond the ability of most to retain all the useful information necessary in day to day practice. d) Invasive procedures are increasingly common. Because they are common, it is easy to forget that they come with inherent risk. e) Sophisticated care has evolved without a sufficient infrastructure to support the delivery of such care. 3. Traditional Medical Education doesn’t prepare students to acknowledge error.14 Common ideas/attitudes of teaching programs: a) Medicine is taught in an authoritarian manner with a sense of absolute right/wrong. b) Medicine is infallible; we should be perfect. c) The most senior clinician is right, by definition. d) There is always one right answer. e) Confidence equals competence. f) Error equals incompetence, negligence, or laziness. Error carries shame. g) “It doesn’t matter if you’re right or wrong, but you must be decisive and confident. Otherwise, how can the patient trust you?” (Anecdote from the halls of one training institution) h) “Everything would be just fine (no errors, no harm) if everyone just did their job responsibly.” (quote from surgeon). 4. Needed reform in Medical Education: changes in attitudes and principles. a) Medical decision-making always carries some degree of uncertainty. b) The best clinicians adapt, learn to be decisive in the face of uncertainty, but flexible to change if the facts don’t fit. c) There is risk in medicine and risks to the actions we take. Every decision should seek to achieve a balance, to optimize risks and benefits. d) We are fallible. The system is fallible. We should keep our eyes open to prevent harm, report it when we see it, investigate and understand the causes. D. What can we do to change? 1. We must acknowledge and adjust to the concept that error is inevitable and that Medicine is a high risk industry. The biggest threat to improvement is our inability or unwillingness to accept imperfection and seek to adapt to the reality of error. 2. Our training programs need to teach medicine decision-making in the context of uncertainty. 3. Our systems need designs that offer tools for avoiding error, recognizing error, and minimizing harm. 4. A more open view towards error can allow for reporting of error, investigating error, understanding error, and eventually, development of error reduction strategies. 5. We need to evolve towards a “Safety-Conscious Culture”, aware of error and determined to use our creative resources to reform health care.

5


6. The design of systems that deliver health care limits our ability to apply our advanced technology. The actual quality of health care relies on better system design.

1.3 Teaching Methodology A one hour introduction by lecture, and/or recommended reading. Ideally, changes in attitude and culture begin by changes in behavior and attitudes in recognized leaders and role-models. Mentees will follow strong leaders. Leaders who are able to acknowledge their own errors, demonstrate how to address error, and actively reform their own system will demonstrate the principles needed to change the culture of medicine. These principles can be demonstrated in Mortality and Morbidity conferences, Quality Assurance activities, and daily teaching sessions. Many of the concepts from Module 1 can be introduced by viewing the following video: Beyond blame: solutions to America’s other drug problem [videotape]. Solana Beach, CA:Bridge Medical Inc.; 1997. Refer to: http://www.bridgemedical.com/beyond_blame.shtml. This is a powerful video of interviews with three health care professionals who were involved in fatal medication errors.

1.4 Recommended Reading Leape LL. Error in medicine. JAMA. 1994;272(23):1851-1857. Blumenthal D. Making medical errors into “medical treasures” [editorial]. JAMA. 1994;272(23):1867-1868. CITATIONS 1. Brennan TA, Leape LL, Laird NM, et al. Incidence of adverse events and negligence in hospitalized patients: results of the Harvard Medical Practice Study I. N Engl J Med. 1991;324(6):370-376. 2. Kohn LT, Corrigan JM, Donaldson MS, eds. To Err is Human: Building a Safer Health System. Institute of Medicine. Washington, D.C.:National Academy Press;2000. 3. Leape LL. Error in medicine. JAMA. 1994;272(23):1851-1857. 4. Wilson RM, Runciman WB, Gibberd RW, et al. The Quality in Australian Health Care Study. Med J Aust. 1995;163:458-471. 5. Gawande AA, Thomas EJ, Zinner MJ, et al. The incidence and nature of surgical adverse events in Colorado and Utah in 1992. Surgery. 1999;126(1):66-75.

6


6. Fleming ST. Complications, adverse events, and iatrogenesis: classifications and quality of care measurement issues. Clin Perform Qual Health Care. 1996;4(3):137147. 7. Donchin Y, Gopher D, Olin M, et al. A look into the nature and causes of human errors in the intensive care unit. Crit Care Med. 1995;23(2):294-300. 8. Bedell SE, Deitz DC, Leeman D, et al. Incidence and characteristics of preventable iatrogenic cardiac arrests. JAMA. 1991;265(21):2815-2820. 9. Leape LL, Brennan TA, Laird N, et al. The nature of adverse events in hospitalized patients: results of the Harvard Medical Practice Study II. N Engl J Med. 1991;324(6):377-384. 10. Leape LL. The preventability of medical injury. In: Bogner MS, ed. Human Error in Medicine. New Jersey:Lawrence Erlbaum Assoc, Inc.;1994:13-25. 11. Chisholm CD, Collison EK, Nelson DR, et al. Emergency department workplace interruptions: are emergency physicians “interrupt-driven” and “multitasking”? Acad Emerg Med. 2000;7(11):1239-1243. 12. Croskerry P. The feedback sanction. Acad Emerg Med. 2000;7(11):1232-1238. 13. Shepherd A, Kostopoulou O. Fragmentation in care and the potential for human error. In: Johnson C, ed. Proceedings of the First Workshop on Human Error and Clinical Systems. Glasgow Accident Analysis Group Technical Report G99-1. Glasgow: Glasgow Accident Analysis Group; 1999. 14. Pilpel D, Schor R, Benbassat J. Barriers to acceptance of medical error: the case for a teaching programme. Med Educ. 1998;32(1):3-7. OTHER RECOMMENDED REFERENCES Weingart SN, Wilson RM, Gibberd RW, et al. Epidemiology of medical error. BMJ. 2000;320:774-777. http://www.npsf.org. The website for the National Patient Safety Foundation is a valuable reference, including links to other safety websites.

7


2 Module 2. Definitions and Models of Error 2.1 Goals and Objective 6.

Learn the basic definitions for error, failure of execution, failure of planning, adverse event, and near miss. Recognize that these definitions are evolving as we learn more about error.

7.

Learn and understand Reason’s “Swiss-cheese model� of error.

8.

Define active and latent error. Be able to explain in what ways they differ (visibility, proximity to error).

9.

Understand and be able to describe the multifactorial nature of medical error.

10.

Become familiar with different models of safety. Apply different models to an error analysis to find factors that contribute to medical error.

11.

Demonstrate an analysis of a medical mishap. Apply the knowledge from this presentation to identify and analyze error and form strategies to prevent error.

2.2 Content Outline II. The Science of Error: Begin section with the presentation of a case, allowing the audience to briefly discuss the cause(s). An experienced nurse is caring for a patient with mild epigastric pain. The patient is relatively young and not particularly ill. The physician has ordered an intravenous dose of Pepcid and hydration for symptomatic relief of what he believes is a mild exacerbation of a peptic ulcer. The nurse reaches in the cabinet for the Pepcid in a parstock of items kept for routine use in the department. She administers the medication in the usual dose as ordered. Moments later the patient is found unresponsive. Investigation uncovers the error. The patient inadvertently received the paralytic Pavulon. The nurse is terminated. What was the error? What action(s) should be taken? A. Basic Definitions1-4 1. Error: Failure of a planned action to be completed as intended or use of a wrong plan to achieve an aim,1 or a human action that fails to meet an implicit or explicit standard.4 a) Error of execution: failure of a planned action to be completed as intended. b) Error of planning: use of a wrong plan to achieve an aim. 2. Adverse event: an injury resulting from a medical intervention, generally implies harm independent of the disease or condition itself. (Sometimes applied more generally to failure to apply a standard of care to prevent harm

8


or intervene in a disease process when such treatment or intervention is appropriate.) 3. Active Error: An error that occurs at the level of the frontline operator, usually visible, usually attributable to a human, whose effects are frequently immediate. Active errors are most commonly named as primary causes of harm, since they are visible and proximate to the moment of harm. 4. Latent error: Problems in an organization’s design, maintenance, or training that exist prior to any particular moment of harm, that set the conditions that make harm more likely. These problems are less visible than active error and exist prior to the moment of harm but may nonetheless contribute to the likelihood of injury. 5. Near-miss: An action which set into motion conditions that might otherwise cause harm, but didn’t. Recognition of near-misses can identify likely sources of future harm. B. Models of Safety and Error 1. Reason’s “Swiss-cheese” model5,6: a) Describes active and latent error. b) Recognizes human and system causes of error. c) Describes characteristics of high-reliability organizations: (1) Acknowledge and plan around human variability and fallibility. (2) They anticipate the worst and are organized to deal with failure. (3) Planning for failure helps avoid harm when failures occur. 2. Vincent organizational accident model: evaluates the task, the team, the work environment, and the organization.8 3. Haddon injury prevention matrix: examines the host, the vector, and the environment in three phases: pre-event, event, and post-event.9,10 4. Helmreich: Crew Resource Management. Focuses on teamwork, interpersonal interactions, and management of team.11,12 5. Failure Modes and Effects Analysis (FMEA): Prospectively examines likelihood of failure and designs (or redesigns) system to minimize the potential for harm.13-15 Should an injury or failure occur, the FMEA is reviewed to determine all factors that functioned as intended and determine what was wrong with the system design.13-15 6. No single model has been accepted for Medicine. (1) Concept of safety in Medicine is relatively new and evolving. (2) Medicine is a complex system; difficult to isolate one model that works for all types of error. (3) Many of the models overlap. (4) Many of the safety models are based in disciplines outside of medicine, thus few people are familiar and comfortable with them. (5) For now, it is useful to view multiple models and see what they offer to our setting. Perhaps they are all relevant for different problems, different cases. C. Analyzing adverse events and medical mishaps a) Root cause analysis.7 Currently, the accepted (and required) method of analyzing medical harm in hospital settings.

9


b) Adverse event analysis should include: (1) Recognizing the human component of error (2) Identifying system/latent component of error (3) Planning change to prevent harm (4) Tracking changes to confirm that suggested reforms have been made, and that they impact risk reduction.

2.3 Teaching Methodology Content can be covered in a one hour interactive small group setting. Begin with presentation of a sample case. Review the content above and then re-analyze the case presented at the beginning, allowing the audience to expand the investigation to include a more thorough search for contributing causes. (In this case, medications were stocked in alphabetical order, without regard to nature of drug. Pepcid was side-by-side with Pavulon. A good investigation will uncover other system problems. Allow the group to be creative. Vary the case to include a variety of problems and solutions.) If time allows, take a more detailed case and allow the audience to review a wider variety of issues. Break into small groups, giving each group a different model to apply to an adverse event (Reason/Vincent, Haddon, Helmreich, and FMEA). After a few minutes, have the groups share their results. Did the method of analysis affect the conclusions reached? Allow time to include strategies for solving problems that are identified. How could an individual make such error less likely? How can the system be designed to make such error unlikely? Can error be made more visible, more likely to be detected, and more likely to be rescued?

2.4 Recommended Reading Reason J. Human error: models and management. BMJ. 2000;320:768-770. Vincent C, Taylor-Adams S, Stanhope N. Framework for analysing risk and safety in clinical medicine. BMJ. 1998;316:1154-1157. Brasel KJ, Layde PM, Hargarten S. Evaluation of error in medicine: application of a public health model. Acad Emerg Med. 2000;7:1298-1302.

CITATIONS 15. Kohn LT, Corrigan JM, Donaldson MS, eds. To Err is Human: Building a Safer Health System. Institute of Medicine. Washington, D.C.: National Academy Press; 2000. 16. Handler JA, Gillam M, Sanders A, et al. Defining, identifying, and measuring error in emergency medicine. Acad Emerg Med. 2000;7(11):1183-1188. 10


17. Meurer S. Patient Safety Term Glossary [e-mail attachment]. The Patient Safety Discussion Forum <PATIENTSAFETY-@LISTSERV.NPSF.ORG>, item #1728 (August 22, 2001). Available at: http://patientsafety-l@listserv.npsf.org/SCRIPTS/WA-NPSF.EXE?A2=ind0108&L= PATIENTSAFETY-L&P=R11365. Accessed June 3, 2002. 18. Senders JW, Moray NP. Human Error: Cause, Prediction, and Reduction. New Jersey: Lawrence Erlbaum Associates; 1991. 19. Reason J. Human error: models and management. BMJ. 2000;320:768-770. 20. Reason J. Managing the Risks of Organizational Accidents. Vermont:Ashgate Publishing Company; 1997. 21. Joint Commission on Accreditation of Healthcare Organizations. What Every Hospital Should Know About Sentinel Events. N.p.;2000. 22. Vincent C, Taylor-Adams S, Stanhope N. Framework for analysing risk and safety in clinical medicine. BMJ. 1998;316:1154-1157. 23. Brasil KJ, Layde PM, Hargarten S. Evaluation of error in medicine: application of a public health model. Acad Emerg Med. 2000;7(11):1298-1302. 24. Haddon W Jr. A logical framework for categorizing highway safety phenomena and activity. J Trauma. 1972; 12(3):193-207. 25. Helmreich RL, Schaefer H. Team performance in the operating room. In: Bogner MS, ed. Human Error in Medicine. New Jersey:Lawrence Erlbaum Associates; 1994:225253. 26. Helmreich RL. On error management: lessons from aviation. BMJ. 2000;320:781785. 27. Hazard analysis models and techniques. In: Leveson NG. Safeware: System Safety and Computers: a Guide to Preventing Accidents and Losses Caused by Technology. Massachusetts:Addison-Wesley Publishing Company, Inc.;1995:313-358. 28. VA National Center for Patient Safety. Healthcare Failure Mode and Effects Analysis Course Materials (HFMEATM). Available at: http://www.patientsafety.gov/HFMEA.html. Accessed June 2, 2002. 29. DeRosier J, Stalhandske E, Bagian JP, Nudell T. Using health care failure mode and effect analysisTM : The VA National Center for Patient Safety’s Prospective Risk Analysis System. The Joint Commission Journal on Quality Improvement. 2002;27(5):248-267.

11


3 Module 3. Cognitive Error and Medical DecisionMaking Introduction: Emergency medicine is a discipline dense with decision-making. The Harvard Practice Study concluded that diagnostic errors (flawed decisions) are common in emergency practice1. How can we address this? How do we improve our decisions? Much of medical decision-making is undefined and poorly understood. We speak of intuition, ‘gestalt’, and clinical judgment. We think, observe, probe, and then conclude in a largely abstract and invisible manner until we come to a ‘clinical impression’. We feel strongly about our conclusions and possess a firm conviction of our viewpoint. We argue, cajole, and debate with consultants to convince them of our opinion. We work to persuade patients to agree to our plan since we are sure we know what is best for them. And then, we err. We don’t know why or how or what to do differently next time. We relive tense clinical moments and tough decisions, we speak with peers whom we respect, we study again, and we hope we learn and will do better next time. For past generations perhaps this was enough. The cultural perception that clinicians are knowledgeable, highly-trained, responsible, and caring professionals dedicated to doing their best was sufficient to engender trust and respect. The reality is, however, that harm occurs despite that, and more often than we care to admit. The old question “How many lives does it take to train a good chief resident” admits to harm from inexperience, the price we all pay in the indoctrination of medical training. Can we do better? Is there a science to decision-making?

3.1 Goals and Objectives 12.

Understand that a significant amount of error in Emergency Medicine occurs from cognitive error and flawed decision making. Although retrospective critiques of medical decisions are subject to hindsight bias, significant improvements in health care can be achieved by optimizing medical reasoning.

13.

Learn the five steps in medical decision-making in the hypothetico-deductive model.

14.

Learn basic statistics that are useful in selecting and interpreting diagnostic tests.

15.

Define and understand prevalence, sensitivity, specificity, pretest probability, posterior probability, and Bayesian analysis.

16.

Learn that “specialty bias” affects how clinicians of different specialties have different perspectives of disease and differing approaches.

12


17.

Learn the concept of “thresholds” and how it influences diagnostic and treatment endpoints.

18.

Learn definitions of “heuristics” and “cognitive bias” and how these concepts influence medical decisions.

19.

Learn about strategies for minimizing harm from cognitive error.

20.

Learn and appreciate how system design can affect cognitive error and explore what system changes can optimize decision-making.

21.

Understand what “cognitive forcing strategies” and “forcing functions” are and how they can be employed to prevent some types of error.

22.

Demonstrate understanding by applying these concepts in a medical error you have observed.

3.2 Content Outline III. THE SCIENCE OF MEDICAL DECISION-MAKING A. Awareness of our Thought Process in Decision-making: Metacognition2 1. Most of the focus during early medical training is spent studying disease and pathophysiology. 2. The skills required to apply that knowledge base are presumed to be acquired somewhat naturally during apprenticeship and mentoring. 3. The process of clinical decision-making is learned largely in clinical settings by interacting with mentors. The abstract and mostly ill-defined process of making decisions gives the appearance that decisions are highly variable and sometimes arbitrary. Senior physicians argue that their decisions are ‘based on my experience” and “in my best judgment”, giving the student little guidance or instruction in the internal process of decision-making. 4. If we are to reduce errors in diagnosis and medical decisions, we need to recognize, define, and teach decision-making strategies. If we are to understand how we err, we must understand how we think. B. The Hypothetico-deductive Model of medical reasoning.3,4 Describes the five steps of making a diagnosis and selecting treatment. Errors may occur in any one of these steps. 1. Hypothesis Generation: Within minutes of a patient encounter, the physician takes cues from the general appearance of the patient and attempts to determine severity/acuity of illness; then looks for general patterns of illness. A differential diagnosis is considered, usually with one or more leading theories. 2. Hypothesis Refinement: The physician begins to gather data to test the theory. Further questions are asked; a plan for testing is made. The process of confirming and eliminating diagnoses begins. 3. Testing the Hypothesis: Having chosen a diagnostic strategy, the results are interpreted. 13


4. Causal Reasoning: Based on the information gathered, the clinician attempts to assimilate the clinical condition into one or more diagnoses. 5. Diagnostic Verification: A conclusion is reached. A working diagnosis is made and forms the basis for a management plan. 6. Errors: Understanding the steps in diagnosis can serve as a basis for understanding how and why errors occur in the process. Mistakes can occur at any of these steps. a) Knowledge gap or inexperience b) Failure to recognize a disease pattern. c) Misinterpretation/misapplication of diagnostic testing. d) A simple category of error was described by Elstein5: Diagnostic failure/error can occur either by: (1) an inaccurate estimate of pretest probability/inaccurate estimate of prevalence (which would occur during hypothesis generation), or (2) an inaccurate estimate of the strength of the evidence (hypothesis testing and hypothesis refinement). C. Basic Medical Statistics: Normative principles that guide the selection of diagnostic tests, the interpretation or tests, and medical decision-making.6,7 1. Basic statistics are valuable in understanding and applying diagnostic tests. The probability of disease and the application of diagnostic testing can be subject to objective statistical reasoning. 2. Knowledge of basic statistics should guide the interpretation of tests and the establishment of diagnostic and treatment endpoints. 3. Important concepts to master: a) Disease prevalence b) Pretest probability, Posterior Probability, Baye’s theorem c) Sensitivity, Specificity, Positive and Negative predictive values. 4. Errors occur when: a) Risk of disease is inaccurately estimated because of a faulty estimate or disregard of disease prevalence. b) There is misapplication or misinterpretation of diagnostic tests. c) The potential value of treatment or the risk of withholding treatment is not accurately assessed. The relative value and risk of different treatments may be misunderstood. D. Decision Analysis: Decision trees and elaborate decision analysis have been developed and applied in limited settings.8 This approach considers diagnostic and treatment options, possible outcomes, and the probability of each outcome. Has had limited application in emergency settings. a) Difficult to consider all patient considerations b) Can’t be completed in the time constraints of patient visits in the Emergency Department. c) Not yet automated E. Evidence-Based medicine: the application of current medical literature to answer clinical questions.9,10 Evolving, but challenged by the practical difficulty of accessing the information in a timely way sufficient to influence actual patient care.

14


F. Practical applications of decision analysis and evidence based medicine are possible. There are a number of helpful shortcuts that can guide bedside management. 1. A thoughtful review of the interpretation and use of common tests can be done in study outside the clinical arena and applied on the spot clinically without extensive analysis. 2. Although elaborate decision analysis is impractical in the clinical arena of a busy ED, the same knowledge and skills can be applied by experts to develop treatment guidelines and algorithms that can be used to simplify decisionmaking at the bedside. 3. Concensus statements, “best-practice” guidelines, prediction rules, and clinical decision rules help simplify common decisions. G. Specialty Bias: Each medical specialty has its own rules that may differ from other disciplines. These approaches are valid for the defined patient population and clinical setting for each discipline. Over time we have even evolved separate cultural identities. We act differently, speak differently, and even dress differently. We serve different functions for different patients in different settings at different stages of their lives. Our patients are filtered to us by different mechanisms with a variety of screens. These differences affect our pattern of practice and our methods for decision-making. We should note how these differences affect our style of decision-making, and our biases. 1. The approach to patients in the Emergency Department is affected by acuity, urgency, and indecision. 2. The “Emergency Medicine approach” to hypothesis generation is driven by the need to make sure that nothing “bad” is present. 3. Our aim is not so much to be “accurate” as to come to a safe endpoint. We ignore statistics on disease prevalence to rule-out the less likely but more threatening risks. 4. These characteristics affect all stages of the diagnostic workup and decisionmaking in Emergency Medicine. Although this approach is widely accepted, we should recognize our own bias, how it differs from other disciplines in Medicine, and in what way it poses its own risk. H. The Concept of Thresholds3: In Emergency Medicine, clinicians function in the face of uncertainty, acuity, and time pressures. Most decisions must be made rapidly and often with incomplete data. There may not be sufficient time or data to form strong endpoints or diagnostic conclusions. In the face of uncertainty and time constraints, we rely on less formal diagnostic criteria, driven not by decisive diagnoses, but rather, by thresholds. 1. Threshold to Test: When the risk of harm from a missed diagnosis is high, and our ability to predict that condition on clinical grounds alone is poor, we may have a low threshold to test, even when we believe the condition does not exist. E.g. current standard for admitting low risk chest pain; obtaining a CT scan in a well-appearing patient with a bad headache to rule out a sentinel bleed from an aneurysm.

15


2. Threshold to Treat: When the consequences of an untreated condition are bad and the risk of significant complications of treatment low, we may chose to treat prior to establishing a definitive diagnosis. 3. Threshold to Admit: When a patient’s condition merits the Emergency Medicine endpoint may be a decision to admit. Simply put, the patient is sick. We may reach a diagnostic endpoint; we may not. We don’t necessary reach a diagnostic endpoint prior to reaching a disposition endpoint. This does not minimize our role to triage; sometimes the patient is better served by not stating a definitive diagnosis. 4. Threshold to Discharge/Refer. We don’t always reach a final or definitive diagnosis. After ruling out severity and acuity, left with a nonacute or nonspecific process, we may conclude that there is nothing to be gained by further ED investigation. We reach an endpoint to discharge with followup care or referral. 5. We choose to optimize risk for the patient, with less regard for diagnostic accuracy. 6. Unlike some disciplines, we must reach some immediate endpoint. We are forced to declare a diagnosis, or back down and simply declare a disposition. I. Heuristics.2,3,5,11,12 Two main characteristics distinguish Emergency Medicine practice apart from most medical specialties: uncertainty, and time-pressured decision-making. To meet the demand to act decisively, clinicians rely on heuristics (short cuts and rules of the thumb) to guide many clinical decisions. This skill is a strength in some settings but can be one source of error. The ability to act quickly in an emergency is a benefit afforded by heuristics; however, the application of heuristics can be flawed and lead to errors. 1. Algorithmic approaches are one example. Simplified schemes can help clinicians function quickly and decisively when stressed. Team members may function better when the algorithm is ‘shared knowledge’ such as ACLS. However, the heuristic approach can fail if it fails to address a significant problem. E.g. cardiac arrest due to tension pneumothorax might be missed by routine ACLS protocols. 2. Simple “if , then” rules help guide reactive behavior in stressful conditions. They should serve to guide initial actions, but be replaced by more thoughtful action as time allows. J. Cognitive Science: Cognitive psychologists have attempted to understand normal human cognition and classify cognitive errors.2,4,5,11-14 Can they help us understand how we make errors in medicine? Some errors occur because of faulty information processing, explained in part by cognitive biases. Faulty cognitive patterns have been observed in everyday life, are a part of being human, and may impact clinical decision-making. In the face of unknown probabilities, unlimited diagnostic possibilities, and nonspecific disease presentations we function largely in the realm of uncertainty. Cognitive psychology looks at how people perceive probability and how our own intuition can mislead us. The science of cognitive psychology and the understanding of cognitive bias can help us understand how we think and how we err. Awareness of these cognitive biases may help avoid some errors in medicine.

16


1. Availability Bias2,5,11,12,14: The likelihood of a specific diagnosis will be perceived to be greater if specific instances with that condition are easily brought to recall (e.g. recent lecture or conference on that topic). This is even more so if there is “salience” to the recall, that is, a dramatic or personal experience with a condition. In our institution, the workup of patients has been anecdotally observed to vary after intense M&M discussions of missed diagnoses. None of these factors alter the likelihood of disease in any given patient, yet it does influence our management of individual patients. 2. Representativeness Bias2,5,11,12,14: The judgment that a condition exists because of the way it resembles that condition. Based on a clinician’s experience, the closer a condition resembles a pattern, the more likely he will pursue that diagnosis. The limitation of this bias is that it fails to consider prevalence and prior probability. The teaching edit, “If it looks like a duck, walks like a duck, and quacks like a duck…it must be a duck” will fail you if you live some place where ducks have not been spotted for many years. A common teaching acknowledges this principle: an atypical presentation of a common condition is more likely than a typical presentation of an uncommon condition. 3. Confirmation bias2,5,14: Clinicians tend to seek information to confirm their initial impression, even though that data may not alter the probability of disease. Physicians may be misled towards thinking that a series of similar tests adds diagnostic certainty. They might be better served by testing to exclude other possible, perhaps also likely, conditions. a) Failure to do so is also referred to as “vertical thinking”; “horizontal thinking” expands on diagnostic options and is more effective at ruling out other conditions. b) A type of confirmation bias is commonly seen in young medical students in Emergency Medicine. Given an undifferentiated patient with chest pain, they may ask a series of questions seeking information about cardiac disease. Their “chest pain history” may be overly focused to elicit angina and acute coronary syndromes. Regardless of how the patient answers the questions, if only questions about angina are asked, even one or two positive responses will end with a concern for heart disease. A few questions to rule out alternative diagnoses will better serve the clinician. 4. Search Satisficing Bias2,15: Finding one abnormality, the clinician fails to search for others. a) e.g. A car approaches a railway intersection. The red lights are flashing, the guardrails are down, and the driver notes a train approaching from the right. After a long delay, the train passes, but the warning signals persist. Impatient from the delay, the driver maneuvers around the rails, past the flashing signals, and is struck from an unexpected train coming from the other direction. b) Common examples in Medicine include failing to note a second fracture, distracted by the first; failing to search for co-ingestants in poisonings,

17


5.

6.

7. 8.

9.

etc.16 Having found one abnormality, it is common to let one’s guard down and stop searching or noticing other abnormalities. Anchoring Bias2,12: When estimating the probability of a given event, people usually begin with an estimate, a starting point, and then adjust or refine their estimate based on additional information. The endpoint is influenced by the starting point. a) When students are asked to estimate the final product of a series of numbers, students starting with the larger numbers will end with a larger final product estimate than those who started with the smaller numbers first, even when the final products are actually the same.12 b) In Medicine, this concept has been applied to initial diagnostic impressions, when clinicians first assess the diagnostic possibilities in a patient. Within seconds to minutes, the clinician makes some estimate of disease severity, acuity, and derives leading diagnostic considerations. This initial impression ultimately influences the final diagnosis. c) The ability to focus and anchor early on in patient assessment is viewed as a desirable trait in Emergency Medicine; it can lead to predictable flaws and errors if the clinician fails to recognize or adjust appropriately if subsequent information and clinical data are not appropriately considered to counter a wrong first impression. Premature diagnostic closure3: reaching a diagnostic endpoint with unfounded certainty, then failing to assimilate additional data that contradicts that endpoint. a) E.g. A patient presents with wheezing, is “labeled” asthma and improves with â-agonist therapy. A focused history fails to expand on night-time symptoms( orthopnea) and progressive exertional dyspnea. The physician performs an exam with an intent to confirm findings consistent with asthma, fails to notice JVD, S3, and peripheral edema. A chest x-ray is done to rule-out an infiltrate and shows cardiomegaly. An overly-focused approach leads to anchoring on the wrong diagnosis. The patient improves on asthma treatment, is sent home, but returns in worsening CHF a day later. When done in a time-pressured environment anchoring may lead to premature diagnostic closure. b) Premature diagnostic closure can also occur when the clinician seeks the most available or most convenient diagnosis. Zebra retreat4,17: A diagnosis is considered but not pursued because the physician may wish to avoid an unfamiliar diagnostic path. Omission bias5,18: Physicians tend to favor inaction when risks are great, even if the risk of not acting may be greater. If treatment is risky, we tend to favor letting the natural disease state cause harm than risk causing harm ourselves. Outcome bias5,19: The quality of a decision is often judged by clinical outcome. In fact, bad decisions may result in good outcomes; good decisions can result in bad outcomes. Judging the quality of decisions based on outcome may be intuitively sound, but is flawed and biased. This is one factor in hindsight bias.20 We are more critical of decisions when the outcome is known to be poor.

18


10. Prevalence bias2,21: A mistake based on misjudging the true base rate of a disease. Occurs when one weights a diagnostic possibility higher than expected for the likelihood of the population. Analogous to searching for a zebra in a herd of horses. Could also apply to failing to consider a diagnosis that is more likely in favor of one less prevalent in the general population of patients with similar characteristics. 11. Conjunction fallacy2,11: The likelihood of two or more independent conditions occurring is overestimated by mistakenly linking them in a cause-and–effect relationship. K. Avoiding Cognitive Error: Each of the cognitive biases can account for error. However, at times the strength of heuristics and cognitive bias can help us function. The ability to focus and anchor early with the impression of an acute MI allows us to proceed definitively towards thrombolytics. Our recognition of that bias can keep us vigilant in looking for alternative diagnoses.16 How do we address cognitive errors?16 1. Reduce cognitive load. a) Simplified diagnostic and treatment protocols drawn up in advance and used systematically. (1) Drip books for resuscitation medications to avoid calculations at the bedside under stressful conditions. (2) Institutional and interdepartmental guidelines for the workup of difficult, problematic, or common conditions, especially those crossing specialty boundaries. E.g. an approach to young women with RLQ pain; approach to the diagnostic evaluation for suspected pulmonary emboli or aortic dissections; conditions where there are multiple appropriate tests and multiple possible consultants; conditions where standards of care are rapidly changing and treatment is controversial. (3) Algorithms: e.g. ACLS, ATLS. (4) Clinical practice guidelines, consensus statements b) Electronic templates: to elicit pertinent history and physical findings of high risk conditions. c) Memory devices: e.g. palm pilot for drug interactions. d) Special resources: e.g. PoisonIndex, Poison Control, available specialty consultation. 2. Teamwork: shared responsibility across the team, team members crosschecking. 3. Cognitive Forcing Strategies a) Recognition of common errors and cognitive biases. Efforts to recognize and act with awareness of common errors and measures to avoid error. b) Recognition of high risk patients, high risk settings, high risk moments, high risk diagnoses. (See Module 4.) c) Cardinal rules of Emergency Medicine, the pitfalls to avoid. E.g. When one fracture is identified, always look for a second. If a woman has abdominal pain, always check for pregnancy. d) Computer systems with locking functions.

19


(1) Pediatric drugs not given without knowing the current weight of the child. (2) Nephrotoxic drugs limited when creatinine is abnormal, unless clinician acknowledges knowledge of creatinine and desire for drug. (3) List of allergies must be current; medication not released if known allergy. (4) Pharmacy gives information regarding medication interactions to physician, particularly if it influences current management. (5) System design to help decision making: future developments. 4. Find and correct system problems that contribute to error-prone settings. Some cognitive errors are really system errors. Decisions made with faulty data, incomplete data, and miscommunication are better approached by system design. 5. Optimize human performance factors. a) Affective error: Patients may elicit a visceral bias with staff. Patients who appear ‘difficult’ or ‘demanding’ can be perceived as less deserving of care. It is natural for some patients to annoy, anger, or frustrate the staff. Clinicians must be aware of this visceral response to avoid allowing affective error to influence clinical judgment. At times, individual clinicians may demonstrate a consistent pattern of reacting to specific types of patients. When affective bias is found to influence clinical judgment, specific measures need to be taken to advise and counsel the clinician. b) Personal impairment: Health care systems should be set up to recognize and address the needs of clinicians themselves. Fatigue, shift work, excessive demands, grief, and substance abuse can all impact clinicians. Individual systems should recognize the risk that impaired clinicians pose and implement strategies to care for the care-providers themselves.

3.3 Teaching Methodology In general, most of this content can be introduced in a one hour lecture. Specific cases can be used to demonstrate medical decision making and illustrate cognitive bias. The student is urged to recall specific errors and identify cognitive biases. A problem-solving session can be added when students can identify particular types of cognitive bias and strategies for minimizing harm. Students/fellows can be encouraged to pursue further work to find ways to design a safer environment to minimize cognitive load and avoid cognitive failure.

3.4 Recommended Reading Kassirer JP, Kopelman RI. Learning Clinical Reasoning. Baltimore: Lippincott Williams & Wilkins;1991. A highly readable text reviewing clinical decision-making, using cases to demonstrate error. This should be a part of every medical student library.

20


Croskerry P. The cognitive imperative: thinking about how we think. Acad Emerg Med. 2000; 7(11);1223-1231. Kovacs G, Croskerry P. Clinical decision making: an emergency medicine perspective. Acad Emerg Med. 1999;6(9):947-952. Elstein AS. Heuristics and biases: selected errors in clinical reasoning. Acad Med. 1999;74(7):791-794. CITATIONS 30. Leape LL, Brennan TA, Laird N, et al. The nature of adverse events in hospitalized patients: results of the Harvard Medical Practice Study II. N Engl J Med. 1991;324(6):377-384. 31. Croskerry P. The cognitive imperative: thinking about how we think. Acad Emerg Med. 2000;7(11):1223-1231. 32. Kassirer JP, Kopelman RI. Learning Clinical Reasoning. Baltimore;Williams & Wilkins;1991 33. Kovacs G, Croskerry P. Clinical decision making: an emergency medicine perspective. Acad Emerg Med. 1999;6(9):947-952. 34. Elstein AS. Heuristics and biases: selected errors in clinical reasoning. Acad Med. 1999;74(7):791-794. 35. The use and interpretation of diagnostic tests. In: Kassirer JP, Kopelman RI. Learning Clinical Reasoning. Baltimore:Williams & Wilkins;1991:17-28. 36. McNeil BJ, Keeler E, Adelstein SJ. Primer on certain elements of medical decision making. N Engl J Med. 1975;293(5):211-215. 37. Pauker SG , Kassirer JP. Decision analysis. N Engl J Med. 1987;316(5):250-258. 38. Sacket DL, Straus SE, Richardson WS, et al. Evidence-Based Medicine: How to Practice and Teach EBM. 2nd ed. New York,NY:Livingstone; 2000. 39. Corrall CJ, Wyer PC, Zick LS, et al. Evidence-based emergency medicine. How to find evidence when you need it, Part 1: databases, search programs, and strategies. Ann Emerg Med. 2002;39(3):302-306. 40. Kahneman D, Slovic P, Tversky A. Judgment Under Uncertainty: Heuristics and Biases. Cambridge University Press; 1982. 41. Tversky A, Kahneman D. Judgment under uncertainty: heuristics and biases. Science. 1974;185:1124-1131.

21


42. Redelmeier DA, Ferris LE, Tu JV, et al. Problems for clinical judgment: introducing cognitive psychology as one more basic science. Can Med Assoc J. 2001;164(3):358360. 43. Dawson NV, Arkes HR. Systematic errors in medical decision making: judgment limitations. J Gen Intern Med. 1987;2:183-187. 44. Simon HA. Reason in Human Affairs. London:Basil Blackwell; 1983. 45. Croskerry P. Cognitive Forcing Stategies in Clinical Decision Making. Presented at the annual meeting of the Society of Academic Emergency Medicine; May 8, 2001; Philadelphia, PA. 46. Croskerry P. Avoiding pitfalls in the emergency room. Can J Contin Med Educ. 1996;Apr:1-10. 47. Elstein AS, Holzman GB, Ravitch MM, et al. Comparison of physicians’ decisions regarding estrogen replacement therapy for menopausal women and decisions derived from a decision analytic model. Am J Med. 1986;80(2):246-58. 48. Gruppen LD, Margolin J, Wisdom K, et al. Outcome bias and cognitive dissonance in evaluating treatment decisions. Acad Med. 1994;69(10 Suppl):S57-S59. 49. Caplan RA, Posner KL, Cheney FW. Effect on outcome on physician judgments of appropriateness of care. JAMA. 1991;265(15):1957-1960. 50. Tversky A, Kahneman D. Availability: a heuristic for judging frequency and probability. Cognit Psychol. 1973;5:207-232.

22


4 Module 4. Learning From the Experience of Others 4.1 Goals and Objectives 23.

Recognize that much of medicine can be learned by sharing the experiences of others.

24.

Learn the types of errors reported commonly by others in your practice setting.

25.

Know and be able to list high risk diagnoses, high risk patients, high risk settings, and critical actions.

26.

Learn types of injury patterns that are frequently missed or difficult to detect.

27.

Learn the most frequent causes of malpractice cases and the type of harm that society views as negligent.

28.

Learn, adopt, and demonstrate a lifelong habit of considering the pitfalls of diagnosis and management for each condition you see, the pathophysiology of error.

4.2 Content Outline: Experienced clinicians report types of cases that are commonly missed or difficult, and errors that lead to harm. This experience should be shared as a valuable knowledge base. We should not all have to learn the same lessons by making the same errors. IV. Common Errors: Learning from the Experience of Others A. High Risk Diagnoses not to miss. Diagnoses that must be established in the ED to avoid harm from delay. This is a list we teach our residents. These are the “must know� diagnoses of Emergency Medicine. 1. Acute MI, especially difficult with atypical chest pain and nondiagnostic EKGs. 2. Pulmonary embolus 3. Aortic disasters: aneurysms, dissections, trauma. 4. Pericardial tamponade 5. Tension pneumothorax 6. Acute upper airway obstruction, either impending, or conditions that have potential to obstruct. 7. Acute arterial insufficiency, embolic or thrombotic, including mesenteric ischemia, acute emboli to extremities. 8. Acute compartment syndrome, with fractures or crush injury. 9. Acute neurological deficits, especially if progressive. Includes evolving strokes, subarachnoid hemorrhage, intracranial hemorrhage, increased

23


B.

C.

D.

E.

F.

intracranial pressure with tumor, edema, pseudotumor, hydrocephalus, as well as medical causes such as Guillion Barre, botulism. 10. Acute spinal cord injuries; cord compression. 11. Intraabdominal or retroperitoneal hemorrhage. 12. Perforated viscous. 13. Life-threatening infections: Meningitis, Sepsis, Fournier’s gangrene, necrotizing fasciitis. 14. Testicular and ovarian torsion. 15. Ectopic pregnancy. 16. Metabolic emergencies: hyperkalemia, hypoglycemia. 17. Acute adrenal insufficiency. 18. Acute narrow angle glaucoma. Critical Actions Not to Miss 1. Antibiotics in the septic patient 2. Steroids in the hypotense steroid-dependent patient. 3. Thrombolytics (or alternative) in acute MI. High Risk Patients 1. Extremes of age: the very young and very old. 2. Asplenic, neutropenic, or immunocompromised. 3. Anyone with altered mental status (delirium, dementia). High Risk Moments 1. Transitions between shifts, new teams. 2. System in crisis: change in lab, shortage of techs, times when the system is likely to fail. 3. July, in academic centers. 4. Patient transfers: to other departments, between institutions. Common errors based on Injury Patterns and Special problems 1. Clenched fist injuries 2. Human and animal bites 3. Alkali burns 4. Hydrofluoric acid burns 5. Disk battery ingestion 6. High velocity deceleration injuries: aortic injuries 7. Injury patterns in falls: LS fractures, calcaneal fractures 8. Lapbelt injuries: duodenal hematomas, Chance fractures, pancreatic injuries, mesenteric avulsion. 9. Quiet poisons: asymptomatic early on or easy to miss a) Acetaminophen b) Salicylates c) Carbon monoxide Common Misses in Radiology Studies: fracture patterns that are commonly missed or difficult to visualize. 1. Scaphoid fracture 2. Scapholunate dissociation 3. Radial head fracture 4. Lisfranc fracture/dislocation

24


G.

H.

I.

J.

5. Calcaneal and talus fractures 6. Tibial plateau fractures. 7. Spinal fractures are commonly multiple. Finding one fracture should mandate a thorough search for a second. Malpractice Cases, illustrating risk.1 The leading causes of malpractice cases, in order of decreasing frequency. 1. Missed fractures 2. Wound care complications, particularly foreign bodies. 3. Missed MI 4. Abdominal pain 5. Missed meningitis 6. Spinal cord injury 7. Subarachnoid hemorrhage, CVA 8. Ectopic pregnancy. Causes of Preventable Iatrogenic Cardiac Arrest of hospitalized patients2 1. Medication errors, drug toxicity, unintended drug side effects. Most common medications involved were Digoxin, antirhythmics (especially those that prolong QT), KCl, Haldol, and Lidocaine. 2. Complications of invasive procedures (see Module 5.) 3. Failure to address abnormal lab values: a) Anemia b) Hypokalemia c) Hypoglycemia 4. Inadequate response to change in patient’s clinical status. Physicians need to develop a life-time approach to learning, conscious of error patterns and pitfalls to avoid. At least one Emergency Medicine Text has included common pitfalls with the description of most diagnoses.3 Tools to help avoid errors: 1. Electronic templates 2. Computer memory aids 3. Individual system aids 4. Consensus statements 5. Practice guidelines 6. Institutional measures, such as interdepartmental guidelines to sort through difficult and problematic cases. 7. M&M conferences to guide problem-solving, identify diagnoses that are difficult, system problems. 8. Anecdotes. E.g. Lancet series on “Uses of Error�. 9. Simulations of high risk situations. 10. Oral exam practice, as one method of mentally rehearsing tough clinical scenarios.

4.3 Teaching Methodology A one hour lecture on above content will familiarize the student with common scenarios in which harm is likely.

25


Stressful, acute, and high risk scenarios can be taught by simulation. Although high fidelity simulation is ideal, even a lower tech version of bedside “resuscitation drills”, disaster drills, and moulage can be used to recreate a realistic clinical encounter. This can be done in teams, or individually. It can be acted out or visualized mentally. The reenactments can help students rehearse actions as they’d like to perform until they have sufficient practice to feel comfortable. They instructor can intensify the experience or vary it to achieve a more sophisticated problem or difficult scenario. Common errors can be introduced to test the student. E.g. Another actor in the simulation can query the student to explain their actions or challenge them to try an alternative action; system problems can be introduced to test the student’s ability to detect and adapt to error-prone settings.

4.4 Recommended Reading Lancet “Uses of Error” series. Academic Emergency Medicine “Profiles in Patient Safety” series. CITATIONS 51. Freeman L, Antill T. Ten things emergency physicians should not do unless they want to become defendants. American College of Emergency Physicians Foresight: Risk Management for Emergency Physicians. September 2000:1-11. 52. Bedell SE, Deitz DC, Leeman D, et al. Incidence and characteristics of preventable iatrogenic cardiac arrests. JAMA. 1991;265(21):2815-2820. 53. Harwood-Nuss A, Wolfson AB, Linden CH, et al, eds. The Clinical Practice of Emergency Medicine. 3rd ed. Philadelphia: Lippincott Williams and Wilkins; 2001. OTHER SUGGESTED READING Ely JW, Levinson W, Elder NC, et al. Perceived causes of family physicians’ errors. J Fam Pract. 1995;40(4):337-344. In their own words, what physicians think caused their errors.

26


5 Module 5. Complications From Invasive Procedures 5.1 Goals and Objectives 29.

Understand the risks inherent in invasive procedures.

30.

Learn a nine-step approach to procedures to avoid harm.

31.

Demonstrate use of this knowledge by applying it in a case study.

32.

Develop a life-long skill of optimizing results in procedures by minimizing complications.

5.2 Course content V. COMPLICATIONS FROM INVASIVE PROCEDURES Introduction: A stable medical patient is admitted with a moderate pleural effusion, CHF, and hypokalemia (K+ 3.0). Thoracentesis is performed without apparent incident. Fifteen minutes after the procedure is completed he is found unresponsive, in cardiac arrest. The “code team” is called and “routine ACLS” is performed without success. The patient dies. The case was reviewed extensively at a Mortality and Morbidity conference and by a departmental Oversight/Quality Assurance review. All the reviews focused on failure to correct hypokalemia as the root cause of the death. A multidisciplinary committee later critiqued the case; they were the first to question and find that no one suspected or evaluated the patient for a pneumothorax. What went wrong? A. Complications from surgical and invasive procedures are a leading cause of iatrogenic injury.1 B. Medical procedures are frequently taught informally, lacking rigor and consistent standards. Many bedside procedural skills are taught by housestaff, to housestaff, in a traditional “see one, do one, teach one” approach. C. Nine steps to minimize error and reduce risk of complications with procedures 1. Indication: know the indication for the procedure, including alternatives and risks. Consider alternative techniques to minimize the risk for each patient. 2. Contraindications: consider the relative and absolute contraindications for the procedure. 3. Know the possible complications. 4. Preparation: Prepare the patient to optimize the result. 5. Perform the procedure, applying techniques to minimize the risk of complications. 6. Post procedure care should include steps to monitor the patient for common complications. 7. Know how to recognize a complication should one occur. This includes ancillary help and those who monitor the patient in your absence.

27


8. The team responsible for the procedure must know how to act if a complication is detected. 9. The system should be designed to facilitate detection and treatment of complications. 10. There is a formal knowledge base for procedures. Students should be familiar with good basic textbooks that cover this content.2 D. Consider the case presented in the introduction. How could the individuals change their habits to avoid future complications? How could the system be changed to better detect and handle complications? In the example given, discussion with the department uncovered deficiencies in the manner in which procedures were taught and monitored. The physicians who performed the procedure were not adequately trained to do the procedure, and did not recognize the possible complications. The nurses were not part of the plan; they were not notified of any need for increased monitoring after the procedure and had no formal training to identify complications. The students should be given time to develop plans to optimize the system response to this error, including response times from radiology, better communication with “code teams�, etc.

5.3 Teaching Methodology A brief lecture will introduce students to the fact that invasive procedures are a common source of iatrogenic harm, and a nine step process to minimize procedural complications will be reviewed. The students will be given the chance to analyze, critique, and solve a case involving an adverse event. This content is best conducted in a small group format to facilitate discussion and interaction. Additional time could be spent reviewing individual procedures, their complications, and risk reduction strategies. A portion of this could be done as independent study, with students assigned a variety of procedures to study. Another alternative is to allow the students to take the role of a hospital administrator and plan system reform to minimize risk from procedures.

5.4 Recommended Reading CITATIONS 54. Leape LL, Brennan TA, Laird N, et al. The nature of adverse events in hospitalized patients: results of the Harvard Medical Practice Study II. N Engl J Med. 1991;324(6):377-384. 55. Roberts JR, Hedges JR, eds. Clinical Procedures in Emergency Medicine. Philadelphia; WB Saunders Company; 1985.

28


6 Module 6. Medical Error from a Systems Perspective 6.1 Goals and Objectives 33.

Review and understand Reason’s “Swiss-cheese” model of error and the concept of active and latent error.

34.

Understand how difficult latent error is to see yet how important it is to remedy.

35.

Define the “system” and how it can contribute to harm.

36.

Learn and describe the characteristics of high reliability organizations.

37.

Know and be able to list features of high reliability design.

38.

Understand the role of teamwork in patient safety and the role of teamwork failure in medical error.

39.

Understand the need for improved information networks in medicine to improve care between teams.

40.

Learn the common causes of medication errors and strategies to avoid them.

41.

Learn the role of equipment failure in medical error and possible solutions.

42.

Learn how system design can impact risk for error.

43.

Understand and list changes that are necessary to improve health care delivery in complex systems.

44.

Develop a lifetime pattern of looking for and understanding error to improve the system where you work.

6.2 Content Outline VI. THE ROLE OF THE SYSTEM IN MEDICAL ERROR A. Models of Medical Error: 1. Review Reason’s “Swiss cheese” model and concept of latent and active error.1 2. Deming’s red bead demonstration2 B. Medicine and Health Care Delivery: Where we are now. 1. Our “system” for health care delivery has evolved out of need. a) Highly dependent upon individuals to “make it work”. b) Individuals typically identify and patch broken areas of the organization. c) The system is stressed and overloaded. 29


d) There are lots of separate “pods” of care, loosely connected by informal communication networks. 2. The state of Emergency Medicine: a) High volume b) Unpredictable workload c) High acuity d) Noise e) Many distractions in a climate of loosely controlled chaos. f) Little reserve to expand in case of disaster. C. The Role of System Failure in Medical Error. Where things can (and do) go wrong. 1. Essentially any part of the system can “break” and ultimately contribute to risk and harm. On any one day there are probably a number of parts of the system that are not fully functional. 2. A lot of things can be broken (latent defects) before harm results. After a while no one views these broken pieces as unusual. Humans develop behaviors to compensate for the chronic system flaws. Over time, we hardly notice them. 3. Since the “brokenness” is common and usually distant from the sharp point of error and the moment of harm it is easy to dismiss the relationship of system problems to any given point of harm. 4. Health care providers are highly responsible, conscientious and quick to accept blame. The medical “culture” promotes individual accountability and is suspicious of any one looking beyond the sharp point to find other causes, as if looking for someone or something else to share the blame. 5. The more distant a system flaw is from the sharp point, the more likely it contributes to a wider population at risk. Even those who “fix” the system may not see how it impacts any individual patient. The role of a system flaw to any one event and its significance to future risk is frequently overlooked. 6. A more reasonable approach to improvement is to look for all contributing causes to harm in an attempt to minimize risk for all patients, current and future. D. What makes up “The System”? The components that make Emergency Departments and hospitals function: 1. Staffing 2. Equipment 3. Medications 4. Laboratory 5. Radiology 6. Other specialty diagnostic services 7. InPatient beds 8. Blood Bank 9. Patient Treatment services: a) Operating room b) The cath lab c) Dialysis center

30


d) Interventional Radiology e) Radiation therapy 10. Consultants 11. Pharmacy 12. Emergency Medical Services, paramedics 13. The Phone, paging systems and other communication modalities 14. General operating policies, and more Before going on, let’s take a case and see how system problems can impact risk. It’s a holiday weekend. A system change is about to take place. The hospital lab is changing its reporting mechanism. For a one to two day period, all results will be reported by phone until a new computer system is installed. The change has been scheduled for a three-day weekend, assuming that the hospital will be quieter than normal and the transition will be easier then. These assumptions are wrong. No changes have been made in the lab staffing to accommodate the transition. The recipe for a disaster is struck. The ED is packed and several very ill patients occupy much of the staff’s attention. A young well-appearing woman presents with ‘nonspecific’ complaints. She has no prior medical problems. She is seen by an ED resident who is puzzled by her presentation; she doesn’t fit a pattern he recognizes. He scribbles a few basic orders for “routine labs” and rushes back to the more acutely ill patients. While he is busy with a resuscitation, the rest of the ED fills up with a number of patients. The entire ED is now backed up. Everyone is overwhelmed. He briefly revisits her when things calm down and notes a CBC result that surprises him (very low Hgb and low platelets). Not sure how to explain it, he once again looks for a pattern he can recognize, then orders another diagnostic test that requires the patient to be moved to Radiology; not a bad move, since it frees another bed in the ED and allows him some distance from a tough diagnostic problem he can’t figure out. Maybe he can do better when he clears other simpler cases out of the ED. Eventually the shift comes to an end. He still doesn’t know what to do with his diagnostic dilemma. He discusses the results with the ED Attending, and together they find a reason to admit (symptomatic anemia) although they do not find a unifying diagnosis that accounts for her presentation. They assure themselves that the inpatient team will be able to spend more time pursuing an appropriate workup. The resident places the results of the CBC on the patient chart, uses a short-hand abbreviation typically used for CBCs, but leaves 2 portions of the icon blank. The recorded results are abnormal, and the blanks give reason to question just what this record means. There is no formal report from the lab. Part of the transition means that there will be no written or electronic records of lab results; all labs have to be verified by phone. By late in the day, it’s clear that the lab can not handle the number of phone calls necessary to verify lab results. The lab has not been staffed with additional people to handle the calls, the lab workers are caught in a predicament. If they answer all the phone calls, they can’t do their usual

31


work. If they perform their usual tasks, they simply can’t answer all the calls. Eventually, they can’t keep up with the lab tests, and have fewer and fewer results that they could report, even if they do answer the phone. Eventually the system breaks down. Specimens are lost. No one has any confidence that they can get a specimen to the lab, track it, and get a result that is valid. Meanwhile our patient arrives on a general medicine ward. An intern sees her and notes the CBC result. He discusses the case with an Attending, but both are perplexed by the abbreviated lab result. The CBC doesn’t make sense to them either. The Attending asks the intern to check the result from the lab. The intern calls the lab, but fails to get any result. Eventually, he orders a repeat specimen. The turnaround time for labs is now growing, and the result is delayed. His shift is over, he endorses to an incoming intern to just check the result. After Midnight a nurse notifies the on-call intern of the CBC result. The on–call intern hasn’t seen this patient, but decides to order a blood transfusion. Her vital signs are ‘normal’ and he is reassured that nothing acute is happening that would merit a bedside assessment. He is busy with new admits and decides that no further action is indicated. The nurse calls the Blood Bank, but they are short staffed. A few of their staff are ‘taking advantage’ of the holiday weekend. The Blood Bank will not be able to follow through on the order until the next shift. The patient is stable and there seems no reason to argue or call for backup help. The nurse shares this information with her replacement for the next shift, but somehow the information is lost and not followed up on. Until 8 am. Now the day team is available, the holiday weekend is over, and the hospital is fully up and running. The first intern returns to find the results of the CBC, notes the patient looking a ‘little worse’, and reorders the transfusion. Meanwhile, the intern’s supervisor, seeing the full CBC result, begins to recognize a pattern that disturbs him. He calls for a Hematology consult. A few hours later, the patient is noted to be disoriented. The Hematologist orders an emergency procedure, but the patient is now too unstable to tolerate it. It’s too late. The patient dies. What went wrong? It’s true, individuals failed this patient. But so did the system. Many of the problems in this case were due to bad planning, bad communication, and a bad system. Many of the problems that occurred were predictable if someone had thought through possible scenarios. Who’s in charge here, anyway? Who set this system in place? Who was responsible for the failure in the lab? The Blood Bank? If the system is broken, how do we fix it? E. System reform: Where we need to be. 1. We need a more formal design for the organizations that deliver care and a sufficient infrastructure to support safe and reliable processes. 2. We must recognize that Medicine is a high risk industry and should operate with safety principles to assure quality care free of harm. F. We are not alone. 1. In fact, there are other high risk industries. Aviation, Nuclear Reactors, NASA and space exploration are a few examples of high risk industries that have achieved success, primarily by design of safer systems.

32


G.

H.

I.

J.

K.

2. The Anesthesia Safety Movement began a decade ago, and they have paved a way for other disciplines in Medicine. Characteristics of High Reliability Organizations3,4 1. Awareness of risks (We in Medicine have been in denial too long.) 2. Vigilance in looking for weaknesses to fix a) Anonymous reporting of near misses (Airline industries, more recently Medicine) b) Critical incident monitoring (Anesthesia, ICU)5 c) Software pirates in development of software (engineers hired to find every way possible to break new software, to help design better software). 3. Anticipate failure; design and re-design expecting failures. 4. Plan for failure with redundancy and backups. 5. Make failure visible, so it can be recognized and fixed prior to harm. a) A good example to mention is radiation injury from computer malfunction. A tech delivered the dose; the computer failed to show that dose had been received. The tech delivered 5 doses before realizing the computer display had failed.6 6. Have rescue systems in place. High Reliability Design8: Creating safety systems in Health Care Organizations. To run a profitable, efficient, and safe business, you need to: 1. Standardize 2. Simplify 3. Automate 4. Take human factors into account: a) Minimize fatigue, stress, boredom b) Promote teamwork (an interdependent network of people who are streamlined to complement one another and maximize each other’s productivity, efficiency, and accuracy). c) Reduce reliance on human memory (it’s unreliable too often) d) Reduce the need for human vigilance (it will fail when the human is distracted, distressed, or overloaded). In high risk and complex settings, 2 additional principles are important: 1. There must be a sufficient information network to allow free exchange of information needed to contribute to the team effort. 2. Communication is a high priority between team members. To re-design of system of Health Care Delivery, the most obvious areas to focus on include: 1. The individual parts of the system 2. Teamwork 3. The Information Network Between Teams/Systems/and Specialists 4. Medication Errors 5. Equipment failure Anatomy of the System 1. Recognize that any part of the system can impact care. A broken x-ray machine slows radiology turn around times and prolongs ED waits. On a good day maybe no one knows the difference. But what if someone with atypical

33


chest pain sits in the waiting room quietly waiting their turn, only to have an acute MI? Any system problem can eventually cause harm. Administrators need to understand the role that system problems ultimately play in direct patient care and the potential for harm. L. Teamwork failure 1. Health care delivery relies on the coordinated efforts of many individuals. Even if we don’t recognize it, we function as a team. As Medicine has grown more sophisticated and complex, it’s generally true that no one individual assesses a patient, considers a diagnosis, carries out a test, interprets the test, plans treatment, and administers the treatment. In fact, most medical interactions require the coordination of many individuals. As care requires more and more people, the potential for errors increase. To effectively work within such a complicated system, we need to adopt principles of Teamwork. We should emphasize “3 Cs of Teamwork”. a) Coordination of efforts to achieve the desired outcome b) Cooperation c) Communication. 2. Why is this so difficult? a) Medicine is taught as if it is administered one-on-one. The fact that health care is delivered by a team isn’t really encountered until the student enters a practice setting. b) Doctors train with doctors; nurses train with nurses; pharmacists train with pharmacists. We don’t talk to one another until we meet in a practice setting and often don’t understand each other’s role/language/style. In addition, Emergency Medicine is practiced with different teams covering shifts; other specialties function in teams that cross-cover while on call. In spite of this, there is no recognition of the need to acquire teamwork skills in medical education. c) All this contributes to what West describes as ‘structural secrecy” with little understanding between team members and failure to exchange information.7 d) The homophily principle7: nurses tend to talk to nurses; doctors to doctors. Exchanges with social networks outside our own are more formal and less open. There are subtle communication barriers between laborers in a hierarchy. Nurses may be slow to question a physician orders, and may be unwilling to share their clinical impressions until they reach a predefined point (e.g. call order). Reading nursing notes can reveal surprising concerns nurses have about patients that they will share with one another but not necessarily with the physician. This principle was thought to contribute to the Challenger disaster when concern regarding the possible o ring failure was not passed up to the people ultimately responsible for approving the start of the countdown to launch. 3. How do we implement Teamwork Strategies? a) The MEDTEAMS PROJECT8: Currently beginning protocols for building and developing teams in Emergency Medicine. Teams are responsible for

34


interacting, communicating information about the patient, sharing in the plan for the patient, and crosschecking other team members’ work. b) In the future, simulations may help assess the effectiveness of teams and improve the function of teams.9 (1) Videotaping acute care resuscitations can reveal how effective team members are, recognize weaknesses, and revise strategies for effectively working together. (2) Resuscitation drills and simulations can help team members practice together. 4. General Guidelines for Teams a) less hierarchy in decision-making b) more sharing of tasks c) cross-checking team members d) Communication is essential to function. e) Minimize the number of handoffs and team transitions. 5. Why does this matter? a) A recent review of malpractice claims cited team work failure as a significant contributor to more than half of the cases of preventable death or major permanent impairment.8 It is not surprising that a system that requires teams to deliver care would suffer if teamwork failed. b) Family Practice study showed that a common system cause of problems came from failure to adequately communicate between teams.10 M. The Information Network Between Teams and Systems10 1. A significant problem for much of medicine is getting information from one system to another, e.g. a) Amended X-ray reports b) Amended lab reports c) Reports of pathology: Abnormal Pap smears d) Consultant reports Most Emergency Departments create a plan to deal with these problems, but unless individuals create a plan, many reports can be lost. One hospital found that routine thyroid function tests frequently weren’t reported prior to hospital discharge. The results would eventually find their way to medical records and be filed with the inpatient chart. When the patient returned to clinic, the outpatient chart would be pulled. Eventually someone figured out that inpatient lab results only went to the inpatient chart; outpatient labs to the outpatient charts. Most physicians weren’t aware of how many lab results were missed over time. Much of medicine lacks a sufficient information network to adequately communication between treatment centers, and between teams of careproviders. 2. Communicating and Interacting Between Teams and Across Specialties a) As we specialize, we become suspicious and distrusting of other specialists. We each have time pressures and different priorities. At times, 35


it can appear that we think differently about similar patient problems. We may be competing within our own institution for more staff lines, more budget to work with, and more space. We may lose sight of our common goals for patients. However, ideal systems plan their approach to patients considering issues that cross specialty barriers. b) Some clinical problems demand interaction between multiple specialists in time-pressured settings. If such situations exist, preexisting protocols can take the heat off the moment and streamline care. E.g. One institution experienced a series of problems with aortic dissections. Emergency physicians haggled with a variety of consultants, never quite able to get an agreement as to the best diagnostic modality. Finally a multidisciplinary conference was held when everyone was allowed to debate a variety of approaches. Aortic dissections could be worked up with TEE, Chest CT, Spiral CT, Angiogram, or MRI. Consultants could include Cardiologists, MICU, Chest Surgeons, and Vascular Surgeons. Procedures could require regular radiology or special interventional radiologists. Clinicians who worked at other institutions were influenced by institutional variations in care. In the past, every case seemed to go a different path with lots of controversy and delays in patient management. The multidisciplinary conference led to a protocol that everyone could support and follow. Patient care was streamlined and a pathway was cleared to facilitate a more rapid and efficient diagnostic plan. N. Medication Errors4,11 1. One of the leading causes of iatrogenic harm. 2. Many causes, many solutions. Consider some examples: Case #1: A child comes to the ED for a simple lip laceration. Receives conscious sedation, found apneic and dies. Case #2: A patient is given a” low” dose of methotrexate to make his tumor more radiosensitive. The standard treatment calls for a given dose /body surface area/over 96 hours. Instead, the patient receives the full dose on each of 4 consecutive days, receiving a dose 16X in excess of intended. He develops bone marrow failure and neutropenic sepsis. Case #3: An experienced senior doctor wants to use Esmolol. He’s read about it, knows it indications and contraindications and has sound reasoning for selecting it. He calculates the dose. He asks someone to doublecheck his math since this is a new calculation for him. The dose is prepared, as ordered, and administered. Within minutes, the patient experiences an asystolic arrest. After investigating it was found that the calculation was off 100 fold. Case #4: Two patients are side by side in a resuscitation suite. Team one is preparing to intubate a patient. The second team is assessing a patient with a dysrhthmia and suspected Digoxin toxicity. Digibind has been ordered and is at the bedside. The medication nurse from team 2’s patient steps over to assist 36


Team #1 during the intubation. Meanwhile, Team #2 is concerned over the delay in administering the digibind. The team 2 doctor sees an unlabeled syringe at the bedside, asks the nurse what it is, then administers it himself, thinking it is Digibind. Patient number 2 stops breathing. The unlabeled syringe contained succinylcholine. Case #5: Two intoxicated patients are side by side in the Emergency Department. By convention, the ED administers IM Magnesium to chronic alcoholics to treat presumptively for magnesium deficiency. A nurse administers the usual dose to both patients. A few minutes later both patients both experience respiratory arrests. What happened? The hospital changed the supplier of their Magnesium, although the ED staff were not informed of the change. The new vials looked the same, but contained a more concentrated formulation. The patients arrested from Magnesium overdoses. Case #6: An agency nursing is filling in a busy ICU. There is a severe nursing shortage and several agency nurses are required to fill usual spots. An order is made to administer IV KCl. The nurse, unfamiliar with the medication, pushes the dose as a bolus. The patient arrests. 3. Possible Solutions to Medication Errors4,11 a) A centralized reporting mechanism to share medication errors and find strategies to avoid them b) Computerized order entry c) Unit dosing d) High risk IV medications should be dispensed by a central pharmacy. e) Commonly used high risk drips should come in premixed suspensions, e.g. KCl. f) Standardize prescribing rules and abbreviations. g) Standardize process for mediation dose, timing, and delivery. h) Pharmacy computer software interfaced with clinical and lab data: (1) Avoid giving nephrotoxic drugs to patients with elevated creatinine. (2) Avoid giving medications in face of allergies. (3) Avoid common adverse drug interactions (4) Checking medication doses with patient weight (especially pediatrics). (5) Avoid giving K+ if last lab result showed high K+ i) Include pharmacists on rounds with medical team, to warn of unfavorable drug interactions. j) Be aware of ‘look alikes’, ‘sound alikes’. k) Inform patients about their medications. Involve them in the process. l) In the ED: (1) Stock resuscitation meds and paralytic in separate place away from other routine meds. (2) Train teams to recognize the common drugs they use to spot errors.

37


Our Case Examples: allow the group to problem solve. Here are some possible solutions. Case #1: Protocol for conscious sedation. Require supervision of junior housestaff prior to administering. Case #2: Methotrexate and other chemotherapy orders should be controlled by a designated pharmacist and administered by a designated chemotherapy nurse. Orders should be handled through a separate computer, programmed to recognize indications and common dosages of chemotherapy agents. Pharmacist and Nurse should round with Oncologist to be familiar with patients, protocols, and discuss management strategies. Case #3: Special book with IV drips now in the department. Physicians encouraged to reference. Nurses required to doublecheck all doses in drip book. No more bedside calculations. Case #4: All medications are to be drawn up in prelabeled syringes. The question of a team member ‘changing teams’ could be discussed. Case #5: This solution is harder. Obviously, the nurse should have checked the vial, but probably had done so many times before. The manufacturer could have made the vial with the new strength appear different to draw attention to the change. Why was the change made in the first place? There was probably no indication to change the formulation, and perhaps it was simply an oversight to have been changed at all. Some might question whether or not the Magnesium should have been ordered in the first place. This case should foster good discussion from many angles. Case #6: JCAHO has solved this one. This was a not so rare problem. This is a good example of a regulatory agency that took the problem in their hands and eliminated KCl vials from the wards; all KCl now available in premixed solutions. It does raise the question of how long it took us to recognize this was a problem. Would we have known sooner if we had a central mechanism for reporting medical errors? We could also discuss why an agency nurse was placed in an ICU setting with apparent lack of expertise. O. Equipment Failure and Human Factors Engineering12 A hospital visitor is hit and injured with a flying piece off an oxygen cylinder. 5 patients die from excessive radiation doses from a computer malfunction during irradiation. A defibrillator sets off a fire at a bedside. Several patients and staff are injured in the fire. A newborn is electrocuted when monitoring equipment is applied wrong.

38


A series of patients are harmed when defective IV infusion pumps inadvertently bolus medication. An intubated patient is placed in an isolation room. The nurse caring for the patient needed to leave to attend to a task outside the room. She asks another nurse to watch her patient. When she returns, the patient is extubated, alarms sounding. The covering nurse couldn’t hear the alarms through the closed doors of the isolation room. 1. Human Factors addresses human:machine interface. Better equipment design can lead to better/safer use. The answer to these problems probably rests with future designs. 2. Can you solve the problems posed above? P. System Design, in General14 1. Standardize: similar floor layouts, similar equipment, standardize policies to govern bedside care. Treatment protocols and “best practice” guidelines. 2. Simplify 3. Automate 4. Improve Teamwork 5. Design better Information Networks Q. The Inevitable Resistance to Change 1. Medicine values individual thought. Physicians develop their own “style” and have “their own” way of practicing. Standardization, treatment protocols, and guidelines will offend some. The benefits of standardization and simplification may not be appreciated by the physician who is confident he knows better. 2. There is still a hierarchy in medicine that may reject some of the teamwork principles discussed here. We may encourage better communication and more open discussion of management decisions between doctors, nurses, and pharmacists, but it will take time for these ideas to gain wide acceptance. R. Changes for the Future15 1. The need to report4 a) Improvement seen by airline industry with use of anonymous reporting b) Improvement in Anesthesia safety with use of critical incident monitoring.5,16,17 c) Improvement in ICU care with critical incident monitoring.18 2. The need to investigate, to look beyond the sharp end of error. 3. The need to innovate. To understand more about human cognition, decisionmaking, medical team structure, communication, medical infrastructure. Better use of computers and automation. Medical simulations for improving individual and team performance. 4. The imperative to be involved. Professional groups can and should lead the way. E.g. National Patient Safety Foundation, individual professional societies. 5. Improve.

39


6.3 Teaching Methodology Some of the content can be presented in a lecture format. It is probably better learned through application and problem-solving of actual cases. The teamwork portion can be introduced in lecture. Bedside drills with doctors, nurses, pharmacists, and technicians would be useful for each to see patient care problems from others’ perspectives.

6.4 Recommended Reading Vaughan D. The dark side of organizations: mistake, misconduct and disaster. Annu Rev Sociol. 1999;25:271-305. Adams JG, Bohan JS. System contributions to error. Acad Emerg Med. 2000;7(11):1189-1193. Leape LL. A systems analysis approach to medical error. J Eval Clin Pract. 1997;3:213-22. Kohn LT, Corrigan JM, Donaldson MS, eds. To Err is Human: Building a Safer Health System. Institute of Medicine. Washington, D.C.: National Academy Press; 2000.

CITATIONS 56. Reason J. Human error: models and management. BMJ. 2000;320:768-770. 57. Lightning Calculator. The Red Bead Experiment. Available at: http://www.qualitytng.com/REDBEAD.HTM. Accessed February 8, 2002. 58. Sagan SD. The Limits of Safety: Organizations, Accidents and Weapons. Princeton University Press;1993. 59. Creating safety systems in health care organizations. In: Kohn LT, Corrigan JM, and Donaldson MS, eds. To Err is Human: Building a Safer Health System. National Academy Press;2000. 60. Galletly DC, Mushet NN. Anaesthesia system errors. Anaesth Intens Care. 1991;19:66-73. 61. Casey S. Set Phasers on Stun and Other True Tales of Design, Technology, and Human Error. 2nd ed. Santa Barbara: Aegean Publishing Company; 1998. 62. West E. Organisational sources of safety and danger: sociological contributions to the study of adverse events. Qual Health Care. 2000;9:120-126.

40


63. Risser DT, Rice MM, Salisbury ML, et al. The potential for improved teamwork to reduce medical errors in the emergency department. Ann Emerg Med. 1999;34(3):373-383. 64. Small SD, Wuerz RC, Simon R, et al. Demonstration of high-fidelity simulation team training for emergency medicine. Acad Emerg Med. 1999;6:312-323. 65. Bhasale A. The wrong diagnosis: identifying causes of potentially adverse events in general practice using incident monitoring. Fam Pract. 1998;15(4):308-318. 66. Bates DW. Using information technology to reduce rates of medication errors in hospitals. BMJ. 2000;320:788-791. 67. Hyman WA. Errors in the use of medical equipment. In: Bogner MS, ed. Human Error in Medicine. New Jersey: Lawrence Erlbaum Assoc, Inc.;1994:327-347. 68. Spath PL. Reducing errors through work system improvements. In: Spath PL, ed. Error Reduction in Health Care: A Systems Approach to Improving Patient Safety. Chicago:Health Forum Inc.; 2000:199-234. 69. Hale A. Approaching safety in healthcare: from medical errors to healthy organizations. In: Vincent C , de Mol B, eds. Safety in Medicine. Amsterdam: Pergamon;2000:247-261. 70. Woods D. Moving Forward on Patient Safety: Inquiry, Innovation, and Learning. Available at: http://www.uth.tmc.edu/schools/sahs/SpeakerSeries/woods2-99.html. Accessed February 8, 2002. 71. Cooper JB, Newbower RS, Long CD, et al. Preventable anesthesia mishaps: a study of human factors. Anesthesiology. 1978;49:399-406. 72. Gaba DM. Anaesthesiology as a model for patient safety in health care. BMJ. 2000;320:785-788. 73. Wright D, Mac SJ, Buchan I. Critical incidents in the intensive therapy unit. Lancet. 1991;338:676-678. OTHER BACKGROUND READING Stahlhut RW, Gosbee JW, Gardner-Bonneau DJ. A human-centered approach to medical informatics for medical students, residents, and practicing clinicians. Acad Med. 1997;72(10):881-887. Nolan TW. System changes to improve patient safety. BMJ. 2000;320:771-773.

41


7 Module 7. Living with the Reality of Medical Error 7.1 Goals and Objectives 45.

Understand the inevitability of error.

46.

Develop a strategy for effectively learning from and coping with errors you encounter.

47.

Understand the ethics of error, the need to report, and the need to inform patients.

7.2 Course Outline VII. COPING WITH MEDICAL ERROR: Coming to terms with your own errors. A. Acknowledge (recognize your role in the error) B. The need to inform the patient: The AMA guidelines state: “A physician shall … be honest in all professional interactions.” Moreover, in cases in which “a patient suffers significant medical complications that may have resulted from the physician’s mistake…the physician is ethically required to inform the patient of the facts necessary to ensure understanding what has occurred”. C. The need to report D. The need to cope with the reality of error. 1. Understand (appreciate the many factors that contributed to error and what role you played) 2. Learn (better understand the disease, the patient, the system, and your own personal factors that caused harm). 3. Adapt (find a constructive way to apply your new knowledge without overreacting or assuming maladaptive risk aversion behavior). 4. Improve the system to make that error less likely

7.3 Teaching Methodology General reading and group discussion.

7.4 Recommended Reading Christensen JF, Levinson W, Dunn PM. The heart of darkness: the impact of perceived mistakes on physicians. J Gen Intern Med. 1992;7:424-431. Excellent account of how physicians cope with errors, much of it written in own words of physicians. Contrast those who believe they should be perfect and word harder and longer, versus those who deny error and attribute it to lack of control over illness.

42


Wu AW, Folkman S, McPhee SJ, et al. How house officers cope with their mistakes. West J Med. 1993;159(5):565-569. Wusthoff CJ. MSJAMA: medical mistakes and disclosure: the role of the medical student. JAMA. 2001;286(9):1080-1081. AMA principles of ethics. See American Medical Association. Principles of Medical ethics. www.ama-assn.org/ama/pub/category/2512.html. Accessed Aug 10, 2001. AMA Council on Ethical and Judicial Affairs and Southern Illinois University School of Law. Code of Medical Ethics, Annotated Current Opinions. Chicago, IL: AMA; 1994.

eudora/attach/PatientSafety-long2.doc

43


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.