top of page
Search
  • Writer's pictureFibonacciMD

Solving the Difficult Problem of Medical Errors

Errors in the delivery of medical care are clearly a significant issue.


This article will delve into some of the important causes of medical errors, with some real-life examples of medical errors and possible ways to correct them. The practice of health care will never be completely error-free, but each person involved in healthcare should strive each day to reduce the number of avoidable errors to the best of their ability.

General Medicine

Continuing Medical Education

Eligible for CME Credit

FREE Online CME Test on FibonacciMD.app



The Institute of Medicine’s landmark report in 1999, "To Err is Human," estimated that there were 44,000 to 98,000 iatrogenic deaths a year in the U.S., making it the sixth leading cause of death.[1,2] A later study in 2016, estimated that there were actually 251,254 deaths per year from medical errors using studies published after the Institute of Medicine’s report, which made medical errors the third largest cause of death in the U.S.[2]


A study of errors in a busy emergency department involving 1,935 patients found there were 346 errors in one week. Twenty-two percent of the errors involved diagnostic studies, 16% administrative procedures, 16% pharmacotherapy, 13% documentation, 12% communication, 11% environmental, and 9% other issues. Reported errors occurred in almost every aspect of emergency care and were made by all staff categories of the care team. Seven of the 346 errors, or 2%, caused adverse events for patients. Using the numbers from that study, a 50,000 per year visit emergency department would be expected to have about 8,941 errors and 181 adverse patient events a year.[3]


A review of 30,195 records of hospitalized patients in New York State found that approximately 1% of the patients had negligent care with 2.6% of that group receiving permanently disabling injuries and 13.6% dying from the errors in their care. The authors, extrapolating that data to the whole population of New York State, estimated a statewide total of 27,179 injuries, including 877 cases of permanent disability and 6,895 deaths from errors per year. The most common areas of negligence in this study were diagnostic mishaps, noninvasive therapeutic mishaps, errors in management and care that occurred in the emergency department.[4,5]


Errors in the delivery of medical care are clearly a significant issue. This article will delve into some of the important causes of medical errors, with some real-life examples of medical errors and possible ways to correct them.


Why Are Near Misses Important?

There is the Swiss cheese theory of error where serious errors are prevented from causing injury by a series of barriers. Each barrier has unintended weaknesses, or holes like Swiss cheese. When by chance all the holes are perfectly aligned, meaning there were multiple errors or omissions, the error reaches the patient and can cause harm. In a near miss for example, two people can make mistakes while taking care of a patient, but a third person might be the stopgap that prevents patient harm from occurring. All near misses are opportunities for process improvement and should all be investigated to close the holes, perfect the process, and prevent future harm.[6]


Swiss Cheese Theory of Error


The Example of Wrong Site Surgery


In 2004, due to a number of reports of wrong site surgeries, the Joint Commission mandated a universal protocol to be used in all surgical suites.[7] It included a complete pre-procedure verification process, clearly marking the procedure site, and performing a time-out before the procedure where all the team members agree that it is the correct patient, the correct operative site, and the correct procedure is going to be performed.[8] Despite this, between 2007 and 2009, a hospital in Rhode Island performed five wrong-site surgeries, for which it was fined, mandated to install cameras in each operating room, and a state health inspector was assigned to the operating room suite for a least a year.[9]


Wrong-site surgery, although significantly reduced by the Joint Commission protocol, still continues. In the year 2021, there were 85 wrong-site surgery events and an additional 28 wrong-patient, wrong-procedure or wrong-implant events in the United States reported to the Joint Commission. Problems found included documents being missing or incomplete, inconsistent use of operative site marking, inadequate patient verification by the team because of rushing or other distractions, ineffective handoff communication, site marks being removed during prep, the time-out being performed without full staff participation, and possibly most important, organizational focus on patient safety was inconsistent, staff was passive or not empowered to speak up, and policy changes were not followed by adequate and consistent staff education.[10]


The issue of wrong site surgery represents how hard it is to achieve perfection in medical practice. There is a Joint Commission mandated well-designed protocol in place that potentially could be close to 100% effective in reducing errors but due to organizational deficiencies and individuals not precisely following the protocol, wrong site surgery is still occurring. Had just one person in the affected cases spoken up, the error might not have occurred.


Is 99% Error-Free Good Enough?

Is 99% error-free good enough? That depends on the process involved, the number of people affected and the state of knowledge. In the U.S. if 99% of airplanes arrived safely 450 planes would crash every day. At 99.99% airline safety, 4 to 5 airplanes would crash every day and many people would refuse to fly.[11] If midwives and obstetricians in the U.S. safely caught 99% of newborns during delivery, approximately 36,000 babies would be dropped on the floor each year.[12] It is clear that for some procedures 99% error-free is far from acceptable and may in fact represent failure. One of the real challenges and goals of medical care should be to try to be as error-free as possible.


There are several models from industry that have been tried with varying levels of success in healthcare. One is six sigma which uses statistical analysis and a management system to try to reduce errors so that 99.99966% of all opportunities are free of defects (which equals 3.4 errors per one million opportunities), or an error rate that is six standard deviations from the mean. Just because an error correcting procedure is working correctly when it is first launched does not mean it will still be the standard practice a few months later. In six sigma one of the major principles is not assuming an error correction process is working after introduction and to recheck it on a regular basis to ensure compliance.


In the lean manufacturing process developed at Toyota, everyone who works in the factory is also expected to incorporate quality control as part of their assignments. The workers literally have the ability, called an andon, to stop the assembly line if they find an error that cannot easily be fixed. It is felt it is better and less expensive to fix a problem before manufacturing is completed rather than trying to retroactively fix it after manufacturing has been completed. One hospital that uses lean concepts is Virginia – Mason Hospital in Washington State. One of the processes they instituted was a type of andon which allowed any employee to send an alert whenever there’s an object, person or process that could potentially cause harm to a patient. These alerts could be transmitted in person, by phone or via a website form. A problem they had to overcome was changing the hospital’s culture to one that could examine issues that caused errors without necessarily assigning blame for the error or near miss. They also needed to create a system that the employees could trust would respond in a timely fashion when an alert was activated.[13]


Examples of System Failures


A nurse was recently convicted of gross neglect and negligent homicide after trying to get Versed (midazolam) from a computerized medication cabinet but instead took out and administered the paralytic agent vecuronium, which caused the patient’s death. The nurse initially tried to withdraw Versed from a cabinet by typing in “VE” without realizing she should have been using its generic name, midazolam. When the cabinet did not dispense Versed, the nurse triggered an override function that unlocked a larger group of medications, then searched for “VE” again. This time, the cabinet offered to dispense vecuronium. Using overrides was common at the hospital, as an upgrade to the hospital’s electronic health records system was causing significant delays using the medication cabinets. The nurse did not recognize the error even though she overlooked or bypassed at least five warnings or pop-ups saying she was withdrawing a paralyzing medication.[14] Additionally, midazolam is a liquid and vecuronium is a powder that needs to be mixed for administration.[15,16]


Another case involved a famous actor’s children in which his newborn twins were twice given 1,000 times the intended dosage of heparin, as did one other child. Nurses mistakenly administered heparin with a concentration of 10,000 units per milliliter instead of 10 units per milliliter. A pharmacy technician took the heparin from the pharmacy’s supply without having a second technician verify the drug’s concentration, as was required by hospital policy. When the heparin was delivered to a satellite pharmacy in the pediatrics unit, a different technician there did not verify the concentration, as was required. Finally, the nurses who administered the heparin to the patients violated policy by neglecting to verify that it was the correct medication and dose beforehand. None of the infants were fortunately injured by the mishap, but needed protamine sulfate to reverse the heparin.[17]


In a case the author is aware of, a patient came to an emergency department complaining of headache and had a CT scan of the head. The overnight radiologist’s reading of the CT scan noted an abnormality, but he couldn’t actually define it. The emergency department physician did not call the overnight radiologist to get more information and the patient was diagnosed with sinusitis and sent home on oral antibiotics. The next day the hospital radiologist officially read the CT scan abnormality as a brain abscess but did not apparently call anyone about that finding. No one in the ED reviewed this obviously positive report. A few days later the patient returned doing much worse, needed emergency surgery and ended up with permanent severe neurological deficits.


The above three cases all reflect total system breakdowns and in two of the cases multiple people did not follow established procedure. There was also ineffective communication in some of the above cases, and no review of a positive finding in one. System breakdowns like these consist of both individual failures as well as institutional ones.


Prevention of System Failures


An organization should create patient care systems that are as foolproof as possible, ensure adequate staff education and then a critical step is intermittently reviewing the procedure to see that it is in fact in compliance and actually does ensure error reduction or elimination. One of the main keys to successful process improvement is data collection to objectively determine what is occurring with frequent re-evaluation of the process to ensure compliance.


Mindfulness is paying attention to what is happening in the moment. A common factor in many of the discussed clinical vignettes is that the practitioner was running on autopilot and not fully paying attention and truly being in the moment. Most people have probably experienced the phenomenon called highway hypnosis of driving while thinking about something else and then suddenly realizing there was no recollection of driving for a few minutes. The same thing can happen while performing repetitive tasks and it is critical while involved in patient care that the practitioner not allow that to happen. As some of these cases demonstrate, just a moment of inattention may be potentially disastrous for the patient and career-ending for the practitioner.


One pilot study trained nurses to perform a rapid awareness scan of their body from head to feet and then focus on the sensation of one breath just before preparing medication and again before administering it to help them be fully present for the tasks.[18,19] This is one technique that can be used to improve focus and awareness while at work. Another study gave mindfulness training to nurses and found there was a self-reported decrease in medication errors after the training.[20]


Medication errors are common. One study identified 6.5 adverse events related to medication use per 100 inpatient admissions with more than a quarter of these events due to preventable errors.[21,22] In another study of serious medication errors 39% occurred during the ordering stage, 38% during medication administration, and the remaining 23% occurred in about equal numbers during the transcription and dispensing stages.[23]

To try to decrease medication errors some organizations have mandated barcode scanning of both the medication and the patient’s wrist band before administration to ensure the correct medication and dose is given to the correct patient. In one study the rate of potential adverse drug events (other than those associated with timing errors) fell from 3.1% without the use of the bar-code to 1.6% with bar-coding. Transcription errors occurred at a rate of 6.1% on units that did not use the bar-code but were completely eliminated on units that did use it. The rate of timing errors in medication administration fell by 27.3% with the use of bar-coding.[21]


After the Versed-vecuronium error described previously, some automated medication system manufacturers have switched to requiring inputting five letters for a requested drug rather than just two letters to try to prevent similar errors.[14]


With respect to the issue of radiology and laboratory report follow-up, one recommendation that allows each practitioner to do quality control is to keep a list of patients who had labs and radiologic tests and then check to see if they have come back and need action. By doing this, the effectiveness of the system at an institution or office can quickly be ascertained as to how much trust a practitioner should put in the process, or if it needs to be changed. During my career I occasionally found some laboratory and radiologic test reports that never returned, as well as radiologists and laboratory technicians that had not called about a critical result as they were mandated to by protocol. Sometimes a test was not performed and cancelled by the lab for something as simple as the blood in the tube had clotted, or had an insufficient quantity of blood and it was only when I tracked down the test that I found this out, as no result or report ever came back. There were also some occasions when others who were responsible for following up my patients’ results on days I was not working did not do so.


Checklists have been used in the aviation industry to ensure safety and are now used extensively in healthcare. Checklists allow complex tasks to be correctly done in the correct order. It basically lays out a procedure in steps so that even inexperienced or temporary staff can perform it, and it may also be helpful in seldomly done or critical tasks. Checklists can also include instructions on what to do if there is a problem. They have been shown to improve compliance with procedures. Checklists additionally provide a record that can be reviewed later on to ensure that the correct procedure was performed.[24]


Are Errors Possible With Computerized Systems?

While helpful, a computerized system is not a total panacea for preventing errors. The author is aware of an electronic medical record (EMR) that had computerized radiological and laboratory test ordering and reporting where results were supposed to go back to the inbox of the ordering doctor or the site where it was ordered. Unfortunately, some of the laboratories had incorrect data codes for a number of the providers and some lab reports were not ending up in anyone’s inbox. On top of that, when radiology requests had missing information, the EMR was sending the reports to a dead letter type file and they were not necessarily being returned to the practitioners. It was not until these shortcomings were realized as issues that these problems were corrected.


In 2016, a computerized EMR cardiovascular risk calculator used by about a third of general practitioners in England, incorrectly calculated the risk to patients. There was concern that thousands of patients may have been incorrectly prescribed or taken off statins as a result of that computer error.[25]

A large health care institution reduced wrong-patient orders by 41% by requiring the clinicians to type in the patient’s initials, gender and age when placing an order. A more passive system of the EMR giving an ID-verifying alert needing a single-click confirmation of the patient’s identity only reduced wrong-patient electronic orders by 16%. They were not able to reduce the error rate more than 41% due to providers entering incorrect patients' initials, gender and age when they were on the wrong patient’s data screen without carefully verifying it was the correct patient; essentially an autopilot issue.[26]


EMRs can do many things well such as allow for instant access to a patient’s medical history or give notice of a potential adverse drug interaction but as can be seen, using a computer system does not necessarily prevent all errors. To avoid making errors practitioners cannot go on autopilot and cede complete control to the EMR when making clinical decisions.


Errors in Diagnostic Interpretation.


One potential source of error is in diagnostic testing interpretation. If a test has a 90% sensitivity and comes back negative is the patient truly negative for the condition or is the patient one of the 10% false negative group? Trying to apply population statistics to an individual patient without further information may not be accurate. In the 90% sensitivity example if the clinician decides a negative test alone means the patient does not have the condition, 10% of the time that conclusion will be wrong. This is where the art of medicine comes in, knowing when to get further diagnostic procedures when a test is negative or knowing when to stop testing to avoid unneeded testing and possibly creating more errors with false positive results. Knowing the limitations of each test that is ordered is critical. For example, if a CT scan is ordered to diagnose pulmonary embolism the clinician needs to understand that in a patient with low probability of an embolism, a CT angiogram only has a total positive predictive value of 58%. Positive predictive value (PPV) is the probability that a patient with a positive (abnormal) test result actually has the disease. This varies widely depending on what location of the lung the clot is seen. A main or lobar pulmonary artery location has a 97% PPV, while a subsegmental pulmonary artery location only has a 25% PPV. Conversely, in a patient with a high probability of an embolism, a negative CT angiogram only has a negative predictive value (NPV) for pulmonary embolism of 60%. Simply thinking a positive CT angiogram represents a pulmonary embolism and a negative one rules it out may lead to a number of diagnostic errors.[27] Use of D-dimer testing in low probability patients may be helpful in decision making, with negative testing making the diagnosis highly unlikely. In a high probability patient ordering a CT venogram of the lower extremities in addition to a CT angiogram increases the NPV of testing from 60% to 82%.[27]


Provider Cognitive Bias


There are over 30 types of cognitive bias that can affect decision making. A review of cognitive bias studies found that 50% to 100% of tested physicians were affected by a least one cognitive bias. The presence of cognitive biases was associated with diagnostic inaccuracies in 36.5% to 77% of presented case-scenarios in those studies. Cognitive biases and personality traits may affect clinical reasoning processes which may lead to errors in the diagnosis, management, or treatment of medical conditions. Evidence from some studies suggest that physicians who exhibit anchoring, information bias, overconfidence, premature closure, representativeness and confirmation bias are more likely to make diagnostic errors.[28]


Use of heuristics is common in medicine as it improves efficiency. A heuristic is a mental shortcut that helps the user make decisions and judgments quickly,[20] but may ignore part of available information.[29] Heuristics are used in rule-out protocols, as a guide to diagnosing and treating patients, but can be potentially misleading if the initial diagnostic impression is incorrect. An example might be looking for ST-segment elevation on an EKG as a sign of myocardial infarction which then triggers aspirin administration, activation of the catheterization lab, etc. However, if the ST segment elevation is actually more consistent with pericarditis and not recognized or thought of, a series of unnecessary interventions may take place.


Confirmation bias occurs when clinicians selectively accentuate the importance of evidence that supports their beliefs or what they want to believe to be true, while ignoring evidence that serves to not confirm those ideas.[30] A true life example of confirmation bias the author is aware of was a patient who had a seizure and complained of the worst headache of her life. She was admitted to a hospital and despite telling the doctors multiple times she did not drink alcohol was given a discharge diagnosis of alcohol withdrawal seizure and sent home. She returned to a different hospital a few days later with a new severe headache and was diagnosed with a subarachnoid hemorrhage due to an aneurysm.


Anchoring bias is closely related to confirmation bias and comes into play when interpreting evidence. It refers to physicians’ practices of prioritizing information and data that support their initial impressions, even when those first impressions are wrong.[31] A true life example of anchoring bias as well as confirmation bias the author is aware of was an intoxicated patient in an emergency department whom the doctors were trying to discharge after he sobered up, but he refused to leave. The patient was complaining of back pain and leg weakness which the doctors shrugged off. They continued to attempt to get him to leave, refusing to admit him to the hospital, but the patient continued to refuse to leave. A fever of 104 degrees F was documented on the chart late at night but was discounted by the physicians because the next day the patient was afebrile. After being in the emergency department almost a day, a new physician re-examined the patient and found the patient had leg weakness, a history of fever and back pain. Within a few hours the patient was in the operating room for his epidural abscess.


Framing, also known as the framing effect, occurs when decisions are influenced by the way information is presented. Physicians in one study recommended timelier follow-up for cancer screening when presented with the frequency events such as 2 in 100 patients, rather than as a 2% percent probability of developing cancer. It was also found that data presented as an increase in mortality rate with non-intervention was more likely to influence physicians to make a recommendation for a cancer procedure than the same data presented as a decrease in survival rate.[32]


Premature closure is the acceptance of a diagnosis before it has been objectively established and alternative diagnoses have been fully investigated.[33]


Representativeness heuristic is where people make judgments about how likely a specific event is based on comparison with an existing mental prototype. While it speeds the diagnostic process it may be based on incomplete information. In one study nurses and student nurses who received historical information not directly related to the patient’s symptoms such as the presence of a job loss or alcohol use, were more likely to dismiss a patient’s heart attack or stroke symptoms in favor of a less serious diagnosis due to the representativeness heuristic.[34]


Overconfidence bias is a frequent cause of delayed and missed diagnoses. Confidence is valued over uncertainty in the medical field, and it may be viewed as a weakness and a sign of vulnerability for clinicians to appear unsure. Overconfidence bias seems to be especially dependent on the manner in which the individual gathers evidence to support a belief. Overconfidence bias and confirmation bias are frequently seen together.[35] A trial of rural healthcare workers that investigated the issue of overconfidence bias found that overconfident healthcare workers were 26% less likely to manage patients correctly. Overconfident providers performed 18% fewer history questions and physical examinations than providers who did not overestimate the quality of their medical judgment. The authors of the study concluded that overconfidence led to two undesirable behaviors. The first was that information was gathered less thoroughly, due to overestimation that their intuition was correct. The second was that overconfident providers were less likely to feel the need for assistance in the form of clinical guidelines or advice from peers.[36]

Errors from Interruptions, Psychological Issues and Work Safety

In a self-reporting study, physicians with symptoms of burnout, fatigue and recent suicidal ideation were more likely to report making an error. In addition, those physicians who worked in facilities with worse work unit safety grades (a reflection of the patient safety practices in a work unit) also reported more errors. The authors suggested that a combination of physician-targeted burnout interventions and unit-targeted patient safety improvement measures were needed in order to provide the most effective error prevention.[37]


A study of emergency physicians found they were interrupted from their work flow an average of 11.12 times per hour and found many of the interruptions occurred at the staff station by colleagues which may increase one's cognitive burden and potentially cause mishaps, delays, and unintentional errors. More than half of the interruptions in the patient rooms were considered low priority, which if eliminated, could allow physicians to concentrate solely on immediate patient care.[38] Another study found emergency physicians were interrupted an average of 9.7 times per hour compared with 3.9 times per hour for primary care physicians (PCPs). However, the PCPs spent an average of 11.4 minutes per hour performing simultaneous tasks compared with 6.4 minutes per hour for emergency physicians. Interruptions have been found to be one of the major factors contributing to drug-dispensing errors, and may contribute to physician job stress, fatigue, and sleep deprivation.[39]

Conclusions


A number of studies have indicated that errors in medicine are unfortunately frequent and have the potential to cause patient harm. This article has looked at only a sampling of possible errors.


Individual practitioners can try to be aware of their own biases and not let them influence the care they deliver. They also can ensure protocols designed to reduce error are followed precisely. They can be mindful when practicing and avoid going on autopilot, which is a frequent source or error. This is especially important given the number of interruptions and multitasking clinicians experience daily at work.


As practiced in lean manufacturing, each person can become a quality assurance inspector for their organization, and report any near-misses or quality issues that arise. A simple process improvement assessment that can be done by any practitioner is checking to see if radiology and laboratory test reporting is being handled correctly by their facility.


While computers can be helpful in some ways to reduce errors, they are not foolproof and orders still need to reviewed carefully to ensure they are correct. Clinicians also need to understand the limitations of their testing, and what the results actually mean. For every medication ordered, drug interactions should be checked, and of course, the patient should be specifically asked about allergies before a medication is given as they may have forgotten to tell someone about a serious allergy during triage or on a previous visit.


Organizations should have a culture of excellence and use protocols and technology to assist in error reduction goals. Near misses should all be investigated to help avoid a future error that causes damage. Checklists may be helpful for complex, critically important or infrequently performed tasks where errors are likely, and also may be helpful when less experienced or temporary people are working. A common institutional error is to assume just because a process is in place it is functioning well. Critical error reduction processes should be reviewed on a regularly scheduled basis to see if they are being performed properly or if there has been a falloff in compliance. Employees should be encouraged to inform leadership when potential errors are observed, without fear of retribution and there needs to be a robust education process to ensure quality is maintained. Finally, workplace wellness programs and improving safety in the workplace may help decrease some of the psychological and environmental issues that can lead to errors.


“To Err is Human”[40] reflects the reality of the human condition and the practice of health care will never be completely error-free, but each person involved in healthcare should strive each day to reduce the number of avoidable errors to the best of their ability.



 

cme eligible

FREE Online CME Test with valid email on www.FibonacciMD.app




 

References

[1] Carver N, Gupta V, Hipskind JE. Medical Error. [Updated 2021 Nov 20]. In: StatPearls. Treasure Island (FL): StatPearls Publishing; 2022 Jan. Retrieved from: https://www.ncbi.nlm.nih.gov/books/NBK430763/

[2] Makary MA, Daniel M. Medical error—the third leading cause of death in the US. The BMJ. BMJ 2016;353:i2139. Retrieved from: https://www.bmj.com/content/353/bmj.i2139

[3] Fordyce J et al. Errors in a busy emergency department. Annals of Emergency Medicine Volume 42, ISSUE 3, P324-333, September 01, 2003. Retrieved from: https://pubmed.ncbi.nlm.nih.gov/12944883/

[4] Brennan TA et al. Incidence of Adverse Events and Negligence in Hospitalized Patients — Results of the Harvard Medical Practice Study I. N Engl J Med 1991; 324:370-376. February 7, 1991. Retrieved from: https://www.nejm.org/doi/10.1056/NEJM199102073240604?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub%20%200www.ncbi.nlm.nih.gov

[5] Leape LL et al. The Nature of Adverse Events in Hospitalized Patients — Results of the Harvard Medical Practice Study II. N Engl J Med 1991; 324:377-384. February 7, 1991. Retrieved from: https://www.nejm.org/doi/10.1056/NEJM199102073240605?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub%20%200www.ncbi.nlm.nih.gov

[6] Perneger TV. The Swiss cheese model of safety incidents: are there holes in the metaphor?. BMC Health Serv Res. 2005;5:71. Published 2005 Nov 9. Retrieved from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1298298/

[7] Universal Protocol for Preventing Wrong Site, Wrong Procedure, Wrong Person Surgery. Patient Safety Network, Agency for Healthcare Research and Quality. April 26, 2006. Retrieved from: https://psnet.ahrq.gov/issue/universal-protocol-preventing-wrong-site-wrong-procedure-wrong-person-surgery

[8] The Universal Protocol for Preventing Wrong Site, Wrong Procedure, and Wrong Person Surgery. The Joint Commission. 2022. Retrieved from: https://www.jointcommission.org/-/media/tjc/documents/standards/universal-protocol/up_poster1pdf.pdf

[9] Fiore K. Hospital Fined for Wrong-Site Surgery. Medpage Today. November 3, 2009 Retrieved from: https://www.medpagetoday.com/hospitalbasedmedicine/hospitalists/16788

[11] Federal Aviation Authority. Air Traffic By The Numbers. March 18, 2022 Retrieved from: https://www.faa.gov/air_traffic/by_the_numbers/

[12] Births and Natality. CDC. last reviewed: February 16, 2022. Retrieved from: https://www.cdc.gov/nchs/fastats/births.htm

[13] McIntyre M. How Can Andons Improve Patient Safety? Virginia Mason Institute. February 17, 2016. Retrieved from: https://www.virginiamasoninstitute.org/how-can-andons-improve-patient-safety/

[14] Kelman B. As a nurse faces prison for a deadly error, her colleagues worry: Could I be next? NPR. March 22, 2022. Retrieved from: https://www.npr.org/sections/health-shots/2022/03/22/1087903348/as-a-nurse-faces-prison-for-a-deadly-error-her-colleagues-worry-could-i-be-next

[15] Kelman B. Nurse Convicted of Neglect and Negligent Homicide for Fatal Drug Error. Kaiser Family Foundation. March 25, 2022. Retrieved from: https://khn.org/news/article/radonda-vaught-nurse-drug-error-vanderbilt-guilty-verdict/

[16] Kelman B. In Nurse’s Trial, Investigator Says Hospital Bears ‘Heavy’ Responsibility for Patient Death. Kaiser Family Foundation. March 24, 2022. Retrieved from: https://khn.org/news/article/radonda-vaught-fatal-drug-error-vanderbilt-hospital-responsibility/

[17] Ornstein C. Dennis Quaid files suit over drug mishap. Los Angeles Times. Sept. 16, 2014. Retrieved from: https://www.latimes.com/entertainment/gossip/la-me-quaid5dec05-story.html

[18] Durham ML. DNP et al. Reducing Medication Administration Errors in Acute and Critical Care: Multifaceted Pilot Program Targeting RN Awareness and Behaviors. The Journal of Nursing Administration: February 2016 - Volume 46 - Issue 2 - p 75-81. Retrieved from: https://journals.lww.com/jonajournal/Abstract/2016/02000/Reducing_Medication_Administration_Errors_in_Acute.6.aspx

[19] Durham ML, Mindfulness for medication safety, Break the cycle of rushing and multitasking. American Nurse Journal. Volume 15, Number 7 pp. 24-26. July 2020. Retrieved from: https://www.myamericannurse.com/wp-content/uploads/2020/06/an7-Mindfulness-622.pdf

[20] Daigle S, Talbot F, French DJ. Mindfulness-based stress reduction training yields improvements in well-being and rates of perceived nursing errors among hospital nurses. J Adv Nurs. 2018 Oct;74(10):2427-2430. Retrieved from: https://pubmed.ncbi.nlm.nih.gov/29869350/ [18]Poon EG. Effect of Bar-Code Technology on the Safety of Medication Administration. N Engl J Med 2010; 362:1698-1707. Retrieved from: https://www.nejm.org/doi/full/10.1056/nejmsa0907115

[21] Poon EG. Effect of Bar-Code Technology on the Safety of Medication Administration. N Engl J Med 2010; 362:1698-1707. Retrieved from: https://www.nejm.org/doi/full/10.1056/nejmsa0907115

[22] Bates DW, Cullen DJ, Laird N, et al. Incidence of Adverse Drug Events and Potential Adverse Drug Events: Implications for Prevention. JAMA. 1995;274(1):29–34.

[23] Leape LL, Bates DW, Cullen DJ, et al. Systems Analysis of Adverse Drug Events. JAMA. 1995;274(1):35–43. Retrieved from: https://jamanetwork.com/journals/jama/article-abstract/389137

[24] A Gawande. The Checklist Manifesto, How To Get Things Right. Metropolitan Books, Henry Holt and Company LLC. 2009.

[25] Iacobucci G. Computer error may have led to incorrect prescribing of statins to thousands of patients BMJ 2016; 353 :i2742. Retrieved from: https://www.bmj.com/content/353/bmj.i2742

[26] Adelman JS et al. Understanding and preventing wrong-patient electronic orders: a randomized controlled trial, Journal of the American Medical Informatics Association, Volume 20, Issue 2, March 2013, Pages 305–310, Retrieved from: https://doi.org/10.1136/amiajnl-2012-001055

[27] Stein et al., Diagnostic Pathways in Acute Pulmonary Embolism: Recommendations of the PIOPED II Investigators, Radiology 2007. Retrieved from: https://pubs.rsna.org/doi/10.1148/radiol.2421060971

[28] Saposnik, G et al. Cognitive biases associated with medical decisions: a systematic review. BMC Med Inform Decis Mak 16, 138 (2016). https://doi.org/10.1186/s12911-016-0377-1

[29] Pines JM. Profiles in Patient Safety: Confirmation Bias in Emergency Medicine. Volume 13, Issue 1, Pages: 1-116. January 2006. Retrieved from: https://onlinelibrary.wiley.com/doi/epdf/10.1197/j.aem.2005.07.028

[30] Marewski JN, Gigerenzer G. Heuristic decision making in medicine. Dialogues Clin Neurosci. 2012;14(1):77-89. Retrieved from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3341653/

[31] Doherty TS, Carroll AE. Believing in Overcoming Cognitive Biases. AMA Journal of Ethics. Sept. 2020. Retrieved from: https://journalofethics.ama-assn.org/article/believing-overcoming-cognitive-biases/2020-09

[32] Bui TC, Krieger HA, Blumenthal-Barby JS. Framing Effects on Physicians' Judgment and Decision-Making. Psychol Rep. 2015;117(2):508-522. doi:10.2466/13.PR0.117c20z0.

[33] Maurier D, Barnes DK. Premature Closure: Was It Just Syncope? PSNet. November 25, 2020. Retrieved from: https://psnet.ahrq.gov/web-mm/premature-closure-was-it-just-syncope#4

[34] Brannon LA, Carson KL. The representativeness heuristic: influence on nurses’ decision making. Applied Nursing Research, Volume 16, Issue 3, Pages 201-204. 2003. Retrieved from: https://doi.org/10.1016/S0897-1897(03)00043-0.

[35] Croskerry P, Norman G., PhD Overconfidence in Clinical Decision Making. The American Journal of Medicine (2008) Vol 121 (5A), S24 –S29. Retrieved from: https://www.amjmed.com/article/S0002-9343(08)00152-6/fulltext

[36] Roxanne J et al. Overconfident health workers provide lower quality healthcare, Journal of Economic Psychology, Volume 76. 2020. Retrieved from: https://doi.org/10.1016/j.joep.2019.102213

[37] Tawfik DS, Profit J, Morgenthaler TI, et al. Physician Burnout, Well-being, and Work Unit Safety Grades in Relationship to Reported Medical Errors. Mayo Clin Proc. 2018;93(11):1571-1580. Retrieved from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6258067/

[38] Blocker RC et al. Physician, Interrupted: Workflow Interruptions and Patient Care in the Emergency Department, The Journal of Emergency Medicine, Volume 53, Issue 6, 2017,Pages 798-804. Retrieved from: https://pubmed.ncbi.nlm.nih.gov/29079489/

[39] Chisholm CD, Dornfeld AM, Nelson DR, Cordell WH. Work interrupted: a comparison of workplace interruptions in emergency departments and primary care offices. Ann Emerg Med. 2001 Aug;38(2):146-51. Retrieved from: https://pubmed.ncbi.nlm.nih.gov/11468609/

[40] Alexander Pope quote. BrainyQuote.com. Retrieved from: https://www.brainyquote.com/authors/alexander-pope-quotes


IMIT takes pride in its work, and the information published on the IMIT Platform is believed to be accurate and reliable. The IMIT Platform is provided strictly for informational purposes, and IMIT recommends that any medical, diagnostic, or other advice be obtained from a medical professional. Read full disclaimer.



bottom of page