Every physician has probably incorrectly diagnosed a patient at some point over the course of his or her career.
Explore This Issue
September 2014In general, the rate of medical misdiagnosis is estimated to be about 10% to 15% (Arch Intern Med. 2005;165:1493-1499). A 2014 study found that two-thirds of 681 respondents to an online survey conducted by the American Association of Otolaryngology-Head and Neck Surgery (AAO-HNS) reported an event in the last six months that they felt should not have happened (Otol Head Neck Surg. 2014;150:779-784). Diagnostic errors were reported in five cases, two of which resulted in major morbidity, and errors in testing were reported in 24 cases, seven of which resulted in major morbidity.
“In surgical specialties, misdiagnosis carries a risk of major morbidity,” said Rahul Shah, MD, co-author of the AAO-HNS study and an otolaryngologist at Children’s National Medical Center in Washington, DC. “When we get it wrong, it leads to big problems.”
The Study of Errors
Diagnosis is never going to be perfect, said Mark Graber, MD, senior fellow in healthcare quality and outcomes at the research institute RTI International. “There are 10,000 different diseases, and atypical presentations are not unusual. The available evidence suggests that physicians get it right about 90% of the time.”
Maybe that’s good. But, Dr. Graber’s question is, “Can we do better?”
Answering that question has become Dr. Graber’s primary mission. In 2011, he founded the Society to Improve Diagnosis in Medicine, and in 2014 he launched the journal Diagnosis.
Dr. Graber believes that one of the best ways to improve diagnosis is to study how errors occur in the first place. In a 2005 study, he and his colleagues studied 100 cases of diagnostic error in internal medicine, collected from five large academic medical centers (Arch Intern Med. 2005;165:1493-1499). In a small number of cases (7%), the mistake was deemed no fault: The disease presented in a very unusual way, or patient behavior undermined the diagnosis. Another 19% of misdiagnoses were system-related errors, such as equipment failure or communication breakdown. In 28% of the cases, cognitive errors on the physician’s part were to blame—either faulty knowledge, faulty information gathering, or faulty synthesis of the data. In 46% of cases, both systems and cognitive errors played a role.
Physicians who did not know enough or who did not get enough information were evident in Dr. Graber’s analysis, but the type of cognitive error that happened the most was in synthesizing information into a diagnosis. In 321 instances of cognitive error, faulty synthesis played a role 83% of the time. For example, physicians overestimated the importance of a symptom, were distracted by patient history, failed to apply appropriate heuristics, or prematurely settled on a diagnosis.
Learn How You Think
The human mind uses rules of thumb that help it to make decisions or quickly slice through an overwhelming amount of input. These shortcuts can also lead to certain types of mistakes—what psychiatry calls “cognitive biases.”
Pat Croskerry, MD, PhD, of Dalhousie University in Halifax, Nova Scotia, has written and lectured on cognitive-related factors in medical decision making. He described the two ways we make decisions: intuitive and rational. Intuitive decision-making is fast and compelling—it is the snap judgment or the gut instinct. Rational decision-making is slower and more deliberate.
While intuitive decision-making often serves physicians well, it is occasionally catastrophic. By virtue of its very speed and certainty, it can blind a physician to considering alternative explanations for a particular patient’s presentation. It’s important to understand that everyone relies on intuitive decision making, that it’s part of human nature, and that you must recognize it in order to overcome it.
What are cognitive biases? Here are some key biases that take place in a physician’s exam room:
- Anchoring bias. “That’s when you latch on to the first thing you see,” said David Eibling, MD, a professor of otolaryngology at the University of Pittsburgh Medical Center. By focusing on the most obvious symptom or the most likely diagnosis, a physician may fail to take into account conflicting information.
- Confirmation bias. Once a physician has a diagnosis in mind, he or she will seek confirming evidence. “But they may forget to look for refuting evidence,” Dr. Eibling said.
- Availability bias. That’s when a physician relies on recent experience, such as, “I’ve seen four of these cases this month; this must be number five.” By the same token, physicians may miss diseases they haven’t seen for awhile.
Dr. Eibling said it’s worth learning about cognitive biases because it allows you step back and see how you think, then determine where that thinking might lead you astray. Also, he said, it’s reassuring for many physicians to learn that these mistakes are encoded into how our brains work.
“Everybody gets uncomfortable when you start talking about errors,” Dr. Graber said. “But everybody gets interested when you show them the patterns of human error.” For example, you put your car keys down on the kitchen counter instead of in the bowl near the mail and then you can’t find them. Everyone recognizes that scenario, Dr. Graber said. “The silly cognitive mistakes we make in our everyday lives are the same ones that can lead to diagnostic errors.’”
Engage the Patient
Indeed, one might think that the trickiest diagnoses are those in which physicians make the most mistakes, but that’s not the case. “We’re more prone to make these errors when we’re close to home, when we’re in our comfort zone,” Dr. Shah said. “We’re less likely to do so when we’re in uncharted waters because we are usually hyperaware in these situations.”
That’s because physicians are more likely to double-check things when they’re uncertain. They’re more likely to follow up to get new information and are more open to changing their mind about a diagnosis. Double-checking and following up are signs of rational decision-making.
“Making a list of differential diagnoses is OK,” said Adam Folbe, MD, associate professor and director of special projects for the department of otolaryngology-head and neck surgery at Wayne State University School of Medicine in Detroit. “And it’s OK to include the patient in your uncertainty.” That might mean talking through the possibilities and asking patients to be partners in the process, ready to inform the physician of any change in status.
In a 2014 study, Dr. Folbe and his colleagues studied 78 pediatric otolaryngology cases that ended in litigation (Laryngoscope [published online ahead of print March 7, 2014]. doi: 10.1002/lary.24663). Misdiagnosis or failure to diagnose in a timely manner were factors in 41% of the cases.
Cultivating good relationships with patients can protect physicians against lawsuits, Dr. Folbe said, something he talks a lot about with his trainees. Yes, it might take a little more time, but “spending two minutes on informed consent earns trust and saves you in the long run,” he said. “Communication is key.”
Dr. Graber agreed, advising physicians to make patients partners in their own care. Physicians should think out loud: “Here’s what I think is going on, but you need to let me know how things play out.” And then follow up. “Patients can play such an important role in this game,” he said.
—Rahul Shah, MD
Review Your Work
In addition to learning about the traps of intuitive decision-making, physicians could surely benefit from reviewing their own work. Dr. Folbe is a “big fan” of morbidity and mortality reports. His group conducts these every six weeks, but, he said, “I think it’s missing in private practice.”
Reviewing past diagnosis is common in some specialties, such as radiology and pathology, Dr. Eibling said. “They’ll reassess a percentage of cases or films,” he said, and determine their own accuracy rate. “The idea of doing this in a clinical practice is kind of new.” Even a single-physician practice could do something like that, Dr. Shah said. “If the physician sat down with their office manager once a quarter and reviewed 10 charts,” he added, “they could do it over lunch.”
Further, Dr. Shah said, such an exercise could help the practice achieve a key component—practice assessment—of the Maintenance of Certification (MOC), as put forth by the American Board of Medical Specialties. The Affordable Care Act specifically recognizes the MOC as a worthy tool for maintaining and reporting high-quality patient care.
Mimi Kokoska, MD, professor and vice-chair at Indiana University School of Medicine’s department of otolaryngology-head and neck surgery, reviewed 50 consecutive head and neck cancer patients over a three-year period (2009-2012) and tracked diagnostic delays and diagnostic errors. She and her colleagues studied electronic medical records (EMRs) and classified delays and errors using criteria similar to those described in the Institute of Medicine’s patient-safety report, “To Err is Human” (National Academies Press; 2000)
The results of the study, presented at this year’s Combined Otolaryngology Spring Meetings, found 57 diagnostic errors and delays in the 50 patients. That might sound like a lot, but Dr. Kokoska said there’s no way of knowing the denominator or the upper bounds. “How many were possible? Theoretically, we could make 100 errors per patient,” she said.
More important than the numbers was the prevalence of different types of errors. Delays were the most frequent problem: Of the 57 cases identified, 31.6% were clinic delays (more than two weeks from referral) and 26.3% were treatment delays (more than two months of inaction after diagnosis, which includes noncancer-related findings on imaging studies). Misdiagnoses made up 15.8% of the problems. Failure to test and use of outmoded tests were also quantified.
Dr. Kokoska said political concerns can make performing this kind of analysis difficult, but if such studies are not undertaken, “how will we know where to focus evidence-based quality improvement efforts?”
“When diagnosed at an early stage, the prognosis for head and neck cancer is so much better,” she said. “So it’s important.”
Address Errors Created in Systems
Many systems errors are those involving delays in patients seeing a specialist or getting a diagnostic test, as well as delays in getting test results back to the initial physician. And, beyond a slowly moving system of communication, there are errors in the communication itself.
Physicians may rely on their own reading of a scan or the written notes that accompany it rather than talk to the radiologist, said Dr. Graber. Subtleties or secondary, nonobvious findings might be missed. These things are easily communicated in a live conversation, but it can be next to impossible to get two physicians on the phone at the same time. “Why doesn’t this happen? Because communication is burdensome,” Dr. Shah said. “The radiologist’s call interrupts my work flow. I call him back and interrupt his work flow.”
Better notes could help, but it’s complicated by the variety of systems. “Even notes are tricky, because different facilities use different EMR platforms and they’re not compatible,” Dr. Shah said.
Checklists, which are are becoming more popular with surgeons, are sometimes very group specific, built for a particular specialty at a particular hospital. But so far these tend to be more procedural than diagnostic. Usually you start with the diagnosis and then the decision tree follows, Dr. Folbe said.
Computerized diagnostic algorithms may be a physician’s tool of the future, but for now simply recognizing one’s limits would help. “We’re not supercomputers,” Dr. Folbe said. “It’s okay to look things up. It’s okay for physicians to admit they don’t know something or to consult with their colleagues.”
The focus on systems is worthy, but a true human factors analysis is a big investment of time and resources. “For a small otolaryngology practice, there’s no budget for this,” Dr. Shah said.
He offered up a low-cost solution: the pneumonic STAR, which stands for stop, think, act, review. “It’s very simple but it can be very helpful,” Dr. Shah said. “For me, it’s a reminder to stop and focus on what’s in front of me,” he adds, whether it’s a patient or a scan or a prescription. Even if—or especially if—your pager is buzzing.
Jill Adams is a freelance writer based in New York.