ORLANDO—A patient with a history of having a “difficult airway” is under the care of an intensive care unit resident who is under pressure to transfer patients from the ICU. The patient develops an air leak around the tube, which is seen as a sign to extubate, but it’s done without extra airway expertise from anesthesiology or otolaryngology personnel. There’s a rapid airway obstruction and the ICU can’t resolve it, so an emergency tracheostomy has to be performed, and the delay in securing the airway leads to anoxic brain injury.
Explore This Issue
November 2014The incident, discussed during the meeting’s panel session “Human Error and Patient Safety,” shows how errors can result not from a lack of knowledge but from biases that have nothing to do with how much you know, said Karthik Balakrishnan, MD, MPH, a pediatric otolaryngologist at Mayo Clinic in Rochester, Minn.
The session explored the pitfalls of bias into which physicians can easily fall if they’re not careful, and ways to make system changes to support them and to prevent errors from occurring.
In the extubation example, the resident and the ICU team fell victim to several clear biases. There was anchoring bias—the team relied too heavily on the air leak as a reason to extubate and discounted the history of difficulty. There was also a motivational bias: They wanted to keep the attending physician happy by moving patients along. And there was confirmation bias, Dr. Balakrishnan said. “There was intense pressure to get the patient extubated, so they found reason to extubate,” he said.
Since then, an airway checklist has been developed at the institution. Any ICU patient extubated at the bedside has to go through a checklist identifying possible complications, and ancillary staff and equipment are put in place to prevent those.
Ellis Arjmand, MD, PhD, director of otolaryngology at Texas Children’s Hospital in Houston and the session moderator, said most decisions that lead to errors “are not intentional or due to a knowledge deficiency, but rather are due to the quality of the decision—thus, awareness of the factors that can bias decision quality is essential.”
Jo Shapiro, MD, chief of otolaryngology at Brigham and Women’s Hospital in Boston and director of the Center for Professionalism and Peer Support, said much of that center’s purpose is to help doctors move away from a natural inclination to distance themselves from colleagues who’ve made errors. “We want to say to ourselves, ‘I would never have done that,’” she said.
—Jo Shapiro, MD
An atmosphere in which errors lead to sadness, shame, and fear makes for a system in which errors can’t be deconstructed so that lessons can be learned from them, she said. So, at the Center for Professionalism, physicians receive support from fellow physicians who are separate from those actually deconstructing what went wrong. “The purpose is to have people feel and experience a culture, a system, where we acknowledge the devastation that most of us feel after things have gone wrong,” she said.
Dr. Shapiro noted that, at meetings, it’s the sessions with the most technical information that tend to draw the biggest crowds, as though “the way to avoid errors is just ‘learning more.’” But sessions on reviewing pitfalls such as bias should be required, she said. “It is these issues that are going to get us and our patients into trouble.”
Brian Nussenbaum, MD, who runs the quality improvement program in the Washington University School of Medicine otolaryngology-head and neck surgery department in St. Louis, said that cases that offer lessons for avoiding errors are discussed in conferences each month. The most serious cases are discussed, he said, but so are the most useful “near miss” cases—the cases that are often quickly forgotten. “These are the events that you could probably learn the most from, because generally when near-misses happen, people just take a sigh of relief, then move on, rather than thinking about exactly what happened,” he said. Additionally, cases with good outcomes are discussed when they hold valuable lessons.
The program tries to follow a “just culture” model, in which personal accountability and systems accountability are balanced—a model that “holds the individuals accountable for their actions but not for the flaws in the system.”
Rahul Shah, MD, MBA, chief of the Quality and Safety Office at Children’s National Health System in Washington, DC, stressed the importance of calling in a second surgeon when you feel under pressure and are faced with difficult decisions. That second set of eyes can be valuable and is not subject to the same biases that the first physician is. He decreed that no patient would code or die on the operating table without a second surgeon in the room.
Dr. Shah also said that it’s important for physicians with experience to share their heuristics—or way of thinking and making decisions—with new learners, such as residents. Those heuristics relate to “the underlying culture” at a center. “It’s invariably safer to fly home today or tomorrow than it is to have surgery in any one of our hospitals—and we have to ask ourselves why,” he said.
Team Approach
After the session, Dr. Arjmand said he organized the panel after being involved in root-cause analysis cases that “too often” fell into a category of everyone having done their job, with an error that “could have happened to anyone.” He wanted to zero in on the biases behind those types of errors.
Things like bringing in a second surgeon, having a conference before a surgery, and other simple measures can be taken to try to prevent those kinds of errors. “Those things actually serve the purpose of trying to reduce the possibility of bias in a decision, but we haven’t understood it using that language,” Dr. Arjmand said.
Farrel Buchinsky, MD, a pediatric otolaryngologist at Allegheny General Hospital in Pittsburgh, said he likes the collegial approach to preventing errors. “I’ve read many books on these topics, and they’re all about you and what you can do for yourself, but what I found most useful was how you can build it into your team approach and how you can rely on your colleagues,” he said. Talking to colleagues about errors, without notes, as they do at Dr. Shapiro’s center, helps physicians overcome their reluctance to let down their guard, he said.
Additionally, he thought panelists were wise to underscore the importance of bringing a second physician into the operating room. “You think you’re bringing him in because you need a second pair of hands,” he said. “You need a second brain, and more importantly than a second brain, a brain that hasn’t gone through what you’ve just been going through for the past half hour or an hour.”