Death of a Patient: Functions but No Process [the root cause of adverse events in healthcare´s complex system]


Steven J. Spear
September, 2008


Site editor:


Joaquim Cardoso MSc.
Health Transformation — Review

September 19, 2022 urmgrp


This is a republication of an excerpt of the book “Chasing the Rabbit” — How Market Leaders Outdistance the Competition and How Great Companies Can Catch Up and Win.


In America, there is a heartbreaking gap between the care we actually have and the care we should get.

Conditions that previously could not be described and diagnosed now can be treated and even cured, including infertility, myriad forms of cancer, and a host of genetic diseases.

Limb reattachment and reconstructive surgery are now possible, along with minimally invasive orthopedic procedures and the conversion of HIV into a chronic condition.

To find a previous period when life could be improved and restored with such certainty, we would have to go back to the age of biblical miracles — fertility being granted for faithful prayer, the dead being restored to the living, ailments like blindness being cured by the laying-on of hands.


And yet modern medicine has become a terrible disappointment.


There are the exorbitant costs, but even for those who can find and afford treatment, the risks are considerable.

The Institute of Medicine has published studies estimating the number of patients who lose their lives to medical error — defined as the mismanagement of medical care while a person is hospitalized — to be as high as 98,000 people each year out of 33 million hospitalizations that occur annually.


The Institute of Medicine has published studies estimating the number of patients who lose their lives to medical error …to be as high as 98,000 people each year out of 33 million hospitalizations that occur annually.


This does not include the equal number felled by hospital-acquired infections.

This makes the risk of injury one in a few hundred, the risk of avoidable death one in a few thousand.


This makes the risk of injury one in a few hundred, the risk of avoidable death one in a few thousand.


As Dr. Lucien Leape, a pioneer in the patient-safety movement, described it at a lecture I attended, you would have to ride in motorized hang gliders or parachute off bridges to face risks similar to those of being a hospitalized patient.

And that is only for acute care. There are those who succumb to illness because of failures in preventive, primary, and chronic care as well.


… you would have to ride in motorized hang gliders or parachute off bridges to face risks similar to those of being a hospitalized patient.



It shouldn’t be like this. Medical science is great and the people who employ it are bright, well educated, well trained, hardworking, and altruistic.


But they work in systems that compromise their best efforts.

For instance, the Annals of Internal Medicine published a series of articles called “Quality Grand Rounds.”


These were detailed case studies of break downs in the delivery of care that led to human suffering.

The variety of things that could go wrong was both shocking and fascinating.

My friend and colleague, Dr. Mark Schmidhofer, and I began to wonder what these cases had in common. We found out that the answer was “plenty.”


In all of the cases that we examined, there were common characteristics that led to painful results.


People lacked a systems view — a full appreciation of how the work they did was affected by and affected the work of other people.


In all of the cases that we examined, there were common characteristics that led to painful results.

People lacked a systems view — a full appreciation of how the work they did was affected by and affected the work of other people.


Granted that, it was exceptionally difficult to understand all the nuances of how such a complex system worked, but the people in these cases did not advance their understanding when there were warnings that they should have.

Rather than push for ever-better clarity as to how things should work, they were exceedingly tolerant of ambiguities regarding who was supposed to do what, how to convey information from one person to the next, or how to perform a particular task.

And even when it was obvious that something was wrong, they worked around the problem, relying on extra vigilance and extra effort.

Thus they imposed on themselves the same set of problems day after day, consistently turning down the chance to understand the complex interactions of people, technology, place, and circumstance better and thus improve the system as its flaws were discovered.


Rather than push for ever-better clarity as to how things should work, they were exceedingly tolerant of ambiguities regarding who was supposed to do what, how to convey information from one person to the next, or how to perform a particular task. …

And even when it was obvious that something was wrong, they worked around the problem, relying on extra vigilance and extra effort. …

Thus they imposed on themselves the same set of problems day after day,



The case of Mrs. Grant


Let’s look at a case in which skilled and dedicated workers in different departments failed to heed warnings that they didn’t fully understand how their work affected each other. Their failure to do so killed a patient.

Mrs. Grant, a 68-year-old woman, was recovering from successful, elective cardiac surgery.

At 8:15 a.m., the day nurse who was just starting her shift discovered that Mrs. Grant was having a full-body seizure.

A code was called; blood was drawn and rushed to the lab. Mrs. Grant was then raced to radiology for a scan.

Was there an undetected mass, blood clot, blood leak, or other neurological cause? All those tests were negative.

Mrs. Grant was wheeled back to the nursing unit.

Awaiting the code team were the shocking results of the blood work: an undetectably low serum-glucose level. Mrs. Grant had nearly no blood sugar.

Her brain was sputtering like an engine pulling from a dry gas tank.

Hurried attempts to intervene intravenously failed. Mrs. Grant’s condition worsened and she went into a coma.

Weeks later, her family withdrew life support. What had happened?


To the hospital’s credit, an investigation was started immediately, with interviews, analyses, and a reconstruction of events.


They started by talking with the day nurse. What did she know? Nothing, as it turned out. She had just started her shift; her first interaction with Mrs. Grant had occurred when she observed the seizures and alerted the code team.

The night nurse had more to say, but nothing that seemed to shed any light — at first.

Apparently, he was at the nursing station when an alarm sounded at 6:45 a.m.

The monitor was reporting that a catheter snaked through Mrs. Grant’s vein to administer medication was showing a possible occlusion, a potentially life-threatening blood clot.

Understanding the severity of the situation, he hurried to Mrs. Grant’s room, loaded a syringe with a dose of the anticoagulant heparin, and injected it into the line.

He then checked that Mrs. Grant was resting comfortably and resumed caring for his other patients. Not until the code was sounded about an hour and a half later did he see Mrs. Grant again.

The investigation was drawing blanks until someone taking inventory of Mrs. Grant’s room asked where the empty vial of heparin was.

It should have been in the sharps container, the box in which used vials, needles, and other dangerous materials are disposed, but it was not.

Nursing is a hectic job with constant bursts of short-duration tasks and care for one patient inextricably intertwined with that of others.

The vial might have gotten swept up in the flurry of work. The investigators immediately started searching for it elsewhere.

Was it on a counter or in a cabinet? Had someone carried it to another patient’s room? It could not be found.

Then the staffer who was inventorying asked a more ominous question: Why was there an empty multidose vial of insulin in the disposal box?

No one had an answer, certainly not the night nurse. There had been no orders to give Mrs. Grant insulin.

The vial did not appear to have been carried in from another patient’s room. There was only one remaining explanation, and the implications were horrifying.

What started as an investigation likely turned into an interrogation, with someone demanding of the night nurse, “Where is the empty heparin vial? Why do we have an empty vial of insulin?” “I don’t know!” he must have protested, asserting both ignorance and innocence.

Then there was a jarring finding. Someone realized that insulin and heparin vials could not easily be distinguished from each other.

By size, shape, weight, and texture, they had the same feel. They looked alike, too.

A quick glance was insufficient to tell one from the other because both vials were clear glass containing a clear fluid.

Yes, they were labeled differently, but the vials were small, the labels were even smaller, and the type on the labels was smaller still.

Of course, a person could distinguish one from the other — but in a rush, responding to an alarm, in a darkened room at the end of an overnight shift? Not likely.

The team then realized what must have happened.


The nurse, in his rush to help Mrs. Grant, had reached for a vial of heparin and somehow grabbed a vial of insulin instead.

The vials were stored only 18 inches apart on the same nursing cart. Maybe he reached for the wrong location.

Maybe an insulin vial had found its way into the heparin stock. We’ll never know.

Once the vial was in his hand and certainly once the medicine was in the syringe, he could not know that his best efforts to protect his patient from the ill effects of a positive blood clot were going to kill her.



Whom would you blame for Mrs. Grant’s death? One could immediately blame the nurse.


After all, he was the one who delivered the fatal dose. While that might be emotionally gratifying, it leaves open the question of what had he actually done wrong?

He heard the alarm, interpreted it correctly, rushed to do what was correct in that situation, and reached for what he thought was heparin.

You can argue that he should have checked, double-checked, and even triple-checked that he had the right medication, but there is overwhelming evidence that relying on vigilance, monitoring, and otherwise being careful is a poor defense against error.

People are not wired to be reliably careful.

That is why, for example, such a large investment has been made in designing aircraft cockpits so that it is difficult to do the wrong thing — mistake thrust for flaps, turn or descend too sharply — and easy to do what is right.


People are not wired to be reliably careful.

That is why, for example, such a large investment has been made in designing aircraft cockpits so that it is difficult to do the wrong thing — mistake thrust for flaps, turn or descend too sharply — and easy to do what is right.


That is why airplane crews are allowed to work only so many hours at a stretch, only so many hours in a day, and only so many days in a week.

Yet here was Mrs. Grant’s nurse, rushing to save a life early in the morning and at the end of his shift, hardly the sweet spot of his circadian rhythm.


That is why airplane crews are allowed to work only so many hours at a stretch, only so many hours in a day, and only so many days in a week.

Yet here was Mrs. Grant’s nurse, rushing to save a life early in the morning and at the end of his shift, hardly the sweet spot of his circadian rhythm.


Do we insist that he should have interrupted his urgent response in order to examine the vials calmly and coolly — in dim light, mind you? As I mentioned above, this kind of carefulness is not really what we are wired for.

The truth is, the nurse was tricked — by packaging, presentation, lighting, and timing — into killing Mrs. Grant.


The truth is, the nurse was tricked — by packaging, presentation, lighting, and timing — into killing Mrs. Grant.


But who set this booby trap? Was it the pharmacy staff? After all, it was their job to prepare, package, and present medication to the nursing staff.


But the pharmacy staff might protest that they had done nothing wrong. 

Mrs. Grant did not die because the medication was of the wrong concentration or contaminated or mislabeled.

The nurse had done what a nurse was expected to do. The pharmacy staff had done what a pharmacy staff is expected to do. But Mrs. Grant was dead.



The real problem is that the system’s pieces may have worked, but their interaction failed, as the work of the pharmacy was grossly flawed from the perspective of nursing. Why?


If this hospital was like many with which I am familiar, it had a hierarchy within nursing — a charge nurse, the unit’s nursing manager, and a chief of nursing — and a hierarchy in pharmacy.

But what it likely lacked was someone responsible for the whole process of medication administration — all the way from the doctors who write the orders to the pharmacy where the orders are checked and filled to the nurses who give the patients their meds.

In the absence of an efficient way — or perhaps any way at all — to manage the functional pieces in the service of the whole process, there turned out to be a fatal disconnect.


The real problem is that the system’s pieces may have worked, but their interaction failed, as the work of the pharmacy was grossly flawed from the perspective of nursing.


But if that process was booby-trapped, you might ask, wouldn’t people like Mrs. Grant get killed all the time? And wouldn’t that already have gotten management’s attention?



This brings us to workarounds, firefighting, making do, and other means of coping with system chatter as a basic pathology of complex work systems.


David Bates, a physician at Brigham and Women’s Hospital in Boston and the author of the case about Mrs. Grant, has done extensive research on the frequency of medication-administration errors.

He and his colleagues discovered that for every patient killed by an error in medication administration, 5 to 10 are injured. (For example, Mrs. Grant might have lived but suffered harm.)

… for every patient killed by an error in medication administration, 5 to 10 are injured. (For example, Mrs. Grant might have lived but suffered harm.)

For every injury, there are 5 to 10 close calls. (Mrs. Grant’s nurse might have caught his mistake just as he was about to inject the insulin.)

For every close call, there are 5 to 10 slips and mistakes. (Mrs. Grant’s nurse might have picked up a vial of insulin, noticed that it was the wrong medication, put it back, and picked up the correct one instead.)

Behind the one mistake that killed Mrs. Grant, we can imagine 5 to 10 injuries, 25 to 100 close calls, and 125 to 1,000 slips and mistakes — a total of between 155 and 1,110 chances for someone at that hospital to say, “Hey, these vials are easy to mix up! Let’s do something about it before we kill someone.”

Behind the one mistake that killed Mrs. Grant, we can imagine 5 to 10 injuries, 25 to 100 close calls, and 125 to 1,000 slips and mistakes — a total of between 155 and 1,110 chances for someone at that hospital to say, “Hey, these vials are easy to mix up! Let’s do something about it before we kill someone.”


That’s exactly what would have happened in a high-velocity organization. But in low-velocity organizations, people suppress those indications that the work processes are inadequate and have to be modified.


When they run into obstacles, they treat them as the normal noise of the process, its unavoidable perturbations, around which they must work.

They get the job done, but they do not increase the chance that the next person will have a higher chance for success and a lower chance for failure.


For example, my colleague, Harvard professor Anita Tucker, made detailed minute-by-minute observations of nurses and found that they confronted some sort of operational failure — a glitch, an interruption, a misunderstanding, the absence of something needed — every few minutes.


Ninety percent of the time the nurses found a way to make do, finish the task, and meet their other responsibilities.

What do you think they did the remaining 10 percent of the time? Remember: Even if only 1 in 10 slips, mistakes, and close calls with insulin and heparin had been investigated, that might have saved Mrs. Grant’s life.

Unfortunately, at least for the nurses Tucker tracked, the other 10 percent of the time they did not draw someone’s attention to the problem so its causes could be rectified and its recurrence prevented.

They did show the problem to someone else, but that was only to get help working around it: a fellow nurse who could decipher a particular doctor’s illegible handwriting or someone heading down the hall who could snag some gloves or gowns.


Some of the temporary fixes were creative and expressed the nurses’ determination to meet the needs of their patients, but they had the inadvertent consequence of leaving in place the factors that had caused the problem in the first place.


(One nurse with whom I worked, when confronted with this reality of working around problems, blurted out, “I thought I was a great problem solver, but I just realized I’ve been solving the same problem every day for twenty years!”

We’ll visit with her later and see the results of her change from persistently working around problems to seeing problems and solving them.)

Applying the findings of Bates and Tucker to the situation in Mrs. Grant’s hospital, it is possible that despite the hundreds if not thousands of warnings that there was something deficient in the way those medications were presented to nurses, nothing was done in an environment of workarounds and firefighting, leaving Mrs. Grant and her nurse to their fates.

… it is possible that despite the hundreds if not thousands of warnings that there was something deficient in the way those medications were presented to nurses, …

… nothing was done in an environment of workarounds and firefighting, leaving Mrs. Grant and her nurse to their fates



What killed Mrs. Grant? The nurse? The pharmacist? No. It was an ineffective approach to managing complex interactive work that proved to be her undoing.


It was not clear to people how their work fit into a larger system.

The nature of the situations in which they found themselves was often ambiguous, and even when it was obvious that something was amiss, they kept plugging away, dealing as best they could with one thing after another.

Diane Vaughn calls this the normalization of deviance.


Diane Vaughn calls this the normalization of deviance.


There is a sad irony here. The hospital staff was able to determine what had killed Mrs. Grant only because they did exactly the right thing, quickly swarming the catastrophic situation once it had been discovered.


If they had waited a day or even an hour, memories might have changed, the sharps container might have been emptied, and the conditions that had allowed the problem to occur might have changed enough to prevent anyone from ever figuring out what had happened.

If that staff had only worked in an organization which trained and expected them to swarm small discrepancies with such velocity and determination — the slips, mistakes, and close calls — they would have seen the medication-administration system’s vulnerabilities earlier on and this disaster might have been averted.


If that staff had only worked in an organization which trained and expected them to swarm small discrepancies with such velocity and determination — the slips, mistakes, and close calls — …

… they would have seen the medication-administration system’s vulnerabilities earlier on and this disaster might have been averted.


References and additional information:


See the original publication


About the authors:


Steven J. Spear is a Senior Lecturer at MIT’s Sloan School of Management and Senior Fellow at the Institute for Healthcare Improvement.

About the authors:

Dr. Lucien Leape, a pioneer in the patient-safety movement

Dr. Mark Schmidhofe

Harvard professor Anita Tucker

David Bates, a physician at Brigham and Women’s Hospital in Boston

Total
0
Shares
Deixe um comentário

O seu endereço de e-mail não será publicado. Campos obrigatórios são marcados com *

Related Posts

Subscribe

PortugueseSpanishEnglish
Total
0
Share