Toggle Menu

In Criminalizing Error, We Are Doomed to Repeat Our Mistakes

Sending a nurse to prison for causing a patient’s death may satisfy the thirst for vengeance, but it won’t make hospitals any safer.

Jessie Singer

April 5, 2022

RaDonda Vaught, charged with reckless homicide, arrives for a court hearing on February 20, 2019, in Nashville, Tenn. (Mark Humphrey / AP Photo)

Two weeks ago in Nashville, a jury found former Vanderbilt University Medical Center nurse RaDonda Vaught guilty of criminally negligent homicide and abuse of an impaired adult for unintentionally giving her patient, 75-year-old Charlene Murphey, the wrong medication. Murphey died; Vaught faces up to six years in prison for the abuse charge and up to two years for homicide.

Vaught’s prosecution sets a dangerous precedent. When we criminalize human error, we are doomed to repeat our mistakes. Consider Vaught’s story: The nurse, who admitted to being complacent and distracted, chose the wrong medicine from an electronic medication dispensing cabinet by overriding at least five warnings from the program of the potential error. But, Vaught testified, overriding such warnings was a common and everyday practice at that medical center, necessary because technical problems frequently beset the medical cabinet and medical records systems, delaying urgent care. In fact, she said, Vanderbilt explicitly instructed nurses to override the system as a workaround. Just three days of caring for Murphey required at least 20 overrides.

Criminalizing Vaught’s error implies that she caused Murphey’s death by administering the wrong medication alone—that the problem was a bad person, not a bad system. But Vaught’s error was simply the last thing that went wrong in a long causal chain. Even putting aside the institutionalized practice of overriding warnings, Vaught administered the wrong medication in a high-pressure work environment undergoing an overhaul of its medical records system, where nurses were required to put in 12-hour shifts while orienting a new employee during a nationwide nurse shortage—and, while we do not know exactly how distracting Vaught’s circumstances were or the patient-to-nurse ratio at Vanderbilt that day, we know that a high patient-to-nurse ratio means a higher rate of accidents, and that the more distracted a nurse is, the higher the likelihood of medical errors.

The prosecution of Vaught’s error is a prime example of what workplace safety expert Sidney Dekker calls the “bad apple” theory—the idea that errors in the workplace are caused by problematic people. Applied to Vaught’s error, for example, this theory would tell us that Vanderbilt was inherently a safe place to work and that one nurse’s carelessness made it unsafe. When accidents happen, by this logic, they are the fault of a few “bad apples,” and employers can make the workplace safe by removing, retraining, or otherwise punishing the problematic ones.

Current Issue

View our current issue

Subscribe today and Save up to $129.

“Punishment focuses the attention on only one contributing factor or component. In a complex system, you can never explain, let alone improve, its functioning by reference to the performance or nonperformance of a single component. That is literally nonsensical,” Dekker explained to me. “Complex systems fail and succeed because of the interaction between a whole host of factors.” Dekker advocates instead what he calls the “new view”—the idea that if people are making mistakes and getting hurt, it should be an indicator that the system in which they work is unsafe.

Still, you can find the “bad apple” theory employed in response to tragedy and disaster across a wide range of workplaces: the Arizona autonomous vehicle test driver prosecuted for negligent homicide after looking at her phone just before the Uber AV killed a pedestrian, even though Uber had disabled the vehicle’s emergency brakes; the Colorado truck driver sentenced to 10 years in prison when his brakes failed, resulting in the death of four people, even though he did not own the truck; the five Washington construction workers charged with manslaughter in a fatal trench collapse, including two who were themselves buried in the trench.

This prosecutorial grandstanding rarely has anything to do with preventing similar “accidents” from happening again. Individual punishment as a response to a systemic problem may even increase the likelihood that disaster repeats. Here’s why: “Bad apple” solutions leave in place the unsafe systems that drove the harm in the first place. After an accident, do you fire the injured construction worker or do you change the trench-digging procedure company-wide? Do you punish the truck driver—or implement new brake tests? Do you send the nurse to prison—or reprogram the electronic medication cabinet? A great deal of power rests with whoever decides the answer to these questions, and in answering these questions, we can often predict whether the same accident might happen again.It is important to note why “bad apple” solutions are adopted by those in power in cases such as Vaught’s. Blaming human error is a win-win for those in charge: It sends a message that the system is safe and that human error caused the aberration—and it makes the blamers look like intelligent go-getters taking action to address the problem.

But there is clear evidence that the threat of such retribution makes people less likely to report mistakes—or the dangerous conditions that lead to them. Workers in the construction industry are less likely to report injury and danger in a punitive environment; health care workers commonly fail to report errors for fear of punishment; a study of air traffic controllers found that reports of errors and near-misses fell by half after controllers were prosecuted for one. Both the American Hospital Association and the American Nurses Association warn that Vaught’s prosecution could have a chilling effect on problem-solving and safety.

Nurses are products of their environment, just like AV test drivers, long-haul truckers, construction workers, and air traffic controllers. Punishing them may satisfy the thirst for vengeance, but it won’t make patients safer. Putting Nurse Vaught in jail lulls us into thinking that the situation is resolved while the root causes and systemic dangers remain. Vaught won’t be making the mistake next time, but—unless conditions change—the same “accident” is doomed to repeat.

Jessie SingerTwitterJessie Singer is a journalist and the author of There Are No Accidents: The Deadly Rise of Injury and Disaster—Who Profits and Who Pays the Price, recently published by Simon & Schuster.


Latest from the nation