When the System Fails: The Critical Role of Controls in SMS
- Jason Starke, Ph.D.
- 15 hours ago
- 3 min read

In a well-functioning Safety Management System (SMS), no single point of failure—especially a human being—should stand between routine operations and a catastrophic outcome. Yet time and again, we see systems that inadvertently position people as the final barrier to failure. One of the most sobering recent examples comes not from aviation, but from healthcare.
The case of Nurse Redonda Vaught has become a widely discussed and deeply divisive example in the medical community, but it also offers significant insights for SMS practitioners in any high-risk industry.
The Incident
In 2017, a patient at Vanderbilt University Medical Center was prescribed Versed, a sedative, to ease anxiety before an MRI. Instead, Vaught mistakenly administered vecuronium, a paralytic. The patient died, and Vaught was ultimately charged and convicted of criminally negligent homicide—despite self-reporting the error and cooperating fully with the investigation.
While heartbreaking, the case is not just a story of individual error. It is a story of systemic failure—the kind that SMS is designed to identify and mitigate.
Control Failure and System Design
At the heart of this case is a critical control: the automated medication dispensing system. The system was widely known to be unreliable, and it was common practice among nurses to override it. In SMS terms, we would label this an ineffective or degraded control—a barrier intended to prevent a high-consequence event that fails under operational pressure.
Vaught followed what had become "tribal knowledge": override the machine to complete the task. The unsafe condition was normalized. And this normalization of deviance—when a poor system design becomes the accepted norm—is precisely what an SMS is supposed to root out.
Latent Conditions and Organizational Drift
This is a textbook example of what we refer to in SMS as latent conditions—systemic weaknesses that lie dormant until aligned with active failures, producing an incident. Long hours, high stress, understaffing, power imbalances between doctors and nurses, and broken controls all aligned to create a high-risk environment. The organization had drifted into unsafe territory without immediate consequence—until tragedy struck.
The Human as the Last Line of Defense
In a robust SMS, we aim for defenses-in-depth. When controls fail, additional mitigations should activate before a person becomes the final safeguard. In this case, every other layer failed, leaving Vaught—fatigued, rushed, and operating in a high-pressure environment—as the only thing standing between the patient and disaster.
That’s not a safety net. That’s a setup.
Reporting Culture and Just Culture
One of the most alarming ripple effects of this case was the response from the broader healthcare community: if self-reporting leads to criminal charges, why report at all?
A strong SMS thrives on a just culture—one where individuals are held accountable for reckless behavior, but not punished for system-induced errors. Vaught’s prosecution has created fear and mistrust, threatening the very foundation of safety reporting. If nurses (or pilots, or maintenance crews) are too afraid to report close calls, we lose the most valuable data we have for proactive risk management.
SMS Takeaways
This case is a cautionary tale—and a call to action for safety professionals. Here are key takeaways through the lens of SMS:
Audit your controls regularly. Are they working as designed? Are they being bypassed? If so, why?
Strengthen defenses-in-depth. Don’t rely on the human as the final barrier. Humans are fallible—especially under fatigue and stress.
Listen to the frontline. Tribal knowledge is a warning sign. If staff have "workarounds," your system is already failing.
Protect reporting culture. Punishing honest errors erodes trust and disables proactive safety.
Apply the Substitution Test. Would another qualified person, in the same context, have made the same mistake? If so, the system—not just the individual—is flawed.
Final Thoughts
As safety professionals—whether in aviation, healthcare, energy, or another high-risk field—our work is to design systems that fail safely. That means recognizing that every human is susceptible to error and ensuring our systems are robust enough to absorb those errors without catastrophic consequences.
The Redonda Vaught case is a reminder that when SMS is absent, ignored, or hollow, the result isn't just inefficiency—it can be irreversible harm.
Let’s build systems where safety is not the exception, but the expectation.
Comments