Sunday, July 29, 2012

Adverse Outcomes in Emergency Medicine: Poor Judgment, Flawed Decision or a Fragile Socio-technical System?

Trials and Tribulations in Emergency Medicine: The Unintended Death of Rory Staunton
"Knowledge and error flow from the same mental resources, only success can tell the one from the other." - Ernst Mach (1905)
Can a small nick, cut or scrape kill anyone? The answer in the affirmative seems to have been the unfortunate outcome for a strapping 12-year old lad, Rory Staunton, who cut himself while diving for the basketball in gym class. The simple cut led to an irreversible series of events -- when the wound was infected by bacteria (Group A streptococcus) leading to septic shock -- that resulted in the untimely death in a New York City Hospital. (You can read the full story here as reported by New York Times)

Patient safety -- and the prevention of iatrogenic error (medical error) -- is of very big concern to medical practitioners, insurers and hospital systems -- and, last but not least, the patients and their family.  Despite a wealth of research over the last several years on this topic, by scholars ranging from doctors to human factors scientists, there is still an occurrence of mishaps like the one reported above. Fortunately, the more egregious kind, wrong limb amputation or wrong-site surgery, by and large have been mitigated in recent years due to change in procedures.

Returning to Rory, he displayed signs such as fever, an upset stomach and blotches on the skin. The first doctor, his pediatrician, to treat him seems to have concluded that they were unrelated to the wound; perhaps caused by a stomach flu? (Although on hindsight it is now possible to infer that those signs may have been the first signs of septic shock.) When his signs didn't subside, the pediatrician sent him to the emergency department at the NYU Langone Medical Center (LMC). 

This transfer of care, from one physician and clinic to another, is the beginning of complexity as knowledge about the case and information, from Rory (the patient himself), family, pediatrician had to be passed on to the hospital. These entities together form a socio-technical system as it consists of many players, technologies, and moving parts.


One of the first things that is done, to an incoming patient at LMC's Emergency Department (ED) is to screen him/her for sepsis. This is done using a checklist -- which was also done to Rory upon admission. At that time, he didn't have the required 3 or more indications for sepsis, which is used to raise the red flag.  The initial hypothesis of a stomach flu, first formulated by the pediatrician, was pursued by LMC's emergency physician as well.  The emergency department physician thus decided to alleviate Rory's condition, under the assumption they were caused by the stomach flu. She administered IV fluids, which seem to have improved Rory's condition. This improvement seems to have provided a sufficient, but NOT necessary, condition for the ED physician to discharge Rory from the hospital. Because later readings collected and tests (Fig. 1) done by LMC -- a few hours after admission -- did reveal that Rory was entering into septic shock. Neither the ED physician nor the hospital seem to have mentally registered this newly acquired information, which could have either prevented the premature discharge; or could have been used as a trigger to contact Rory's family even after the patient was discharged to begin treatment for sepsis.



Fig. 1 Severe Sepsis Triage Screening Tool (via NYT -- See this bigger image for details)


Rory's physicians seem to have deployed an "anchoring heuristic," as they held firm with their first hypothesis of stomach flu and never saw his signs and symptoms under a different light. The anchoring heuristic is not unusual to human cognition, where one mistakenly anchors to the first hypothesis and doesn't consider alternatives resulting in a decision bias, or first hypothesis error. 

Once Rory was incorrectly discharged, it turned out to be a point of no return. He went into septic shock at home (fever, nausea, pain, etc.). He was rushed back to the LMC's emergency department and despite the best efforts of the doctors he could not be revived.

Ex post facto, of course, with our "hindsight bias," it is easy to blame the emergency physician for prematurely discharging the patient and attribute it to negligence or at-risk, reckless behavior.  But one has no idea as to the circumstances (technical to financial pressures) that may have led to that incorrect decision. And it is highly likely that this was an unintentional error on the part of the physician at that time as she had no idea on what may occur thereafter.

In this posting, I will simplify this somewhat complex systems issue and highlight just two major theories that may partly explain the failure of the system in Rory's case:
1) Swiss cheese model
2) Blunt-end/Sharp end

Despite the safety barriers we humans devise in complex and safety-critical systems (aviation, nuclear power, medicine, etc.) somehow an adverse event worms its way through the "holes" in these barriers. This was iconically explained by the safety expert James Reason with his Swiss Cheese Model (Fig. 2).




Fig. 2: Swiss Cheese Model for Accident (via S2S)

The latent failures are "invisible" (hidden in the system), until the stresses and strains in the system  cause the proverbial holes in the Swiss Cheese slices expose them. 

Often times, after a tragic accident has occurred, the spot light is put on the human agent -- whether it be a physician, pilot or operator -- who is on the front-lines making the final call, decision or intervention. This is referred to as the sharp end of the system. Little attention is paid to the "back office" or blunt-end of the system where policies, procedures, training, financial/throughput pressures exist and may influence how the sharp-end performs. This was best captured by Woods & Cook in their blunt-end/sharp-end model (Fig. 3).


Fig. 3 Blunt-end/Sharp-end (via Woods & Cook)

When one analyzes the blunt-end, it is possible that any one of these (or more) could have contributed to the physicians unintentional errors:


  1. Are the physicians trained to avoid diagnostic biases (e.g., anchoring heuristic)?
  2. Insufficient time due to "throughput" pressures 
    • Patient turnover due to financial pressures 
    • Capacity of ER (does the ER have the resources (room, medical staff, etc.) to permit a wait, watch and observe protocol for a patient whose recovery status is unclear?; or is there pressure to make room for the next incoming patient due to patient volume?)
  3. Can EMR (Electronic Medical Records) or other technology aid physicians in diagnosis and not only flag premature discharge, but prevent it?
  4. Can there be policies and technologies that can keep the patient's caregiver in the loop -- i.e., make the process transparent with an open dialogue? 
    • So that even if the professionals missed something flagged by EMR, the patient or his caregivers would be notified electronically (smart phone, computer, etc.), and they can in turn resume the dialogue with the professional healers to clarify it or seek further assistance.

There are also things that can be done outside the system. For instance, how about educating the public that a freshly incurred wound should be washed under running water before putting on a band-aid? Because washing the wound is the first line of defense against infection and possible sepsis down road.

Blaming a physician at the "sharp-end" of the system may be emotionally satisfying. The physician's poor judgment or flawed decision making at the sharp-end are more than likely to have been a product of a fragile socio-technical system. What is needed is a more reasoned and seasoned approach at the socio-technical systems level, with its many layers, to prevent the next tragic error.

Ernst Mach observed "Knowledge and error flow from the same mental resources, only success can tell the one from the other." And a well designed healthcare socio-technical system has the potential  to favor knowledge over error, promote recovery rather than adverse outcomes.

Moin Rahman
Founder/Principal Scientist
HVHF Sciences, LLC

HVHF on Facebook
http://www.linkedin.com/company/hvhf-sciences-llc

Saturday, July 21, 2012

Colorado Shooting Tragedy: A Socio-Technical System Failure to the "Left-of-Fire"

Once again the inability to devise, legislate and put in place a policy that will prevent senseless shootings has cost 12 lives in Colorado. (The shooter, James Holmes, 24, was able to legally purchase his weapons in gun stores and many rounds of ammunition online.)

The objective of this post is not to debate the history or the constitutionality of the second amendment, which can be read in this fine piece published in the New Yorker, but to study the role of technologies that have a potential to kill in civil society.

Socio-technical Systems
A civil society is a socio-technical system, where society (a group of citizens) must interact with both simple and complex technologies ranging from electricity to automobiles. These technologies can kill, if citizens are reckless, negligent -- or malevolent in their intent.  To prevent this, a socio-technical system (STS) at-large, with its different stratas (politicians, policy makers, regulators, business owners, engineers, end-users, among others) and interacting technological components, has to plan, coordinate, engineer, implement and execute a robust and reliable system which will prevent injury and death.

One good example of this STS, a quasi eco system, if you will, are motor vehicles.  Consider two organizations in the motor vehicle STS: the Department of Motor Vehicles (DMV) and the National Highway Traffic Safety Administration (NHTSA). The DMV is charged with ensuring drivers have the necessary knowledge and skills to safely operate an automobile.  The NHTSA is charged with ensuring that vehicles are designed to meet the minimum standards for safety. Despite these measures there were 32,885 motor vehicle deaths in the US in 2010. Of course, this should take into consideration that the primary purpose of a motor vehicle is to provide a means for transport and not to kill.  Needless to say, a socio-technical system seen in the case of motor vehicles is lacking for guns. 

Left-of-Fire
In counter-insurgency warfare, a new doctrine called to the "left-of-boom" was developed based on social science and network theory to put an end to roadside bombs (a.k.a.,improvised explosive devices, IEDs) set-off by insurgents. If one were to visualize bomb-making as a supply chain in an insurgency socio-technical system, moving from left-to-right, one finds a funder for the operation, a technical planner, a material purchaser, supplier of components, assembler, intelligence gatherer, bomb planter, trigger puller -- all occurring to the left of boom (the bomb going off) on the road. On the right side of the boom you've the response. Paramedics, forensics, law enforcement, legal prosecution, among others. By going to the left-of-boom and disrupting the supply chain, counter-insurgents (allied military) in Iraq reduced their road-side fatalities caused by IEDs.

In the case of motor vehicles we have fairly robust STS -- to the "left-of-crash" -- that has reduced road fatalities over the years, but it is still not good enough. In the case of guns, the STS that exists today, to the "left-of-firing," is woefully inadequate. In addition to legal controls, we don't have technologies that specifically look at the human factor ("end-user") and at the point of user interaction (gun-human interface). For instance, we do not know how to lock-out a weapon if the user at the moment happens to be deranged, inebriated, or experiencing a moment of rage.

Until one has a robust STS for guns, with policies, laws and technologies that can provide the necessary checks and controls, gun violence, unfortunately is here to stay. Or it could be prevented by design, by taking a socio-technical systems perspective founded in technology, human factors psychology, and systems engineering.  And if one were to express this in the language of the second amendment, it would go like this:
A well regulated  socio-technical system being necessary to the security of all citizens in a free state, the right of the people not to be grievously injured by someone else's inability to safely bear arms shall not be infringed.

Moin Rahman
Founder/Principal Scientist
High Velocity Human Factors "HVHF" Sciences
http://hvhfsciences.com/
HVHF on Facebook
http://www.linkedin.com/company/hvhf-sciences-llc

Thursday, July 12, 2012

Automation's Biggest Irony (after all these years): The Non-Surprise

Bainbridge (1987) in the "Ironies of Automation" observed automatic equipment seems to function best when the workload is light and the task routine; when the task requires assistance because automation is incapable of handling a novel situation, causing a spike in the operator's workload, this is when the automatic equipment is of least assistance. This is the 'irony' of automation. 

This "irony" seems to have some relevance to the crash of Air France 447 as reported by IEEE Spectrum. In short, the pilot had no idea as to why the autopilot may have disengaged suddenly at cruising altitude -- a surprise (!) -- which resulted in inappropriate pilot inputs. (The pilots were unaware that all three air speed sensors (pitot tubes) were defective -- giving incorrect inputs to the flight computers due to the formation of ice crystals  -- and as the autopilot didn't have airspeeds to work with, it automatically disengaged.)



The biggest irony of automation, after all these years of human factors research and design, should really be viewed as a "non-surprise" for the following reasons:

  1. Automation is not fail-proof and it can result in dangerous consequences when the human operator is suddenly made-in-charge of an [automation] failure, thrusting him/her in a situation when the stakes are high and the time on hand is less. 
  2. A sudden failure in automation in a highly complex system, whose inner workings are opaque to the operator, may prove beyond the cognitive means of a highly stressed (panicky) operator to troubleshoot the situation and recover on time. 
The above (#2) happens when a pilot is suddenly made to shift roles from a passive monitor ["out-of-the-loop"] to an active operator ["into-the-loop"] and is forced to grapple with the situation and grasp what is going on by rapidly developing a veridicial mental model of the situation). Furthermore, this ability could be impaired due to danger or stress-induced impoverishment of an operator's cognitive control (rational thinking) resulting in disorganization of thought and/or inappropriate responses. (The latter topic forms the intellectual underpinnings of "High Velocity Human Factors.")

Years of experience have shown that invariably automation will abdicate its responsibility, when its performance envelope has been exceeded and bewilder the operator -- which should come as no surprise to the designers. So I will refer to it as a Non-Surprise. Thus it behooves designers to provide "means" -- that are not mentally taxing, e.g., requiring cognitive transformations and inferential reasoning --  where a highly stressed operator can comprehend and take control of a non-normal situation. But what are the "means" to this end? I will reserve this for another post.  

Moin Rahman
Founder/Principal Scientist
High Velocity Human Factors "HVHF" Sciences
http://hvhfsciences.com/
HVHF on Facebook
http://www.linkedin.com/company/hvhf-sciences-llc