Moin Rahman, Principal Scientist, HVHF Sciences, presents the Direct Perception-Action Coupling (DPAC) approach to design hyper-intuitive user-interactions between professional end-users and mission critical products, particularly in high stakes and time compressed situations. The DPAC approach is particularly applicable to inform the design of technology (computers, communication, devices, cockpits, life sustaining devices, etc.) in safety critical domains such as warfighting, firefighting, emergency medicine, aviation, automobiles, among others.
Rahman, M., Balakrishnan, G., & Bergin, T. (2011). Designing Human-Machine Interfaces for Naturalistic Perceptions, Decisions and Actions occurring in Emergency Situations. Theoretical Issues in Ergonomics Science. Vol. 13(3), 358-379.
Available online at: http://www.tandfonline.com/doi/abs/10.1080/1463922X.2010.506561
Rahman, M. (2012). Direct Perception-Action Coupling: A Neo-Gibsonian Model for Critical Human-Machine Interactions under Stress. In Proceedings of the Human Factors and Ergonomics Society 56th Annual Meeting. Santa Monica, CA: Human Factors and Ergonomics Society.
Available online at: http://pro.sagepub.com/content/56/1/1401.abstract?etoc
"Smart" Embodied Interactions - Examples
Caterpillar M-Series Moto Grader (latest model) -
Note: More than 15 levers present in the previous model were replaced with two joy sticks that encapsulate the DPAC approach discussed in the presentation.
NEST Learning Thermostat
Black & Decker Gyro Screw Driver
-------------------------------------------
Moin Rahman
Founder/Principal Scientist
HVHF Sciences, LLC
"Designing systems and solutions for human interactions when stakes are high, moments are fleeting and actions are critical."
Officers belonging to the New York Police Department (NYPD) over the last two decades have been involved in inadvertent shootings of unarmed citizens (e.g., Amadou Diallo, Sean Bell), which have resulted in a tragic loss of innocent lives. The most recent incident occurred on Oct. 4, which resulted in the fatal shooting of Noel Polanco, 22, who was shot by a 14-year veteran detective, Hassan Hamdy, 39, assigned to the elite Emergency Service Unit. Needless to say, these incidents not only besmirch the reputation of a police department whose function in a Republic is to protect and serve its citizens but also cause irreparable harm to police-community relations.
Mr. Polanco's Honda was pulled over on the Grand Central Parkway because, the police said, it had cut off their vehicles.
The questions raised by these tragic incidents are many. But one crucial question that is of great interest to the human factors/cognitive scientist is what may have caused the police officer NOT to realize that the citizen was unarmed, meant no harm -- yet pursued a course of action (use of deadly force) that was incompatible with the situation that was encountered (unarmed civilian). Put in human factors jargon (taxonomy [Marx, 2008]), did the police officer commit a: Normal Error: Inadvertent action (slip, lapse, mistake)? At Risk Behavior: A choice: risk not recognized or believed justified? Reckless Behavior: Conscious disregard of unreasonable risk?
The job of a police officer is certainly not easy. He must accomplish speedy and successful "sensemaking" (making sense of one's experience and giving it meaning) and situated-social cognition (reading and interpreting the intent of a civilian who may appear to be posing a threat) in a high stakes situation, where incorrect sensemaking may end-up costing his own life.
So the question is, how should the citizens of a republic recruit a police officer with the right type of psychological profile who will not indulge in Reckless Behavior, legally arm him; then ensure that he his trained so that he doesn't become a victim to his own poor choices of At Risk Behavior or commit Normal Error? This is also dictated by the relationship between a police officer and citizen through what is referred to as the "Power Distance": Power Distance is one among five dimensions of a culture identified by the sociologist Geert Hofstede who defined it as:
"the extent to which the less powerful members of organizations and institutions (like the family) accept and expect that power is distributed unequally."
The power distance may vary between the police and citizenry. It may be very large in a police state, moderate in a country like the United States (this may vary from city to city), and relatively low in European democracies such as Great Britain The abuse and misuse of power distance are serious problems where they can cause loss of life in risky socio-technical systems such as police departments, military, aviation or hospitals as they set expectations on how personnel (or the bureaucracy_ should perform within (in-group) and interact without (out-group). Problems caused by power distance are typically symptoms of a decrepit organizational culture that is marked by poor management, training and operational protocols.
Having introduced the above concepts, let us now consider Mr. Polanco's shooting through them. Early reports on this shooting indicate that "Mr. Polanco was driving erratically, switching lanes while speeding, and twice cutting off two police trucks carrying nine officers of the Emergency Service Unit..." The narrative given by a passenger in Mr. Polanco's car indicates that the officers committed an act of road rage, allegedly losing their temper when their authority (power distance) was challenged on the road (Mr. Polanco cutting off the two police trucks, twice). This may have been exacerbated when the police officers approached Mr. Polanco's Honda, which was forced to a stop, and ordered those inside the car to show their hands -- and Mr. Polanco didn't comply. According to the Times, in an interview, a passenger, Diane Deferrari, in the Honda said that Mr. Polanco "...had no time to comply and that, in that instant Detective Hamdy fired the shot." It is too early to conclude as to what may caused Detective Hamdy to fire that fatal shot. Was it Reckless Behavior? Or a case of Normal Error or At-Risk Behavior brought about by poor sensemaking? Some may be inclined to lean towards Reckless Behavior, even though this was the very first shot that Detective Hamdy fired in his 14-year career as a police officers. But they may point to the two lawsuits that were brought against him for allegedly not following proper procedures when apprehending suspects. On the other hand, the portrait of Detective Hamdy is somewhat complex, because earlier this year, he was also accorded the status of a hero as he helped rescue five people in a burning apartment building. One may also wonder whether Detective Hamdy's prior professional background in the military, which has a very different conception of the use of fatal force and power distance as opposed to policing, may have influenced his decision making in the situation discussed above. (Detective Hamdy served four years in the Marine Corps, rising to the level of sergeant in an artillery division, and earned medals for good behavior.) Furthermore, did Detective Hamdy's current assignment in the Tactical Apprehension Unit (TAU) of the NYPD, a very stressful and risky operational setting -- that may employ somewhat of a larger Power Distance than typical policing on the street -- play a role in this shooting? What might be referred to as a negative transfer of skills, experience and training, which [TAU] is geared towards taking on criminals and gangs to a situation that was of a different nature (unsafe vehicular operation on the road of a driver) get the better of him? Next, one may also attribute to the inappropriate decision making of Detective Hamdy to issues raised by High Velocity Human Factors (HVHF). Did the autonomic arousal -- what is termed as "predatory cardiovascular reactions" (much like the arousal a predator experiences when chasing a prey) -- that was triggered-off by the adrenalin released in the car chase have a role? Perhaps, this autonomic arousal didn't make him pause (sensemaking) before opening fire? For instance, giving consideration to the possibility that Mr. Polanco may not have heard the officers' orders to raise his hands. Or was there a real threat that was perceived by Detective Hamdy when Mr. Polanco didn't raise his hands from the steering wheel? (An earlier report indicates that a Power Drill was found on the passenger seat of the vehicle.) Did danger-induced emotional arousal distort the facts [perceptual mechanisms such as the "snake in the grass effect"], much like the officers who shot Amadou Diallo, who mistook his black wallet for a gun? We may not know until the inquiry is complete. In the meantime, there is one thing, that is, training, which certainly needs to be revisited in the best interests of all concerned. Professionals in a variety of professions are trained under the rubric referred to as KSAs (Knowledge-Skills-Attitudes) to do the job. They can be briefly described as follows in the context of doing a job, whether it be flying a plane or being a police officer: Knowledge: Need to know Skills: Need to do Attitudes: Need to feel Training of professionals is typically very good on the first two (K & S) items. But it is always a challenge with the last one, "Attitudes." In policing, particularly in a time compressed, high stakes situations an officer may not have enough time for analysis of the situation, rational thought and decision making. He literally has to go with somatic situation awareness (see publications), or what is referred to as "gut feeling" in the vernacular. How does one "train" gut feeling to make those right decisions when danger is imminent and the moments are fleeting? This issue has been studied under the auspices of HVHF by bringing to bear both evolutionary psychology and neuroscience. The process of interpreting this body of science and translating it into pedagogical curriculum to inculcate the ability in an officer through feeling (Attitude: need to feel) and interpreting the intent (situated-social cognition) of a civilian has just begun. This work needs to be accelerated so that officers, including soldiers (particularly in COunter INsurgency operations; COIN), do not become victims of their own circumstances; wherein they end-up in the fatal shooting of innocent civilians like Mr. Polanco, or killing one of their own, committing fratricide. Moin Rahman Founder/Principal Scientist HVHF Sciences, LLC HVHF on Facebook http://www.linkedin.com/company/hvhf-sciences-llc http://www.linkedin.com/in/moinrahman
Fragments of Attention: The Perils of Multitasking
Any man who can drive safely while kissing a pretty girl is simply not giving the kiss the attention it deserves.- Albert Einstein
Student and teenage drivers are often told, "keep your eyes on the road." Seasoned drivers are told "don't text or talk on the cellphone while driving." These admonitions are spot-on. In addition to this, one rarely hears someone telling drivers "pay attention to the road." After all what is the point of LOOKING but not SEEING? Meaning, that one should not only look at the road, but also cognitively register the stimuli (other cars, pedestrians, etc.) on the road, including their intentions, trajectories, unexpected appearances of new stimuli, etc. Thus, a good driver has to monitor, by paying attention, to the present state of affairs on the road and the potential future. He must do all this, while simultaneously attending, albeit with automaticity (for the expert driver), to other driving tasks such as steering, keeping the gas pedal depressed; and, at times, to other undesirable secondary tasks like arguing with one's colleague on some abstruse or mundane topic. Say, about the Boson Higgs [God] particle or where to go for lunch.
The challenge posed by a modern automobile's cockpit, with its multitude of devices (GPS, Entertainment System, climate control, etc., Figure 1a) when combined with heavy traffic, and poor decision making of the driver (focusing on say, track selection on his music player rather than lane selection on the road), can result in adverse outcomes. Simply because there are limits to how many tasks one can do by dividing attention while driving. Findings from a recent study that report the perils of multitasking, texting while driving, provide further details about the limited bandwidth of dividing attention.
Figure 1a: The complexity of a modern Automobile Cockpit has multiplied with the increase in a number of subsystems (GPS, climate control, etc.) that are unrelated to the primary task of driving
Flying a plane requires the pilot to both aviate and navigate at once. A pilot has to divide attention (scan) across six essential, primary flight instruments (Figure 1b) to keep the plane flying and heading in the right direction. This does not include chatting with the co-pilot or monitoring radio traffic or even engine parameters. The inherent task of flying an airplane fragments attention and test its limits.
Figure 1b: The primary flight instrumentation (the minimum) required to aviate and navigate a plane
1: Airspeed Indicator (ASI); 2: Attitude Indicator (AI); 3: Altimeter; 4: Turn Coordinator (TC); 5: Directional Gyro (DG); 6: Vertical Speed Indicator (VSI). For more details read this article.
So one needs to pay attention to Einstein's words (quote above), if one wishes not to crash the car, or its corollary, deliver and experience the pleasure of a quality kiss!
The "Spotlight" of Attention: when it holds and when it folds
To simplify the understanding of attention, it can be thought of a spotlight. Where the spotlight of attention is shined determines what we hear or see. Paying attention to what is being said by a man to the right of us at a cocktail party -- shining the "attentional" spot light on him -- may result in us ignoring what is being said by the lady on the left. This spotlight theory is true, until suddenly someone across the room utters our name in passing conversation. Our attention is now suddenly drawn to our name, because of the familiarity and emotional valence it has on us. This is known as the Cocktail Party Effect.
In dangerous and threat situations, our brain has been evolutionary wired to "preattentively" process negative information. Studies have shown that an angry face is spotted faster in a collection of happy faces, whereas the opposite, spotting a happy face in a collection of angry faces is much slower (Figure 2).
Figure 2: Preattentive Processing of the Angry Face - "Angry Face Effect"
This subconscious spotlighting of attention is also revealed in what is referred to as the "Snake in the Grass Effect" by Prof. Joseph LeDoux. Say, you're walking in the woods and suddenly you feel something gliding past your shin. Your immediate response, even before you consciously could pay attention to the plausible cause, was most likely to recoil in shock. Then look at the cause. Was it some thing dangerous (snake) or was it just a fallen branch? Our unconscious spotlight of attention, via our fear response (emotional brain) may have saved our life, if it happened to be a snake.
These types of biases (survival instinct) may sometimes prove to be fatal in high risk professions such as law enforcement and combat. One tends to have an implicit fear response, which may result in tragic consequences such as the shooting of Amadou Diallo, an innocent civilian, or fratricide (A-10 Thunderbolt, Air-to-Ground Friendly Fire incident in the Iraq war).
Attention can be exteroceptive, i.e., linked to the sensory system (vision, hearing, touch, etc.) or nonexteroceptive (purely in the mind). In exteroceptive case, one can figure out something flying at a distance (bird or plane) by paying attention to it; in the nonexteroceptive case, one can multiply (22 X 38) by holding the numbers in our mind and applying certain rules to get the answer.
What is even more interesting is that the exteroceptive case may cut across sensory modalities. For instance, "looking" with one's eyes at the source of a sound may help us "hear" better and understand what is being said. For instance, we may have unconsciously stared at the speakers of a PA system, or the direction from which the sound is propagating, to hear better what is being said in a noisy airport. And sometimes looking at the speaker's lips can influence what we hear! Check out this illusion called the McGurk effect below
Close your eyes, play this movie, and listen. Open your eyes, replay and listen again. Is he saying "ba ba" or "da da"? It's called the McGurk effect. The man in this video, and its creator, is Arnt Maasø, associate professor at the University of Oslo. (More on the McGurk effect )
Communication and its comprehension (or the opposite, misunderstanding or mishearing) are key determinants when a a crew, squad or team have to succeed. This can apply to say, an aircraft's cockpit, surgical theater, coal mine, or marine unit in the heat of urban combat. Poor communication skills and protocols or noisy radio links can result in dire outcomes or false alarms; as was the case with the latter when a false alarm was issued at a Shell refinery.
Funneling-Tunneling of Attention
On 29 Dec 1972, Eastern Airlines L-1011 Flight had reached its destination, Miami, and was preparing to land. At that time, the crew encountered a problem with the landing gear and got preoccupied with troubleshooting it. The crew's attention was tunneled on the problem and nobody noticed the gradual loss in altitude -- nobody paid attention to the altimeter -- and L-1011 crashed killing 101 of the 176 people on board.
Focusing of attention in a tunnel, without periodically scanning for what else might be going on in the environment, can be costly. Put another way, the tunneling and funneling of attention -- or switching of attention from focused to divided should be optimized to minimize error (Figure 3).
Figure 3: Attention Funneling-Tunneling
The pilot, soldier, police officer, fire fighter or surgeon -- or a tennis player -- should have a strategy to switch his attention from the funnel to the tunnel and vice versa. Obviously, this comes with experience. And system designers, can design the user interface of a cockpit or an infusion pump, to draw the attention of the human agent with proper enunciation (e.g., alarm) when attention has to be directed to it. (Poor design of multiple alarms in complex systems such as power plants, cockpits, or even cars, is another big problem that can contribute to accidents. This is referred to as the "Too much, too soon? or too little, too late?" problem.).
Sports too, and tennis in particular, demands strategic switching of attention between the tunnel and the funnel. The great maestro Roger Federer's visual attention (and by extension cognitive) is legendary and has been analyzed in-depth. For example, at the point of contact with the ball (exteroceptive attention; Figure 4), and the nonexteroceptive aspects of attention as to where to place the ball, what is the opponent's intent, reading the trajectory, speed, and spin of the ball, etc., are essential to win matches.
Figure 4: Roger Federer's focused visual attention (exteroceptive) as he makes contact with the ball
This posting just touched the tip of the iceberg with regards to attention. Research on attention has come a long way with increasing complexity of technology (air traffic control, cockpits, drones (UAVs), nuclear power plants, etc.). Dr. Christopher Wickens, Professor Emeritus @ University of Illinois, the doyen of cognitive/engineering psychology has enhanced our understanding of attention with his Models of Attention (multiple resource theory, SEEV), which are applied to design human-machine interfaces for complex technologies to exploit the innate strengths of human attention. I will discuss them in a future post.
But until then let us be cautious about our capacity to pay attention to a number of things at once, and wary about getting sucked into the funnel -- and then tunnel -- of attention never to switch back. The training of human agents (drivers, pilots, surgeons, police officers, fire fighters, process control operators, among others) and the design of technology to make it compatible with the predilection and capabilities of human attention are of great import that deserves GREAT ATTENTION.
Trials and Tribulations in Emergency Medicine: The Unintended Death of Rory Staunton
"Knowledge and error flow from the same mental resources, only success can tell the one from the other." - Ernst Mach (1905)
Can a small nick, cut or scrape kill anyone? The answer in the affirmative seems to have been the unfortunate outcome for a strapping 12-year old lad, Rory Staunton, who cut himself while diving for the basketball in gym class. The simple cut led to an irreversible series of events -- when the wound was infected by bacteria (Group A streptococcus) leading to septic shock -- that resulted in the untimely death in a New York City Hospital. (You can read the full story here as reported by New York Times) Patient safety -- and the prevention of iatrogenic error (medical error) -- is of very big concern to medical practitioners, insurers and hospital systems -- and, last but not least, the patients and their family. Despite a wealth of research over the last several years on this topic, by scholars ranging from doctors to human factors scientists, there is still an occurrence of mishaps like the one reported above. Fortunately, the more egregious kind, wrong limb amputation or wrong-site surgery, by and large have been mitigated in recent years due to change in procedures. Returning to Rory, he displayed signs such as fever, an upset stomach and blotches on the skin. The first doctor, his pediatrician, to treat him seems to have concluded that they were unrelated to the wound; perhaps caused by a stomach flu? (Although on hindsight it is now possible to infer that those signs may have been the first signs of septic shock.) When his signs didn't subside, the pediatrician sent him to the emergency department at theNYU Langone Medical Center (LMC). This transfer of care, from one physician and clinic to another, is the beginning of complexity as knowledge about the case and information, from Rory (the patient himself), family, pediatrician had to be passed on to the hospital. These entities together form a socio-technical system as it consists of many players, technologies, and moving parts.
One of the first things that is done, to an incoming patient at LMC's Emergency Department (ED) is to screen him/her for sepsis. This is done using a checklist -- which was also done to Rory upon admission. At that time, he didn't have the required 3 or more indications for sepsis, which is used to raise the red flag. The initial hypothesis of a stomach flu, first formulated by the pediatrician, was pursued by LMC's emergency physician as well. The emergency department physician thus decided to alleviate Rory's condition, under the assumption they were caused by the stomach flu. She administered IV fluids, which seem to have improved Rory's condition. This improvement seems to have provided a sufficient, but NOT necessary, condition for the ED physician to discharge Rory from the hospital. Because later readings collected and tests (Fig. 1) done by LMC -- a few hours after admission -- did reveal that Rory was entering into septic shock. Neither the ED physician nor the hospital seem to have mentally registered this newly acquired information, which could have either prevented the premature discharge; or could have been used as a trigger to contact Rory's family even after the patient was discharged to begin treatment for sepsis.
Rory's physicians seem to have deployed an "anchoring heuristic," as they held firm with their first hypothesis of stomach flu and never saw his signs and symptoms under a different light. The anchoring heuristic is not unusual to human cognition, where one mistakenly anchors to the first hypothesis and doesn't consider alternatives resulting in a decision bias, or first hypothesis error. Once Rory was incorrectly discharged, it turned out to be a point of no return. He went into septic shock at home (fever, nausea, pain, etc.). He was rushed back to the LMC's emergency department and despite the best efforts of the doctors he could not be revived. Ex post facto, of course, with our "hindsight bias," it is easy to blame the emergency physician for prematurely discharging the patient and attribute it to negligence or at-risk, reckless behavior. But one has no idea as to the circumstances (technical to financial pressures) that may have led to that incorrect decision. And it is highly likely that this was an unintentional error on the part of the physician at that time as she had no idea on what may occur thereafter. In this posting, I will simplify this somewhat complex systems issue and highlight just two major theories that may partly explain the failure of the system in Rory's case: 1) Swiss cheese model 2) Blunt-end/Sharp end Despite the safety barriers we humans devise in complex and safety-critical systems (aviation, nuclear power, medicine, etc.) somehow an adverse event worms its way through the "holes" in these barriers. This was iconically explained by the safety expert James Reason with his Swiss Cheese Model (Fig. 2).
The latent failures are "invisible" (hidden in the system), until the stresses and strains in the system cause the proverbial holes in the Swiss Cheese slices expose them. Often times, after a tragic accident has occurred, the spot light is put on the human agent -- whether it be a physician, pilot or operator -- who is on the front-lines making the final call, decision or intervention. This is referred to as the sharp end of the system. Little attention is paid to the "back office" or blunt-end of the system where policies, procedures, training, financial/throughput pressures exist and may influence how the sharp-end performs. This was best captured by Woods & Cook in their blunt-end/sharp-end model (Fig. 3).
When one analyzes the blunt-end, it is possible that any one of these (or more) could have contributed to the physicians unintentional errors:
Are the physicians trained to avoid diagnostic biases (e.g., anchoring heuristic)?
Insufficient time due to "throughput" pressures
Patient turnover due to financial pressures
Capacity of ER (does the ER have the resources (room, medical staff, etc.) to permit a wait, watch and observe protocol for a patient whose recovery status is unclear?; or is there pressure to make room for the next incoming patient due to patient volume?)
Can EMR (Electronic Medical Records) or other technology aid physicians in diagnosis and not only flag premature discharge, but prevent it?
Can there be policies and technologies that can keep the patient's caregiver in the loop -- i.e., make the process transparent with an open dialogue?
So that even if the professionals missed something flagged by EMR, the patient or his caregivers would be notified electronically (smart phone, computer, etc.), and they can in turn resume the dialogue with the professional healers to clarify it or seek further assistance.
There are also things that can be done outside the system. For instance, how about educating the public that a freshly incurred wound should be washed under running water before putting on a band-aid? Because washing the wound is the first line of defense against infection and possible sepsis down road. Blaming a physician at the "sharp-end" of the system may be emotionally satisfying. The physician's poor judgment or flawed decision making at the sharp-end are more than likely to have been a product of a fragile socio-technical system. What is needed is a more reasoned and seasoned approach at the socio-technical systems level, with its many layers, to prevent the next tragic error. Ernst Mach observed "Knowledge and error flow from the same mental resources, only success can tell the one from the other." And a well designed healthcare socio-technical system has the potential to favor knowledge over error, promote recovery rather than adverse outcomes.
Once again the inability to devise, legislate and put in place a policy that will prevent senseless shootings has cost 12 lives in Colorado. (The shooter, James Holmes, 24, was able to legally purchase his weapons in gun stores and many rounds of ammunition online.)
The objective of this post is not to debate the history or the constitutionality of the second amendment, which can be read in this fine piece published in the New Yorker, but to study the role of technologies that have a potential to kill in civil society. Socio-technical Systems
A civil society is a socio-technical system, where society (a group of citizens) must interact with both simple and complex technologies ranging from electricity to automobiles. These technologies can kill, if citizens are reckless, negligent -- or malevolent in their intent. To prevent this, a socio-technical system (STS) at-large, with its different stratas (politicians, policy makers, regulators, business owners, engineers, end-users, among others) and interacting technological components, has to plan, coordinate, engineer, implement and execute a robust and reliable system which will prevent injury and death.
One good example of this STS, a quasi eco system, if you will, are motor vehicles. Consider two organizations in the motor vehicle STS: the Department of Motor Vehicles (DMV) and the National Highway Traffic Safety Administration (NHTSA). The DMV is charged with ensuring drivers have the necessary knowledge and skills to safely operate an automobile. The NHTSA is charged with ensuring that vehicles are designed to meet the minimum standards for safety. Despite these measures there were 32,885 motor vehicle deaths in the US in 2010. Of course, this should take into consideration that the primary purpose of a motor vehicle is to provide a means for transport and not to kill. Needless to say, a socio-technical system seen in the case of motor vehicles is lacking for guns. Left-of-Fire
In counter-insurgency warfare, a new doctrine called to the "left-of-boom" was developed based on social science and network theory to put an end to roadside bombs (a.k.a.,improvised explosive devices, IEDs) set-off by insurgents. If one were to visualize bomb-making as a supply chain in an insurgency socio-technical system, moving from left-to-right, one finds a funder for the operation, a technical planner, a material purchaser, supplier of components, assembler, intelligence gatherer, bomb planter, trigger puller -- all occurring to the left of boom (the bomb going off) on the road. On the right side of the boom you've the response. Paramedics, forensics, law enforcement, legal prosecution, among others. By going to the left-of-boom and disrupting the supply chain, counter-insurgents (allied military) in Iraq reduced their road-side fatalities caused by IEDs.
In the case of motor vehicles we have fairly robust STS -- to the "left-of-crash" -- that has reduced road fatalities over the years, but it is still not good enough. In the case of guns, the STS that exists today, to the "left-of-firing," is woefully inadequate. In addition to legal controls, we don't have technologies that specifically look at the human factor ("end-user") and at the point of user interaction (gun-human interface). For instance, we do not know how to lock-out a weapon if the user at the moment happens to be deranged, inebriated, or experiencing a moment of rage.
Until one has a robust STS for guns, with policies, laws and technologies that can provide the necessary checks and controls, gun violence, unfortunately is here to stay. Or it could be prevented by design, by taking a socio-technical systems perspective founded in technology, human factors psychology, and systems engineering. And if one were to express this in the language of the second amendment, it would go like this:
A well regulated socio-technical system being necessary to the security of all citizens in a free state, the right of the people not to be grievously injured by someone else's inability to safely bear arms shall not be infringed.
Bainbridge (1987) in the "Ironies of Automation" observed automatic equipment seems to function best when the workload is light and the task routine; when the task requires assistance because automation is incapable of handling a novel situation, causing a spike in the operator's workload, this is when the automatic equipment is of least assistance. This is the 'irony' of automation. This "irony" seems to have some relevance to the crash of Air France 447 as reported by IEEE Spectrum. In short, the pilot had no idea as to why the autopilot may have disengaged suddenly at cruising altitude -- a surprise (!) -- which resulted in inappropriate pilot inputs. (The pilots were unaware that all three air speed sensors (pitot tubes) were defective -- giving incorrect inputs to the flight computers due to the formation of ice crystals -- and as the autopilot didn't have airspeeds to work with, it automatically disengaged.)
The biggest irony of automation, after all these years of human factors research and design, should really be viewed as a "non-surprise" for the following reasons:
Automation is not fail-proof and it can result in dangerous consequences when the human operator is suddenly made-in-charge of an [automation] failure, thrusting him/her in a situation when the stakes are high and the time on hand is less.
A sudden failure in automation in a highly complex system, whose inner workings are opaque to the operator, may prove beyond the cognitive means of a highly stressed (panicky) operator to troubleshoot the situation and recover on time.
The above (#2) happens when a pilot is suddenly made to shift roles from a passive monitor ["out-of-the-loop"] to an active operator ["into-the-loop"] and is forced to grapple with the situation and grasp what is going on by rapidly developing a veridicial mental model of the situation). Furthermore, this ability could be impaired due to danger or stress-induced impoverishment of an operator's cognitive control (rational thinking) resulting in disorganization of thought and/or inappropriate responses. (The latter topic forms the intellectual underpinnings of "High Velocity Human Factors.") Years of experience have shown that invariably automation will abdicate its responsibility, when its performance envelope has been exceeded and bewilder the operator -- which should come as no surprise to the designers. So I will refer to it as a Non-Surprise. Thus it behooves designers to provide "means" -- that are not mentally taxing, e.g., requiring cognitive transformations and inferential reasoning -- where a highly stressed operator can comprehend and take control of a non-normal situation. But what are the "means" to this end? I will reserve this for another post. Moin Rahman Founder/Principal Scientist High Velocity Human Factors "HVHF" Sciences http://hvhfsciences.com/ HVHF on Facebook http://www.linkedin.com/company/hvhf-sciences-llc