Thursday, July 12, 2012

Automation's Biggest Irony (after all these years): The Non-Surprise

Bainbridge (1987) in the "Ironies of Automation" observed automatic equipment seems to function best when the workload is light and the task routine; when the task requires assistance because automation is incapable of handling a novel situation, causing a spike in the operator's workload, this is when the automatic equipment is of least assistance. This is the 'irony' of automation. 

This "irony" seems to have some relevance to the crash of Air France 447 as reported by IEEE Spectrum. In short, the pilot had no idea as to why the autopilot may have disengaged suddenly at cruising altitude -- a surprise (!) -- which resulted in inappropriate pilot inputs. (The pilots were unaware that all three air speed sensors (pitot tubes) were defective -- giving incorrect inputs to the flight computers due to the formation of ice crystals  -- and as the autopilot didn't have airspeeds to work with, it automatically disengaged.)



The biggest irony of automation, after all these years of human factors research and design, should really be viewed as a "non-surprise" for the following reasons:

  1. Automation is not fail-proof and it can result in dangerous consequences when the human operator is suddenly made-in-charge of an [automation] failure, thrusting him/her in a situation when the stakes are high and the time on hand is less. 
  2. A sudden failure in automation in a highly complex system, whose inner workings are opaque to the operator, may prove beyond the cognitive means of a highly stressed (panicky) operator to troubleshoot the situation and recover on time. 
The above (#2) happens when a pilot is suddenly made to shift roles from a passive monitor ["out-of-the-loop"] to an active operator ["into-the-loop"] and is forced to grapple with the situation and grasp what is going on by rapidly developing a veridicial mental model of the situation). Furthermore, this ability could be impaired due to danger or stress-induced impoverishment of an operator's cognitive control (rational thinking) resulting in disorganization of thought and/or inappropriate responses. (The latter topic forms the intellectual underpinnings of "High Velocity Human Factors.")

Years of experience have shown that invariably automation will abdicate its responsibility, when its performance envelope has been exceeded and bewilder the operator -- which should come as no surprise to the designers. So I will refer to it as a Non-Surprise. Thus it behooves designers to provide "means" -- that are not mentally taxing, e.g., requiring cognitive transformations and inferential reasoning --  where a highly stressed operator can comprehend and take control of a non-normal situation. But what are the "means" to this end? I will reserve this for another post.  

Moin Rahman
Founder/Principal Scientist
High Velocity Human Factors "HVHF" Sciences
http://hvhfsciences.com/
HVHF on Facebook
http://www.linkedin.com/company/hvhf-sciences-llc