This "irony" seems to have some relevance to the crash of Air France 447 as reported by IEEE Spectrum. In short, the pilot had no idea as to why the autopilot may have disengaged suddenly at cruising altitude -- a surprise (!) -- which resulted in inappropriate pilot inputs. (The pilots were unaware that all three air speed sensors (pitot tubes) were defective -- giving incorrect inputs to the flight computers due to the formation of ice crystals -- and as the autopilot didn't have airspeeds to work with, it automatically disengaged.)
The biggest irony of automation, after all these years of human factors research and design, should really be viewed as a "non-surprise" for the following reasons:
- Automation is not fail-proof and it can result in dangerous consequences when the human operator is suddenly made-in-charge of an [automation] failure, thrusting him/her in a situation when the stakes are high and the time on hand is less.
- A sudden failure in automation in a highly complex system, whose inner workings are opaque to the operator, may prove beyond the cognitive means of a highly stressed (panicky) operator to troubleshoot the situation and recover on time.
Years of experience have shown that invariably automation will abdicate its responsibility, when its performance envelope has been exceeded and bewilder the operator -- which should come as no surprise to the designers. So I will refer to it as a Non-Surprise. Thus it behooves designers to provide "means" -- that are not mentally taxing, e.g., requiring cognitive transformations and inferential reasoning -- where a highly stressed operator can comprehend and take control of a non-normal situation. But what are the "means" to this end? I will reserve this for another post.
High Velocity Human Factors "HVHF" Sciences
HVHF on Facebook