Inicio > Art√≠culos, Factores Humanos / Psico-Sociolog√≠a, General, Seguridad > Air France 447 and “The automation Paradox”

Air France 447 and “The automation Paradox”

A deadly combination of pilot confusion, “warning system ergonomics” design and ¬†inadequate pilot training were responsible for the crash of Air France Flight 447 on 1 June 2009.¬†This is the conclusion reached by the BEA (Bureau d’Enqu√™tes et d’Analyses pour la s√©curit√© de l’aviation civile), the French authority responsible for carrying out safety investigations relating to accidents or serious incidents in civil aviation, in its final report into the crash. BEA’s 224-page report indicated that the aircraft might have been flown out of danger if the pilots had realized the situation they were facing.


And what this has to do with Automation Paradox?¬† ….

Let’s follow reading this interesting article written by Robert N ¬†Charette in IEEE Spectrum¬† risk analysis blog, featuring daily news, updates and analysis on computing and IT projects, software and systems failures, successes and innovations, security threats, and more



The summary of the report (pdf) provides the following account, which begins with the “unleashing event” of the icing over of the three Pitot tubes that provide airspeed data to the Airbus 330-200′s flight computers and which subsequently caused the aircraft’s autopilot to disengage:

The blockage of the Pitot probes by ice crystals in cruise was a phenomenon that was known but misunderstood by the aviation community at the time of the accident. From an operational perspective, the resulting loss of all airspeed information was an identified malfunction. After initial reactions involving basic airmanship skills, it was supposed to be diagnosed by pilots, and managed if necessary by precautionary inputs on the pitch attitude and thrust detailed in the associated procedure.

¬†The occurrence of the failure in the context of flight in cruise completely surprised the crew of flight AF 447. The apparent difficulties in handling the aeroplane in turbulence at high altitude resulted in over-handling in roll and a sharp nose-up input by the PF [pilot flying]. The destabilisation that resulted from the climbing flight path and changes in pitch attitude and vertical speed therefore added to the incorrect airspeed indications and ECAM¬† [Electronic Centralized Aircraft Monitoring] messages that did not help any diagnosis. The crew, whose work was becoming disrupted, likely never realised they were facing a ¬ęsimple¬Ľ loss of all three airspeed sources.

In the first minute after the autopilot disconnection, the failure of the attempt to understand the situation and the disruption of crew cooperation had a multiplying effect, inducing total loss of cognitive control of the situation. The behavioural assumptions underlying the classification of a loss of airspeed information as ¬ęmajor¬Ľ were not validated in the context of this accident. Confirmation of this classification therefore requires additional work in terms of operational feedback in order to modify, where necessary, crew training, the ergonomics of the information made available to them, as well as the design of procedures.

The aeroplane went into a sustained stall, signalled by the stall warning and strong buffet [the warning at one point sounded continuously for 54 seconds but apparently was ignored]. Despite these persistent symptoms, the crew never understood they were in a stall situation and therefore never undertook any recovery manoeuvres. The combination of the warning system ergonomics, the conditions under which pilots are trained and exposed to stalls during their professional and recurrent training, did not result in reasonably reliable expected behaviour patterns.
In short, as BEA head Jean-Paul Troadec is quoted at the crash report news conference last Thursday at Le Bourget Airport in Paris:

“It seems that the pilots did not understand the situation and they were not aware that they had stalled.”

However, Troadec also made it very clear that BEA was not blaming the pilots alone for the accident:

“If the BEA thought that this accident was only down to the crew, we would not have made recommendations about the systems, the training, etc.”
He went on to say:
¬†What appears in the crew behavior is that most probably, a different crew should have done the same action. So, we cannot blame this crew. What we can say is that most probably this crew and most crews were not prepared to face such an event.”

In fact, BEA made a total of 25 recommendations (pdf) covering everything from better training of aircrews to changes in display logic to improvements in search and rescue. Training pilots to fly aircraft manually at high altitudes is seen as a major need.
Many of the recommendations also deal with the so-called “automation paradox,” i.e., which as I wrote about for IEEE Spectrum concerns the situation where “the more reliable the automation, the less the human operator may be able to contribute to that success. Consequently, operators are increasingly left out of the loop, at least until something unexpected happens. Then the operators need to get involved quickly and flawlessly.”
In the Air France Flight 447 case, the crash report stated that the occurrence of the failure in the context of flight in cruise “completely surprised the pilots,” and thus being “startled,” they were never able to comprehend what the difficulty caused the autopilot to disengage.
Going back to the summary of the crash report (with my highlighting):

“At present, recognition of the stall warning, even when associated with buffet, assumes that the crew assigns a minimum degree of ¬ęlegitimacy¬Ľ to the alarm. This in turn assumes sufficient prior experience with stall conditions, at least some cognitive availability and understanding of the situation, as well as knowledge of the aeroplane (and its protection modes) and its flight physics. A review of pilot training did not provide convincing evidence that the associated skills had been correctly developed and maintained.”

“More generally, the dual failure of the expected procedural responses shows the limits of the current safety model. When action by the crew is expected, it is always assumed that they will have the capacity to initially control the flight path and to rapidly diagnose and identify the correct entry in the dictionary of procedures. A crew may encounter an unexpected situation causing a momentary but profound loss of understanding. If, in such cases, the assumed capacity to initially control and then to diagnose is lost, the safety model is in ¬ęcommon failure mode¬Ľ. In this occurrence, the inability to initially control the flight path also made it impossible to understand the situation and find the appropriate solution.”

What the pilots seemed to need was something akin to the “digital parachute” I blogged about last year being developed by Rockwell Collins that will take control of the aircraft’s flight management system and return the aircraft to level flight (if possible) when a pilot hits a “panic button” during an emergency situation.
A story in the New York Times states that, “The French news media reported late Wednesday that a judicial panel of experts advising a separate criminal inquiry into the crash had recommended that blame not be placed solely on the pilots, saying that Air France, Airbus and European safety regulators also shared responsibility.”
Supposedly tomorrow, both Air France and Airbus will find out whether they will be charged with manslaughter in regard to their roles leading up to the crash. No word on whether the safety regulators will face legal scrutiny.


  1. Sin comentarios aún.