Turbine Pilot

When Automation Goes Bad

August 1, 2000

How FMS programming errors cause mistakes, small and large

In late December 1995, American Airlines Flight 965 crashed in the Andes Mountains, 38 miles north of Cali, Colombia. The ensuing accident investigation revealed that the crew programmed the wrong navigation fix into the aircraft's flight management system (FMS), causing the Boeing 757 to turn away from its cleared route. While this error went undetected, the pilots were also expediting their descent in response to a last-minute runway change issued by the controller. The Boeing's speed brakes had been extended, and the crew was readying for an approach entirely different from the one they had expected. In the midst of this demanding and somewhat confusing scenario, they finally realized that the aircraft was turning the wrong way and began to correct back on course. But they never stopped the aircraft's high rate of descent until the ground proximity warning system (GPWS) sounded. Despite a subsequent attempt to climb, the aircraft crashed near the top of a mountain peak. Only four of the 164 persons on board the flight survived.

The Flight 965 accident was not the first one ever caused in part by an FMS programming error, nor will it likely be the last. A search of NASA's Aviation Safety Reporting System (ASRS) database reveals many hundreds of FMS operational errors that have been reported in both the corporate and airline flying communities. Some resulted in only minor embarrassment to the pilots involved. Others might have progressed to an accident on the scale of the Cali disaster but for the timely intervention of a controller or crewmember. And, as anyone who regularly operates an FMS-equipped aircraft knows, these reported incidents are just a small fraction of the errors that occur on a regular basis. All of which beg the question: How can they be prevented, or their effects at least mitigated?

A typical FMS works by integrating a number of important aircraft navigation and performance functions under one roof. It "knows," for instance, where the aircraft is at all times. In larger aircraft with inertial reference units (IRUs), the crew first programs in a starting position. The FMS then automatically tunes nearby VOR and DME signals, using them to continuously refine the IRU positions as the aircraft moves from its starting point. This is referred to as radio updating. Some FMS installations also use GPS signals to further refine this position updating. When radio or GPS updating is not available, the IRUs will continue to provide inertial-based position information. Over time, however, the IRU positions will drift, meaning that position accuracy will slowly degrade. Even so, IRU position information alone is still adequate for many navigation situations. For instance, triple IRU-equipped aircraft without GPS commonly cross the North Atlantic while out of range of radio updating for several hours at a time. FMSs in lighter aircraft often do not have IRUs; position is then derived strictly from available navigation signals.

An FMS also contains both a navigation database and an aircraft and engine performance database. The navigation database lets the crew select an airport or navaid almost anywhere in the world (limited only by the storage capacity of the database and the system's certified area of operation) and navigate directly to it along a great circle route. The performance database allows the FMS to seamlessly manage the aircraft's autopilot, flight director, and autothrottle systems (if installed). It enables the FMS to honor aircraft performance limits such as maximum speeds or altitudes during automatic flight. At the same time, it can optimize the flight plan for pilot-selected criteria such as best fuel economy or range.

But it is exactly the varied, sometimes subtle ways that the FMS influences aircraft control, and the dynamic nature of flying itself, that creates a paradox. Automation intended to make a pilot's job easier sometimes makes it tougher instead. As the Cali accident proves, even pilots highly experienced with FMS use can make programming errors in the heat of battle. So while a pilot's first goal should always be to avoid mistakes in the first place, he or she also needs tools to help find and neutralize errors that still manage to slip through the cracks.

FMS-related errors tend to occur for predictable reasons. They include:

  • Assuming the FMS has been programmed correctly, when in fact it hasn't.
  • Selecting the wrong level of automation for the task at hand.
  • Complacency—letting the FMS do the "thinking," causing the pilots to fall behind the airplane.
  • Pilot misunderstanding of what mode the FMS is in, or how a particular mode works.
  • Programming complex FMS commands at an inappropriate time.
  • Making programming changes in the FMS without the knowledge or concurrence of the other pilot.

The ASRS files are filled with plenty of real-world examples of these and other common FMS traps. Consider the simple programming error that almost resulted in a controlled-flight-into-terrain accident for the crew of an airline heavy jet departing from Hong Kong International.

"At the time the controller asked if we were still on the SID. I said, ‘Yes, we are right on it.' The controller then said, ‘You are right of course and approaching high terrain.' At this point the captain took control of the aircraft and immediately started a climbing left turn. At the same time he commanded the engineer to push the throttles up to takeoff power. At about this time, we got a GPWS warning. Shortly thereafter the controller said we were clear of the terrain...."

The culprits? A fatigued crew, a last-minute runway change, and an incorrect navaid programmed into the FMS by the first officer and not caught by the captain, resulting in erroneous DME information.

The pilot of a corporate jet wrote these comments after making an FMS programming error that resulted in a missed altitude restriction during descent. Fortunately there was no conflicting traffic, but a lesson in complacency and overreliance on automation was hopefully driven home.

"I entered [the] waypoint in my haste, without double-checking the actual name of the entered waypoint on the vertical nav page.... This profile caused me to rely on a bogus descent rate to make the crossing restriction. In the future, both pilots will be cross-checking the old-fashioned way...and not rely[ing] on the vertical nav profile alone."

A Cessna Citation II pilot discovered when not to program an FMS in the following incident.

"Event started while taxiing aircraft from terminal to active runway (100-percent snow-covered area, ramp in approximately one mile visibility with snow and blowing snow). Since we had a single-pilot waiver in the Citation II, I was the only crew on this flight. As a single-pilot operator, work load can be quite high if not properly managed and, more importantly, prioritized! I began taxi while programming an FMS and a GPS for my next destination, along with other checklist items. I looked down to enter info...."

Attempting to type and taxi at the same time in such conditions is obviously a bad idea. The Citation proceeded to exit the taxiway, luckily resulting in only slight damage to a taxiway light. The pilot concluded his report with this observation:

"Moral of story: Do not taxi aircraft unless you can devote 100-percent full attention to that task, especially during inclement weather. Another thought—do not rush anything while in flight or on ground. [An] extra few minutes is very inexpensive compared to damage to aircraft, single pilot or crew."

An airline first officer was the pilot flying (PF) in the following cautionary tale. This report demonstrates how confused a situation can become when a monitoring pilot (MP) makes a programming change to the FMS without the knowledge or concurrence of the PF.

"Inbound to SFO, assigned 11,000 feet, on the FMS Bridge Visual Runway 28R.... I was reviewing the arrival again and when I looked up, we were passing through 10,000 feet. I asked the captain if we had been cleared out of 11,000 feet and he said he thought we were cleared for the approach. I took the airplane, which was on autopilot, and started a climb toward 11,000 feet. At that time ATC called and asked our assigned altitude...."

These incidents illustrate how easily FMS mistakes can occur. So how can pilots keep such errors from progressing to an accident, or prevent them altogether? The recurring themes found in FMS errors have led some users to develop defensive flying strategies, which, if followed, can head off most errors.

  • Set clearly defined crew duties regarding how the FMS will be operated. For instance, when the PF is hand-flying, the MP makes all programming inputs. When operating on autopilot, the PF does the programming. In either case, accuracy should never be sacrificed for the sake of programming speed.
  • Ensure that no pilot makes a programming change without the knowledge of the other crewmember. If one pilot has been "out of the loop" for some reason, such as being on another frequency to check weather, the PF should fully brief any changes that have taken place when the other pilot's attention is again focused on flying.
  • Be sure that one pilot is flying the aircraft at all times, regardless of the level of automation used. An FMS is not a pilot.
  • Always cross-check that a pilot input made to the FMS results in the expected mode of operation. Religiously adhere to an FMS policy of "trust but verify."
  • Apply the appropriate level of automation to a situation. If complex FMS programming is required during a busy or critical time, it is probably better to revert to a simpler autopilot or hand-flown mode until the changes can be made in a less hurried fashion.
  • Back up FMS calculations with old-fashioned mental calculations. The FMS is there to assist the crew, not to take command of the flight.
  • Make sure that both pilots verify that the correct altitude has been set in the altitude alert window when climb or descent clearances are received.
  • Occasionally hand-fly to maintain basic stick-and-rudder skills.
  • Have a raw-data backup plan during critical times, such as when the aircraft is close to the ground. For instance, during a descent in mountainous areas, raw data from navaids should be monitored to ensure that the FMS-computed position is reliable.

To these might be added the admonition for pilots to fully understand all the nuances of their aircraft's FMS and autoflight operation, and to constantly strive for a high degree of checklist discipline and standard operating procedure. Considering the potential consequences, it is dangerously shortsighted to do anything less.


Links to additional information about using FMS may be found on AOPA Online ( www.aopa.org/pilot/links/links0008.shtml). Vincent Czaplyski holds ATP and CFI certificates. He flies as a Boeing 757 captain for a major U.S. airline.