Already a member? Please login below for an enhanced experience. Not a member? Join today

Proficient Pilot Automation dependencyProficient Pilot Automation dependency

Most of us have heard about automation dependency.

Barry SchiffMost of us have heard about automation dependency. It is a relatively new aviation term that describes what can happen to pilots who become excessively dependent on automation, particularly when used for aircraft control and navigation. There might be a technical definition of automation dependency somewhere, but I couldn’t find it.

Because it is a complex subject, I won’t attempt to define it here except to say that it is a dependence on automatic systems that can lead a pilot to complacently accept what an aircraft is doing without monitoring and confirming that it is doing what he expects or wants it to do. Further, habitual and excessively reliant use of automation can result in the loss of situational awareness, failure to notice evolving difficulties, and not taking appropriate action to resolve problems. It also can result in the erosion of a pilot’s basic (manual) flying skills.

An example of automation dependency that some might consider extreme involves this hypothetical example. Assume that an airline crew is executing a visual, obstruction-free approach to a lengthy runway at a major airport in a widebody jetliner on a severely clear day. Assume also that this hypothetical crew is accustomed to executing autopilot-coupled approaches with the assistance of autothrottles that automatically maintain the desired approach speed. In this example, the instrument landing system is out of service, and the approach must be flown manually (although vertical guidance is provided by PAPI, a precision approach path indicator). The crew fails to realize during the approach that the autothrottles are not engaged and that indicated airspeed is eroding dangerously while on short final approach. Before long the stick shaker provides loud and tactile notice that a stall is imminent. But it is too late to recover—the aircraft is too low and too slow.

There is nothing exotic or difficult to understand about autothrottles. Their use is relatively simple. When engaged in Airspeed mode, they are closely analogous to cruise control in an automobile. In each case, the desired speed is selected, the automatic system is engaged, and power then adjusts automatically to maintain that speed. If the driver of an automobile were to notice a decrease in speed, he would know that cruise control likely was not engaged and would simply step on the gas to maintain speed. The same is true in a jetliner. If the autothrottles do not maintain the selected speed (for whatever reason), the pilot easily compensates by pushing forward on the manual throttles (thrust levers) and recapturing the desired airspeed. This assumes, of course, that he is monitoring airspeed in the first place.

Is it possible for pilots to be so dependent on automation that they fail to observe airspeed during an approach, and instead trust completely that the autothrottles are doing the job and maintaining the selected airspeed? Many experts believe so, especially when such a hypothetical crew might have been taken out of its comfort zone by not being able to execute an autopilot-coupled approach. It also is possible that someone in the cockpit did notice an evolving danger on short-final approach but failed to provide warning in an assertive and timely manner, a topic for next month.

Automation has become a boon to aviation safety and efficiency, but its use presumes that pilots fulfill their primary obligation—and that is to fly the airplane safely (with or without automated assistance). In other words, it is still the pilot’s responsibility to aviate. A pilot must not relegate this duty to automation. He must monitor and keep in check the four most essential elements of safe flight: airspeed, altitude, heading, and attitude. If an automated system is used to change or maintain any of these elements, it is the pilot’s responsibility to ensure that the results are as desired. A pilot may assign workload to automation, but he must never assume the outcome. In other words, “trust but verify,” a principle that serves pilots as well as it did a president.

Remember American Airlines Flight 965 in 1995? The Boeing 757 was descending toward Cali, Colombia, through a valley defined on both sides by very tall mountains. An incorrect waypoint was entered into the flight-management system and activated without first being verified as correct. The airplane dutifully responded to the errant command and turned toward the mountains defining the left side of the valley while continuing to descend. This was the first accident involving a U.S.-owned Boeing 757 and claimed 159 lives.

There have been a number of accidents in which automation dependency has played a role, and there likely will be many more, especially in general aviation as automation becomes more pervasive.

As they say in showbiz, “Any resemblance to actual events or persons is purely coincidental.”

Related Articles