August 1, 2006
Bent metal, charred wreckage, and ruined lives point to the many failures in pilot decision making, but the clues to what makes a competent pilot decision maker are far from clear. Competence in aeronautical decision making is quite different from the skills that we learn in becoming pilots. Sure, pilots can describe what makes a good landing or takeoff in objective detail. But how do pilots who have led long, productive flying careers decide their way to the age of gray hair and weathered brow? Is it luck? Or, is there a common thread in the way that these aviators approach the task of decision making?
Past performance is no guarantee of future results. Pilots are tempted to believe that the ability to decide wisely increases with years and hours of flight experience. This may be partially true, but there are two factors that argue against a direct correspondence between hours flown and success in avoiding future calamity. The first is that the aviation environment is full of surprises. New factors are dynamically presented in combinations that have never been seen or envisioned, and practice in past decision scenarios may be of no use in recognizing these novel circumstances. The second factor is a natural tendency to assume that past decisions were good decisions if the outcome was good. If a flight was completed without harm, even though it involved pushing a little hard, the success of the flight is equated with successful decision making. The problem with this thinking is that the link between decisions and results is not direct. Not every act of stupidity is punished. Most of us know pilots who regularly take extraordinary risks and seem to escape unscathed year after year. These pilots validate their choices through the lack of negative effect. In combination, these two factors ensure that poor aviation decisions are an equal-opportunity cause. The accidents of experienced pilots may be less frequent, but they also tend to be more severe.
Decision errors can occur from a lack of knowledge, but it does not seem like this is the primary reason for bad decisions. Give any pilot a stack of NTSB reports and ask him to identify the chain of errors that led to a particular accident. Our aviation community is a very knowledgeable and motivated group as a whole — even the less experienced have both the knowledge and understanding to identify the links in the accident chain without help from the findings section of the report.
Necessary rationalization and smugness accompany the reading of aviation accident reports. If anyone thought he would fall prey to the same errors, he would never set foot in an airplane again. But does anyone think for a moment that the people who have accidents have never read an accident report and felt the same not-to-me immunity? The ability to perceive the environment of flight from an armchair is somehow different from the situation aloft. Something happens in situ that robs the knowledge, objectivity, and awareness that were present while sitting under a reading lamp at home.
My interest in the subject of decision making came with a sort of aviation epiphany 11 years ago. At that point, I had been flying as a captain for a major airline for several years and flying my Piper Twin Comanche for family travels for about the same number of years. Although my light-airplane travels began more than two decades prior to this point, the Twin Comanche offered more transportation capabilities than any airplane I had previously owned or rented, which encouraged flying in more demanding conditions, with longer, more complicated trips. During this same period of time, some fellow airline pilots were killed in general aviation accidents. If these accidents had exhibited some special circumstances that separated them from the run-of-the-mill GA accident, they would not have made an impact on my thinking. But these accidents looked like any other general aviation accident — just dumb decisions.
Comparing my light-airplane decision making with my experiences on the airline was enlightening. My light-airplane flying was punctuated by difficult, sometimes gut-wrenching decisions, but my airline decisions were mostly matter of fact.
Airline decisions almost always seek an answer to the question: "What is normal?" Because an airline operates with a set of operating procedures and rules defined in its FAA approved operations specifications, the question "what is normal?" can be answered reliably for most circumstances. The mere volume of operations also helps in defining normal. But there also are many other institutional factors that shape the way decisions are made. Each flight is dispatched according to a set of criteria that conforms to the company's rules and procedures. The airline's training, culture, and pilots union each plays a role in giving the captain an objective and somewhat detached perspective. The multipilot crew, especially in the era of crew resource management, has a natural way of creating the awareness of opposing views. It is very hard to find a first officer who will knowingly share the ride to the scene of an accident to protect a captain's ego. The association of peers in crew rooms across the system is another way that normal is defined. The accident statistics in organized flight operations would seem to bear out the role that organizational influences have on decision making as accident rates decrease as the structure of organization increases, i.e., FAR Part 135 operations enjoy a more favorable accident rate than FAR Part 91 operations as a whole, and, of course, FAR Part 121 operations are the safest of all by a significant margin.
By contrast, light-airplane flying decisions seem to seek an answer to the question, "What is possible?" and this inevitably puts the decider at the center of the decision with self-esteem in the equation. Up go the blinders. Detachment from decisions is hard to achieve when the results are personal — as they are in the lightplane cockpit flown with family or friends as passengers. When the airline pilot decides to step into a general aviation setting for fun or business, he leaves behind an invisible cloak of organizational influence. This may explain my observation that airline pilot accidents in general aviation aircraft really do not look a whole lot different than the accident database as a whole.
All of this discussion leads to the question of how the individual general aviation pilot can achieve the benefits of objectivity and detachment that occur in organizational settings. My interest in this subject led me to research in the decision sciences. What I found is that some people are naturally better decision makers than others. The editors of Psychological Investigations of Competence in Decision Making (Cambridge University Press, 2004) suggest, through a collection of research in a variety of fields, that competent decision makers share the ability to stand back from their environment and think about what kind of thinking is required in making decisions. These good decision makers are not only aware of their operational environment, but also they are simultaneously aware of their own mental activity and their own limitations in the decision process. I imagine that the ability of the naturally competent decision maker exists throughout the pilot population in varying degrees. Can this ability be trained into the less gifted? Or, can cultural influences provide structure to pilot decision making?
More than 10 years ago, while merely trying to mimic the conditions of my airline employment, I wrote what I called a "personal operations specifications" for operating my Twin Comanche. The idea of a "personal minimums checklist" was not in vogue at the time. In quiet reflection away from the airport, I recorded what would be considered normal in my lightplane flying. The "ops spec" set a baseline for decision making that included limitations just like my airline manual. It has been revised many times — but never in flight or in preparation for a flight. In the past 10 years, my light-airplane flying has been far more enjoyable. The personal ops spec cuts both ways; it has helped me complete flights that I might have otherwise canceled, and it has helped me scrub flights that I might have considered to the point of distraction. The real benefit to this decision aid is that it helps a pilot stand to the side of a decision and take the "I" out of the process. It adds grace and confidence to decision making by reducing the number of original decisions. The repeated use of this decision aid sets up permanent precepts — a sort of mental furniture that is associated with all flight operations. These associations are automatic so that any weather brief will trigger attention to pertinent factors. Any in-flight situation draws an alert in the same way — automatically. Normal is a place between the boundaries of these mental guideposts where a pilot can function with greater awareness.
There is no idea or gimmick that can remove the uncertainty involved in making decisions in a complex and dynamic aviation environment with the limitations inherent in a light airplane, but the personal ops spec (or personal minimums checklist) helps to leverage mental resources by organizing and simplifying complex factors. It really is not a checklist, however, but a permanent mental fixture that operates in parallel with the self that is involved in thinking out the actions of an actual flight. It is a way to bring knowledge and perspective from the couch to the cockpit.
Chris Burns is a recently retired USAirways Boeing 737 captain. He has owned and flown general aviation aircraft for the past 35 years.
For additional resources on improving your own decision-making plan — whether a personal ops spec or personal minimums checklist — see the AOPA Air Safety Foundation Web site, especially Do the Right Thing — Decision Making for Pilots Safety Advisor.
The short answer is — just about anything that describes normal for the type of flying you do. Aerobatics, banner towing, crop-dusting, pipeline patrol, the $100 hamburger, round-the-world cross-country, or just about any flying pursuit has a place called "normal" where the elements of flight are performed in situations that are routine enough to allow reserve capabilities (aircraft and individual) to deal with the unexpected. Runway lengths for day and night operations; fuel requirements; weather minimums for takeoff, en route, and landing; alternate airport requirements; minimum altitudes; survival equipment; and thunderstorm avoidance and icing are just some of the common areas where personal rules can be developed to guide your flying decisions. Deciding what is normal is the easy part.
Living within those constraints takes practice. The human mind is a treacherous adversary when its objectives are being thwarted by self-imposed rules. Ask any dieter!
Clear-cut determinations can be elusive. Destination forecasts may hint that crosswinds may be a problem upon arrival. Does a flight get canceled on possibilities? Other criteria produce clear signals that conditions are not normal. A departure ceiling reported that is below your limits or freezing rain that you can see or feel for yourself is a clear indication that conditions are not normal. Think of the familiar traffic signal when applying your ops spec or personal minimums. Stop on red — when the proposed flight clearly exceeds your own limits. Proceed cautiously on yellow when flight-planning indicators hint that a limit may be exceeded. Realize that two or more yellows — two areas of concern like ceilings and thunderstorms — place severe demands on your attention resources. Two or more yellows make a red for me. Of course, green means that all conditions are go for a normal flight.
You can appraise in-flight conditions the same way. When things are not going as planned, attention is diverted to one or more distractions. Develop an awareness of normal and what the conditions of relaxation and confidence feel like. Form a continuous appraisal of concerns that are siphoning off your attention; these are early indications that cautionary conditions are developing into real concerns. Take early action to return to normal.— CB
Safety and Education,
Aeronautical Decision Making,
Two tragic accidents that occurred within a week of each other, involved pilot incapacitation at high altitudes.
George Perry recognized the signs quickly: Hypoxia is something he spent 20 years training for as a U.S. Navy fighter pilot and instructor.
Cessna’s update of the popular Citation CJ3 earned FAA certification less than six months after the company announced the new model.
VOLUNTEER AT AN AOPA FLY-IN NEAR YOU!
SHARE YOUR PASSION. VOLUNTEER AT AN AOPA FLY-IN. CLICK TO LEARN MORE >>>
VOLUNTEER LOCALLY AT AOPA FLY-IN! CLICK TO LEARN MORE >>>
BE A PART OF THE FLY-IN VOLUNTEER CREW! CLICK TO LEARN MORE >>>