Get extra lift from AOPA. Start your free membership trial today! Click here

Safety Pilot: Mapping the mind

Beyond voyeurism the best reason to investigate an accident is to learn how to keep from doing it again. Dozens of brilliant quotes relate to the human ability to screw up and why, in some circles, it’s considered a good idea.

Beyond voyeurism the best reason to investigate an accident is to learn how to keep from doing it again. Dozens of brilliant quotes relate to the human ability to screw up and why, in some circles, it’s considered a good idea. This works for business as long as one is right more often than wrong. Some of the greatest successes have occurred after stunning failure. Formula 409, the spray cleaner, worked only after 408 missteps. The light bulb and Post-it notes would never have been discovered without consistent mishap. “If you have made mistakes, even serious ones, there is always another chance for you,” said Canadian actress Mary Pickford. But it’s a safe bet Ms. Pickford did not spend much time as pilot in command.

For those who fly, it’s better to learn vicariously—i.e., let somebody else be the pathfinder. A letter from a reader regarding the August landmark accident (“Safety Pilot Landmark Accidents: Too Close for Comfort,” August 2009 AOPA Pilot) got me to thinking about this. A 4,000-hour twin Twin Comanche pilot, despite datalink weather and assistance from ATC, wound up disassembling the aircraft, himself, and two friends headed to the Carolinas to play golf. The reader asked, “Why was it that a seasoned pilot with 4,000 hours elected to fly into the jaws of a monster? Are there studies by cognitive psychologists and medical specialists that shed light onto the mystery of what it was that might have influenced this particular pilot in command? And most important, what tools are out there to assist us in recognizing when we too might be falling into whatever traps were out there on that fateful day in April 2007?”

It’s a question with no easy answer. Why do humans, let alone pilots, engage in any life-shortening behavior? In some cases they are ignorant of the danger: the golfer who continues to play when a thunderstorm approaches. Sometimes it’s the risk taker playing the odds: the youthful foolishness of some motorcyclists approaching aircraft speeds on the highway. The NTSB does not speculate on a person’s mindset unless there’s compelling evidence to show what he was thinking—that’s a tall order. In some accidents there is circumstantial evidence that pressure may have played a part but in most cases we get the “what”: Pilot failed to…[insert omission]” and seldom the “why.”

So, if you’ll indulge some armchair psychology here’s my greatly oversimplified Judgment Failure Hypothesis. This absolutely unproven explanation holds that judgment-related accidents, not skill-based ones, have three fundamental origins:

1. The pilot is trying to get too much utility out of the aircraft. Examples of this include overloading the airplane, trying to take off or land from a runway that is too short, running out of fuel, flying a VFR-only aircraft into IMC, flying a non-deiced aircraft into icing, flying the aircraft with a known deficiency, or flying into an area of rapidly developing thunderstorms without onboard radar.

2. The pilot is trying to get too much utility out of himself. VFR into IMC is one of the best examples of this. Flying when fatigued is another.

3. The pilot is attempting to have too much fun—this would include buzzing, river-running, improper aerobatics, et cetera.

In the Twin Comanche accident 1 and 2 might both apply. It’s been said many times that datalink weather is not airborne weather radar and is not to be used tactically. Operating around thunderstorms I am constantly reminded that what looks similar to a situation that I’ve seen before may not be. Context, or nature of the weather system, is critical. Misjudge it and bad things may happen.

After the accident it’s quite obvious to everyone what went wrong; that’s likely not so clear ahead of time. The statistical truth that safety advocates don’t like is that most of the time, pilots get away with egregiously bad judgment because they didn’t quite get to the last link in the accident chain. This can build a dossier of success based on unsafe practices. We all know an “ace of the base” who is cocky and quite certain that the rules don’t apply to him, and usually he doesn’t come to grief.

The “hot stove” theory does not work well in aviation. Remember Mom’s hot stove warning as children? We get smart quickly around stoves because the result is immediate and a burn is certain. That certainty is largely missing from VFR into IMC, buzzing, driving while intoxicated, flying near thunderstorms, and texting while driving. We get away with it most of the time, but so do the odds of an accident.

Risk is generally analog, not digital. It’s seldom “either/or” but rather more complex. If every time we heard “VFR not recommended” or thunder/ice was in the forecast, and the bad stuff really was there, pilots would stop messing with it. Often the weather is better but the government has been on the losing end of too many “failure to warn” lawsuits—so, rather than truth in forecasting, there is defensive forecasting.

Mapping the human mind to determine what someone is thinking, and why, is unlikely to happen soon. There are way too many variables and brain chemicals to account for and humans are notoriously devious. Fortunately, we are allowed to take personal risk in the United States and smart pilots manage it, believing all the while that Murphy’s Law is for optimists.

Bruce Landsberg writes and speaks on aviation safety topics throughout the year.

Related Articles