Even with all the mistakes we make, humans exhibit better and more consistently safe decision-making than robots. If you were to ask a group of pilots their predictions for the future of pilotless skies, most would give you 10 reasons why humans are better pilots than artificial intelligence. We exhibit critical thinking skills. We can make rapid decisions in situations we have not been trained for. We consider passenger comfort, et cetera. I wonder why it is, then, that we so often demand absolute robotic perfection of ourselves and our fellow pilots?
In most cases, it’s not the FAA demanding blood when bad things happen. For an honest mistake, the agency usually just requires the pilot to get some more training on the issue that led to the incident. No, we do this to ourselves. When we hear about an incident, we look for all the mistakes those “incompetent” pilots made so we can tell ourselves it would never happen to us. And when I make a mistake, the harshest critic is often…me. While that type of thinking might help us manage the discomfort of things not turning out as planned, it doesn’t make us any safer as a pilot group.
One of the privileges of my position as a designated pilot examiner is that I get asked to give FAA-mandated remedial training to those who have had an aircraft incident, such as a runway excursion or incursion. Many times, when these people walk in, shoulders slumped, the defeat is evident on their faces. But folks, one mistake does not make you incompetent. It makes you human. As a recovering perfectionist myself, I’ll tell you the very worst thing we can do is continue to beat ourselves up over a failed checkride or a mishap in the airplane. That self-doubt becomes a self-fulfilling prophecy, making us weaker instead of stronger.
So, for the last long-faced person who showed up in my office, we watched the courtroom clip from the film Sully before we got down to the business of dissecting the runway excursion that led to remedial training. For those of you who haven’t watched the movie, go do it. As a pilot, it will help you breathe a little easier.
Here’s the part that did it for me: Capt. Sullenberger and his first officer, Jeff Skiles, are in the NTSB hearing, watching simulator pilots re-create the dual engine loss scenario—but instead of a water landing on the Hudson, the sim pilots make it back to LaGuardia and Teterboro. (Note: This scene is Hollywood sensationalized. Sullenberger himself requested that actual names of the NTSB inspectors not be used because he did not want them unfairly demonized.) Sully then turns the tables by eloquently delivering the point that the sim pilots are effectively robots, having the benefit of warning and 17 practice runs before the courtroom simulations. “You are looking for human error. Then make it human,” he says. In the movie, this promps the NTSB to agree to a 35-second human factor delay before the sim pilots can turn for the runway. Those attempts at airport landings are unsuccessful.
The NTSB ultimately rules Sullenberger and his crew were heroes for managing to make a safe water landing and get all 155 souls out alive. But here’s the question: What if the NTSB had ruled pilot error? Would that make Sullenberger and Skiles any less heroic? No. They were proficient pilots who managed to keep their cool and form a plan of action in an impossible situation. Hindsight says there may have been a better plan, but there was no way they could have known for sure there in the heat of the moment.
As a group, we pilots tend to dislike excuses. We demand accountability and logical answers to problems. So, if something went wrong, then it was obviously the pilot’s fault. And if it was pilot error, then he must not be a very good pilot. No, it was pilot error because that pilot was human. When things happen, we need to analyze the incident (minus the self-inflicted abuse) and determine a course of action that will prevent such a thing from ever happening again.
Yes, after the incident, someone sitting in a warm, still room sipping their coffee may come up with ways we could have done it better. We probably will even do this to ourselves. Was there a way we could have reacted differently when the engine failed? Did we miss something on the preflight that might have prevented the problem? Could we have better briefed our students to prevent them from locking up the brakes? Of course, we could have. And with experience, and practice, and more knowledge, we will do it better next time.
That’s what our failures should be good for. They make us better the next time. We are not robots, we’re something better. We’re human.