Aircraft Spruce logo
Sponsored by Aircraft Spruce

Aircraft Maintenance: The dangers of expectation bias

Humans are naturally wired to recognize patterns, and to perceive patterns that fit our expectations. When flying or maintaining an aircraft, perceiving what we expect to see and missing what is right there in front of us can be costly—or even fatal.

Photo by Mike Fizer.

Expectation bias, a well-established idea in aviation psychology, is a cognitive phenomenon in which our experiences, preconceived beliefs, or desires influence how we perceive, interpret, or act on information. This is very closely linked to optimism bias in which an evaluation of a situation is influenced by our desires instead of an analysis of facts. We commonly refer to this as “wishful thinking,” yet we rarely acknowledge when we are guilty of it. Human beings are preprogrammed to seek results that match our expectations and desires, and this creates a dangerous bias that can lead to faulty perception and errors in judgment.

Consider a Cirrus crash in 2022: The pilot contacted air traffic control and reported a brief “stutter” of the engine, followed by rough running. He requested a diversion to a nearby airport en route to his destination. However, as he neared the alternate airport, the engine smoothed out. Rather than land, the pilot chose to revert to his original plan and continue to his original destination. About 15 nautical miles from that destination, the engine began running rough again, followed by a catastrophic engine failure. The pilot activated the airframe parachute and landed in a field. He and the passenger survived, but the aircraft was destroyed. The pilot changed his action plan based on the unrealistic hope that the engine problem had spontaneously resolved and the flight had somehow magically become safe enough to continue.

I have witnessed several cases of questionable maintenance-related decisions that I would attribute to the same expectation bias or wishful thinking. In one case, an aircraft landed with a completely flat main-gear tire. The pilot was anxious to get back to his home airport rather than suffer the delay of performing a repair away from home. The pilot chose to add air to the tire, evaluate whether it appeared to hold pressure, then depart.

In another case, an aircraft owner noted very low engine oil pressure in flight. They landed and brought the airplane to a mechanic for evaluation. Thinking (or hoping) that the low oil pressure was caused by a false indication, the mechanic ran the engine repeatedly while testing and ruling out the sensor and display. Only after confirming that the low oil pressure reading was accurate did the mechanic remove the pressure relief valve and discovered a mechanical failure.

In a final case, a mechanic was attempting to identify the cause of an intermittent in-flight engine failure. Although it could not be reproduced on the ground, the mechanic made several attempts to diagnose and correct the problem. Each time it appeared that the issue had been resolved, the failure reappeared during subsequent test flights. Ultimately, the aircraft crashed during one of the test flights, severely injuring the mechanic.

In each of these cases, the individuals involved instinctively chose to accept risk in exchange for indulging their expectation or optimism bias. In the case of the flat tire, the pilot’s desire to return home allowed him to rationalize that simply adding air would remedy the problem. He avoided confronting the question of how all the air in the tire had been lost, and downplayed the risk of landing with a flat tire, should the air leak out again during the flight home. I’m not sure how the story ended, but even an uneventful landing would only serve to reinforce a risky approach to aviation safety.

In the case of the low oil pressure issue, the mechanic’s bias toward a failure in the indication system, rather than a mechanical failure, resulted in repeatedly running the engine with a questionable oil supply. While this may not have posed an immediate safety risk, it did place the long-term health of the engine at risk. In the end, it only took a few minutes to remove the oil pressure relief valve and identify the mechanical failure.

The case of the engine failure was the most tragic. Without the ability to faithfully reproduce the failure on the ground, the mechanic chose to conduct a test flight based on the expectation that the problem had been remedied, nearly losing their life in the process.

How to protect against expectation/optimism bias

Awareness: Simply recognizing our natural desires and expectations can help break the cycle of poor decision making. If you acknowledge that you can be influenced into unsafe situations, you are more likely to implement steps to avoid them.

Questioning: When you are analyzing a situation and developing a plan of action, ask yourself:

  • Am I being objective, or is this wishful thinking?
  • Have I acknowledged all the facts available to me?
  • Are any of my assumptions illogical or highly unlikely?
  • What are the consequences if I am wrong?
  • What options do I have to reduce or eliminate this risk?

Systematic approach: Follow the FAA’s guidelines for using a risk assessment matrix when approaching every operational or maintenance situation.

  • Identify the potential consequences and severity of the risk in a worst-case scenario.
  • Assess the probability of the risk.
  • Develop a plan of action that acknowledges the facts and mitigates the risk.
  • Establish clear boundaries that ensure you will never knowingly place your life at risk.

The highest-severity risks include engine lubrication, fuel delivery, propeller operation, flight controls, electrical system, and fire. For these items, an in-flight abnormality warrants immediate landing. In other cases, you must carefully evaluate the probability and risk of failure, and make a disciplined decision that prioritizes safety while preserving an alternate plan should the situation deteriorate. On the ground, approach maintenance issues with even more scrutiny. You should never fly an aircraft unless you are completely confident in its safety.

As pilots and mechanics, we routinely encounter situations that do not always have clear solutions. Objectivity, however, is the key to safety. Approach problems with a logical plan—one that you are comfortable saying out loud to yourself and your passengers, that addresses and acknowledges your internal bias, and prioritizes the safety of everyone on board. Until next time, I hope you and your families remain safe and healthy, and I wish you blue skies.

Jeff Simon
Jeff Simon
Jeff Simon is an A&P mechanic, IA, pilot, and aircraft owner. He has spent the last 22 years promoting owner-assisted aircraft maintenance and created the first inspection tool for geared alternator couplings available at ApproachAviation.com. Jeff is also the creator of SocialFlight, the free mobile app and website that maps more than 20,000 aviation events, hundred-dollar hamburger destinations, and also offers educational aviation videos. Free apps are available for iOS and Android devices, and users can also visit www.SocialFlight.com.
Topics: Ownership, Aircraft Maintenance
aircraft spruce logo

Aircraft Spruce

Sponsor of Aircraft Maintenance
Aircraft Spruce provides virtually everything a pilot or aircraft owner might need. As a Strategic Partner since 2012, the company sponsors programs that bring hands-on knowledge and DIY spirit to AOPA members.