Already a member? Please login below for an enhanced experience. Not a member? Join today
Menu

Bad habits, or bad luck?badBad habits, or bad luck?

AOPA’s annual Flight Training Experience Survey includes only one safety-related question, asking respondents to agree or disagree with the statement that “My flight school supports a culture of safety.” In part, that’s because safety questions aren’t much help in distinguishing between competitors, as almost every respondent gives the same answer. (How many clients are going to stick with a school that routinely terrifies them?) But it’s also because primary students can’t be expected to recognize a strong safety culture when they see one, or be able to tell whether the operator is serious about it or just going through the motions.

That doesn’t make primary students incapable of recognizing poor safety culture when they see it. We’re acquainted with one school that lost students by trying to convince them to fly in aircraft with visible deficiencies, including obvious fuel and oil leaks. The school has since changed management. We’ve also heard from students who quit flying with a certain instructor after he showed them what IMC looks like from the inside—without benefit of a clearance.

As it happens, that first operator never did experience an engine failure or in-flight fire during a training flight. Nor have any of the cloud-busting instructor’s unauthorized adventures ended in a midair collision or impact with a mountain or radio tower (so far). We’d bet, however, that not many readers would take the lack of tragic consequences as proof of the acceptability of those practices.

But what about the opposite side of the equation? The most robust safety culture can only reduce risks, not eliminate them, and even extremely rare events happen repeatedly when given enough chances. (One lady in New Jersey won the state lottery twice in four months—a fact that is much less surprising when you consider that tens of millions of players buy billions of tickets each year.) Suppose an energetic, thoughtful, and carefully monitored safety program could reduce the risk of an accident on any given training flight from one in 15,000 to one in 50,000. With 4.5 million hours of training time in 2015 equating to perhaps 3 million individual flights, accidents will still occur. (And in fact, the observed rate was about one in 23,000, so those hypothetical figures aren’t out of line with reality.)

Lottery drawings aren’t just random, they’re also independent: Winning one doesn’t hurt (or help) your chances of winning another. Accidents at a specific school are not independent, or at least they shouldn’t be. Identifying a point of weakness and taking steps to correct it after the fact should reduce the risk of another, similar misfortune. Even so, a further consequence of randomness is that outcomes don’t automatically even out over the course of a lifetime or career. Some, like the lady in New Jersey, end up being luckier than they deserve. Others do worse than expected through no fault of their own. The trick is to distinguish systemic weaknesses from the proverbial run of bad luck.

Consider a hypothetical flight school that’s relatively small—one location, a handful of instructors—whose CFIs have a reputation for watchfulness, sobriety, and excellent airmanship. Say its owner is a career aviator whose practical experience is impressive for both its breadth and depth and who cuts no corners on maintenance. Now imagine this flight school has lost five aircraft in the span of 15 years. Is that a sign that its commitment to safety isn’t all this description would imply?

Maybe not. Let’s say one accident was a hard landing on a student solo, and another was a common type of ground-handling mishap. One involved a maneuver for which no guidance was provided by the aircraft’s manufacturer (who subsequently advised against trying it). A catastrophic VFR-into-IMC occurred on a ferry flight the pilot had been explicitly told not to attempt, and a midair collision took place in good VMC in daylight less than two miles from their home airport. After each, the operator held a safety stand down to dissect the accident chain and modify its procedures to help prevent any recurrences.

In a book called Extinction: Bad Genes or Bad Luck?, paleontologist David Raup made the case that species are more likely to go extinct—as perhaps 99.9 percent of all those that ever lived already have—because of some sudden shock (like an asteroid impact) than from inherent inferiority to newer competitors. Applying the same argument to flight schools might seem like a stretch, but if the absence of accidents doesn’t prove safety, then their occurrence doesn’t automatically prove safety has been neglected. What’s crucial is the desire to learn from past mistakes, the determination to figure out what went wrong, and the willingness to adjust procedures as needed to make future recurrences even more unlikely.

ASI Staff

David Jack Kenny

Manager, Safety Analysis
David Jack Kenny analyzes GA accident data to target ASI’s safety education programs while also supporting AOPA’s ongoing initiatives and assisting other departments in responding to breaking developments. David maintains ASI’s accident database and regularly writes articles for ePilot, Flight School Business, Flight Training, CFI-to-CFI, and other publications.

Related Articles