This accident highlights the importance of making certain that students are well-versed in aircraft systems and can readily troubleshoot systems problems. An accurate assessment of problem symptoms is critical to the decision-making process. Had the pilot been lost and low on fuel, a precautionary landing would have been an appropriate course of action. In reality, the only problem was the failed electrical system. Had the pilot recognized this problem early in the course of events, he may have been able to conserve electrical power, communicate his problem, and resolve any navigational concerns.
The only problem with the fuel was the fact that the gauges read zero when battery power waned. The flight de-parted with full fuel, which provides approximately four hours and 30 minutes' endurance. The actual duration of the flight was one hour and 45 minutes. The NTSB findings suggest that if the pilot had performed the fuel consumption calculations, he might not have attempted the precautionary landing. Certainly if he had recognized the failure of the electrical system, he could have made better decisions regarding the continuation of the flight.
Even minor differences in aircraft electrical systems can foul up an unsuspecting pilot, causing him to suspect a problem when none exists. To reduce cockpit glare, the Piper Comanche PA-24 electrical system has a feature that automatically dims the landing gear indicator lights when the panel lights are activated. (Many other aircraft with retractable gear have a similar system.) Had the pilot in the following case known of this feature before ferrying the aircraft at night for his friend, he might have saved himself a lot of grief as well as some serious injuries.
It was a dark night in January when the pilot arrived at the nontowered airport at Cabool, Missouri. Weather at the time included scattered clouds at 1,500 and 2,100 feet with visibility of 10 miles. After circling the airport to determine the runway in use, the pilot lowered the gear and prepared to land.
Completing the prelanding checklist on final approach, he noticed that the green landing gear indicator light was not properly illuminated, and he wanted to verify that the gear was in a safe position for landing. While fumbling in the cockpit, he accidentally turned off the navigation and panel lights, causing the gear indicator light to fully illuminate. At this point, the pilot had gone too far to complete the landing. He saw the lighted wind tee go by about mid-field and, with insufficient runway remaining, he executed a go-around. As the runway lights disappeared below and behind, the pilot turned his attention to the instruments, only to find the entire instrument panel blacked out. While struggling to reach a flashlight in the cockpit, he inadvertently banked the aircraft to the right. The aircraft struck some trees, then slammed into a wooded hillside.
Here again, there was no problem with the aircraft. Everything was operating normally, but the pilot's lack of systems knowledge lured him into thinking that he had a real problem.
Another incident occurred on a June day in Jamestown, North Dakota. A high-time flight instructor (11,000-plus hours) was trying out the single-engine capability of a Piper PA-23 Aztec with two passengers aboard. The pilot must not have recognized the limited performance capability of the aircraft with the critical engine feathered. To make matters worse, he did not understand the implications of securing the left engine of this aircraft. Nonetheless, the pilot feathered the left engine, and everything went downhill from there. What began as a simple demonstration of aircraft capability became a stunning demonstration of what can happen when a pilot fails to understand aircraft systems.
The Aztec has only one engine-driven hydraulic pump, which is located on the left engine. The aircraft is equipped with hydraulic gear and flaps that become inoperative when the engine is secured. When the pilot feathered the engine (shut it down completely), he effectively painted himself into a proverbial corner. He now had only a hand-operated pump to manually extend the landing gear and flaps. Performing this operation can be a time-consuming chore, and it is especially difficult when simultaneously coping with an engine failure at low altitude. The aircraft wouldn't maintain altitude with only one engine operating, and the pilot was forced to land the aircraft gear-up in a field. On rollout, the aircraft nosed over and was destroyed. Fortunately, there was no fire, and the pilot and the passengers escaped without injury.
As the accident report illustrates, a simulated emergency can quickly turn into real trouble. Nothing was wrong with the aircraft's flaps and landing gear. They performed normally. The engine was performing normally until the pilot intentionally feathered it. Had the pilot followed standard training procedures and simulated the engine failure with a "zero thrust" power setting rather than shutting it down completely, the hydraulic pump would have still been operating. The pilot's lack of knowledge precipitated a fouled-up situation when in fact the aircraft was operating normally.
Our final incident involves a retired airline captain flying a Piper Archer. Before departing on a flight, the pilot had noted with curiosity that the oil pressure warning light seemed to stay on longer than usual after engine start. As he increased the rpm, the light extinguished, and the pilot figured that the only problem was a pressure sensor that was out of calibration. Everything else appeared normal during the runup, and he departed into the hazy summer sky without further ado.
Flying north over the ocean about 10 miles south of New York's Kennedy Airport, the oil pressure warning light suddenly illuminated. Even at 10,000 feet, the pilot knew it would be a stretch to reach land if the engine quit completely, but perhaps he could nurse his powerplant along and gain a few extra miles toward the safety of Long Island's shores. He throttled back to save the engine and gently lowered the nose.
Shortly thereafter, he realized he had another problem. The vacuum system had failed and the gyros had begun a lazy, lopsided roll to the inoperative mode. By now the shoreline was in sight, and the pilot's attention shifted again to the engine, which promptly sputtered and quit-confirming the expectations generated by the warning light.
The pilot considered securing the engine, but thought better of it. First, he scanned the engine instruments to verify the nature of the problem. He looked at the fuel gauges and saw that the fuel selector was set to an empty tank. Perhaps that engine did have some life left. He switched fuel tanks, and the windmilling engine quickly responded with a smooth surge of power. Scanning the engine instruments, he realized that while the oil pressure warning light was illuminated, the oil pressure gauge was reading well into the green. Further study of the instruments revealed that while the vacuum system was truly inoperative, the vacuum system warning light was not illuminated.
The engine didn't miss a beat for the remainder of the flight. After making a precautionary landing at a nearby airport, it was found that the vacuum system and oil pressure warning lights had somehow been switched during maintenance. The illuminated warning light had really been for the vacuum system rather than for the oil pressure. This explained why the low oil pressure indicator had remained illuminated after engine start. The cause of the engine failure was purely fuel system mismanagement, caused by the distraction of the warning light. Fortunately, the pilot knew the systems. He carefully evaluated the situation and didn't rush to conclusions. Had he secured the engine, the outcome of the flight would have been much different.
The moral of these stories is that things are not always what they seem. Anomalous indications can often be confusing, and the prudent pilot will carefully cross check indications before committing to a course of action, especially one such as securing an engine. A pilot's best defense against system SNAFUs is thorough systems knowledge and clear thinking.