Get extra lift from AOPA. Start your free membership trial today! Click here

NASA takes synthetic vision to the next level

Assembling cameras, an eyeglass frame, and a tiny projector is easy enough. The result may profoundly increase safety and efficiency for general aviation and commercial aircraft alike by putting the world outside right in front of a pilot’s eye, visible through clouds thick and thin. A truly effective “virtual reality” display on this scale will take a little longer to perfect, though NASA expects to start flight testing this year.

Various teams at NASA’s Langley Research Center in Hampton, Va., are working to give pilots a tool that will keep them safely separated in a NextGen environment when visibility is nonexistent. A display prototype mounted on ordinary safety glasses, refined from a 10-pound helmet developed several years ago, is perhaps less than a decade away from cockpits, either commercial or GA. That timing, said Trey Arthur, an engineer on the project, will depend most on who wants it.

Arthur and the team are working to reduce the time lag between real-world events and the synthetic vision image that is projected on a screen the size of a postage stamp. Latency (time lag between the two images) causes disorientation in the air, prompting control inputs that don’t match what’s needed.

“Pilot-induced oscillation,” Arthur calls it.

On the ground, it can result in motion sickness (and has during early tests).

Trey Arthur NASA research engineer Trey Arthur wears a pair of glasses modified to project a virtual version of the world outside on a tiny screen in front of the eye.

Solve that latency problem with faster electronics, and there will be virtually no limit on what a pilot can see. One version of the system is designed exclusively for airport operations, using a three-dimensional database of structures, runways, and taxiways coupled with inertial guidance to create a fully synthetic image of the world outside. A line is drawn pointing the way to the terminal or departure runway, and a pilot can easily navigate even the most complex airports without verbal instruction, as Arthur demonstrated. Controllers will upload taxi instructions digitally, and Automatic Dependent Surveillance-Broadcast (ADS-B) data from other aircraft would also be added to the display.

One research question Arthur’s team has yet to decide: whether a diamond symbol similar to existing anti-collision displays is sufficient, or if the on-board database should include high-resolution aircraft images.

In a parallel effort, researchers are feeding the eye-mounted display with images from infrared cameras that move with the pilot’s head, synchronized by a tiny camera mounted on the display to track head movement. A pilot can then spot traffic—or whatever else the heat-seeking camera picks up—straight through clouds or fog. This “enhanced vision system” would require no on-board database.

“It also solves what we call the deer on the runway problem,” Arthur said. “If there’s something out there that’s not in your database, that’s not transponding, that EVS system could potentially pick that up.”

Arthur said flight testing of some systems should begin later this year. The end results would facilitate key elements of the FAA NextGen plan: increasing airport capacity and reducing fuel consumption – in part by allowing two large aircraft to fly simultaneous instrument approaches to parallel runways.

“They wouldn’t do this in today’s operations,” Arthur said. The “enhanced vision system” based on camera images would theoretically work in any weather, up to the limitations of the camera’s ability to see. “What we call better than visual.”

Pairs of approaching aircraft would be sequenced on a time interval that pilots control themselves, leaving controllers to simply watch the action. Radio congestion would be drastically reduced even as traffic volume dramatically increased.

Based on the pace of past developments, Arthur expects this technology to hit the marketplace in perhaps eight years.

Trey Arthur

NASA research engineer Trey Arthur is part of a team working to reduce image lag that can induce disorientation, and even motion sickness.

“If the technology’s there and the market’s there and the right people push the technology in the market, it seems to be about 10 years when that comes around,” Arthur said, adding enhanced vision concepts are one to two years into that cycle now. “People have seen it—the interest has been there.”

Once motion simulator and flight testing is complete, Arthur said headquarters will decide whether to patent and license the technology, or simply give it away. Various aviation technologies have been delivered by NASA to GA via both methods.

Arthur said that while the systems are being developed primarily with commercial aviation in mind, there’s no reason to believe versions won’t show up in GA cockpits first. Such was the case when the same NASA group produced panel-mounted synthetic vision displays, Arthur said: Business aircraft owners were among the earliest adopters.

Jim Moore
Jim Moore
Managing Editor-Digital Media
Digital Media Managing Editor Jim Moore joined AOPA in 2011 and is an instrument-rated private pilot, as well as a certificated remote pilot, who enjoys competition aerobatics and flying drones.
Topics: NextGen, Advocacy, Avionics

Related Articles