Get extra lift from AOPA. Start your free membership trial today! Click here

Brain direct: In pursuit of an aircraft-mind connection

brain direct 

Future pilots might be invited to have a tiny sensor surgically implanted in their brain, extending the fly-by-wire concept straight to the motor cortex. A more palatable option, perhaps, is simply donning a headset, though accuracy suffers.

If using Wonder Woman’s tiara to fire up your Skyhawk sounds farfetched, it is, for now, according to various scientists working to decipher the complex electrical signals generated by billions of neurons as people pilot simulated airplanes through an array of electrodes hugging the scalp (“look, Ma—no hands.”)

There has been a surge in interest—and research investment—in recent years, and the first goal is not to fly hands-free, but to bridge gaps between damaged cells. There is good reason to hope paralysis and similar disabilities with neurological roots can be overcome by creating a direct connection between mind and machine.

Much of this work involves flight simulation, in large part because controlling an airplane with electrical signals from the brain is more complex and challenging than driving a virtual car. Researchers at the University of Illinois at Urbana-Champaign have a lot of practice with test subjects flying simulated aircraft in the laboratory. Associate Professor of Aerospace Timothy Bretl said in an email it is “unlikely” that systems based on EEGs and similar devices “will give performance that exceeds traditional input devices (e.g., pedals, stick) in an aircraft. That being said, what exactly constitutes an ‘EEG/EMG-based interface’ is becoming less clear.”

Another potential application of reading a pilot’s mind might help increase safety without touching controls: EEG-based systems can already measure a subject’s focus and concentration, alerting when fatigue begins to dull the mind or concentration wavers.

One of Bretl’s students, doctoral candidate Abdullah Akce, explained the challenge of picking out signal from noise and deciphering the kind of instructions from the brain that evoke small, precise movements: Electrodes resting on the outside of the skull capture brain signals at a level similar to a microphone picking up sound in the middle of a crowded stadium. Knowing that the crowd is cheering a play, or booing an umpire, is one thing; picking out individual conversations is quite another.

Akce said computer software can decipher basic commands in a couple of seconds. Think “left,” or “right,” and the machine understands and responds. But fine-tuned control, the sort of constant input required, for example, to fly an instrument approach is far more complex, and current systems lack the accuracy the FAA would require.

“That decision might be noisy,” Akce said. “One out of five commands might be incorrect.”

A more promising route to connecting pilots with aircraft through a “brain-computer interface” or “brain-machine interface,” as they are variously termed, involves implantation—which risks infection, rejection, and other potential side effects. It is a risk that many people with disabilities are willing to take. Researchers from Brown University, Stanford University School of Medicine, Massachusetts General Hospital, and the U.S. Department of Veterans Affairs are collaborating to create technological solutions to neurological injury and disease. They have developed a small, wireless device that is implanted in the brain, and transmits a much more detailed signal to a computer programmed to translate thought into complex action, such as commanding a robotic arm to grip a soda can.

Human subjects will be implanted soon, and wounded warriors might be back on their feet within a few years.

The pace of such efforts has increased dramatically, note the authors of “Brain-Computer Interface: A State of the Art Summary” (Springer, 2013): No more than five laboratories were active in BCI research in the early 2000s. Today, more than 300 laboratories are focused on the effort.

The press has long since seized on the idea of flying by mind, a concept that is enduringly fascinating, if not yet practical.

“Media exaggerate these things,” Akce said. “There needs to be many more improvements for that to happen.”

Already flying by mind—sort of

Pilots flying today will most likely never land an airplane by thinking, “flare, now.” (It would be a terrible shame if a distraction derailed that train of thought.) There are, however, consumer devices already available that can “read your mind,” deciphering some very basic control inputs (or intentions) from the complex cloud of signal that the human brain generates constantly. Students at Hawaii Preparatory Academy made a splash at MacWorld with a scale quadcopter controlled remotely by an Emotiv headset, able to pick out some very basic signals from the electroencephalogram noise. Their teacher, Bill Wiecking, said in an email that it is not farfetched to imagine controlling a manned aircraft this way, once the system is refined.

“Our belief is that this will become a tool for both manned and UAV flight,” Wiecking said.

Much depends on what is meant by “control.” Using a brain-direct interface to input a destination into a flight computer, or cycle displays, is a very different proposition than using it to transmit rapid and precise motor impulses that allow pilots to safely land. In other words, replacing a keyboard or knobs on a flight management computer is a very different prospect than replacing the stick and rudder.

The Puzzlebox Orbit, an off-the-shelf headset-controlled aircraft by NeuroSky, uses a version of traditional “autopilot” technology to do the actual flying, with signals from a user’s brain giving general direction.

Jim Moore

Jim Moore

Managing Editor-Digital Media
Digital Media Managing Editor Jim Moore joined AOPA in 2011 and is an instrument-rated private pilot, as well as a certificated remote pilot, who enjoys competition aerobatics and flying drones.
Topics: People

Related Articles