After a long flight in instrument conditions, a commercial flight crew begins a challenging approach, and the pilots’ attention suddenly turns to a landing gear light malfunction. With the autopilot engaged in cruise, a pilot loses focus and misses a pertinent radio call. Danger can lurk at either end of the workload spectrum: Pilots are more liable to make errors when overloaded or underworked and inattentive.
But what if a system could tap into pilots’ brain waves to automate tasks when they’re overwhelmed and snap them out of it when their attention drifts? Honeywell Aerospace researchers have used electroencephalography (EEG) sensors to quantify brain activity and cast light on when someone is at higher risk of making mistakes. The system has already had some success in measuring soldiers’ workload in stressful environments, and could inform technologies that nudge pilots and other aerospace workers from error-prone mental states into their neurological “sweet spot,” where their performance is highest.
A little stress can be a good thing. With more mental stimuli, one may become more focused and motivated to complete difficult tasks. With an abundance of stress, however, one becomes frazzled and performance deteriorates.
Recognizing that its systems’ effectiveness depends on human operators, Honeywell has developed a measurement of cognitive effort that can inform future technologies—keeping pilots at the height of the performance curve and making up the difference when performance wanes.
Measuring workload or attentiveness by traditional means can be tricky. Ask a test subject to rate his attentiveness at different times during a task, and his answer will be clouded by perception or gaps in his recollection. Physical responses such as answering an ATC call can amplify our understanding of attentiveness, and physiological measures such as heart rate can also give an indication; but Santosh Mathan, principal scientist in the Honeywell Human Centered Systems Group in Advanced Technology, said EEGs give a more accurate measurement, registering more pronounced differences between low- and high-workload situations.
EEGs record the brain’s electrical activity using electrodes on the scalp. As pilots perform tasks at a simulator wearing the sensors, Honeywell filters electronic noise and artifacts such as eye blinking and produces a single metric of cognitive effort. This metric lays the groundwork for emerging technologies to step in.
Honeywell put its algorithms and hardware to the test in Defense Advanced Research Projects Agency’s Augmented Cognition program: Soldiers wore wireless EEG headsets during a combat training exercise. The system gave a fairly accurate estimate of workload, Mathan said—information that could help avert ill-timed interruptions in the high-workload, high-stakes environment of military operations.
Honeywell further developed the technology to detect subtler differences in cognitive effort, a capability that is fueling efforts to estimate changes in cognitive function among stroke survivors in rehabilitation and soldiers with mild traumatic brain injuries.
In an aviation context, this technology can aid Honeywell in testing its product designs, identifying aspects of a design that make users work harder so that developers can improve it. It could also detect lapses of attention, as when a baggage screener has been staring at a screen for long periods of time. Alerting people of their lapsed attention can renew their focus, although Mathan, a pilot, acknowledged that responses to this kind of notification vary.
“It partly depends on people’s perception of their importance to the task,” Mathan said. If a person doesn’t believe that his or her attention is really compromised, or that it matters, then the person will not respond as much. Mathan said people may take the indications more seriously as the system proves itself.
In flight, new systems could tackle the doldrums of cruise flight by engaging a pilot with relevant tasks when attention dips; on the high-workload end of the spectrum, future technologies could recognize a pilot reaching mental saturation and bring in automated assistance. Instead of thinking of automation as an all-or-nothing proposition, Mathan said, we could use it in proportion to what’s needed.
Or we could fly with our brains. A related project by Honeywell gives “hands free” new meaning: In a video demonstration, a pilot controls a simulator by looking at designated flashing, checkered boxes on a screen. A system processing EEG signals recognizes when the pilot looks at the box flashing at the frequency for left, right, up, or down, and executes the command on the autopilot.
The video gives a dramatic demonstration of neurorobotic control, but practical applications will more likely be in controlling less mission-critical tasks when a pilot’s hands are occupied—moving a map or toggling radio frequencies, for instance.
Commercial Honeywell systems may not tap into pilots’ brains anytime soon, but the research is a logical next step for the company, whose products already aim to minimize excess stimuli at times when workload may overwhelm a pilot. SmartView synthetic vision declutters the screen when it detects an unusual attitude to focus the crew’s attention on correcting it; a 3-D taxi system in the works removes roll and pitch instruments when the aircraft is on the ground and visually emphasizes important markers such as hold short lines while de-emphasizing potential distractions. It’s all part of an effort to manage workload so they can operate in their “sweet spot” of cognitive effort, and at the top of their game.