Get extra lift from AOPA. Start your free membership trial today! Click here

Drone 'eyes' might be better than yours

A computer-vision collision avoidance solution developed for drones may one day make every aircraft a little safer.

Real-world test encounters near Reno, Nevada, helped prove Casia’s capabilities. Photo courtesy of Iris Automation.

It took months of testing before the crew at Iris Automation realized what the system they created is really capable of. Now, with about 7,000 real-world close encounters purposely flown, along with more than 40,000 encounters simulated by an increasingly sophisticated computer model, “we’re pretty sure at this point we’re better than human vision,” said Iris Automation CEO Alexander Harmsen.

He co-founded the company and assembled a team of pilots, computer programmers, engineers experienced with fusing multiple sensor systems into a single whole, and aviation industry veterans who have worked for the likes of NASA and Boeing. They've worked together to prove that simple digital cameras connected to sophisticated computers can create an artificial version of situational awareness that is far more comprehensive and accurate than any such picture produced by the human eye and mind.

For perspective, humans have many shortcomings when it comes to seeing and avoiding other aircraft. Painstaking reconstructions of midair collisions, including a 2015 accident involving a U.S. Air Force F–16 and a Cessna 150 over South Carolina, revealed that when any two aircraft wind up on a collision course, many things have to go right to avoid disaster, with a healthy measure of good luck added. The NTSB reconstructed that fatal 2015 collision second by second and determined that the view of both pilots was obstructed at a critical moment by parts of their respective aircraft.

Iris Automation has conducted thousands of real-world intercept tests. Photo courtesy of Iris Automation.

Cameras able to capture light across a range of wavelengths can be mounted on drones in locations that offer clearer views than what is possible from inside most cockpits, and the compact computer they are connected to in the Casia artificial situational awareness system, which went on sale April 26, takes data from the unblinking artificial eyes and starts out by calculating a rather straightforward geometry problem. Within milliseconds, it sizes up each new object in the visual field and calculates the object’s closing velocity and trajectory, along with what maneuver is required to deconflict. That maneuver is then instigated through a direct connection between Casia and the unmanned aircraft’s flight control system, with no human intervention required.

“One of the things our system is quite good at is to always be on, and always be looking at the outside world,” said Harmsen, whose 10 years of experience as a manned pilot has informed his view that distractions often interfere with seeing and avoiding traffic. He cited research that determined the average human pilot needs 12.5 seconds to see and react to converging traffic, and “for our system, it’s about 200 milliseconds. There’s a huge difference between humans and what our technology would be able to do.”

Indeed, independent research conducted by various universities that tested the limits of human visual observers found that humans tend to make mistakes when estimating closing trajectories and collision risk from the ground. One such study found that observers on the ground consistently miscalculated collision potential and the time available to avoid collision by an average of 17 seconds; it also found that ground observers are unable to reliably spot manned aircraft from the ground beyond a range of about 1,600 feet.

Iris Automation’s computer-vision approach to detect-and-avoid is the first system to reach the market that closes a fundamental gap in safety, allowing drones to automatically evade other objects in the airspace including manned aircraft that are not broadcasting their position using Automatic Dependent Surveillance-Broadcast Out systems or emitting a transponder signal. Such so-called “non-cooperative” aircraft will continue flying for years to come, and AOPA has urged the FAA to consider that fact when deciding how to regulate advanced drone missions, including flights beyond the remote pilot's line of sight.

“There are numerous types of manned aircraft flying every day that do not and will not have a transponder or ADS-B installed,” said Rune Duke, AOPA senior director of airspace, air traffic, and aviation security. “Many general aviation aircraft do not have an electrical system that can support cooperative technology. We support the innovative work by Iris Automation and other companies who are leading the industry in detect-and-avoid solutions that are compatible with all types of manned aircraft flying in the same airspace.”

The FAA recently approved the nation's first drone delivery service, with Wing, a subsidiary of Google parent Alphabet, cleared to deliver packages in rural areas. Other drone flights beyond visual sight are being conducted on a limited basis in defined airspace, including newly approved programs in Ohio and North Dakota, where ground-based radar is used to detect any "noncooperative" aircraft approaching a drone operations corridor. This approach is costly to scale up, particularly in areas where radar coverage is limited or unavailable entirely.

Harmsen said the lack of a drone-mounted, active-detection system that can be deployed on a large scale has been the single biggest obstacle to unleashing the potential of drones to operate beyond line of sight for medical emergency response, disaster relief, infrastructure inspection, and a host of other missions that cannot be accomplished with the drone in view.

“This problem, the risk mitigation of noncooperative aircraft, is still the number one problem of this entire industry,” Harmsen said.

Casia does more than solve geometry and trajectory vector problems. It has layers of additional processing power built in that enable it to identify the objects it "sees," and differentiate between different aircraft types, as well as birds and any other objects that it may encounter in the air. This higher-order processing and analysis happens in parallel to the fundamental collision avoidance algorithm, and allows the system, which is integrated with the unmanned aircraft’s flight control system, to refine its response as circumstances dictate.

Iris Automation has been a participant in the federal drone integration pilot program, and Harmsen said customers who installed the system through an early adopter program have added to a trove of data that has further refined the company’s computer simulation environment, which augments the real-world trials in which manned aircraft are deliberately flown close to Casia-equipped drones.

Much of the firm's real-world close encounter work has been done in Reno, Nevada, where an Iris Automation team of manned and unmanned pilots has worked to demonstrate the reliability of the system based on proven, simple technology and custom software. Aerial encounters can happen in a huge range of lighting conditions, angles of approach, and closing velocities, and Casia has so far proved capable of handling them all, though testing every conceivable scenario is a daunting proposition.

Harmsen said the system has proved capable of identifying and solving the closing trajectory and avoidance maneuver calculations for an object measuring 10 feet in diameter out to a range of about 1,600 feet. Real-world tests organized by the FAA and others have consistently demonstrated performance superior to human capability in all but one respect, Harmsen said: “Based on pure range, ability to see… if a human (in an aircraft cockpit) was focused on something in the distance, I think (the human pilot) would beat our system,” Harmsen said. “That’s probably the one thing that humans are better at.”

Iris Automation co-founders Alex Harmsen, left, and James Howard. Photo courtesy of Iris Automation.

Despite a more limited maximum potential detection range, the system has proved capable of getting a drone out of the way, even in high-speed, head-on encounters that present the greatest challenge. An FAA-organized test in Kansas included such intercept scenarios, and at no point did the drone come closer than about 100 vertical feet of a manned aircraft passing straight overhead. “That was probably the most extreme testing that we’ve done,” Harmsen said.

Harmsen declined to name a specific price for Casia, partly because much depends on the customer’s particular needs. He said the hardware is relatively inexpensive on a unit-cost basis, and sold with a service subscription that includes regular software updates and a licensing agreement. He said the price per unit therefore depends on the size of the drone fleet a customer wants to equip. Casia has been fitted to drones as small as a DJI Matrice 100 quadcopter, and is designed to add minimal weight (about 300 grams) and minimal power demands, enabling it to be fitted to the broadest possible range of drones.

“I’m comfortable saying that it’s between $5,000 and $50,000,” Harmsen said of Casia’s typical cost, adding, “it’s on the lower end of that.”

The purchase price also includes Iris Automation's support with regulatory applications, so when a customer applies for an FAA waiver to conduct flights beyond visual line of sight, the company will provide data to support that application.

The startup firm with offices in San Francisco and Reno hopes to scale up to mass production sooner than later.

“What we really want to do is we want to sell a couple million of these systems over the next couple of years,” Harmsen said. “That’s really the vision that we’re trying to unlock.”

Harmsen said the system’s potential utility for manned aircraft applications as well as unmanned has not been lost on manned aircraft pilots who have been introduced to it at trade shows and conferences.

“It’s definitely a market we’re looking at,” Harmsen said, though the company is focused on equipping unmanned aircraft in the near term.

Adapting Casia for manned aircraft may require some refinements, depending on whether it is to be integrated with an automated system able to manipulate flight controls to avoid an object, or configured to provide warnings to the human pilot, who must then act to avoid accordingly.

Casia is not the first drone-mounted system capable of detecting noncooperative aircraft to reach the market, though it is the first such system that utilizes computer vision. Echodyne developed a drone-sized radar system that has also been tested in conjunction with the federal drone integration program, while PrecisionHawk is testing another approach entirely, using microphones to detect nearby aircraft. Radar can penetrate clouds more reliably than imaging systems tuned to visible or infrared wavelengths, giving it one important advantage if instrument flight is required. The cost of creating a radar array with a field of view that matches a computer vision system at a given size, weight, power consumption, and cost remains unclear. Tradeoffs like this help explain why most experts believe a combination of technologies will be required to achieve the required level of safety in all operating conditions.

This image simulates Casia’s view of a Cessna 172. Image courtesy of Iris Automation.
Jim Moore

Jim Moore

Managing Editor-Digital Media
Digital Media Managing Editor Jim Moore joined AOPA in 2011 and is an instrument-rated private pilot, as well as a certificated remote pilot, who enjoys competition aerobatics and flying drones.
Topics: Drone, Aircraft Regulation, Advocacy

Related Articles