Amazon delivery drone
An arrow pointing leftHome

Why do Amazon’s autonomous drones keep falling out of the sky?

  • Adam Bluestein
4/28/2022

Could machine learning problems be contributing to crashes?

Jeff Bezos may have shot himself to the edge of space, but he’s had a much harder time getting Amazon’s drone delivery service, Prime Air, off the ground.

Since announcing the program in 2013 — with the promise of 30-minute delivery within three or four years, Amazon’s foray into aerial robotics has steered a wobbly course. A recent Bloomberg News report revealed a program plagued by technical problems, personnel issues and flagging morale. Not to mention concerns that the company may be sacrificing safety for speed in an effort to keep up with competitors.

While Wing, a subsidiary of Alphabet, made more than 100,000 deliveries with its delivery drones last year in Australia — and was recently cleared to launch commercial service in the suburban Dallas-Fort Worth area — Amazon’s delivery program has yet to make a commercial flight, and has recently drawn attention for other reasons.

According to Bloomberg, five of Amazon’s autonomous drones — UAV (unmanned aerial vehicle) or UAS (unmanned aircraft system) are the preferred terms — crashed within a span of four months last year. As first reported by Insider last month, one crash last June started a 25-acre blaze that had to be put out by the local fire department. All the incidents took place at the Pendleton UAV Range, an FAA-designated drone-testing site in the high desert of Eastern Oregon, where companies like Amazon, Airbus, Verizon and other clients fly up to 1,000 test flights a month.

What’s going on? As with most drone crashes, it’s hard to know. Under U.S. law, only certain types of non-recreational drone incidents — including ones involving drones weighing more than 300 pounds, property damage costing more than $500, and incidents that cause serious injury or death — must be reported to the Federal Aviation Authority and the National Transportation Safety Board. Amazon self-reported a crash last May to the NTSB, which occurred after a drone lost a propeller, but cleared the site before the FAA could investigate. (An Amazon spokesperson told the Verge that they were following orders from NTSB.)

An FAA report on the June crash that caused the fire determined that the drone’s motor shut off as it shifted from vertical flight to flying straight ahead, and two safety features designed to stabilize the drone and land it in this type of situation both failed, causing the drone to flip and plummet upside-down from 160 feet up.

We may never know what caused the other crashes. Another propeller or motor problem? Or something else?

The technical challenges of autonomous flight are daunting, chief among them the development of reliable computer vision systems — complex arrays of cameras, other sensors and onboard processors that tell the drone where it is in three dimensions, and help it to detect, identify and hopefully avoid objects as it plots an efficient course from Point A to Point B. Throw in variables of lighting, weather and all manner of real-world surprises, and the difficulty level becomes clear. Whether or not problems with its computer vision systems contributed to any of the Amazon drone crashes, these systems certainly merit close attention.

Approved by the FDA in 2020 for commercial deliveries, Amazon’s 85-pound, six-propeller Prime Air drone (the Model MK27) reportedly uses a multi-view stereo-vision system (with cameras in different positions to help gauge depth), a thermal camera and a sonar unit mounted to the drone, with an onboard “neural network” that detects and classifies objects — building, bird, powerline — on the fly. Such multi-view systems have helped to decrease the risk of drone crashes and make drone flights more efficient. But each piece of these systems also has its limitations.

“The short baselines of stereo cameras [i.e., the distance between cameras] on drones means that they can’t sense depth reliably past about 10 meters,” says Leaf Jiang, an engineer and CEO of Nodar, a Boston-area startup that makes 3D sensing systems for autonomous vehicles. “This means that they need to fly slowly when they expect trees, power lines or other obstructions.”

Adam Bry, the CEO of Skydio, a leading U.S. maker of drones for clients in industry, law enforcement and the military, says “the magic is in the software. We’re getting depth perception entirely from neural networks doing stereo calculations [comparing views] from six fisheye cameras.”

The challenge with deploying neural nets on drones, Jiang says, “is usually being able to get good performance after the network is severely trimmed down to fit on the wimpy computer hardware.” Nvidia and other chipmakers, though, are making big investments in advancing the technology. “The more compute we have, the more interesting and useful stuff we can do,” says Bry. “So in that sense, we are limited by the processors. But the technology is already at a very powerful point, and it’s improving very quickly.”

While some large drones may employ LIDAR systems, in addition to cameras, to help gauge distance from objects (or for mapping), because of their high power requirements, “there’s a poor cost-size-weight performance trade-off [for most drones],” says Bry.

Some drones, like Amazon’s, “might use a cheap-o sonar range finder pointed downwards, like those on car bumpers, to determine their height above the ground for landing,” says Jiang.

Thermal cameras are most often used for taking images of crops, but they can also play a role in safe navigation because they are good at spotting warm, living things like humans and animals against a cooler background, especially at night. “They’re not so good when the ground is near the temperature of the human,” Jiang says.

The quality — and variety — of data that’s used to train computer-vision algorithms can also impact the accuracy and reliability, says Jiang, “to the extent that drones likely will encounter situations and objects unlike any in their training data, and therefore might not respond properly.” Drone companies will use public data sets as well as synthetic data that might be shared across companies. Many also collect their own data sets specific to their use cases that are proprietary.

None of roughly a dozen experts we contacted would go on record about whether faulty computer vision systems specifically might have played a role in Amazon’s crashes.

“It’s rare that you have a pure software problem or a pure hardware problem, but a combination of things,” Bry says. “System complexity — the connectivity among cameras, propulsion, electronics, computing — tends to be the most challenging piece. Even a fault that lasts a few seconds can be the end of the craft. The issue with delivery is that a drone big enough to carry a five-pound package is big enough to kill somebody.”