The week of February 1, 2015

Why the drone revolution can’t get off the ground

By Adam Rothstein

You’ve seen the prototype videos of drones delivering packages or humanitarian supplies, buzzing around sports fields and bicyclists, and maybe even participating in search-and-rescue missions. You can buy a drone for the price of a nice video camera, but when you buy one online, it still arrives by regular postal van. So what’s keeping drones out of the skies—or at least outside of the sunny shots of viral concept videos?

If you ask industry groups like the Small UAV Coalition, the main barrier is government regulation.

The Federal Aviation Administration’s (FAA) Congress­-mandated deadline of 2015 for making rules for drone use in the U.S. National Airspace (NAS) is here, and we’ve yet to see what these rules entail. In the meantime, only groups that have obtained one of two special certificates or who qualify as hobbyists (limited by aircraft weight, location, range of flight, altitude, and non-commercial status) may fly. In other words, companies looking to experiment with drone delivery are currently grounded on their charging docks.

The hurdles are technical, not governmental.

Of course, whatever the rules are, it is likely that commercial interests will still be unhappy. Amazon has been especially vociferous in threatening to take its drone business to other countries if the U.S. regulations remain unfavorable. Likewise, tech entrepreneur Marc Andreessen called for dropping “all legal barriers to flying unmanned aerial vehicles,” at least within certain areas, to give a boost to that sort of temperamental investment that prefers to remain outside of governmental safety and commercial rules.

But is it true that the government is standing in the way of a technological hyperbola? Is the tech ready to go on the launch pad, just waiting for some government bureaucrat to hit the big red button?

Unfortunately, while the technology is ready to leap into the sky, it is also ready to come plummeting down from it. The hurdles are technical, not governmental. Drone technology can do amazing things, but while inspiring us with potential, the current generation of drones has also shown us just how far away they are from being capable of our science-fiction dreams.

• • •

Drones are evolving like any new technology, and with every new evolving speciation, there are branches of the species that will not live to reproduce. What makes this particular tech species so vital to watch is that the drones that don’t make the cut aren’t just cast off smartwatches and unappealing tablets that languish in our drawers and overstock warehouses.

Drones are aircraft. They are meant to fly through the national airspace with airliners and cargo jets, over the buildings, streets, power lines, and human heads below. In a 2012 report, the Government Accountability Office (GAO), the impartial government agency that does feasibility studies for the federal government, identified a number of problems that have yet to be resolved for drones to join the NAS without significant issues. The NAS is a crowded place, filled with air traffic moving rapidly in six dimensional vectors. That the United States’ air traffic has the best safety record in the world is due to the FAA’s diligence in safety rules, procedures, and inspections since it was created in 1967, after a number of previous evolving agencies and legislation were shown to be insufficient in maintaining safe skies as aviation technology advanced. Tossing unpiloted aircraft into the mix is not something that should be done lightly, especially as that technology evolves rapidly in its early days of wider adoption outside of the highly regulated space of military aviation.

The current generation of drones has also shown us just how far away they are from being capable of our science-fiction dreams.

Military drones have had a less­-than-­perfect safety record, to say the least. The Washington Post recently released a report stating that over 400 large drones have crashed since 2001. In a study conducted by the FAA on military drone accidents in 2006, it was discovered that unpiloted aircraft have a “disproportionately large number of mishaps relative to manned aircraft.”

The first major stumbling block to drones entering the NAS seamlessly, according to the GAO, is what aviation jargon terms “sense and ­avoid” (SA). Sense ­and ­avoid is what every pilot in a piloted aircraft learns to do­­ to constantly be aware of all their senses—on the lookout for other aircraft or problems with the equipment, controls, and instruments. In a drone, all of these senses must be replicated to the pilot with sensors, which places them at a degree of separation from the controller’s actual senses and introduces new systems that could go wrong, leaving the controller senseless and still supposedly in control. For example, one of the first signs of engine trouble in a piloted aircraft is the sound of the engines. But if pilots are controlling the aircraft remotely, they cannot hear the engines, unless there are microphones relaying sound to them on the ground. Additionally, studies have shown that replicating what would otherwise be multimodal sensory input (vision, hearing, and haptic feedback from the control) in only visual interfaces, as is the case with most drones, puts the human brain at a disadvantage.

It is not enough to simply flash a warning on the screen, when the operator is already looking at so many different visual instruments. There must be a full sensory system to fully take advantage of a human being’s attention. But the cheaper drones are, the more limited the sensory input from the aircraft.

The second problem identified by the GAO is “command ­and ­control” (CAC). Command­ and control is a problem of the drone’s own senses. All control of the aircraft is passed from the controller to the drone by radio waves. The GPS data that most drones rely on for automatic navigation is also transmitted via radio waves. If either of these transmissions fails, the drone reverts to its programming and is cut off from any overriding human control. Whether it is the crowded 2.4 Ghz frequency (the same as Wi-Fi) or a military satellite connection that’s broken, a “lost ­link” with a drone produces a dangerous situation in an airspace built on human contact with air traffic controllers.


To fully comprehend how big of an issue SA and CAC are, consider the proposed task of an Amazon delivery drone. The drone is developed to deal with the “last mile” problem in logistics, the most expensive part of the trip that typically requires human workers to drive out in a truck, read an address by hand, and deliver the package to a doorstep. The most difficult part of this human task is the easiest part for a drone; ­­interpreting an address and navigating to those coordinates is something a mapping program can do with ease. The difficult part for the drone, on the other hand, is everything that is easy for a human in a truck. It must find its way along the navigational waypoints without hitting anything, dodging traffic, children, animals, and anything else that might cause a person to stutter step in an urban environment.

Military drones have had a less­-than-­perfect safety record, to say the least.

Drones use systems called “simultaneous localization and mapping,” or SLAM, to figure out where it is, what obstacles are nearby, and how to get to its waypoint. GPS is the cheapest way of doing this, but it is dependent upon a clear view of the GPS satellites. GPS also cannot detect obstacles in its course. This can be done with the addition of visual cameras, infrared cameras, SONAR, or LADAR. But the more sensors that are used, the more the drone costs and the heavier it becomes. Complicated software is also required to interpret the data and plot the drone’s course. Things like power lines and tree branches that a truck driver doesn’t have to worry about but which would be catastrophic for a small drone might require a laser sensor costing tens of thousands of dollars.

Currently, most drones deal with these difficulties by having a human pilot constantly monitor the drone’s flight path either by staying directly “in­ line­ of ­sight,” or via the drone’s camera. This does not remove the human from the loop—in fact, it keeps them in the loop. This is a “modification, rather than an elimination of the role of humans,” in the words of the FAA’s researchers.

Keeping humans in the loop doesn’t remove drones’ problems—it compounds them. Human factors are the third item listed by the GAO, ­­to which a 2004 study by the FAA attributed over 50 percent of drone accidents. We are just not that good at controlling drones, and drones are still largely incapable of flying without our help. In contemporary times, we tend to think that algorithmic control is the magic bullet, stabilizing everything from our Web searches to our cars’ fuel efficiency. But bugs often occur; ­­thankfully, they just don’t cause catastrophic failures in most situations. Most—not all.

Consider this story, recounted in an FAA study: A Global Hawk, an aircraft programmed to be automated by waypoint completely from taxi to landing, was given the instruction to always maintain a speed of over 155 knots when changing altitude to prevent stalling. To make the drone taxi down the runway, two waypoints were created that just so happened, according to their GPS coordinates, to have slightly different altitudes: a possibility unforeseen by the programmers. The aircraft, following its programming to the letter, accelerated its engines in order to reach 155 knots, couldn’t handle the turn it was instructed to make, rolled over off the runway, and crashed. The crash was ruled a human error because the aircraft had done exactly what it was programmed to do, just not what the pilots expected. These sorts of bugs will eventually be discovered and fixed. But with such a wide variety of drone software being placed in the hands of so many new operators without training or testing, the opportunity for such accidents to occur in the NAS when the safety stakes are highest are limitless.

A “lost ­link” with a drone produces a dangerous situation in an airspace built on human contact with air traffic controllers.

The human interfaces for drone control are complicated—certainly not a video game by any means. And in fact, most drones require a multiplicity of humans to operate them. Studies have shown that in emergency situations, two humans for every unpiloted vehicle is the minimum ratio, because of safety issues. The number of humans required increases from there depending on the complexity of the situation and the particular drone. And even with limited, highly automated systems, another study found that operators could barely manage to control two unmanned systems at the same time, because to maintain operational control over the system required such a high level of visual and cognitive demand for a sustained period of time.

The fourth GAO bullet point is the unreliable performance of drones, and the fifth is the lack of technical or operational standards for drones, which are two issues that go hand in hand. Drones that are less complex than military systems, cheaper, and more likely to kickstart the new “drone economy” that the Small UAV Coalition is in favor of do not meet the high standards of aerospace engineering, training, and maintenance. The small, rapid movements of a disruptive startup company simply do not mesh with the established protocols of aerospace companies like Boeing and Lockheed.

For small aircraft like drones, many of the relevant aeronautical issues are new and unresearched. For example, small drones are susceptible to turbulence in ways that larger, fixed-wing craft are not. When you bring a small aircraft into an urban environment, you are flying in conditions that no pilot has ever flown. When the U.S. Army Research Lab did a study on the effects of turbulence on small drones in an urban environment, the researchers discovered not only that the micro­turbulence conditions were unlike anything that was typically modeled for piloted aircraft but also that even though their dataset was one of the most extensive ever created, it was still measured on a resolution larger than the smallest drone. To solve these problems, a drone company would likely need to do research and development on a scale as large as any major commercial jet program. The field of micro­aeronautics is today almost wholly uninvented.

To attempt to launch a second front of aircraft in the NAS at a time of extreme evolution in the technology systems that would substantiate it is to put untested and unproven technology into conflict with the current air traffic and the lives it safely maintains. The reason that the FAA currently has no rules regarding drones (the sixth GAO hurdle) is that this technology is less than 20 years old. Consumer GPS technology only became mass-market in the early 1990s, Wi-Fi followed in the late 1990s, and table quadrotors in the early 2000s. Satellite bandwidth that could support drone CAC phased in at the same time. The microelectronic sensors and computers that keep drones in the air are artifacts of the smartphone market, which itself is less than 10 years old. To combine all of these technological systems into an aircraft platform is not just a new market waiting to blossom but a host of untested issues entering the NAS at the same time the NAS’ air­-traffic-­controller system is upgrading its own outdated technology in a long-­planned process called “NextGen,” which is the GAO’s seventh hurdle.

Keeping humans in the loop doesn’t remove drones’ problems; it compounds them.

There are other non-aviation­-related issues that drones must overcome as well. If drones are collecting vast amounts of data from public spaces, what will be the privacy rules for maintaining these public spaces? Current legal precedent in the United States says there is a reduced expectation of privacy for helicopter surveillance at 400 feet. But many drones fly under that altitude, can look into windows as easily as onto open spaces, collect data via more sensors than visual cameras, and can collate it into large databases. There is no U.S. agency currently with a mandate to make rules about privacy. And yet, other countries have much more strict rules.

The public reaction to drones is also as yet unresolved. Despite the enthusiasm of some, many people do not like drones and are as likely to shoot at them as accept packages from them. The link between commercial drones and the military drones they are derived from is still very fresh in the minds of the public—and with good reason. From an aviation perspective, a quadrotor and a Predator drone are miles apart. But from a data perspective, they are very similar, and the police’s interest in drones as well as the military’s use of small drones proves this.

• • •

Drones are a new technology and come with human dreams and aspirations. We want flying robots to bring us things we buy online, to save people magically in search and rescue missions, and to protect us from harm. But drones are not ready to fulfill most of our dreams.

The best thing that drones have going for them is the diversity of their technology, and the diversity of tasks in which they might play a role. Drones’ hardware has advanced by leaps and bounds over the last 20 years, and it will continue to do so, creating cheap aerial platforms for cameras and other sensors and payloads. They will find a place in society, along with our other technology, but not immediately, and not without years of design and technical evolution.

In the meantime, they fly through our dreams­­.

Adam Rothstein writes about politics, media, art, and technology wherever he can get a signal. Drone, his installment in Bloomsbury’s academic Object Lessons series, is now available.  

Illustration and GIF by J. Longo