'Siri, Kill That Guy': Drones Might Get Voice Controls

The next decade could see a huge shift in the way armed drones and their human controllers interact, with potentially profound effects on future battlefields. At the heart of this change: two-way voice controls for autonomous systems, just like your iPhone's Siri app. Also, vibrating controls like an Xbox controller. A drone operator could literally talk to a drone -- and the drone could talk right back, and also alert its human operator with a sensation similar to touch.
Predator control station
A Predator ground control station.Photo: US Air Force

The next decade could see a huge shift in the way armed drones and their human controllers interact, with potentially profound effects on future battlefields. At the heart of this change: two-way voice controls for autonomous systems, just like your iPhone's Siri app. Also, vibrating controls like an Xbox controller. A drone operator could literally talk to a drone and the drone could talk right back, and also alert its human operator with a sensation similar to touch.

Today, human drone operators rely on clunky interfaces comprised of computer screens, keyboards and joysticks to steer their robot charges, which might be thousands of miles away from the virtual cockpits. The operator's input is limited to keystrokes and mouse and joystick movements transmitted via satellite. The drone responds solely with streams of data or visual images sent from its onboard cameras. "It's a desktop-type environment similar to an office," explains Mike Patzek, a senior engineer working for the Air Force Research Laboratory in Ohio.

In the next decade or so, much more sophisticated controls -- what the Air Force calls "man-machine interfaces" -- could replace the desktops, Patzek tells Danger Room. In addition to the Siri-style two-way voice exchange, Patzek says the next-gen controls could include smarter, easier-to-interpret computer displays and tactile feedback from the drone to the operator, much in the way an Xbox controller vibrates to alert a player he's taking damage within a game.

Imagine an Air Force drone operator sitting in front of a single, large computer screen elegantly displaying select data from the distant robot in an intuitive graphical format -- say, bits of information laid over a hyper-realistic three-dimensional moving picture stitched together from multiple visual and infrared sensors. The operator simply sits and watches until the robot literally asks for advice, perhaps on which suspicious objects -- as determined by its sensors and algorithms -- to check out more closely.

At that point the human 'bot-wrangler states his recommendation and the drone swoops down to do its master's bidding. If the robot detects incoming enemy gunfire, it alerts its boss by causing his chair to shake. The operator can call out, "Evasive action!" and the drone banks sharply.

That's just one hypothetical scenario. Exactly how the interfaces evolve depends on the progress of the Air Force's research -- and its funding. This year the flying branch's enthusiasm for advanced drones has cooled somewhat. The Air Force has reduced its purchases of state-of-the-art MQ-9 Reapers and cancelled development of its ambitious MQ-X jet-powered attack drone. Still, no one disputes that flying robots will play an important role in future U.S. air power.

"The fundamental issue is that the [robotic] systems are going to be more capable and have more automation," says Mark Draper, a research psychologist for the Air Force Research Laboratory. "The trick is, how do you keep the human who is located in a different location understanding what that system is doing, monitoring and intervening when he or she needs to?"

Perhaps the same way people communicate with each other. By using touch ... and their voices.