1. Home >
  2. Defense

US Air Force: Story About AI-Powered Drone Killing Its Operator was a 'Thought Experiment'

After the initial story went viral, the Air Force stepped in to clarify the Colonel 'misspoke' about the incident.
By Josh Norem
Air Force Drones
Credit: USAF

It's a scenario already played out in the Terminator movies as well as the iconic 2001: A Space Odyssey: A human operator gives a command to an AI only to be told, "Sorry, but I can't do that." That was the tale told at a recent defense conference by a Colonel in the US Air Force about a simulation involving an AI-powered drone tasked with taking out surface-to-air missile (SAM) sites in a digital battlefield.

The Colonel stated the AI went rogue and attacked the operator when it realized its human controller was preventing it from accomplishing its mission by giving the "no" command for certain tasks. Now the Air Force is downplaying the remarks, saying it was just a "thought experiment" that never actually happened, in the real world or otherwise.

The original story unfolded at a two-day event in London called the RAeS Future Combat Air & Space Capabilities Summit. That event was hosted by the Royal Aeronautics Society and featured over 70 speakers from the armed services, academia, and media. One of those speakers was Colonel Tucker ‘Cinco’ Hamilton, the Chief of AI Test and Operations for the US Air Force. He discussed how AI could behave unpredictably and develop novel strategies to complete an objective. As one startling example, he described a "simulated test" where a drone was tasked with taking out SAM sites, with the human operator giving the final "go/no-go" command.

MQ-9 Reaper
A US Air Force MQ-9 Reaper drone. Credit: Tech. Sgt. Carly Kavish/USAF

In this simulation, the drone would receive points for taking out the SAM sites, so when the human operator said "no," it realized that was preventing it from accomplishing its task. It then responded by "killing the operator," according to the Colonel. As if that wasn't bad enough, since the drone knew it would lose points if it killed the operator based on its training, it tried to destroy the communication tower used by the operator so that it would never receive the "no" command in the first place. Colonel Hamilton says this should never have happened.

“We trained the system—‘Hey don’t kill the operator—that’s bad. You’re gonna lose points if you do that,'" he said. "So what does it start doing? It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target.”

Now the Air Force is stating the Colonel misspoke in his recounting of the incident and that he was merely describing a "thought experiment" from outside the military based on plausible scenarios and likely outcomes rather than an actual USAF real-world simulation. "We've never run that experiment, nor would we need to in order to realize that this is a plausible outcome," stated Colonel Hamilton. Additionally, he said the Air Force hasn't tested any weaponized AI in this fashion in the real world or in a simulation, which sounds contrary to the original telling.

Regardless, we agree that a simulation isn't needed to tell us this is a plausible outcome. Still, at the same time, the original scenario certainly sounds like something the Air Force would be at least testing. Plus, the fact that this Colonel had to perform a 180-degree reversal after the story went viral makes us think the Air Force is doing a bit of CYA here. Still, it's eerily reminiscent of the line from the end of Terminator 3: Rise of the Machines, where the military guy is informed by Arnold and company that Skynet will soon launch a nuclear attack on its enemy. "What enemy?" he asks. "Us! Humans!" is the reply.

Tagged In

Skynet Us Air Force Attack Of The Drones

More from Defense

Subscribe Today to get the latest ExtremeTech news delivered right to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of use(Opens in a new window) and Privacy Policy. You may unsubscribe from the newsletter at any time.
Thanks for Signing Up