Air Force official’s musings on rogue drone targeting humans go viral

Air Force official’s musings on rogue drone targeting humans go viral

Source Node: 2694938

WASHINGTON — The U.S. Air Force walked back comments reportedly made by a colonel regarding a simulation in which a drone outwitted its artificial intelligence training and killed its handler, after the claims went viral on social media.

Air Force spokesperson Ann Stefanek said in a June 2 statement no such testing had been conducted, adding that the servicemember’s comments were likely “taken out of context and were meant to be anecdotal.”

“The Department of the Air Force has not conducted any such AI-drone simulations and remains committed to ethical and responsible use of AI technology,” Stefanek said. “This was a hypothetical thought experiment, not a simulation.”

The killer-drone-gone-rogue episode was initially attributed to Col. Tucker ‘Cinco’ Hamilton, the chief of AI test and operations, in a recap from the Royal Aeronautical Society FCAS23 Summit in May. The summary was later updated to include additional comments from Hamilton, who said he misspoke at the conference.

“We’ve never run that experiment, nor would we need to in order to realize that this is a plausible outcome,” Hamilton was quoted as saying in the Royal Aeronautical Society’s update. “Despite this being a hypothetical example, this illustrates the real-world challenges posed by AI-powered capability and is why the Air Force is committed to the ethical development of AI.”

Hamilton’s assessment of the plausibility of rogue-drone scenarios, however theoretical, coincides with stark warnings in recent days by leading tech executives and engineers, who warned in an open letter that the technology has the potential to wipe out humanity if left unchecked.

Hamilton is also commander of the 96th Operations Group at Eglin Air Force Base in Florida. Defense News on Thursday reached out to Eglin’s 96th Test Wing to speak to Hamilton, but was told he is not available for comment.

In the original post, the Royal Aeronautical Society said Hamilton described a simulation in which a drone fueled by AI was given a mission to find and destroy enemy air defenses. A human was supposed to give the drone its final authorization to strike or not, Hamilton reportedly said.

But the drone algorithms had been told that destroying the surface-to-air missile site was its preferred option. So the AI decided that the human controller’s instructions not to strike were getting in the way of its mission, and then attacked the operator and the infrastructure used to relay instructions.

“It killed the operator because that person was keeping it from accomplishing its objective,” Hamilton was quoted as saying. “We trained the system, ‘Hey don’t kill the operator, that’s bad. You’re gonna lose points if you do that.’ So what does it start doing? It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target.”

The Defense Department has for years embraced AI as a breakthrough technology advantage for the U.S. military, investing billions of dollars and creating the the Chief Digital and AI Office in late 2021, now led by Craig Martell.

More than 685 AI-related projects are underway at the department, including several tied to major weapon systems, according to the Government Accountability Office, a federal auditor of agencies and programs. The Pentagon’s fiscal 2024 budget blueprint includes $1.8 billion for AI.

The Air and Space forces, in particular, are responsible for at least 80 AI endeavors, according to the GAO. Air Force Chief Information Officer Lauren Knausenberger has advocated for greater automation in order to remain dominant in a world where militaries make speedy decisions and increasingly employ advanced computing.

The service is ramping up its efforts to field autonomous or semi-autonomous drones, which it refers to as collaborative combat aircraft, to fly alongside F-35 fighters and a future fighter it calls Next Generation Air Dominance.

The service envisions a fleet of those drone wingmen that would accompany crewed fighters into combat and carry out a variety of missions. Some CCAs would conduct reconnaissance missions and gather intelligence, others could strike targets with their own missiles, and others could jam enemy signals or serve as decoys to lure enemy fire away from the fighters with human pilots inside.

The Air Force’s proposed budget for 2024 includes at least millions in new spending to help it prepare for a future with drone wingmen, including a program called Project Venom to help the service experiment with its autonomous flying software in F-16 fighters.

Under Project Venom, which stands for Viper Experimentation and Next-gen Operations Model, the Air Force will load autonomous code into six F-16s. Human pilots will take off in those F-16s and fly them to the testing area, at which point the software will take over and conduct the flying experiments.

The Royal Aeronautical Society’s post on the summit said Hamilton “is now involved in cutting-edge flight test of autonomous systems, including robot F-16s that are able to dogfight.”

The Air Force plans to spend roughly $120 million on Project Venom over the next five years, including a nearly $50 million budget request for 2024 to kick the program off. The Air Force told Defense News in March that it hadn’t decided which base and organization will host Project Venom, but the budget request asked for 118 staff positions to support the program at Eglin.

In early 2022, as public discussions about the Air Force’s plans for autonomous drone wingmen gathered steam, former Air Force Secretary Deborah Lee James told Defense News that the service needs to take caution and consider ethical questions as it moves towards conducting warfare with autonomous systems.

James said that while the AI systems in such drones would be designed to learn and act on their own, such as by taking evasive maneuvers if it were in danger, she doubted the Air Force would allow an autonomous system to shift from one target to another on its own if that would result in the deaths of humans.

Stephen Losey is the air warfare reporter for Defense News. He previously covered leadership and personnel issues at Air Force Times, and the Pentagon, special operations and air warfare at Military.com. He has traveled to the Middle East to cover U.S. Air Force operations.

Colin Demarest is a reporter at C4ISRNET, where he covers military networks, cyber and IT. Colin previously covered the Department of Energy and its National Nuclear Security Administration — namely Cold War cleanup and nuclear weapons development — for a daily newspaper in South Carolina. Colin is also an award-winning photographer.

Time Stamp:

More from Defense News Air