‘Collaborative, Portable Autonomy’ Is the Future of AI for Special Operations

For a future fight against a near-peer military, U.S. special operators say they need smart, networked sensors and drones that can work together in contested environments with little human supervision. But as “collaborative autonomy” comes within technical reach, just how independent should these things get?

“We are going to use a lot of sensors, whether they’re unmanned aerial systems, unmanned ground systems, unmanned maritime systems, unintended sensors, all working together, and what our goal is to have those working together collaboratively and autonomously,” SOCOM’s top acquisition executive, James Smith, said at NDIA’s SOFIC conference in Tampa, Florida, last week.

SOCOM has “a specific line of effort where we’re focused on what we’re calling ‘collaborative autonomy,” said David Breede, who runs a program executive office at SOCOM. That “line of effort” is concerned with such questions as “How do I get an unattended ground sensor talking to an unmanned aircraft and having an unmanned aircraft react based on the information that it got from that unmanned ground sensor? Not only collaboration across technologies and capabilities, but collaboration across program offices, right?” Breede said. In other words, special operations forces need sensors on the ground, in the air, and in space constantly working together to autonomously detect changes and sound the alert.

But SOCOM doesn’t just want networks of sensors to collaborate better. They also want the underlying autonomy software to work the same on everything from a 3-D printer to a $10,000 drone.

“We have a goal of what we’re calling ‘portable autonomy,’ so being able to port software, virtual algorithms across different classes,” of small drones. “We would have an autonomy developer actually have their software algorithms on a payload and then integrate that onto a third-party platform and demonstrate the ability to control that platform without talking to that third-party platform provider,” Breede said.

Among the obstacles: battlefield radio communications are expected to become much more difficult. SOCOM’s Col. Paul Weizer said the command is trying to “untether” itself from the radio.

“So how do I operate completely in an untethered way, whether it’s with unattended ground sensors or whether it’s unmanned vehicles or otherwise?”

Part of the answer is to put more information and computing power on the battlefield instead of counting on being able to “reach back” for it. The military and SOCOM in particular have been trying to bring cloud capabilities much closer to the battlefield, an effort exemplified by the Army’s XVIII Airborne Corps and Amazon Web Services to create a tactical cloud environment.

That will also make battlefield decision-making much easier, said Quentin Donnellan, the president of the Space and Defense division of AI company Hypergiant, which is working with AWS and the Army on the effort.

“If I turn on my radios, people are going to know where I’m at. So I don’t want to turn on my radios, right?” Donnellan said. “So the idea for these use-cases is ‘How do we deploy AI and machine learning out tactically where I can make those decisions” in a communications-denied environment. “If I’ve got the tools that allow me to, like, leverage AI to put it out to the edge, I should be able to do my job even if my cloud connection is denied.”

One example of tactical cloud use is integrating radar sensor data for air defense in the field—closer to the threat—rather than receiving an alert from a headquarters. “That’s kind of a really tactical and specific example of, ‘Hey, if you deploy AI out there, [you could] potentially leverage weather or ground-based radar to be able to do things like object detection and classification, but not relying on the connectivity back down,” Donnellan said.

Shield AI co-founder Brandon Tseng said his company—known for drones that navigate without GPS—is working with SOCOM on “portable autonomy” to operate ever-larger drone swarms. Since 2018, Shield AI has been developing a software-based autonomy product called Hivemind for drone piloting; they’re integrating it onto V-BAT drones to develop swarming and maritime domain awareness capabilities.

The company is working closely with the U.S. military to figure out how to penetrate enemy air defenses with drone swarms, he said. “Something that we’re super excited about is operationalizing swarms of three V-BAT aircraft in 2023, four craft in  [20]24, eight aircraft in [20]25, and 16 aircraft in [20]26 that are working as a highly intelligent team together…. I think it’s adjacent to where SOCOM is and it definitely plays into their interests. But we’re also integrating it on fighter jets and we expect to have it running on an F-16 later this year.”

But the technology aspect of portable, collaborative autonomy isn’t actually the hardest part of the challenge; the larger policy and ethics questions are.

Take the Switchblade from AeroVironment, the small kamikaze drones that have helped the Ukrainian military push back Russian forces. The drone sends video directly to an operator nearby without having to travel long distances over the radio.

Brett Hush, vice president of tactical mission systems at AeroVironment, said his company is experimenting with artificial intelligence for automatic target recognition. “Those capabilities are in development. We’ve demonstrated with the DOD our ability to do that to identify like… 32 tanks” and potentially strike them with no need for communication with an operator.

“Now, fielding that capability is where we’re gonna cross, you know, policy,” he said. “Today, everything that’s done with our loitering missiles, there’s a man on the loop. Once we go to field in the [automatic target recognition] and with more autonomy, we’ve got to really, as a country, think through where would that be allowed and not allowed.”