The Army Research Laboratory is exploring new applications of AI designed to better enable forward-operating robot “tanks” to acquire targets, discern and organize war-crucial information, surveil combat zones and even fire weapons when directed by a human.
“For the first time, the Army will deploy manned tanks that are capable of controlling robotic vehicles able to adapt to the environment and act semi-independently. Manned vehicles will control a number of combat vehicles, not small ones but large ones. In the future we are going to be incorporating robotic systems that are larger, more like the size of a tank,” Brandon Perelman, Ph.D., scientist and engineer, Army Research Laboratory, Combat Capabilities Development Command, Army Futures Command, told Warrior in an interview at Aberdeen Proving Ground, Md.
The concept is aligned with ongoing research into new generations of AI being engineered to not only gather and organize information for human decision-makers but also advance networking between humans and machines. Drawing upon advanced algorithms, computer technology can organize, and disseminate otherwise dis-aggregated pools of data in seconds -- or even milliseconds. AI-empowered sensors can bounce incoming images, video or data off a seemingly limitless existing database to assess comparisons, differences and perform near real-time analytics.
At the speed of the most advanced computer processing, various AI systems can simultaneously organize and share information, perform analyses and solve certain problems otherwise impossible for human address within any kind of comparable timeframe. At the same time, there are many key attributes, faculties and problem-solving abilities unique to human cognition. The optimal approach is, according to Perelman, to simultaneously leverage the best of both.
“We will use the power of human intelligence and the speed of AI to get novel interactions,” Perelman added.
This blending or synthesis of attributes between mind and machine is expected to evolve quickly in the coming years, increasingly giving warzone commanders combat-sensitive information much faster and more efficiently.
“You can take risks you would never do with a manned platform. A robotic system with weapons does not need to account for crew protection,” Perelman said.
For instance, a forward-operating robotic “wingman” vehicle could identify a target that might otherwise escape detection, and instantly analyze the data in relation to terrain, navigational details, previous missions in the area or a database of known threats.
“You have an AI system that is not better than a human but different than a human. It might be faster and it might be more efficient at processing certain kinds of data. It will deal with threats in concert with human teammates that are completely different than the way we do things today,” Perelman said.
With these goals in mind, the ARL is now working on mock-up interfaces intended to go into the services’ emerging family of Next Generation Combat Vehicles. Smaller robots such as IED-clearing PackBots have been in existence for more than a decade; many of them have integrated software packages enabling various levels of “semi-autonomy,” able to increasingly perform a range of tasks without needing human intervention. Current ARL efforts now venture way beyond these advances to engineer much greater levels of autonomy and also engineer larger robots themselves… such as those the size of “tanks.”
Bringing this kind of manned-unmanned teaming to fruition introduces new strategic and tactical nuances to combat, enabling war commanders a wider and more immediate sphere of options.
“Commanders will be able to view a target through vehicle sensor packages, or if there is an aided target recognition technology or some kind of AI to spot targets, they might see battlespace target icons pop up on the map indicating the location of that target,” Perelman said.
AI-oriented autonomous platforms can greatly shorten sensor-to-shooter time and enable war commanders to quickly respond to, and attack, fast emerging moving targets or incoming enemy fire.
“Everything that a soldier does today ... shooting, moving, communicating ... will be different in the future because you do not just have human to human teammates, you have humans working with AI-teammates,” Perelman said.
Enabling robots to understand and properly analyze humans is yet another challenging element of this complex equation. “When you have two humans, they know when the other is cold and tired, but when you bring in an AI system you don’t necessarily have that shared understanding,” Perelman said.
Various kinds of advanced autonomy, naturally, already exists, such as self-guiding aerial drones and the Navy’s emerging “ghost fleet” of coordinated unmanned surface vessels operating in tandem. Most kinds of air and sea autonomous vehicles confront fewer operational challenges when compared to ground autonomy. Ground warfare is of course known to incorporate many fast-changing variables, terrain and maneuvering enemy forces - at times to a greater degree than air and sea conditions - fostering a need for even more advanced algorithms in some cases. Nevertheless, the concepts and developmental trajectory between air, land and ground autonomy have distinct similarities; they are engineered to operate as part of a coordinated group of platforms able to share sensor information, gather targeting data and forward-position weapons -- all while remaining networked with human decision-makers.