Members of CMU’s NREC, set up equipment during the data collection event at Fort Hunter Liggett, Jan 13. The partnership between CMU and the AITF focuses on modernizing the Army and its processes, through AI, by giving Soldiers the proper tools needed to succeed on the future battlefield.
by Patrick Ferraris
When thinking about modern warfare, it may be easy to conjure up mental images of Soldiers actively engaged on the battlefield: Rounds firing, sweat dripping, and commands being directed with urgency. One might not, however, immediately think of the amount of preparation that goes into ensuring Soldiers are battlefield-ready. An essential element of this preparation is using gathered intelligence to properly define the battlefield environment, determine the threat, and then develop proper courses of action. Soldiers need this intelligence for success, and while that won’t change, the way they acquire this information will, as modern technologies and capabilities are rapidly developing to embrace future warfare.
The Army’s Artificial Intelligence Task Force (AITF) is using its technical expertise and proficiency with future technology to work on a project that could radically transform how the U.S. military prepares for and conducts battlefield operations. It’s called Aided Threat Recognition from Mobile Cooperative and Autonomous Sensors (ATR-MCAS), and it was the project focus for the AITF and Carnegie Mellon University’s National Robotics Engineering Center (CMU NREC) team who recently participated in a data collection event at Fort Hunter Liggett, Ca. from Jan. 13-17.
ATR-MCAS is an AI-enabled system of networked, state-of-the-art air and ground vehicles that leverage sensors and edge computing. The vehicles carry sensors enabling them to navigate within areas of interest to identify, classify, and geo-locate entities, obstacles, and potential threats which reduces the cognitive load on Soldiers. The system is also capable of aggregating and distributing the target data, which can then be used to make recommendations and predictions based on the combined threat picture provided.
“This project pushes the existing limits of artificial intelligence and machine learning used for image classification and autonomous navigation,” stated Lt. Col. Chris Lowrance, Autonomous Systems Lead with the AITF. “ATR-MCAS is different than existing autonomous system efforts because it is not limited to specific-use cases. It can be used to perform reconnaissance missions across the area of operations, or maintain a fixed position while performing area defense surveillance missions.” ATR-MCAS capabilities also extend to other ground warfare missions such as route reconnaissance, screening missions, or the verification of high-value targets.
This ability to adapt to multiple performance standards provides increased situational awareness and presents Soldiers with faster decision-making abilities. Additionally, this adaptable design increases Soldier lethality and survivability by enabling Soldiers to find, identify, and track targets on the battlefield more swiftly.
Once identified by the autonomous sensors, the basic information about the identified threats are relayed back to the Soldiers through a mobile ad-hoc network. Threats are advertised in a common operating picture (COP) that provides an aggregated view of the battlefield. This COP-provided information is then processed by an AI-enabled decision support agent, which can make recommendations such as the prioritization of the threats for Soldiers to utilize. Such information gathered will be achieved not by static data standards, but by robust data mediation, which allows for greater synergy and improved interoperability across ground and air systems.
In applications such as ATR, data mediation focuses on shared awareness at the tactical edge, which is critical to obtaining accurate information on the threat or object of interest. Processing image data from many sensors through artificial intelligence and machine learning (AI/ML) techniques requires a significant amount of computational power at the tactical edge, providing the Soldier more immediate access to the data.
“Data collection events like this one are important because data is the precursor and an essential ingredient to building an AI/ML classification or prediction model.” Lowrance added. “The more opportunities we take to collect good, realistic data, the more effective our systems will be in identifying and classifying similar objects in the future.”
The data collected during this event will be used to train the sensors in recognizing and classifying objects in the field which improves the system’s accuracy and usability for future operations. The images collected will be labeled as specific types of objects in order to further train the model in identifying the same or similar objects of interest. Achieving a greater shared awareness at the edge facilitates collaboration between sensors, systems, and Soldiers.
Developing intelligent and adaptive tools like this supports the Army’s modernization efforts and provides the Warfighter with additional situational awareness, keeping them safer and enabling them to make smarter and more informed decisions on the battlefield.
Subscribe to Army AL&T News – the premier online news source for the Army Acquisition Workforce.
Subscribe