New features include the creation of virtual environments that match real-world gunnery test courses.

The US Army Research Laboratory (ARL), US Army Tank Automotive Research Development and Engineering Center (TARDEC), DCS Corp., and Naval Surface Warfare Center Dahlgren Division (NSWCDD) worked together to advance the capabilities of a software-in-the-loop (SIL) simulation environment in support of the larger TARDEC-Wingman Joint Capabilities Technology Demonstration (JCTD).

Wingman SIL's detailed software connections

The Wingman program began in fiscal year 2014 to provide robotic technological advances and experimentation to increase the autonomous capabilities of manned and unmanned combat-support vehicles. A major goal of this program as a whole is to advance manned-unmanned teaming initiatives by iteratively defining and decreasing the gap between autonomous vehicle control and required level of human interaction. Outcomes of these joint research efforts for development of this SIL support the design of a robotic system user interface and enhance communication among manned-unmanned team members, which are critical to achieve Training and Doctrine Command 6+1-required capabilities for robotics and autonomous systems.

The Army's Robotic Wingman program currently has a single manned M1151 High Mobility Multipurpose Wheeled Vehicle (HMMWV) working with a single unmanned robotic M1097 HMMWV operating in a joint gunnery task. The manned-vehicle crew comprises a driver, commander, gunner (also responsible for target detection and lasing for the unmanned vehicle), robot-vehicle operator to monitor or control mobility, and robot-vehicle gunner to monitor and assist with target acquisition and firing. Currently, the project's main goal is to attain direct-fire weapon proficiency by delivering fire on target(s) and qualifying under the Table VI qualification guidelines on gunnery-target ranges as described in the US Army Training and Doctrine Command's Training Circular 3-20.31 (TRADOC 2015). Future advancements from this program foresee the single manned vehicle working cooperatively with multiple unmanned vehicles supporting manned-unmanned teaming (MUM-T) initiatives in complex, uncertain environments.

The current software includes the Robotic Technology Kernel (RTK) for autonomous mobility, the Autonomous Remote Engagement System (ARES) supporting the autonomous targeting and weapons systems control, and the Wingman's Warfighter Machine Interface (WMI) providing individualized, customized interactive displays for the Wingman commander, robot-vehicle driver, and robot-vehicle gunner. The accompanying figure provides a visual depiction of the detailed software connections and required integration with two simulation systems: the Unity3d Game Engine and Quantum Signal's Autonomous Navigation and Virtual Environment Laboratory (ANVEL).

ANVEL was developed as a simulation tool for studying robotic assets in various environments with a variety of sensors. Integration with the RTK vehicle-mobility software was achieved using ANVEL's plugin interface and supports rapid testing of current and potential mobility capabilities with minimal integration effort. The Unity3D Game Engine was integrated into the SIL because it provides a customizable, realistic virtual environment that supports complex interactions with terrain and dynamic events that stimulate the ARES sensors (e.g., camera and LRF data), actuates ARES output (e.g., weapon commands), and simulates physical effects such as wind effects and bullet fly-outs. All software, including video output from the simulation systems, is used to update information on the different WMI displays.

The combination of simulation software allowed the SIL to utilize the strengths of each program without the need for developing additional capabilities. ANVEL's main strength lies in its ability to accurately simulate the dynamics of the robotic vehicle and all of the robotic sensors in real time. Unity's strength lies in its flexibility for adding elements and scenarios to a scene, its quality video rendering for target tracking and acquisition, and its ability to incorporate dynamic and customizable interactions with the virtual environment. ANVEL's physics simulation would have required extensive modifications to add elements like weapon fire and the Unity simulation and would have required developing or integrating new systems to add the necessary robotic sensors and dynamics; hence, federation was the ideal approach.

This work was done by Kristin E. Schaefer, Ralph W. Brewer, E. Ray Pursel, Anthony Zimmermann, and Eduardo Cerame for the Army Research Laboratory. ARL-0210