Teams of robots and humans are envisioned for use in defense applications, and search and rescue operations.

A human-robotic system is under development that can map an unknown environment, as well as discover, track, and neutralize several static and dynamic objects of interest. In addition, the robots can coordinate their individual tasks with one another without overly burdening a human operator. The testbed utilizes the Segway RMP platform, with lidar, vision, inertial, and GPS sensors. The software draws from autonomous systems research, specifically in the areas of pose estimation, target detection and tracking, motion and behavioral planning, and human robot interaction.

The Robot is based on the Segway RMP-50 platform, with modifications such as a steel space-frame, 16
Environmental sensing includes SICK and Hokuyo laser range finders, and three EO cameras. Localization sensing is provided by an inertial measurement unit (IMU), wheel odometry, and a dual GPS system. A distributed Real-time Data Distribution Network was developed, allowing for accurate time synchronization of data for key algorithms such as sensor fusion. The pose (localization and attitude) estimator uses a custom Bayesian filter, supplemented with lidar-based SLAM methods in order to ensure robust localization in complex terrain (e.g. during wheel slippage) and during indoor operations.

An estimate of the world state, defined as both the static environment and dynamic targets, is maintained locally as well as globally (collaborative). Static terrain maps are developed using probabilistic grid fusion methods, extended to a collaborative, decentralized system. An important characteristic of the static mapping is to maintain local and global representations of the maps for different functions such as path and mission planning.

The problem of autonomous target identification is addressed using computer vision detection methods fused with lidar in a formal Bayesian tracking and classification estimator. The expert human operator validates the targets and removes false positives. Collaborative behaviors are managed at the global level, with the connections to the Human-Machine Interface (HMI). A flexible interface has been designed that is capable of both Playbook and tabletstyle interaction.

The robot is designed to mount various sensors, electrical components, two mini-ITX computers, and two LiPo batteries. Each robot is equipped with one SICK LMS-291 and one Hokuyo URG- 04LX laser range finder, three PGR Firefly MV Firewire Cameras, one MicroStrain Inertia-Link Inertial Measurement Unit (IMU), and two Septentrio AsteRX1 GPS receivers. In order to provide a more accurate timestamp for all of the sensors on the robot, Cornell Skynet’s real-time data network technology is leveraged. Each sensor measurement packet obtains its timestamp from a microcontroller.