Current chemical-protection gear for warfighters on the ground inhibits electronic communication via keyboards, cell phones, and remote-control devices. To improve communications capabilities for the warfighter wearing protective gear in hazardous environments, a series of eGloves has been developed with a view toward freeing the warfighter of the need to type on a keyboard while wearing a Mission-Oriented Protective Posture (MOPP) suit. The eGloves can help the warfighter transmit gestures with the hands and fingers from within the protective gear, or they can be used to transmit encoded ASCII characters.
Gesture-Based Sensor-Information Fusion (GBSIF) refers to the fusing of sensor data collected from the environment with data from motion sensors on the eGlove. The eGlove features a CPU that is used to fuse hand and finger motions and positions into gestures. The same CPU can be used to fuse additional data from the environment. In GBSIF, the operator transports the sensor array but does not take an active role in determining the sensors that will participate in the fusion, or the target subjects about which the data will be collected, with the exception of the sensors mounted on the eGlove.
Data are collected from the environment and also from the glove sensors, and these data can be fused and integrated on a network site that differs from the user’s node. Thus, gesture sensor data and environmental data are collected, fused, and integrated where appropriate. However, the gestures themselves are not the primary driving force in selecting information sources and controlling the fusion process.
In contrast, Gesture-Directed Sensor-Information Fusion (GDSIF) includes GBSIF but extends it to the active participation of the eGlove operator to initiate sensor-information fusion. The concept of operation of the GDSIF is that the warfighter would point to a platform or another object in the battlespace using a gesture while wearing a GDSIF-equipped eGlove. The eGlove would be linked to reference sensors to determine orientation and azimuth of the operator’s arm. The eGlove also would use GPS to determine the operator’s geographic location. Gestures would cue sensors to send their data to the eGlove, where these data would be fused with the gesture that prompted the data collection. Fusion would be accomplished in the CPU mounted on the eGlove.
Simple gestures can be used to communicate information to improve situational awareness, send commands to personnel and to robots, and send commands to CBRN and other sensors in the battlespace. For example, the gesture to use most often for information fusion would be to point at a sensor-data source in the battlespace with the index finger extended and the other fingers touching the palm, (to distinguish it from similar gestures that use the whole hand to point). This pointing gesture, when recognized, would signal the sensor and trigger a data stream or a single reading from the designated sensor to the local common-data backbone. Successful transmission from the sensor would trigger haptic feedback on the operator’s glove indicating that the data set has been sent to the network. Continuing the example, the warfighter could repeat the pointing process with a second sensor and then a second gesture; for example, a fist with the arm held straight down would trigger a pre-determined sensor-information fusion process. Using the fusion-fist gesture in this manner would distinguish it from other gestures that employ a closed fist with the arm extended, which in some command contexts means “stop.” It also would avoid confusion with gestures in which the fist is held close to the chest.
This work was done by Marion G. Ceruti, Jeffrey Ellen, Gary Rogers, Sunny Fugate, Nghia Tran, Hoa Phan, Daniel Garcia, Bryan Berg, Emily Medina, and LorRaine Duffy of the Space and Naval Warfare Systems Center Pacific. For more information, download the Technical Support Package (free white paper) at www.defensetechbriefs.com/tsp under the Physical Sciences category. NRL-0040
This Brief includes a Technical Support Package (TSP).
Gesture-Directed Sensor-Information Fusion Gloves
(reference NRL-0040) is currently available for download from the TSP library.
Don't have an account? Sign up here.