Eric Bickford, U.S. Army CERDEC engineer, compares the current prototype vision-aided navigation system with a 3-D rendition of a planned smaller, more soldier-wearable system. CERDEC engineers are analyzing the systems as a possible emerging and alternative navigation technologies for future combat operations. (Photo: U.S. Army RDECOM)

The Army Materiel Command’s Communications-Electronics Research, Development and Engineering Center, or CERDEC, is using miniature cameras to create vision-aided navigation capabilities in GPS-denied situations.

“Vision-aided navigation works by using cameras with rapid frame rates to take pictures of objects in view and then comparing the object’s features in each frame to determine how far, and in what direction, the camera has moved in relation to the object,” said Eric Bickford, an engineer in CERDEC’s Command, Power and Integration Directorate’s Positioning, Navigation and Timing Division, or CP&ID PNTD. The camera catches even the slightest movement through feature detection, which allows users to leverage the camera’s data to track a person’s relative position and movement over a given trajectory or path, Bickford said.

Vision-aided navigation is part of the Army’s overarching goal to provide uninterrupted PNT capabilities to soldiers. While still in the early development phase, CERDEC plans to transition vision-aided navigation solutions to the Army’s Direct Reporting Program Manager Positioning, Navigation and Timing, which was chartered in 2015 to address PNT capabilities across Army portfolios.

“The availability of GPS on the battlefield has significantly enhanced soldiers’ navigational capabilities, but it is susceptible to interference,” said Christopher Manning, acting director for CERDEC CP&ID. “As the Army’s R&D lead for soldier and ground platform PNT needs, we’re using our science and technology investments to support PM PNT by investigating and developing alternate navigation solutions that will address the PNT challenges our soldiers face in various tactical environments.”

A monocular, or single camera, acts as the foundation to the vision-aided navigation system. It captures rotation and translation but not depth; in other words, it shows how much a person rotated and moved along a path, but not how far away he or she is. The “aided” component incorporates inertial measurement units (IMUs), which are comprised of sensors such as gyroscopes and accelerometers; when properly combined with the camera, they provide motion and direction information simultaneously.

“IMUs allow us to determine approximately how the camera is moving; thus, the motion of the camera can be mathematically compared with the motion detected from features tracked visibly by the camera,” CERDEC engineer Gary Katulka said. “With calibrated cameras, quality IMUs, a well-tuned navigation algorithm and other supporting components, a person equipped with a vision-aiding navigation system can achieve GPS-like navigational performance.”

Over time, errors from the IMUs will accumulate and cause some sensor “drift,” but data from the camera serves to limit these errors for a more accurate combined-sensors navigation solution known as sensor fusion, Katulka said.

The first iteration of vision-aided navigation will likely be vehicle-mounted. CERDEC tested this concept by inserting a system into a standard vehicle and driving along a major highway. The camera’s feature detection capability accurately captured everything in its path — other cars, exit signs, and trees — even at high speeds.

“The system understood that the cars ahead of us were going nearly the same speed as we were because those cars never appeared to change in size,” Bickford said.

Two-camera vision-aided navigation, also known as “stereo,” could be a viable option as long as Size, Weight, Power and Cost, or SWAP-C, are not a factor. Objects appear to shift with the two camera solution as it operates similarly to a person repeatedly opening one eye while closing the other.

“That shift tells you how far things are away from you; whereas an object will appear to shift more if it closer to you, but if you look into the distance, the shift yields very little movement,” Bickford said. “This also gives us the required distance or depth information.”

The Army’s science and technology community is investigating approaches for multi-purpose cameras, including vision-aided navigation. With that in mind, CERDEC CP&ID is teaming with additional organizations, including its sister organization, the Night Vision and Electronic Sensors Directorate, to leverage existing technologies such as its thermal imaging camera, which will allow vision-aided navigation in less optimal and low-light situations. In the future, vision-aided navigation systems could be integrated with soldier-wearable devices. During tests in urban environments, CERDEC’s Soldier-mounted prototype allowed the user to stay on nearly the exact trajectory to the path generated from GPS and other sources.