The United States shares 5,525 miles of land border with Canada, and 1,989 miles with Mexico. Monitoring these borders, which is the responsibility of U.S. Customs and Border Protection (CBP), is an enormous task. Detecting and responding to illegal activity while facilitating lawful commerce and travel is made more difficult by the expansive, rugged, diverse, and thickly vegetated geography that spans both often-crossed borders. To help mitigate the challenges to border surveillance, a group of researchers at MIT Lincoln Laboratory is investigating whether an airborne ladar system capable of imaging objects under a canopy of foliage could aid in the maintenance of border security by remotely detecting illegal activities.

Figure 1. AOSTB DHC-6 Twin Otter aircraft on the tarmac at Hanscom Air Force Base.

Requisite for effective border protection is timely, actionable information on areas of interest. Leveraging the Laboratory’s experience in building imaging systems that exploit microchip lasers and Geiger-mode avalanche photodiodes, the research team developed and tested two concepts of operations (CONOPS) for using airborne ladar systems to detect human activity in wooded regions.

“For any new technology to be effectively used by CBP, an emerging sensor must bring with it a sensible deployment architecture and concept of operation,” said John Aldridge, technical staff member from the Laboratory’s Homeland Protection Systems Group, who worked with a multidisciplinary, cross-divisional team. The CONOPS the team focused on were cued examination of a localized area and uncued surveillance of a large area. To demonstrate the approach, the team conducted proof-of-concept experiments with the Airborne Optical Systems Testbed (AOSTB), a Twin Otter aircraft outfitted with an onboard ladar sensor.

AOSTB is reconfigurable, allows for roll-on/roll-off capability, and can accommodate multiple sensors. AOSTB mission areas include wide-area, down-looking, high-resolution imaging; side-looking and up-looking laser ranging and tracking; and sensor fusion with cameras and hyperspectral payloads. The airborne 3D ladar utilizes a single-photon-sensitive avalanche photodiode (APD) array and short-pulse laser. The detector technology is based on arrays of short-wave infrared (SWIR) APDs operating in Geiger mode. The laser technology is based on diode-pumped passively Q-switched lasers.

For cued surveillance, the use of an airborne ladar sensor platform (whether a piloted aircraft or an unmanned aircraft system) might be prompted by another persistent sensor that indicates the presence of activity in a localized area at or near the border. The area of coverage for cued surveillance may be in the 1 km2 to 10 km2 range, and the Laboratory has already developed and demonstrated sensor technology that can achieve this coverage in minutes, according to the researchers.

Uncued, wide-area surveillance sorties might be flown long distances and over timelines of days or weeks to establish typical activity patterns, and to discover emerging paths and structures in high-interest regions. “The area coverage required under such a CONOPS may reach as high as 300 to 800 km of border, depending on the Border Patrol Sector and vegetation density,” Aldridge explained, adding, “Although the curent AOSTB’s area coverage rate is limited by the aircraft’s airspeed, the sensor can image such a region in a matter of hours in a single sortie.”

AOSTB Technology

The 3D ladar concept is straightforward. Light from a short-pulse laser illuminates a scene of interest. The reflected light is imaged onto a two-dimensional (2D) grid of detectors. Rather than measuring intensity, as in a conventional camera, these detectors measure the photon time-of-flight and therefore the object distance. With each pixel encoded with range information, photon-counting ladars can produce an angle-angle-range or 3D image on a single laser pulse. As a result of the receiver having a Geiger-mode avalanche photo-diode (GMAPD) array, which is sensitive to single photons, one can relax the requirement on the laser transmitted power, and hence reduce the overall system size, weight, and power (SWaP) requirements, while allowing for closure of the optical link budget. This is important for airborne applicationswhere payloads are constrained by numerous practical factors. When coupled with a fast scanning beam-pointing mechanism, photon-counting laser radar systems can provide high area-coverage rates in excess of 100 km2/hr.

The AOSTB leverages technologies and systems developed for the Airborne Ladar Imaging Research Testbed (ALIRT) open-terrain mapping system and the Multi-look Airborne Collector for Human Encampment and Terrain Extraction (MACHETE) FOPEN 3D ladar. The AOSTB engineering team took advantage of important improvements in detector technology and implemented unique scanning modalities, resulting in a relatively low-cost airborne ladar system. The hardware components have a flexible roll-on/roll-off capability, and the testbed is suitable for operation as a single sensor or as part of a fused-sensor suite that can combine down- and side-looking ladar and various passive imaging modalities. The Twin Otter (TO) aircraft platform offers flight endurance in excess of 4 hours at ground speeds of 100 knots, and all the necessary space and power for various research and development activities. Figure 1 shows the AOSTB on the tarmac at Hanscom Airforce Base (HAFB) in Bedford, MA.

Field Tests

Figure 2. (Left) This example of a 3D ladar point cloud shows a fully foliated field site. Color indicates elevation above the ground, with red tones indicating a higher elevation value (for example, treetops and vegetation), and blue tones indicating lower elevation value (ground-level vegetation and the ground itself). (Right) This ladar point cloud is from the same area as the image at left. The ladar data have been processed by subtracting ground elevation and filtering out foliage by height above ground, making objects under the foliage canopy discernible. In the center, two trucks are visible; in the lower right corner, the shape of a tarp canopy can be seen; and to the left of the tarp are two circular shapes that represent a campfire ring and picnic table (closest to the tarp), and a domed tent (brighter circle).

As a start to their field tests to assess their CONOPS, the team flew data collection runs over several local sites identified as representative of the northern U.S. border environment. The sites contained a variety of low-growing brush, thin ground vegetation, very tall coniferous-trees, and leafy deciduous trees. For the tests, the team positioned vehicles, tents, and other camp equipment in the woods to serve as the targets of interest. They made 40 passes at an altitude of 7,500 feet to allow for a spatial resolution of about 25 centimeters. In between each pass, the concealed items were moved to perform post-process analysis for change and motion detection (Figure 2).

In this post-processing stage, the team members enhanced the data captured during the flights so that human analysts could then inspect the ladar imagery. They digitally removed ground-height data to reveal the 3D ladar point cloud above ground, and then digitally thresh-olded the height (erased 3D points above a certain height) to eliminate the foliage cover. The resulting images gave analysts a starting point for approximating the locations of both the planted objects as well as objects that were already on scene.

Searching through vast quantities of ladar data to spot areas for careful inspection is a labor-intensive task, even for experienced analysts who can recognize subtle cues that direct them to the possible presence of objects in the imagery. For the ladar data to be efficiently mined, an automated method of identifying areas of interest is needed. One way to alert analysts to potential targets is to track changes in the 3D temporal data. Changes caused by vehicle movements or alterations in a customary scene can indicate uncharacteristic activity.

To begin a change detection approach to the discovery of potential targets of interest, the research team registered the before and after ladar data, and then subtracted the before data from the after dataset. This process allowed some improvement in the visual identification of vehicles that appeared where there had been none before; however, even a skilled human analyst would find it difficult to spot the small changes that signaled the presence of a vehicle.

A change detection approach, therefore, must compensate for the challenge posed by clutter in the ladar data. This clutter comes from the nature of ladar collection in densely foliated environment. As light travels through gaps between foliage, it bounces off a surface of leaves, ground, or manmade objects. The returned light is collected by the ladar sensor to form the 3D point cloud. Because the motion induced by a flying platform causes each ladar scan to travel through different configurations of gaps between leaves, different parts of the canopy and shrubbery are sensed by the ladar. Much of the clutter in the change detection output is from the different levels of canopy detected from different ladar scans.

Conclusion

Looking forward, the team hopes to improve the capabilities of automated 3D change detection to be more robust to natural temporal changes in foliage, expand the number of automatically detected object classes, and extend automated detection capability to full 3D point clouds. The CBP has expressed the need to apply advanced technology solutions for border management. Continued development of Lincoln Laboratory’s automated approach to using a low-cost ladar system for surveillance of foliated regions may in the future offer another tool that the Department of Homeland Security’s CBP can deploy to monitor the growing volume of land border activity.

This article was contributed by MIT Lincoln Laboratory, Lexington, MA. For more information, visit here .


Aerospace & Defense Technology Magazine

This article first appeared in the September, 2017 issue of Aerospace & Defense Technology Magazine.

Read more articles from this issue here.

Read more articles from the archives here.