Many challenges still persist in the area of autonomous (and even semi-autonomous) vehicle navigation for unmanned ground vehicles (UGVs). One challenge is in detecting and classifying obstacles for avoidance and path planning. The use of laser-based sensors, such as lidar, has become quite common for assisting in such a task; however, lidar systems may be too expensive for certain applications, and are active, not passive sensors, so they may not be desirable in some missions. Lidar is adversely affected by smoke, dust, fog, and rain. Therefore, the use of passive camera sensors, such as typical color and infrared (IR) cameras, has become an important research topic in UGV navigation.
One of the greatest challenges of using a stereo pair of color and/or IR cameras is to accurately determine the extrinsic calibration parameters between the cameras. For color cameras, this has historically been solved using a checkerboard pattern of black and white squares. This does not necessarily work out-of-the-box for IR stereo cameras, due to thermal radiation required for high- and low-intensity pixels in an IR sensor. For instance, on a cold and cloudy day, there will be very little difference registered in an IR sensor between the black and white squares on a piece of paper. Therefore, more care and preparation is required in order to calibrate stereo IR cameras.
The first challenge is the calibration board itself. Unlike the calibration pattern for color stereo cameras, which can utilize simple black and white checkerboard patterns for highly accurate calibration, the calibration pattern for two IR stereo cameras must be carefully selected, designed, and/or manufactured.
The second challenge was the calibration pattern itself. Starting with the classic black and white checkerboard pattern, the dynamic range between the white and black squares was not sufficient for the calibration routine to detect the pattern. Even when the size of the calibration board was increased, the detection algorithm was unsuccessful.
The successful methodology used to calibrate the IR stereo cameras incorporated a calibration board made from dibond, a lightweight, rigid, and durable aluminum composite material. The pattern printed onto the boards was a 3 x 5 pattern of asymmetric circles with a 17-cm diameter with a spacing of 17 cm between the circles. By using this large asymmetric circle pattern on a warm day with little to no wind (with the board left in direct sunlight), the detection results improved dramatically. Additionally, simple pre-processing techniques were used to increase the accuracy of IR stereo calibration.
In the first method, a median blur filter was applied to each IR image with a window of 5 pixels. For the second method, a thresholding function was used to truncate the pixel values above an intensity of 50. The third method combines the first two methods.
For experiments with the color cameras in the system, checkerboard patterns, symmetric circle patterns, and asymmetric circle patterns were all successfully used in the calibration routine. For experiments with the IR stereo cameras, only the asymmetric circle patterns were successfully used in calibrating the cameras.
To evaluate the stereo calibration results numerically, OpenCV was used to calculate the stereo re-projection error for each of the calibration patterns used for both the color stereo cameras and the IR stereo cameras.
This work was done by Josh Harguess of the Space and Naval Warfare Systems Center Pacific and Shawn Strange of Leidos. SPAWAR-0002