The goal of this research was to understand insect flight for purposes of improving agility, autonomy, robustness, and integrated sensing and processing of unmanned aerial vehicles. This goal was approached using a comparative methodology in order to understand general principles of insect flight across diverse species; understand environmental variables that impact natural flight of insects; understand how insects can recover from flight perturbations; and understand the connection between flight, sensor capability, neural processing, and muscular control.

Insects are existence proofs for agile, robust, autonomous flight that minimizes size, weight, and power requirements, aspects that are desirable for human-engineered systems. To learn design principles for improved sensors and guidance/control algorithms, AFRL studies insect sensors and flight. The current research effort attempts to connect the environmental information with insect flight and relate that to insect sensors and processing. The goal is to understand insect flight for purposes of improving agility, autonomy, robustness, and integrated sensing and processing of unmanned aerial vehicles.

Indoor laboratory and outdoor laboratory flights of insects were recorded by high-speed cameras with frame rates from 500–1000 Hz. Indoor laboratory flights were recorded in a flight chamber measuring 2m × 1m × 1m and lined with different optic flow patterns. Outdoor laboratory flights were recorded by releasing just captured insects in front of highspeed cameras and allowing them to initiate escape flight. The goal was to compare the kinematics of each flight inside the laboratory versus outside in the natural world.

This effort required automating the tracking of the insect in the video frames because the dataset captured is extremely large. If using just two cameras recording at 1000 frames per second, the study cameras are capable of recording 8 seconds of data. The insect’s position would need to be found and recorded in 16000 frames for one behavior capture.

Figure 2. Outdoor recording screenshot of local robberfly (Diogmetes) after capture. Lighting conditions are challenging because of changing weather conditions. The sky is used as a background in an attempt to increase the contrast of the insect and background, The robberfly is directly above the release point in this screenshot.

Work was started to automate this process based on Ty Hedrick’s algorithms (Hedrick, 2008). Figure 1 illustrates challenges in adapting these techniques indoors. Also shown in Figure 1 are early behavior recordings of Green Darner dragonflies (Anax junius) in this flight chamber. The goal was to establish repeatable protocols for eliciting flights in insects large enough to carry a telemetry recoding chip to correlate flight kinematics, responses to optic flow stimuli, and muscle potentials.

Outdoor flight recordings have their own challenges, as illustrated in Figure 2. It is also likely that the objective of capturing completely natural kinematics is not being reached, because the insects are manipulated beforehand. Future efforts will move towards completely natural conditions, capturing flight from insects that have not experienced any interference from the research team.

Gaze stabilization is also of interest, but, as can be seen in Figure 1, the head of the insect is not easily discernable in free flight. In addition, it would be difficult to induce precise behaviors to initiate a gaze stabilization response. Therefore, efforts were started toward characterizing gaze stabilization in insects while tethered. The stimulating is a rotating horizon line produced by UV and green LEDs.

This work was done by Jennifer Talley, PhD for the Air Force Research Laboratory. AFRL-0289

This Brief includes a Technical Support Package (TSP).
Agile Robust Autonomy: Inspired by Connecting Natural Flight and Biological Sensors

(reference AFRL-0289) is currently available for download from the TSP library.

Don't have an account? Sign up here.