Tech Briefs

Response Surface Mapping Technique Aids Warfighters

When weaponeering a target, military planners pinpoint a detonation location that will result in the desired damage to the entire target, or even a particular area within the target. The warfighter then selects the most suitable delivery platform— aircraft, weapon, guidance package, release altitude, and speed— for inflicting the appropriate damage to the target. Determining the proper combination of variables capable of producing the desired effect on a hardened target requires the warfighter to understand the penetration dynamics of the weapon; it also relies on the individual's ability to adjust the variables within his or her control, as necessary. For a scenario in which the destruction of a specific target is often coupled with the mitigation of collateral damage, it is imperative that the warfighter make proper decisions regarding weapons selections. AFRL scientists, collaborating with other Department of Defense agencies, applied innovative data mining and visualization methods to aid warfighter efficiency and effectiveness in making these choices.

Posted in: Briefs, Information Technology, Cartography, Data acquisition and handling, Imaging, Terrain, Military vehicles and equipment

Microelectromechanical Systems Inertial Measurement Unit Flight Test

AFRL and Boeing engineers conducted successful flight tests of microelectromechanical systems (MEMS) inertial measurement units (IMU) on the Joint Direct Attack Munition (JDAM). They collected flight data and validated the MEMS IMU technology's capability to provide stable navigation performance and accurate weapon guidance, both with and without Global Positioning System (GPS) updates. Researchers will use this flight data to further refine MEMS IMU technology to enhance future capabilities of air-launched munitions.

Posted in: Briefs, Mechanical Components, Microelectricmechanical device, Navigation and guidance systems, Flight tests

Fruit Flies

He refers to them as "nature's fighter jets" and has devoted his life's work and an entire lab to monitor their every move. Thus is the relationship existing between Dr. Michael Dickinson and the objects of his attention—fruit flies. Career pursuits aside, Dr. Dickinson's connection to the insects is one he predicts will eventually lead to the development of flying robots capable of performing various covert tasks, such as spying and surveillance.

Posted in: Briefs, Mechanical Components, Surveillance, Biological sciences, Robotics

Active Flow Control Demonstrated on “Airborne Wind Tunnel”

AFRL engineers, collaborating with aerospace manufacturers and other Air Force groups, recently demonstrated the first-ever airborne active flow control system when they manipulated the airflow behind an F-16 external pod. They significantly altered the turbulent wake using small, electrically controlled, piezoelectric synthetic jet (PESJ) actuators. This demonstration is just one part of AFRL's multiphase Aeroelastic Load Control program aimed at reducing the weight, complexity, and signature of air vehicles through the introduction of active control technologies.

Posted in: Briefs, Mechanical Components, Aerodynamics

Coordination of Autonomous Unmanned Air Vehicles

Future autonomous unmanned air vehicles (UAV) will need to work in teams to share information and coordinate activities in much the same way as current manned air systems. Funded by AFRL, Professor Hugh Durrant-Whyte and his research staff (see Figure 1) at the Australian Research Council's Center of Excellence for Autonomous Systems have been developing mathematical models and simulation studies to understand—and ultimately provide— this future UAV capability.

The team's research focuses on coordination and cooperation for teams of autonomous UAVs engaged in information gathering and data fusion tasks, including cooperative tracking, ground picture compilation, area exploration, and target search. The underlying mathematical model for coordination and cooperation employs quantitative probabilistic and information-theoretic models of platform and sensor abilities. This information-theoretic approach builds on established principles for distributed data fusion in sensor networks, extending these ideas to problems in the distributed control of sensing resources. The researchers have made substantial progress towards formulating, solving, and demonstrating these methods for multi-UAV systems. In particular, they have developed distributed algorithms that enable UAV team-based search and exploration operations. These search and exploration algorithms can incorporate realistic constraints on platforms and sensors—a priori constraints from the environment and weak information from external sources. To date, Prof Durrant-Whyte's research team has successfully demonstrated these algorithms on a flight simulator of midlevel fidelity (see Figure 2).

Recently, the team presented its findings to AFRL researchers at Wright- Patterson Air Force Base (AFB), Ohio, and Eglin AFB, Florida. Based on the promising results of this research effort, AFRL is funding two additional projects to further explore the mathematical aspects of the technology and facilitate real-world application through demonstrations. The first round of demonstrations will involve high-fidelity, hardware-in-the-loop simulations, culminating in a large-scale demonstration involving a UAV fleet operated by the University of Sydney (see Figure 3). AFRL's two research projects will provide significant scientific and technical advancement in the cooperative control of autonomous systems.

The availability of autonomous UAV teams capable of complex cooperative behavior will enable warfighters to execute highly complex missions effectively and safely removed from harms way (i.e., remotely). In addition to providing these advantages, the UAV technology's imaging and atmospheric sampling capabilities have the potential to support both homeland security emergency scenarios and real-time forest fire monitoring tasks.

Dr. Tae-Woo Park, of the Air Force Research Laboratory's Air Force Office of Scientific Research, and Prof Hugh Durrant-Whyte, of the University of Sydney (Australian Research Council Federation Fellow), wrote this article. For more information, contact TECH CONNECT at (800) 203-6451 or place a request at Reference document OSR-H-05-06.

Posted in: Briefs, Information Technology, Mathematical models, Simulation and modeling, Unmanned aerial vehicles

Surface-Emitting Laser Arrays Bring Light to the Top

Laser diodes are an integral part of everyday life, incorporated into commonplace items as diverse in function as laser pointers, fiber-optic communications systems, and DVD players. Manufacturers make most laser diodes by layering specially doped semiconductor materials on a wafer. By slicing tiny chips from these wafers to attain two perfectly smooth, parallel edges, they create very thin (tens of microns) waveguides. These waveguides define a resonating cavity that causes stimulated light to combine in a way that embodies a "laser" and propagates its lasing action. Although this process represents a highly successful and wellengineered means for producing semiconductor lasers, the lasers do not produce an optimum beam. Beam emission occurs from the small rectangular opening at the end of the chip, a configuration that results in an elliptically distorted beam as well as the loss of output efficiency. In addition, the output aperture's relatively small size can lead to destruction of the cleaved and polished end facet during the laser's high-power operation. Laser diodes produced using this process are also susceptible to substantial fluctuations in output wavelength and beam quality as a function of temperature. Furthermore, since the chip emits beam output from an edge instead of its top or bottom surface, manufacturers experience difficulty both in packaging various diode configurations and in combining the output beams of multiple laser diodes.

Posted in: Briefs, Photonics, Fiber optics, Lasers


Visualization of geospatially correct, remotely sensed data is a key element of many government and commercial applications. It enables a user to analyze and assess ground activities and other conditions of interest. Because remotely sensed data can include a diversity of data types reflecting many different data formats, users may experience difficulty visualizing and interpreting these varying data types and formats due to data structure complexity. In addition, important supplemental information often accompanies the data. This supplemental information—or metadata— may include pertinent information of significant value to the user with respect to where, when, and how data collection occurred. Whereas some applications require metadata to support geospatial analysis functions such as positioning and measurement, many others are unable to interpret such metadata and it may thus go unnoticed. Multiband data and motion imagery further compound the task of visualization with spectral components and complex video streams interlaced with other geospatial information.

Posted in: Briefs, Software, Data acquisition and handling, Imaging