Aprogram of research has built a technological foundation for further development of systems that could be characterized, variously, as "smart" cameras or adaptive optoelectronic eyes. A system according to this concept would function in an optimal manner to capture images under a variety of lighting conditions and would be capable of semi-autonomous recognition of objects.
The approach taken in this research was to combine advances in understanding of the mechanisms of biological vision systems with advances in hybrid electronic/photonic packaging technology in order to develop algorithms, architectures, and optical and electronic hardware components and subsystems essential to building artificial, biologically inspired vision systems.
A primary goal of this research was to enable the development of a sensor/processor module (1) in which the sensor and processor would be intimately coupled and (2) that would have architectural characteristics and capabilities similar to those of the multilayer retina and early stages of vision in a mammalian visual system. The module would be a compact, multilayer, vertically integrated structure, containing very-large-scale integrated (VLSI) circuit chips. It would perform both sensing (image-acquisition) and processing functions (for purposes of extraction of features and recognition of objects, for example).
The upper part of the figure illustrates the basic architectural concept for a densely integrated hybrid electronic/ photonic multichip module. Dense arrays of fan-out and fan-in interconnections between layers would be implemented with the help of diffractive optical elements. Each layer other than the sensor layer would comprise or include a two-dimensional array of processing elements corresponding to the pixels of the sensor layer. Each processing element might include, for example, (1) analog neuron-like signal-processing circuitry, (2) analog-to-digital converter (sampleand- hold) circuitry, and/or (3) digital circuitry (for local memory, communication between neighboring pixels, and/or control). Other circuitry in each layer also might include (1) analog, digital, or hybrid analog/digital processors; (2) shifting circuitry for lateral scrolling functions; and (3) in the case of the sensor layer, photodetectors and preamplifier circuitry.
The lower part of the figure presents an enlarged view of the portion highlighted by the rectangle in the upper part of the figure, showing an example of dense optical fan-out/fan-in interconnections between layers in the multichip module. Each pixel of the lower silicon VLSI chip would contain both processing electronic circuitry and a photodetector illuminated with image-bearing or other information from the layer above it. An array of multiple-quantum-well (MQW) modulators is shown to be flipchip bonded to the silicon pixel array. An optical power bus (an integrated optical component) would deliver a readout beam to each modulator element in the array by means of an array of rib waveguides containing vertical-outcoupling gratings. Each diffractive optical element would provide weighted fanout connections from the modulator element of the corresponding pixel in the upper VLSI chip to the photodetectors of several neighboring pixels on the lower silicon VLSI chip.
This work was done by Armand R. Tanguay Jr., B. Keith Jenkins, Christoph von der Malsburg, Bartlett Mel, Irving Biederman, John O'Brien, and Anupam Madhukar of the University of Southern California for the Army Research Laboratory. For more information, download the Technical Support Package (free white paper) at www.defensetechbriefs.com/tsp under the Photonics category. ARL-0013
This Brief includes a Technical Support Package (TSP).
Adaptive Optoelectronic Eyes
(reference ARL-0013) is currently available for download from the TSP library.
Don't have an account? Sign up here.