A Comparative Evaluation of the Detection and Tracking Capability Between Novel Event-Based and Conventional Frame-Based Sensors

This research establishes a fundamental understanding of the characteristics of event-based sensors and applies it to the development of a detection and tracking algorithm.

The detection and tracking of moving objects in free space is an area of computer vision that has benefited greatly from years of research and development. To date, there are many different algorithms available with new and improved revisions being developed on a regular basis. With such maturity, it is not surprising that such algorithms provide very effective solutions in a wide range of applications. There are, however, select scenarios in which traditional detection and tracking algorithms break down. In many cases it is not attributable to the algorithm, but rather the fundamental operation of a frame-based sensor.

Despite extensive research, traditional frame-based algorithms remain tied to predefined frame rates that lead to image artifacts such as motion blur and sensor characteristics such as low dynamic range, speed limitations and the requirement to process large data files often filled with copious amounts of redundant data.

Event-based sensors, also known as silicon retinas or neuromorphic sensors, are revolutionary optical sensors that operate fundamentally differently to traditional frame-based sensors and offer the potential of a novel solution to these challenges. Inspired by the functionality of a biological retina, these sensors are driven by changes in low-light intensity and not by artificial frame rates and control signals. Within these sensors each pixel behaves both asynchronously and independently, enabling events to be generated with microsecond resolution in response to localized optical changes as they occur.

As noted, event-based sensors specifically aim to mimic the biological retina and subsequent vision processing of the brain. While the retina is the photosensitive tissue of the eye, for it to work properly the entire eyeball is needed. Figure (a) shows a schematic crosssection of a typical eye. Light entering the eye passes through the cornea and into the first of two humors. The aqueous humor is a clear mass that connects the cornea with the lens, helping to maintain the shape of the cornea. Between the aqueous humor and lens lies the iris, a colored ring of muscle fibers. The iris forms an adjustable aperture, called the pupil, which is actively adjusted to ensure a relatively constant amount of light enters the eye at all times.

While the cornea has a fixed curvature, the shape of the lens can be actively moulded to adjust the eye’s focal length as needed. On the back side of the lens is the vitreous humor that again helps to maintain the shape of the eye. Light entering the pupil then forms an inverted image on the retina. The retina contains the photosensitive rods and cones and is a relatively smooth, curved layer with two distinct points; the fovea and optic disc. Densely populated with cone cells, the fovea is positioned directly opposite the lens and is largely responsible for color vision. The optic disc, a blind spot in the eye, is where the axons of the ganglion cells leave the eye to form the optic nerve.

From the photoreceptors, neural responses pass through a series of linking cells, called bipolar, horizontal and amacrin cells (see Figure (b)). These cells combine and compare the responses from individual photoreceptors before transmitting the signals to the retinal ganglia cells. The linkage between neighboring cells provides a mechanism for spatial and temporal filtering, facilitating relative, rather than absolute, judgment of intensity, emphasizing edges and temporal changes in the visual field of view. It is the network of photoreceptors, horizontal cells, bipolar cells, amacrin cells and ganglion cells that can discriminate between useful information to be passed to the brain, and redundant information that is best discarded immediately.

This work was done by James P. Boettiger of the RAAF for the Air Force Institute of Technology. For more information, download the Technical Support Package below. AFRL-0299



This Brief includes a Technical Support Package (TSP).
Document cover
A Comparative Evaluation of the Detection and Tracking Capability Between Novel Event-Based and Conventional Frame-Based Sensors

(reference AFRL-0299) is currently available for download from the TSP library.

Don't have an account? Sign up here.