The performance of persons who watch surveillance videos, either in real-time or recordings, can vary with their level of expertise. It is reasonable to suppose that some of the performance differences might be due, at least in part, to the way experts scan a visual scene versus the way novices might scan the same scene. For example, experts might be more systematic or efficient in the way they scan a scene compared to novices. Even within the same person, video surveillance performance can vary with factors such as fatigue. Again, differences in the way their eyes scan a scene might account for some of the differences. Full Motion Video (FMV) “Eyes-on” intelligence analysts, in particular, actively scan video scenes for items of interest for long periods of time.
To better understand the characteristics of scanning behavior of Intelligence, Surveillance, and Reconnaissance (ISR) analysts, it is important to track eye movement characteristics. It is relatively simple to model eye characteristics over time such as pupil dilation or eyelid opening over time. It is also common to characterize different types of eye movement over time.
However, when it comes to making comparisons between two sequences of eye data, it is much more challenging and by default, most commercial eye tracking systems do not come equipped with analytic measures of comparing trajectory morphologies. However, there are a variety of applications where this comparison can be useful to ISR research:
Comparing eye scanning movements before and after critical events, such as detecting a target;
Comparing scanpaths between two different analysts;
Comparing scanpaths of analysts over time (e.g. early in the day vs. later in an analyst’s shift);
Comparing scanpaths across different surveillance tasks;
Comparing scanpaths in simple versus complex surveillance scenarios.
This research explores some common metrics for quantifying the similarity between two eye movement scanpaths, with an emphasis on the Matlab toolbox ScanMatch. ScanMatch is a Matlab package that computes a similarity score between two scanpaths. Both experiments involving pseudodata with known conclusions and experimental data from Piaseki (2016) were analyzed using ScanMatch.
In research on human analyst performance in ISR tasks, it is useful to collect eye tracking metrics of analysts as they perform search operations. In particular, information about fixations and saccades (i.e., the jumps from one fixation to another), has been useful for yielding information regarding:
Attention; and even
Fixation and saccade locations can serve as markers for attention because where an analyst is looking on the screen is often highly correlated with what he or she is attending to. However, highly correlated does not mean perfectly correlated with conscious attention, as a well-known inattention blindness study by Drew, Vo, and Wolfe (2013) demonstrates. In their study, radiologists examining an X-ray often fixated on a gorilla figure embedded in the X-ray and made repeated backtracks or re-fixations to it but did not consciously notice the anomaly. Inattention blindness and associated re-fixation and backtrack patterns are important in understanding ISR analyst performance.
Among the rich variables that are collected in eye tracking data, one that centers on the temporal aspects of fixation patterns but whose analysis is frequently neglected due to its complexity, is the scanpath. A scanpath is defined by the temporal sequence of point-by-point (x,y) screen coordinates of where a person is looking on the screen. The accompanying figure shows three example notional scanpaths, although the temporal direction is not shown. At a minimum, scanpaths encompass at least one full fixation-saccade-fixation sequence. Clearly, scanpaths capture the fixation, re-fixation, and backtrack patterns that reveal an analyst's attention, conscious or otherwise.
This work was done by Dr. Mary E. Frame for the Air Force Research Laboratory. For more information, download the Technical Support Package (free white paper) below. AFRL-0303
This Brief includes a Technical Support Package (TSP).
Quantifying Eye Movement Trajectory Similarity for Use in Human Performance Experiments in Intelligence, Surveillance, and Reconnaissance (ISR) Research
(reference AFRL-0303) is currently available for download from the TSP library.
Don't have an account? Sign up here.