Automated Filtering of Eye Movements Using Dynamic AOI in Multiple Granularity Levels

Automated Filtering of Eye Movements Using Dynamic AOI in Multiple Granularity Levels

Gavindya Jayawardena, Sampath Jayarathna
DOI: 10.4018/IJMDEM.2021010104
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Eye-tracking experiments involve areas of interest (AOIs) for the analysis of eye gaze data. While there are tools to delineate AOIs to extract eye movement data, they may require users to manually draw boundaries of AOIs on eye tracking stimuli or use markers to define AOIs. This paper introduces two novel techniques to dynamically filter eye movement data from AOIs for the analysis of eye metrics from multiple levels of granularity. The authors incorporate pre-trained object detectors and object instance segmentation models for offline detection of dynamic AOIs in video streams. This research presents the implementation and evaluation of object detectors and object instance segmentation models to find the best model to be integrated in a real-time eye movement analysis pipeline. The authors filter gaze data that falls within the polygonal boundaries of detected dynamic AOIs and apply object detector to find bounding-boxes in a public dataset. The results indicate that the dynamic AOIs generated by object detectors capture 60% of eye movements & object instance segmentation models capture 30% of eye movements.
Article Preview
Top

1. Introduction

Eye tracking can reveal objective and quantifiable information about the quality, predictability, and consistency of the underlying covert process of the human brain when carrying out cognitively demanding tasks (McCarley and Kramer 2008; Radach et al. 2003; Van der Stigchel et al. 2007). According to the eye-mind hypothesis (Just and Carpenter 1980), observers attend where their eyes are fixating. Thus, eye tracking measurements enable us to investigate cognitive behavior when visually exploring a stimulus. With the advancement of eye tracking technology, gaze tracking measurements have become reliable and accurate.

Eye gaze measurements include various metrics relevant to oculomotor control (Komogortsev et al. 2013) such as saccadic trajectories, fixations, and other relevant measures including velocity, duration, amplitude, pupil dilation (Krejtz et al. 2018). Studies have shown that the size of the pupil diameter correlates with the task complexity (Kosch et al. 2018) enabling the use of pupillary behavior as biomarkers of mental workload when completing a task. Several studies (Gehrer et al. 2018; Jayawardena et al. 2020) have incorporated eye tracking to obtain insights into underlying covert processes. As a standard practice in the community, upon successful completion of the study, performance of users is measured, traditional positional gaze metrics and advanced gaze metrics are calculated, and statistical significance of computed numerous metrics are evaluated (Gehrer et al. 2018; Jayawardena et al. 2020).

Eye tracking experiments utilize area of interests (AOIs) to the aid the analysis process by extracting eye gaze metrics within a predefined AOIs. An AOI is a region of stimuli that is used to study the eye gaze metrics and link eye movement measures to the part of the area of the stimuli (Hessels et al. 2016). Studies in visual attention and eye movements (Noton and Stark 1971; Privitera and Stark 2000) have shown that humans only attend to a few AOIs in a given stimulus. Analysis of eye gaze metrics within AOIs can provide important cumulative clues to the underlying physiological functions supporting the allocation of visual attention resources. For instance, in the context of user interface interaction, the number of fixations within an AOI (e.g., a user interface component) indicates the efficiency of finding that component among others, whereas the maximum and average fixation duration within that AOI indicates the informativeness of that component (Goldberg and Kotval 1999). In addition, the fixation frequency and blink frequency within AOIs can indicate cognitive workload when interacting with the particular component of the user interface (Van Orden et al. 2001).

The analysis of eye movements in dynamic AOIs, such as in video sequences is not new (Marchant et al. 2009; Goldstein et al. 2007; Crossland et al. 2002; Timberlake et al. 2005). Studies with the primary focus on detecting fixation sequences within identified AOI have used clustering techniques to group gaze locations to determine the AOI (Nguyen et al. 2004), and various image processing algorithms (Privitera and Stark 2000) to automatically identify the AOI. For the analysis of cognitive workload and allocation of visual attention resources, many studies (Santella et al. 2006; Shanmuga Vadivel et al. 2015; Khosravan et al. 2016; Wang et al. 2019) have utilized saliency models of viewers in conjunction with visual information from video frames. In addition, computer vision techniques (Weibel et al. 2012) have been applied to generate AOI-mapped gaze coordinates by using a template of the desired object derived from a single frame of the eye tracking stimuli video. But it only works for pre-recorded eye tracking stimuli using the manual specification of the AOI template generated beforehand.

To overcome these challenges, we propose a computer vision-based deep neural network approach to identify the AOIs in video streams in real-time to filter gaze locations that fall into the identified AOIs for the analysis of both positional and advanced eye gaze metrics. We focus on two filtering techniques to dynamically generate AOI-mapped gaze locations on video streams:

  • Pre-trained object detectors to identify bounding boxes of dynamic AOI in video sequences, and;

  • Object instance segmentation models for offline detection of dynamic AOI via precise pixel-wise masks.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 1 Issue (2023)
Volume 13: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing