A subscription to JoVE is required to view this content. Sign in or start your free trial.
Eye tracking is a non-invasive method to probe information processing. This article describes how eye tracking can be used to study gaze behavior during a flight simulation emergency task in low-time pilots (i.e., <350 flight hours).
Eye tracking has been used extensively as a proxy to gain insight into the cognitive, perceptual, and sensorimotor processes that underlie skill performance. Previous work has shown that traditional and advanced gaze metrics reliably demonstrate robust differences in pilot expertise, cognitive load, fatigue, and even situation awareness (SA).
This study describes the methodology for using a wearable eye tracker and gaze mapping algorithm that captures naturalistic head and eye movements (i.e., gaze) in a high-fidelity flight motionless simulator. The method outlined in this paper describes the area of interest (AOI)-based gaze analyses, which provides more context related to where participants are looking, and dwell time duration, which indicates how efficiently they are processing the fixated information. The protocol illustrates the utility of a wearable eye tracker and computer vision algorithm to assess changes in gaze behavior in response to an unexpected in-flight emergency.
Representative results demonstrated that gaze was significantly impacted when the emergency event was introduced. Specifically, attention allocation, gaze dispersion, and gaze sequence complexity significantly decreased and became highly concentrated on looking outside the front window and at the airspeed gauge during the emergency scenario (all p values < 0.05). The utility and limitations of employing a wearable eye tracker in a high-fidelity motionless flight simulation environment to understand the spatiotemporal characteristics of gaze behavior and its relation to information processing in the aviation domain are discussed.
Humans predominantly interact with the world around them by first moving their eyes and head to focus their line of sight (i.e., gaze) toward a specific object or location of interest. This is particularly true in complex environments such as aircraft cockpits where pilots are faced with multiple competing stimuli. Gaze movements enable the collection of high-resolution visual information that allows humans to interact with their environment in a safe and flexible manner1, which is of paramount importance in aviation. Studies have shown that eye movements and gaze behavior provide insight into underlying perceptual, cognitive, and motor processes across various tasks1,2,3. Moreover, where we look has a direct influence on the planning and execution of upper limb movements3. Therefore, gaze behavior analysis during aviation tasks provides an objective and non-invasive method, which could reveal how eye movement patterns relate to various aspects of information processing and performance.
Several studies have demonstrated an association between gaze and task performance across various laboratory paradigms, as well as complex real-world tasks (i.e., operating an aircraft). For instance, task-relevant areas tend to be fixated more frequently and for longer total durations, suggesting that fixation location, frequency, and dwell time are proxies for the allocation of attention in neurocognitive and aviation tasks4,5,6. Highly successful performers and experts show significant fixation biases toward task-critical areas compared to less successful performers or novices4,7,8. Spatiotemporal aspects of gaze are captured through changes in dwell time patterns across various areas of interest (AOIs) or measures of fixation distribution (i.e., Stationary Gaze Entropy: SGE). In the context of laboratory-based paradigms, average fixation duration, scan path length, and gaze sequence complexity (i.e., Gaze Transition Entropy: GTE) tend to increase due to the increased scanning and processing required to problem-solve and elaborate on more challenging task goals/solutions4,7.
Conversely, aviation studies have demonstrated that scan path length and gaze sequence complexity decrease with task complexity and cognitive load. This discrepancy highlights the fact that understanding the task components and the demands of the paradigm being employed is critical for the accurate interpretation of gaze metrics. Altogether, research to date supports that gaze measures provide meaningful, objective insight into task-specific information processing that underlies the differences in task difficulty, cognitive load, and task performance. With advances in eye tracking technology (i.e., portability, calibration, and cost), examining gaze behavior in 'the wild' is an emerging area of research with tangible applications toward advancing occupational training in the fields of medicine9,10,11 and aviation12,13,14.
The current work aims to further examine the utility of using gaze-based metrics to gain insight into information processing by specifically employing a wearable eye tracker during an emergency flight simulation task in low-time pilots. This study expands on previous work that used a head-stabilized eye tracker (i.e., EyeLink II) to examine differences in gaze behavior metrics as a function of flight difficulty (i.e., changes in weather conditions)5. The work presented in this manuscript also extends on other work which described the methodological and analytical approaches for using eye tracking in a virtual reality system15. Our study used a higher fidelity motionless simulator and reports additional analysis of eye movement data (i.e., entropy). This type of analysis has been reported in previous papers; however, a limitation in the current literature is the lack of standardization in reporting the analytical steps. For example, reporting how areas of interest are defined is of critical importance because it directly influences the resultant entropy values16.
To summarize, the current work examined traditional and dynamic gaze behavior metrics while task difficulty was manipulated via the introduction of an in-flight emergency scenario (i.e., unexpected total engine failure). It was expected that the introduction of an in-flight emergency scenario would provide insight into gaze behavior changes underlying information processing during more challenging task conditions. The study reported here is part of a larger study examining the utility of eye tracking in a flight simulator to inform competency-based pilot training. The results presented here have not been previously published.
The following protocol can be applied to studies involving a wearable eye tracker and a flight simulator. The current study involves eye-tracking data recorded alongside complex aviation-related tasks in a flight simulator (see Table of Materials). The simulator was configured to be representative of a Cessna 172 and was used with the necessary instrument panel (steam gauge configuration), an avionics/GPS system, an audio/lights panel, a breaker panel, and a Flight Control Unit (FCU) (see Figure 1). The flight simulator device used in this study is certifiable for training purposes and used by the local flight school to train the skillsets required to respond to various emergency scenarios, such as engine failure, in a low-risk environment. Participants in this study were all licensed; therefore, they experienced the engine failure simulator scenario previously in the course of their training. This study was approved by the University of Waterloo's Office of Research Ethics (43564; Date: Nov 17, 2021). All participants (N = 24; 14 males, 10 females; mean age = 22 years; flight hours range: 51-280 h) provided written informed consent.
Figure 1: Flight simulator environment. An illustration of the flight simulator environment. The participantβs point of view of the cockpit replicated that of a pilot flying a Cessna 172, preset for a downwind-to-base-to-final approach to Waterloo International Airport, Breslau, Ontario, CA. The orange boxes represent the ten main areas of interest used in the gaze analyses. These include the (1) airspeed, (2) attitude, (3) altimeter, (4) turn coordinator, (5) heading, (6) vertical speed, and (7) power indicators, as well as the (8) front, (9) left, and (10) right windows. This figure was modified from Ayala et al.5. Please click here to view a larger version of this figure.
1. Participant screening and informed consent
2. Hardware/software requirements and start-up
3. Data collection
NOTE: Repeat these steps for each trial. It is recommended that the laptop is placed on the bench outside the cockpit.
4. Data processing and analysis
Term | Definition |
Success (%) | Percentage of successful landing trials |
Completion time (s) | Duration of time from the start of the landing scenario to the plane coming to a complete stop on the runway |
Landing Hardness (fpm) | The rate of decent at point of touchdown |
Landing Error (Β°) | The difference between the center of the plane and the center of the 500 ft runway marker at point of touchdown |
Table 1: Simulator performance outcome variables. Aircraft performance-dependent variables and their definitions.
Figure 2: Landing scenario flight path. Schematic of (A) the landing circuit completed in all trials and (B) the runway with the 500 ft markers that were used as the reference point for the landing zone (i.e., center orange circle). Please click here to view a larger version of this figure.
Figure 3: Area of Interest mapping. An illustration of the batch script demonstrating a window for frame selection. The selection of an optimal frame involves choosing a video frame that includes most or all areas of interest to be mapped. Please click here to view a larger version of this figure.
Figure 4: Generating Area of Interest mapping βin-screenβ coordinates. An illustration of the batch script demonstrating a window for βin-screenβ coordinates selection. This step involves the selection of a square/rectangular region that remains visible throughout the recording, is unique to the image, and remains static. Please click here to view a larger version of this figure.
Figure 5: Identifying Area of Interest to be mapped. An illustration of the batch script window that allows for the selection and labelling of areas of interest. Abbreviation: AOIs = areas of interest. Please click here to view a larger version of this figure.
Figure 6: Batch script processing. An illustration of the batch script processing the video and gaze mapping the fixations made throughout the trial. Please click here to view a larger version of this figure.
Term | Definition |
Dwell time (%) | Percentage of the sum of all fixation durations accumulated over one AOI relative to the sum of fixation durations accumulated over all AOIs |
Average fixation duration (ms) | Average duration of a fixation over one AOI from entry to exit |
Blink rate (blinks/s) | Number of blinks per second |
SGE (bits) | Fixation dispersion |
GTE (bits) | Scanning sequence complexity |
Number of Bouts | Number of cognitive tunneling events (>10 s) |
Total Bout Time (s) | Total time of cognitive tunneling eventsΒ |
Table 2: Eye tracking outcome variables. Gaze behavior-dependent variables and their definitions.
The impact of task demands on flight performance
The data were analyzed based on successful landing trials across basic and emergency conditions. All measures were subjected to a paired-samples' t-test (within-subject factor: task condition (basic, emergency)). All t-tests were performed with an alpha level set at 0.05. Four participants crashed during the emergency scenario trial and were not included in the main analyses because the sparse data does not allow meaningful conc...
The eye tracking method described here enables the assessment of information processing in a flight simulator environment via a wearable eye tracker. Assessing the spatial and temporal characteristics of gaze behavior provides insight into human information processing, which has been studied extensively using highly controlled laboratory paradigms4,7,28. Harnessing recent advances in technology allows the generalization of eye t...
No competing financial interests exist.
This work is supported in part by the Canadian Graduate Scholarship (CGS) from the Natural Sciences and Engineering Research Council (NSERC) of Canada, and the Exploration Grant (00753) from the New Frontiers in Research Fund. Any opinions, findings, conclusions, or recommendations expressed in this material are of the author(s) and do not necessarily reflect those of the sponsors.
Name | Company | Catalog Number | Comments |
flight simulator | ALSIM | AL-250 | fixed fully immersive flight simulation training device |
laptop | Hp | Lenovo | eye tracking data collection laptop; requirements: Windows 10 and python 3.0 |
portable eye-tracker | AdHawkΒ | MindLink eye tracking glasses (250 Hz, <2Β° gaze error, front-facing camera); eye tracking batch script is made available with AdHawk device purchase |
Request permission to reuse the text or figures of this JoVE article
Request PermissionThis article has been published
Video Coming Soon
Copyright Β© 2025 MyJoVE Corporation. All rights reserved