Aby wyświetlić tę treść, wymagana jest subskrypcja JoVE. Zaloguj się lub rozpocznij bezpłatny okres próbny.
Method Article
VisualEyes2020 (VE2020) is a custom scripting language that presents, records, and synchronizes visual eye movement stimuli. VE2020 provides stimuli for conjugate eye movements (saccades and smooth pursuit), disconjugate eye movements (vergence), accommodation, and combinations of each. Two analysis programs unify the data processing from the eye tracking and accommodation recording systems.
Through the purposeful stimulation and recording of eye movements, the fundamental characteristics of the underlying neural mechanisms of eye movements can be observed. VisualEyes2020 (VE2020) was developed based on the lack of customizable software-based visual stimulation available for researchers that does not rely on motors or actuators within a traditional haploscope. This new instrument and methodology have been developed for a novel haploscope configuration utilizing both eye tracking and autorefractor systems. Analysis software that enables the synchronized analysis of eye movement and accommodative responses provides vision researchers and clinicians with a reproducible environment and shareable tool. The Vision and Neural Engineering Laboratory's (VNEL) Eye Movement Analysis Program (VEMAP) was established to process recordings produced by VE2020's eye trackers, while the Accommodative Movement Analysis Program (AMAP) was created to process the recording outputs from the corresponding autorefractor system. The VNEL studies three primary stimuli: accommodation (blur-driven changes in the convexity of the intraocular lens), vergence (inward, convergent rotation and outward, divergent rotation of the eyes), and saccades (conjugate eye movements). The VEMAP and AMAP utilize similar data flow processes, manual operator interactions, and interventions where necessary; however, these analysis platforms advance the establishment of an objective software suite that minimizes operator reliance. The utility of a graphical interface and its corresponding algorithms allow for a broad range of visual experiments to be conducted with minimal required prior coding experience from its operator(s).
Concerted binocular coordination and appropriate accommodative and oculomotor responses to visual stimuli are crucial aspects of daily life. When an individual has reduced convergence eye movement response speed, quantified through eye movement recording, doubled vision (diplopia) may be perceived1,2. Furthermore, a Cochrane literature meta-analysis reported that patients with oculomotor dysfunctions, attempting to maintain normal binocular vision, experience commonly shared visual symptoms, including blurred/double vision, headaches, eye stress/strain, and difficulty reading comfortably3. Rapid conjugate eye movements (saccades), when deficient, can under-respond or over-respond to visual targets, thus meaning further sequential saccades are required to correct this error4. These oculomotor responses can also be confounded by the accommodative system, in which the improper focusing of light from the lens creates blur5.
Tasks such as reading or working on electronic devices demand coordination of the oculomotor and accommodative systems. For individuals with binocular eye movement or accommodative dysfunctions, the inability to maintain binocular fusion (single) and acute (clear) vision diminishes their quality of life and overall productivity. By establishing a procedural methodology for quantitatively recording these systems independently and concertedly through repeatable instrumentation configurations and objective analysis, distinguishing characteristics about the acclimation to specific deficiencies can be understood. Quantitative measurements of eye movements can lead to more comprehensive diagnoses6 compared to conventional methods, with the potential to predict the probability of remediation via therapeutic interventions. This instrumentation and data analysis suite provides insight toward understanding the mechanisms behind current standards of care, such as vision therapy, and the long-term effect therapeutic intervention(s) may have on patients. Establishing these quantitative differences between individuals with and without normal binocular vision may provide novel personalized therapeutic strategies and heighten remediation effectiveness based on objective outcome measurements.
To date, there is not a single commercially available platform that can simultaneously stimulate and quantitatively record eye movement data with corresponding accommodative positional and velocity responses that can be further processed as separate (eye movement and accommodative) data streams. The signal processing analyses for accommodative and oculomotor positional and velocity responses have respectively established minimum sampling requirements of approximately 10 Hz7 and a suggested sampling rate between 240 Hz and 250 Hz for saccadic eye movements8,9. However, the Nyquist rate for vergence eye movements has yet to be established, though vergence is about an order of magnitude lower in peak velocity than saccadic eye movements. Nonetheless, there is a gap in the current literature regarding eye movement recording and auto-refractive instrumentation platform integration. Furthermore, the ability to analyze objective eye movement responses with synchronous accommodation responses has not yet been open-sourced. Hence, the Vision and Neural Engineering Laboratory (VNEL) addressed the need for synchronized instrumentation and analysis through the creation of VE2020 and two offline signal processing program suites to analyze eye movements and accommodative responses. VE2020 is customizable via calibration procedures and stimulation protocols for adaptation to a variety of applications from basic science to clinical, including binocular vision research projects on convergence insufficiency/excess, divergence insufficiency/excess, accommodative insufficiency/excess, concussion-related binocular dysfunctions, strabismus, amblyopia, and nystagmus. VE2020 is complemented by the VEMAP and AMAP, which subsequently provide data analysis capabilities for these stimulated eyes and accommodative movements.
The study, for which this instrumentation and data analysis suite was created and successfully implemented was approved by the New Jersey Institute of Technology Institution Review Board HHS FWA 00003246 Approval F182-13 and approved as a randomized clinical trial posted on ClinicalTrials.gov Identifier: NCT03593031 funded via NIH EY023261. All the participants read and signed an informed consent form approved by the university's Institutional Review Board.
1. Instrumentation setup
Figure 1: Haploscope control and recording equipment configuration. Example of the VE2020's display indexing for clockwise monitor ordering and dimensioning. Here, 1 is the control monitor, 2 is the near-left display monitor, 3 is the far-left display monitor, 6 is the calibration board (CalBoard), 4 is the far-right display monitor, and 5 is the near-right display monitor. Please click here to view a larger version of this figure.
Table 1: BNC port map. The convention for BNC connections. Please click here to download this Table.
Figure 2: Breakout box switch references. Demonstration of the proper NI 2090A switch positions. Please click here to view a larger version of this figure.
2. Visual stimulation utilizing the VE2020 visual displays and VE2020 LED targets
Figure 3: Stimulated degrees to monitor pixels. Depiction of the operator view for calibrating the VE2020. From left to right, a table of values for the recorded pixels corresponding to a known degree value is provided for a given stimulus monitor selection (stretch mode ID) with a fixed aspect ratio, given file name, background stimulus (BG), and foreground stimulus (Line). Please click here to view a larger version of this figure.
Figure 4: Pixel to degree calibration slopes. Monocular calibration curve for known degree values and measured pixel values. Please click here to view a larger version of this figure.
3. LED calibration
Figure 5: Calculated degrees of rotation. Method of calculating the angular displacement for both saccadic eye movements and vergence movements with a known distance to the target (X) and inter-pupillary distance (IPD). Please click here to view a larger version of this figure.
4. Software programming
5. DC files
Table 2: DC file configuration. The table provides an overview of the DC text file format. Please click here to download this Table.
6. LED input file definition and stimulus library storing
Figure 6: Stimulus library. Utilizing text-editing software, the format shown for identifying the port communications, baud rate, data size, and parity, as well as the library of stimulus files (.vei), provides the VE2020 with the necessary configurations and stimulus file names to run successfully. Please click here to view a larger version of this figure.
7. Script creation for experimental protocols
Table 3: VE2020 function syntax. VE2020 has specific syntax, as demonstrated in the table for calling embedded functions and commenting. Please click here to download this Table.
8. Participant preparation and experiment initiation
9. VNEL eye movement analysis program (VEMAP)
Figure 7: Monocular calibration and correlation slopes. An example of the calibration of eye movement data from voltage values to degrees of rotation. Please click here to view a larger version of this figure.
Figure 8: Eye movement software classification. Classification of the stimulated eye movement responses. Please click here to view a larger version of this figure.
Figure 9: Eye movement response software analysis. An example of plotted convergence responses stimulated by a 4° symmetrical step change (right), with individual eye movement response metrics presented tabularly (left) and group-level statistics displayed tabularly below the response metrics. Please click here to view a larger version of this figure.
10. Accommodative Movement Analysis Program (AMAP)
Figure 10: AMAP software frontend. The figure displays the main user interface for the AMAP with highlighted sections for the graphical presentation (graphical options) of data and data analysis (metric modifications). Please click here to view a larger version of this figure.
Group-level ensemble plots of stimulated eye movements evoked by VE2020 are depicted in Figure 11 with the corresponding first-order velocity characteristics.
Figure 11: Eye movement response ensembles. The ensemble plots of vergence steps (left) and saccades (right) stimulated using th...
Applications of the method in research
Innovations from the initial VisualEyes2020 (VE2020) software include the expansibility of the VE2020 to project onto multiple monitors with one or several visual stimuli, which allows the investigation of scientific questions ranging from the quantification of the Maddox components of vergence18 to the influence of distracting targets on instructed targets19. The expansion of the haploscope system to VE2020 alon...
The authors have no conflicts of interest to declare.
This research was supported by National Institutes of Health grant R01EY023261 to T.L.A. and a Barry Goldwater Scholarship and NJIT Provost Doctoral Award to S.N.F.
Name | Company | Catalog Number | Comments |
Analog Terminal Breakout Box | National Instruments | 2090A | |
Convex-Sphere Trial Lens Set | Reichert | Portable Precision Lenses | Utilized for autorefractor calibration |
Graphics Cards | - | - | Minimum performance requirement of GTX980 in SLI configuration |
ISCAN Eye Tracker | ISCAN | ETL200 | |
MATLAB | MathWorks | v2022a | AMAP software rquirement |
MATLAB | MathWorks | v2015a | VEMAP software requirement |
Microsoft Windows 10 | Microsoft | Windows 10 | Required OS for VE2020 |
Plusoptix PowerRef3 Autorefractor | Plusoptix | PowerRef3 | |
Stimuli Monitors (Quantity: 4+) | Dell | Resolution 1920x1080 | Note all monitors should be the same model and brand to avoid resolution differences as well as physical configurations |
Zapytaj o uprawnienia na użycie tekstu lub obrazów z tego artykułu JoVE
Zapytaj o uprawnieniaThis article has been published
Video Coming Soon
Copyright © 2025 MyJoVE Corporation. Wszelkie prawa zastrzeżone