A subscription to JoVE is required to view this content. Sign in or start your free trial.
Method Article
Virtual reality (VR) experiments can be difficult to implement and require meticulous planning. This protocol describes a method for the design and implementation of VR experiments that collect physiological data from human participants. The Experiments in Virtual Environments (EVE) framework is employed to accelerate this process.
Virtual reality (VR) experiments are increasingly employed because of their internal and external validity compared to real-world observation and laboratory experiments, respectively. VR is especially useful for geographic visualizations and investigations of spatial behavior. In spatial behavior research, VR provides a platform for studying the relationship between navigation and physiological measures (e.g., skin conductance, heart rate, blood pressure). Specifically, physiological measures allow researchers to address novel questions and constrain previous theories of spatial abilities, strategies, and performance. For example, individual differences in navigation performance may be explained by the extent to which changes in arousal mediate the effects of task difficulty. However, the complexities in the design and implementation of VR experiments can distract experimenters from their primary research goals and introduce irregularities in data collection and analysis. To address these challenges, the Experiments in Virtual Environments (EVE) framework includes standardized modules such as participant training with the control interface, data collection using questionnaires, the synchronization of physiological measurements, and data storage. EVE also provides the necessary infrastructure for data management, visualization, and evaluation. The present paper describes a protocol that employs the EVE framework to conduct navigation experiments in VR with physiological sensors. The protocol lists the steps necessary for recruiting participants, attaching the physiological sensors, administering the experiment using EVE, and assessing the collected data with EVE evaluation tools. Overall, this protocol will facilitate future research by streamlining the design and implementation of VR experiments with physiological sensors.
Understanding how individuals navigate has important implications for several fields, including cognitive science1,2,3, neuroscience4,5, and computer science6,7. Navigation has been investigated in both real and virtual environments. One advantage of real-world experiments is that navigation does not require the mediation of a control interface and thus may produce more realistic spatial behavior. In contrast, virtual reality (VR) experiments allow for more precise measurement of behavioral (e.g., walking trajectories) and physiological (e.g., heart rate) data, as well as more experimental control (i.e., internal validity). In turn, this approach can result in simpler interpretations of the data and thus more robust theories of navigation. In addition, neuroscience can benefit from VR because researchers can investigate the neural correlates of navigation while participants are engaged in the virtual environment but cannot physically move. For computer scientists, navigation in VR requires unique developments in processing power, memory, and computer graphics in order to ensure an immersive experience. Findings from VR experiments can also be applied in architecture and cartography by informing the design of the building layouts8 and map features9 to facilitate real-world navigation. Recently, advances in VR technology combined with a dramatic decrease in its cost have led to an increase in the number of laboratories employing VR for their experimental designs. Because of this growing popularity, researchers need to consider how to streamline the implementation of VR applications and standardize the experiment workflow. This approach will help shift resources from implementation to the development of theory and extend the existing capabilities of VR.
VR setups can range from more to less realistic in terms of displays and controls. More realistic VR setups tend to require additional infrastructure such as large tracking spaces and high-resolution displays10. These systems often employ redirected walking algorithms in order to inject imperceptible rotations and translations into the visual feedback provided to users and effectively enlarge the virtual environment through which participants can move11,12. These algorithms can be generalized in that they do not require the knowledge of environmental structure13 or predictive in that they assume particular paths for the user14. Although most research on redirected walking has used head-mounted displays (HMDs), some researchers employ a version of this technique with walking-in-place as part of a large projection system (e.g., CAVEs)15. While HMDs can be carried on the head of the participant, CAVE displays tend to provide a wider horizontal field of view16,17. However, less infrastructure is needed for VR systems using desktop displays18,19. Neuroscientific research has also employed VR systems in combination with functional magnetic resonance imaging (fMRI) during scanning20, in combination with fMRI after scanning21,22, and in combination with electroencephalography (EEG) during recording23,24. Software frameworks are needed in order to coordinate the variety of displays and controls that are used for navigation research.
Research that incorporates VR and physiological data poses additional challenges such as data acquisition and synchronization. However, physiological data allows for the investigations of implicit processes that may mediate the relationship between navigation potential and spatial behavior. Indeed, the relationship between stress and navigation has been studied using desktop VR and a combination of different physiological sensors (i.e., heart rate, blood pressure, skin conductance, salivary cortisol, and alpha-amylase)25,26,27,28. For example, van Gerven and colleagues29 investigated the impact of stress on navigation strategy and performance using a virtual reality version of a Morris water maze task and several physiological measures (e.g., skin conductance, heart rate, blood pressure). Their results revealed that stress predicted navigation strategy in terms of landmark use (i.e., egocentric versus allocentric) but was not related to navigation performance. In general, findings from previous studies are somewhat inconsistent regarding the effect of stress on navigation performance and spatial memory. This pattern may be attributable to the separation of the stressor (e.g., the cold pressor procedure26, the Star Mirror Tracing Task25) from the actual navigation task, the use of simple maze-like virtual environments (e.g., virtual Morris water maze26, virtual radial arm maze28), and differences in methodological details (e.g., type of stressor, type of physiological data). Differences in the format of collected physiological data can also be problematic for the implementation and analysis of such studies.
The Experiments in Virtual Experiments (EVE) framework facilitates the design, implementation, and analysis of VR experiments, especially those with additional peripheral devices (e.g., eye trackers, physiological devices)30. The EVE framework is freely available as an open-source project on GitHub (https://cog-ethz.github.io/EVE/). This framework is based on the popular Unity 3D game engine (https://unity3d.com/) and the MySQL database management system (https://www.mysql.com/). Researchers can use the EVE framework in order to prepare the various stages of a VR experiment, including pre- and post-study questionnaires, baseline measurements for any physiological data, training with the control interface, the main navigation task, and tests for spatial memory of the navigated environment (e.g., judgments of relative direction). Experimenters can also control the synchronization of data from different sources and at different levels of aggregation (e.g., across trials, blocks, or sessions). Data sources may be physical (i.e., connected to the user; see Table of Materials) or virtual (i.e., dependent on interactions between the participant's avatar and the virtual environment). For example, an experiment may require recording heart rate and position/orientation from the participant when that participant's avatar moves through a particular area of the virtual environment. All of this data is automatically stored in a MySQL database and evaluated with replay functions and the R package evertools (https://github.com/cog-ethz/evertools/). Evertools provides exporting functions, basic descriptive statistics, and diagnostic tools for distributions of data.
The EVE framework may be deployed with a variety of physical infrastructures and VR systems. In the present protocol, we describe one particular implementation at the NeuroLab at ETH Zürich (Figure 1). The NeuroLab is a 12 m by 6 m room containing an isolated chamber for conducting EEG experiments, a cubicle containing the VR system (2.6 m x 2.0 m), and a curtained area for attaching physiological sensors. The VR system includes a 55" ultra-high definition television display, a high-end gaming computer, a joystick control interface, and several physiological sensors (see Table of Materials). In the following sections, we describe the protocol for conducting a navigation experiment in the NeuroLab using the EVE framework and physiological sensors, present representative results from one study on stress and navigation, and discuss the opportunities and challenges associated with this system.
The following protocol was conducted in accordance with guidelines approved by the Ethics Commission of ETH Zürich as part of the proposal EK 2013-N-73.
1. Recruit and Prepare Participants
2. Prepare the Experiment and Physiological Devices Using EVE
3. Experimental Procedure
4. After Each Experimental Session
From each participant in the NeuroLab, we typically collect physiological data (e.g., ECG), questionnaire data (e.g., the Santa Barbara Sense of Direction Scale or SBSOD31), and navigation data (e.g., paths through the virtual environment). For example, changes in heart rate (derived from ECG data) have been associated with changes in stress states in combination with other physiological32 and self-report measures<...
In the present paper, we described a protocol for conducting experiments in VR with physiological devices using the EVE framework. These types of experiments are unique because of additional hardware considerations (e.g., physiological devices and other peripherals), the preparatory steps for collecting physiological data using VR, and data management requirements. The present protocol provides the necessary steps for experimenters that intend to collect data from multiple peripherals simultaneously. For example...
The authors have nothing to disclose.
The virtual environment was kindly provided by VIS Games (http://www.vis-games.de) to conduct research in virtual reality.
Name | Company | Catalog Number | Comments |
Alienware Area 51 Base | Dell | 210-ADHC | Computation |
138 cm 4K Ultra-HD LED-TV | Samsung | UE55JU6470U | Display |
SureSigns VS2+ | Philips Healthcare | 863278 | Blood Pressure |
PowerLab 8/35 | AD Instruments | PL3508 | Skin Conductance |
PowerLab 26T (LTS) | AD Instruments | ML4856 | Heart Rate |
Extreme 3D Pro Joystick | Logitech | 963290-0403 | HID |
Request permission to reuse the text or figures of this JoVE article
Request PermissionThis article has been published
Video Coming Soon
Copyright © 2025 MyJoVE Corporation. All rights reserved