Aby wyświetlić tę treść, wymagana jest subskrypcja JoVE. Zaloguj się lub rozpocznij bezpłatny okres próbny.
Method Article
Here, we present a simplified open-source hardware and software setup for investigating mouse spatial learning using virtual reality (VR). This system displays a virtual linear track to a head-restrained mouse running on a wheel by utilizing a network of microcontrollers and a single-board computer running an easy-to-use Python graphical software package.
Head-restrained behavioral experiments in mice allow neuroscientists to observe neural circuit activity with high-resolution electrophysiological and optical imaging tools while delivering precise sensory stimuli to a behaving animal. Recently, human and rodent studies using virtual reality (VR) environments have shown VR to be an important tool for uncovering the neural mechanisms underlying spatial learning in the hippocampus and cortex, due to the extremely precise control over parameters such as spatial and contextual cues. Setting up virtual environments for rodent spatial behaviors can, however, be costly and require an extensive background in engineering and computer programming. Here, we present a simple yet powerful system based upon inexpensive, modular, open-source hardware and software that enables researchers to study spatial learning in head-restrained mice using a VR environment. This system uses coupled microcontrollers to measure locomotion and deliver behavioral stimuli while head-restrained mice run on a wheel in concert with a virtual linear track environment rendered by a graphical software package running on a single-board computer. The emphasis on distributed processing allows researchers to design flexible, modular systems to elicit and measure complex spatial behaviors in mice in order to determine the connection between neural circuit activity and spatial learning in the mammalian brain.
Spatial navigation is an ethologically important behavior by which animals encode the features of new locations into a cognitive map, which is used for finding areas of possible reward and avoiding areas of potential danger. Inextricably linked with memory, the cognitive processes underlying spatial navigation share a neural substrate in the hippocampus1 and cortex, where neural circuits in these areas integrate incoming information and form cognitive maps of environments and events for later recall2. While the discovery of place cells in the hippocampus3,4 and grid cells in the entorhinal cortex5 has shed light on how the cognitive map within the hippocampus is formed, many questions remain about how specific neural subtypes, microcircuits, and individual subregions of the hippocampus (the dentate gyrus, and cornu ammonis areas, CA3-1) interact and participate in spatial memory formation and recall.
In vivo two-photon imaging has been a useful tool in uncovering cellular and population dynamics in sensory neurophysiology6,7; however, the typical necessity for head restraint limits the utility of this method for examining mammalian spatial behavior. The advent of virtual reality (VR)8 has addressed this shortcoming by presenting immersive and realistic visuospatial environments while head-restrained mice run on a ball or treadmill to study spatial and contextual encoding in the hippocampus8,9,10 and cortex11. Furthermore, the use of VR environments with behaving mice has allowed neuroscience researchers to dissect the components of spatial behavior by precisely controlling the elements of the VR environment12 (e.g., visual flow, contextual modulation) in ways not possible in real-world experiments of spatial learning, such as the Morris water maze, Barnes maze, or hole board tasks.
Visual VR environments are typically rendered on the graphical processing unit (GPU) of a computer, which handles the load of rapidly computing the thousands of polygons necessary to model a moving 3D environment on a screen in real time. The large processing requirements generally require the use of a separate PC with a GPU that renders the visual environment to a monitor, multiple screens13, or a projector14 as the movement is recorded from a treadmill, wheel, or foam ball under the animal. The resulting apparatus for controlling, rendering, and projecting the VR environment is, therefore, relatively expensive, bulky, and cumbersome. Furthermore, many such environments in the literature have been implemented using proprietary software that is both costly and can only be run on a dedicated PC.
For these reasons, we have designed an open-source VR system to study spatial learning behaviors in head-restrained mice using a Raspberry Pi single-board computer. This Linux computer is both small and inexpensive yet contains a GPU chip for 3D rendering, allowing the integration of VR environments with the display or behavioral apparatus in varied individual setups. Furthermore, we have developed a graphical software package written in Python, "HallPassVR", which utilizes the single-board computer to render a simple visuospatial environment, a virtual linear track or hallway, by recombining custom visual features selected using a graphical user interface (GUI). This is combined with microcontroller subsystems (e.g., ESP32 or Arduino) to measure locomotion and coordinate behavior, such as by the delivery of other modalities of sensory stimuli or rewards to facilitate reinforcement learning. This system provides an inexpensive, flexible, and easy-to-use alternative method for delivering visuospatial VR environments to head-restrained mice during two-photon imaging (or other techniques requiring head fixation) for studying the neural circuits underlying spatial learning behavior.
All procedures in this protocol were approved by the Institutional Animal Care and Use Committee of the New York State Psychiatric Institute.
NOTE: A single-board computer is used to display a VR visual environment coordinated with the running of a head-restrained mouse on a wheel. Movement information is received as serial input from an ESP32 microcontroller reading a rotary encoder coupled to the wheel axle. The VR environment is rendered using OpenGL hardware acceleration on the Raspberry Pi GPU, which utilizes the pi3d Python 3D package for Raspberry Pi. The rendered environment is then output via a projector onto a compact wraparound parabolic screen centered on the head-restrained mouse's visual field15,16, while the behavior (e.g., licking in response to spatial rewards) is measured by a second behavior ESP32 microcontroller. The graphical software package enables the creation of virtual linear track environments consisting of repeated patterns of visual stimuli along a virtual corridor or hallway with a graphical user interface (GUI). This design is easily parameterized, thus allowing the creation of complex experiments aimed at understanding how the brain encodes places and visual cues during spatial learning (see section 4). Designs for the custom hardware components necessary for this system (i.e., the running wheel, projection screen, and head-restraint apparatus) are deposited in a public GitHub repository (https://github.com/GergelyTuri/HallPassVR). It is recommended to read the documentation of that repository along with this protocol, as the site will be updated with future enhancements of the system.
1. Hardware setup: Construction of the running wheel, projection screen, and head-fixation apparatus
NOTE: The custom components for these setups can be easily manufactured if the user has access to 3D-printing and laser-cutting equipment or may be outsourced to professional manufacturing or 3D prototyping services (e.g., eMachinehop). All the design files are provided as .STL 3D files or .DXF AutoCAD files.
2. Setup of the electronics hardware/software (single board computer, ESP32 microcontrollers, Figure 2)
3. Running and testing the graphical software package
NOTE: Run the graphical software package GUI to initiate a VR linear track environment, calibrate the distances on the VR software and behavior ESP32 code, and test the acquisition and online plotting of the mouse's running and licking behavior with the included Processing language sketch.
4. Mouse training and spatial learning behavior
NOTE: The mice are implanted for head fixation, habituated to head restraint, and then trained to run on the wheel and lick consistently for liquid rewards progressively ("random foraging"). Mice that achieve consistent running and licking are then trained on a spatial hidden reward task using the VR environment, in which a single reward zone is presented following a visual cue on the virtual linear track. Spatial learning is then measured as increased licking selectivity for positions immediately prior to the reward zone.
This open-source virtual reality behavioral setup allowed us to quantify licking behavior as a read-out of spatial learning as head-restrained mice navigated a virtual linear track environment. Seven C57BL/6 mice of both sexes at 4 months of age were placed on a restricted water schedule and first trained to lick continuously at low levels while running on the wheel for random spatial rewards ("random foraging") without VR. Although their performance was initially affected when moved to the VR projection screen s...
This open-source VR system for mice will only function if the serial connections are made properly between the rotary and behavior ESP32 microcontrollers and the single-board computer (step 2), which can be confirmed using the IDE serial monitor (step 2.4.5). For successful behavioral results from this protocol (step 4), the mice must be habituated to the apparatus and be comfortable running on the wheel for liquid rewards (steps 4.3-4.5). This requires sufficient (but not excessive) water restriction, as mice given ...
Clay Lacefield is the founder and maintainer of OpenMaze.org, which provides designs for the OMwSmall PCB used in this protocol free for download.
We would like to thank Noah Pettit from the Harvey lab for the discussion and suggestions while developing the protocol in this manuscript. This work was supported by a BBRF Young Investigator Award and NIMH 1R21MH122965 (G.F.T.), in addition to NINDS R56NS128177 (R.H., C.L.) and NIMH R01MH068542 (R.H.).
Name | Company | Catalog Number | Comments |
1/4 " diam aluminum rod | McMaster-Carr | 9062K26 | 3" in length for wheel axle |
1/4"-20 cap screws, 3/4" long (x2) | Amazon.com | B09ZNMR41V | for affixing head post holders to optical posts |
2"x7" T-slotted aluminum bar (x2) | 8020.net | 1020 | wheel/animal mounting frame |
6" diam, 3" wide acrylic cylinder (1/8" thick) | Canal Plastics | 33210090702 | Running wheel (custom width cut at canalplastics.com) |
8-32 x 1/2" socket head screws | McMaster-Carr | 92196A194 | fastening head post holder to optical post |
Adjustable arm (14") | Amazon.com | B087BZGKSL | to hold/adjust lick spout |
Analysis code (MATLAB) | custom written | file at github.com/GergelyTuri/HallPassVR/software/Analysis code | |
Axle mounting flange, 1/4" ID | Pololu | 1993 | for mounting wheel to axle |
Ball bearing (5/8" OD, 1/4" ID, x2) | McMaster-Carr | 57155K324 | for mounting wheel axle to frame |
Behavior ESP32 code | custom written | file at github.com/GergelyTuri/HallPassVR/software/Arduino code/Behavior board | |
Black opaque matte acrylic sheets (1/4" thick) | Canal Plastics | 32918353422 | laser cut file at github.com/GergelyTuri/HallPassVR/hardware/VR screen assembly |
Clear acrylic sheet (1/4" thick) | Canal Plastics | 32920770574 | laser cut file at github.com/GergelyTuri/HallPassVR/hardware/VR wheel assembly |
ESP32 devKitC v4 (x2) | Amazon.com | B086YS4Z3F | microcontroller for behavior and rotary encoder |
ESP32 shield | OpenMaze.org | OMwSmall | description at www.openmaze.org (https://claylacefield.wixsite.com/openmazehome/copy-of-om2shield). ZIP gerber files at: https://github.com/claylacefield/OpenMaze/tree/master/OM_PCBs |
Fasteners and brackets | 8020.net | 4138, 3382,3280 | for wheel frame mounts |
goniometers | Edmund Optics | 66-526, 66-527 | optional for behavior. Fine tuning head for imaging |
HallPassVR python code | custom written | file at github.com/GergelyTuri/HallPassVR/software/HallPassVR | |
Head post holder | custom design | 3D design file at github.com/GergelyTuri/HallPassVR/hardware/VR head mount/Headpost Clamp | |
LED projector | Texas Instruments | DLPDLCR230NPEVM | or other small LED projector |
Lick spout | VWR | 20068-638 | (or ~16 G metal hypodermic tubing) |
M 2.5 x 6 set screws | McMaster-Carr | 92015A097 | securing head post |
Matte white diffusion paper | Amazon.com | screen material | |
Metal headposts | custom design | 3D design file at github.com/GergelyTuri/HallPassVR/hardware/VR head mount/head post designs | |
Miscellenous tubing and tubing adapters (1/16" ID) | for constructing the water line | ||
Optical breadboard | Thorlabs | as per user's requirements | |
Optical posts, 1/2" diam (2x) | Thorlabs | TR4 | for head fixation setup |
Processing code | custom written | file at github.com/GergelyTuri/HallPassVR/software/Processing code | |
Raspberry Pi 4B | raspberry.com, adafruit.com | Single-board computer for rendering of HallPassVR envir. | |
Right angle clamp | Thorlabs | RA90 | for head fixation setup |
Rotary encoder (quadrature, 256 step) | DigiKey | ENS1J-B28-L00256L | to measure wheel rotation |
Rotary encoder ESP32 code | custom written | file at github.com/GergelyTuri/HallPassVR/software/Arduino code/Rotary encoder | |
SCIGRIP 10315 acrylic cement | Amazon.com | ||
Shaft coupler | McMaster-Carr | 9861T426 | to couple rotary encoder shaft with axle |
Silver mirror acrylic sheets | Canal Plastics | 32913817934 | laser cut file at github.com/GergelyTuri/HallPassVR/hardware/VR screen assembly |
Solenoid valve | Parker | 003-0137-900 | to administer water rewards |
Zapytaj o uprawnienia na użycie tekstu lub obrazów z tego artykułu JoVE
Zapytaj o uprawnieniaThis article has been published
Video Coming Soon
Copyright © 2025 MyJoVE Corporation. Wszelkie prawa zastrzeżone