3.2K Views
•
11:32 min
•
January 19th, 2022
DOI :
January 19th, 2022
•Transcript
Sensory associative learning paradigms have revealed a great deal about the function of the cerebellum and its relationship to the rest of the brain, and behavior. Using the mouse as a model organism, we present a method with sufficient detail for laboratories around the world to implement these powerful methods effectively and inexpensively. Our protocol will allow researchers to set up multiple cerebellum-dependent behaviors.
The core functions of our research platform can be readily modified to fit a particular research question. To begin, connect the camera serial interface cable to the camera, and the camera port on the SBC. Then download the operating system for the SBC onto the host computer, and create a file called ssh in the microSD card.
Eject the microSD card from the host machine. Insert it into the SBC microSD card slot, and power the SBC. Next, to prepare the SBC to accept a wired connection to the host, open a terminal, type the command ifconfig"and record the Ethernet IP address of the SBC.
Then go to the Interface tab of the Raspberry Pi Configuration setting, and enable the options for camera, SSH, and VNC. For establishing the wired connection, connect an Ethernet cable to the Ethernet port on the SBC and a host computer, and attach the other end of these cables to an Ethernet switch. Then, using a virtual network computing client, such as VNC, access the desktop using the SBC IP address and the default authentication.
Next, download the required software and necessary Python libraries to the SBC. To allow direct control over the micro-controller, download the micro-controller integrated development environment, or IDE software. Then open an SBC terminal window, navigate to the Downloads directory, and install the IDE.
After opening the micro-controller IDE, select Tools, followed by Manage Libraries, and install the Encoder library from Paul Stoffregen. Finally, insert a thumb drive into a USB port on the SBC, and enter the commands for mounting the USB external storage device. After connecting the SBC to the programming port of the micro-controller, open the download sketch with the micro-controller IDE, and upload it to the micro-controller.
Next, download and install the appropriate version of the Arduino IDE on the host computer. Then download the DTSC_US. ino sketch to the host computer.
Connect the USB A to USB B wire to the host computer and micro-controller. Open the sketch, and upload it to the micro-controller. Attach wires to the micro-controller's breadboard, LEDs, rotary encoder, stepper motor with a driver, and solenoid valve with driver.
Then wire one channel of a power supply to the positive-V and GND pins of the stepper motor driver. After turning on the power supply, set the attached channel voltage to 25 volts. Next, wire the positive lead of a power supply to the solenoid valve driver hold-voltage pin, and the other positive lead to the spike-voltage pin.
After turning on the power supply, set the channel connected to spike voltage to 12 volts, and the channel connected to the hold voltage to 2.5 volts. Then connect an air source regulated to a pressure of 20 pounds per square inch to the solenoid valve. Next, to make the running wheel, cut a three-inch wheel from a foam roller, and drill a quarter-inch hole in the exact wheel center.
Then insert a quarter-inch shaft into the wheel, and fix it in place using clamping hubs. Affix the rotary encoder to a 4.5-inch aluminum channel. Then stabilize the aluminum channel on the aluminum breadboard.
After attaching the wheel to the rotary encoder, stabilize the free side of the wheel shaft with a bearing inserted in a right-angle end clamp, installed on a breadboard-mounted optical post. Next, position the head restraints using optical posts and right-angle post clamps. Then position the conditional stimulus LED and the solenoid valve outlet for the DEC unconditional stimulus around the assembled wheel.
Next, mount the stepper motor used for the DTSC unconditional stimulus and Pi Camera on an optical post. Place the infrared light array on the same side as the Pi Camera, slightly above and directly facing where the animal's face will be positioned. Make a tactile stimulus for delayed tactile startle conditioning by taping foam to the edge of a piece of acrylic.
Mount to a quarter-inch shaft using a clamping hub, then attach the tactile stimulus to the stepper motor shaft. To implant a head plate, anesthetize the mouse, then make an incision with a scalpel along the midline of the scalp, from the back edge of the eyes to the skull. Spread the incision open, and clamp both sides with hemostats to hold it open.
Using cyanoacrylate glue, attach the head plate to the skull. Then apply a mix of dental cement powder, solvent, and catalyst to all areas of exposed bone. Suture the skin closed, behind and in front of the head plate.
Then inject postoperative analgesia, while allowing the mouse to recover for at least five days. To prepare the mice for behavior sessions, allow them to habituate to the platform by mounting them in the head restraint. Before the session, ensure that the solenoid valve outlet is centered on the target eye, positioned less than one centimeter away, and the tactile stimulus is centered on the mouse's nose, positioned approximately 1.5 centimeters away.
For DTSC session preparation, start the GUI from an SBC terminal. Run a test session of three trials, and ensure that the logged data that prints to the terminal show a deflection of greater than 20, but less than 100 steps. To run a session, mount a mouse to the head restraint and start the GUI from the SBC terminal.
To save camera recordings, hit the Stream button prior to starting the session. Input identifying information for the animal into the Animal ID field, and hit the Set button. Then input the desired experiment parameters, and hit the Upload to Arduino"button.
Finally, hit Start Session to begin the session. DEC training results from a recording session with acceptable lighting are shown here. The acceptable illumination conditions resulted in a good contrast between the eye and periocular fur.
The performance of a single mouse trained for eight sessions showed behavioral traces, with no conditioned response in the untrained mouse, and robust conditioned responses once the mouse is trained. A sample trial video shows a trained mouse successfully closing its eye in response to the LED conditional stimulus, while the untrained mouse does not blink until the unconditional stimulus. The conditioned response increases in size and frequency through behavioral sessions performed across days.
Suboptimal lighting conditions severely limit the quality of data acquired. When the contrast between the eye and surrounding fur is low, slight changes in the image can significantly alter the recorded shape of the unconditioned response over a single session, and decrease the signal-to-noise ratio for detecting eyelid position. DTSC training results for a mouse trained for five sessions are presented here.
A sample trial video shows a trained mouse successfully backing the wheel in response to the LED conditional stimulus, while an untrained mouse fails to move the wheel until the tactile unconditional stimulus is applied. The frequency and amplitude of the conditioned response increase as training proceeds. In a cohort of animals trained with an unconditional stimulus that produced low-amplitude unconditioned responses, no animal learned to consistently produce conditioned responses after four days of training.
For successful behavior, the animal's comfort while running on the rig is critical. It is important to ensure that the wheel turns freely and evenly prior to habituating animals to the rig. Using this flexible platform, we have successfully imaged and perturbed the activity of Purkinje neurons, the output cells of the cerebellum, during learning in head-fixed animals.
We have developed a single platform to track animal behavior during two climbing fiber-dependent associative learning tasks. The low-cost design allows integration with optogenetic or imaging experiments directed towards climbing fiber-associated cerebellar activity.
Chapters in this video
0:04
Introduction
0:47
Setting Up the Single-Board Computer (SBC)
3:13
Wiring Stimulus Hardware and Assembling Stage
6:40
Preparing and Running the Behavior Experiments
8:48
Results: Delay Eyeblink Conditioning (DEC) and Delayed Tactile Startle Conditioning (DTSC) Trials
10:56
Conclusion
Related Videos
ABOUT JoVE
Copyright © 2025 MyJoVE Corporation. All rights reserved