A subscription to JoVE is required to view this content. Sign in or start your free trial.
Method Article
Rodent skilled reaching is commonly used to study dexterous skills, but requires significant time and effort to implement the task and analyze the behavior. We describe an automated version of skilled reaching with motion tracking and three-dimensional reconstruction of reach trajectories.
Rodent skilled reaching is commonly used to study dexterous skills, but requires significant time and effort to implement the task and analyze the behavior. Several automated versions of skilled reaching have been developed recently. Here, we describe a version that automatically presents pellets to rats while recording high-definition video from multiple angles at high frame rates (300 fps). The paw and individual digits are tracked with DeepLabCut, a machine learning algorithm for markerless pose estimation. This system can also be synchronized with physiological recordings, or be used to trigger physiologic interventions (e.g., electrical or optical stimulation).
Humans depend heavily on dexterous skill, defined as movements that require precisely coordinated multi-joint and digit movements. These skills are affected by a range of common central nervous system pathologies including structural lesions (e.g., stroke, tumor, demyelinating lesions), neurodegenerative disease (e.g., Parkinson's disease), and functional abnormalities of motor circuits (e.g., dystonia). Understanding how dexterous skills are learned and implemented by central motor circuits therefore has the potential to improve quality of life for a large population. Furthermore, such understanding is likely to improve motor performance in healthy people by optimizing training and rehabilitation strategies.
Dissecting the neural circuits underlying dexterous skill in humans is limited by technological and ethical considerations, necessitating the use of animal models. Nonhuman primates are commonly used to study dexterous limb movements given the similarity of their motor systems and behavioral repertoire to humans1. However, non-human primates are expensive with long generation times, limiting numbers of study subjects and genetic interventions. Furthermore, while the neuroscientific toolbox applicable to nonhuman primates is larger than for humans, many recent technological advances are either unavailable or significantly limited in primates.
Rodent skilled reaching is a complementary approach to studying dexterous motor control. Rats and mice can be trained to reach for, grasp, and retrieve a sugar pellet in a stereotyped sequence of movements homologous to human reaching patterns2. Due to their relatively short generation time and lower housing costs, as well as their ability to acquire skilled reaching over days to weeks, it is possible to study large numbers of subjects during both learning and skill consolidation phases. The use of rodents, especially mice, also facilitates the use of powerful modern neuroscientific tools (e.g., optogenetics, calcium imaging, genetic models of disease) to study dexterous skill.
Rodent skilled reaching has been used for decades to study normal motor control and how it is affected by specific pathologies like stroke and Parkinson's disease3. However, most versions of this task are labor and time-intensive, mitigating the benefits of studying rodents. Typical implementations involve placing rodents in a reaching chamber with a shelf in front of a narrow slot through which the rodent must reach. A researcher manually places sugar pellets on the shelf, waits for the animal to reach, and then places another one. Reaches are scored as successes or failures either in real time or by video review4. However, simply scoring reaches as successes or failures ignores rich kinematic data that can provide insight into how (as opposed to simply whether) reaching is impaired. This problem was addressed by implementing detailed review of reaching videos to identify and semi-quantitatively score reach submovements5. While this added some data regarding reach kinematics, it also significantly increased experimenter time and effort. Further, high levels of experimenter involvement can lead to inconsistencies in methodology and data analysis, even within the same lab.
More recently, several automated versions of skilled reaching have been developed. Some attach to the home cage6,7, eliminating the need to transfer animals. This both reduces stress on the animals and eliminates the need to acclimate them to a specialized reaching chamber. Other versions allow paw tracking so that kinematic changes under specific interventions can be studied8,9,10, or have mechanisms to automatically determine if pellets were knocked off the shelf11. Automated skilled reaching tasks are especially useful for high-intensity training , as may be required for rehabilitation after an injury12. Automated systems allow animals to perform large numbers of reaches over long periods of time without requiring intensive researcher involvement. Furthermore, systems which allow paw tracking and automated outcome scoring reduce researcher time spent performing data analysis.
We developed an automated rat skilled reaching system with several specialized features. First, by using a movable pedestal to bring the pellet into "reaching position" from below, we obtain a nearly unobstructed view of the forelimb. Second, a system of mirrors allows multiple simultaneous views of the reach with a single camera, allowing three-dimensional (3-D) reconstruction of reach trajectories using a high resolution, high-speed (300 fps) camera. With the recent development of robust machine learning algorithms for markerless motion tracking13, we now track not only the paw but individual knuckles to extract detailed reach and grasp kinematics. Third, a frame-grabber that performs simple video processing allows real-time identification of distinct reaching phases. This information is used to trigger video acquisition (continuous video acquisition is not practical due to file size), and can also be used to trigger interventions (e.g., optogenetics) at precise moments. Finally, individual video frames are triggered by transistor-transistor logic (TTL) pulses, allowing the video to be precisely synchronized with neural recordings (e.g., electrophysiology or photometry). Here, we describe how to build this system, train rats to perform the task, synchronize the apparatus with external systems, and reconstruct 3-D reach trajectories.
All methods involving animal use described here have been approved by the Institutional Animal Care and Use Committee (IACUC) of the University of Michigan.
1. Setting up the reaching chamber
NOTE: See Ellens et al.14 for details and diagrams of the apparatus. Part numbers refer to Figure 1.
2. Setting up the computer and hardware
3. Behavioral training
4. Training rats using the automated system
5. Analyzing videos with DeepLabCut
NOTE: Different networks are trained for each paw preference (right paw and left paw) and for each view (direct view and left mirror view for right pawed rats, direct view and right mirror view for left pawed rats). The top mirror view is not used for 3D reconstruction—just to detect when the nose enters the slot, which may be useful to trigger interventions (e.g., optogenetics). Each network is then used to analyze a set of videos cropped for the corresponding paw and view.
6. Box calibration
NOTE: These instructions are used to determine the transformation matrices to convert points identified in the direct and mirror views into 3-D coordinates. For the most up to date version and more details on how to use the boxCalibration package, see the Leventhal Lab GitHub: https://github.com/LeventhalLab/boxCalibration, which includes step-by-step instructions for their use.
7. Reconstructing 3D trajectories
Rats acquire the skilled reaching task quickly once acclimated to the apparatus, with performance plateauing in terms of both numbers of reaches and accuracy over 1–2 weeks (Figure 5). Figure 6 shows sample video frames indicating structures identified by DeepLabCut, and Figure 7 shows superimposed individual reach trajectories from a single session. Finally, in Figure 8, we illustrate what happen...
Rodent skilled reaching has become a standard tool to study motor system physiology and pathophysiology. We have described how to implement an automated rat skilled reaching task that allows: training and testing with minimal supervision, 3-D paw and digit trajectory reconstruction (during reaching, grasping, and paw retraction), real-time identification of the paw during reaching, and synchronization with external electronics. It is well-suited to correlate forelimb kinematics with physiology or to perform precisely-tim...
The authors have nothing to disclose.
The authors would like to thank Karunesh Ganguly and his laboratory for advice on the skilled reaching task, and Alexander and Mackenzie Mathis for their help in adapting DeepLabCut. This work was supported by the National Institute of Neurological Disease and Stroke (grant number K08-NS072183) and the University of Michigan.
Name | Company | Catalog Number | Comments |
clear polycarbonate panels | TAP Plastics | cut to order (see box design) | |
infrared source/detector | Med Associates | ENV-253SD | 30" range |
camera | Basler | acA2000-340kc | 2046 x 1086 CMV2000 340 fps Color Camera Link |
camera lens | Megapixel (computar) | M0814-MP2 | 2/3" 8mm f1.4 w/ locking Iris & Focus |
camera cables | Basler | #2000031083 | Cable PoCL Camera Link SDR/MDR Full, 5 m - Data Cables |
mirrors | Amazon | ||
linear actuator | Concentrics | LACT6P | Linear Actuator 6" Stroke (nominal), 110 Lb Force, 12 VDC, with Potentiometer |
pellet reservoir/funnel | Amico (Amazon) | a12073000ux0890 | 6" funnel |
guide tube | ePlastics | ACREXT.500X.250 | 1/2" OD x 1/4" ID Clear. Extruded Plexiglass Acrylic Tube x 6ft long |
pellet delivery rod | ePlastics | ACRCAR.250 | 0.250" DIA. Cast Acrylic Rod (2' length) |
plastic T connector | United States Plastic Corp | #62065 | 3/8" x 3/8" x 3/8" Hose ID Black HDPE Tee |
LED lights | Lighting EVER | 4100066-DW-F | 12V Flexible Waterproof LED Light Strip, LED Tape, Daylight White, Super Bright 300 Units 5050 LEDS, 16.4Ft 5 M Spool |
Light backing | ePlastics | ACTLNAT0.125X12X36 | 0.125" x 12" x 36" Natural Acetal Sheet |
Light diffuser films | inventables | 23114-01 | .007x8.5x11", matte two sides |
cabinet and custom frame materials | various (Home Depot, etc.) | 3/4" fiber board (see protocol for dimensions of each structure) | |
acoustic foam | Acoustic First | FireFlex Wedge Acoustical Foam (2" Thick) | |
ventilation fans | Cooler Master (Amazon) | B002R9RBO0 | Rifle Bearing 80mm Silent Cooling Fan for Computer Cases and CPU Coolers |
cabinet door hinges | Everbilt (Home Depot | #14609 | continuous steel hinge (1.4" x 48") |
cabinet wheels | Everbilt (Home Depot | #49509 | Soft rubber swivel plate caster with 90 lb. load rating and side brake |
cabinet door handle | Everbilt (Home Depot | #15094 | White light duty door pull (4.5") |
computer | Hewlett Packard | Z620 | HP Z620 Desktop Workstation |
Camera Link Frame Grabber | National Instruments | #781585-01 | PCIe-1473 Virtex-5 LX50 Camera Link - Full |
Multifunction RIO Board | National Instruments | #781100-01 | PCIe-17841R |
Analog RIO Board Cable | National Instruments | SCH68M-68F-RMIO | Multifunction Cable |
Digital RIO Board Cable | National Instruments | #191667-01 | SHC68-68-RDIO Digital Cable for R Series |
Analog Terminal Block | National Instruments | #782536-01 | SCB-68A Noise Rejecting, Shielded I/O Connector Block |
Digital Terminal Block | National Instruments | #782536-01 | SCB-68A Noise Rejecting, Shielded I/O Connector Block |
24 position relay rack | Measurement Computing Corp. | SSR-RACK24 | Solid state relay backplane (Gordos/OPTO-22 type relays), 24-channel |
DC switch | Measurement Computing Corp. | SSR-ODC-05 | Solid state relay module, single, DC switch, 3 to 60 VDC @ 3.5 A |
DC Sense | Measurement Computing Corp. | SSR-IDC-05 | solid state relay module, single, DC sense, 3 to 32 VDC |
DC Power Supply | BK Precision | 1671A | Triple-Output 30V, 5A Digital Display DC Power Supply |
sugar pellets | Bio Serv | F0023 | Dustless Precision Pellets, 45 mg, Sucrose (Unflavored) |
LabVIEW | National Instruments | LabVIEW 2014 SP1, 64 and 32-bit versions | 64-bit LabVIEW is required to access enough memory to stream videos, but FPGA coding must be performed in 32-bit LabVIEW |
MATLAB | Mathworks | Matlab R2019a | box calibration and trajectory reconstruction software is written in Matlab and requires the Computer Vision toolbox |
Request permission to reuse the text or figures of this JoVE article
Request PermissionThis article has been published
Video Coming Soon
Copyright © 2025 MyJoVE Corporation. All rights reserved