A subscription to JoVE is required to view this content. Sign in or start your free trial.
This protocol implements a stereo-imaging camera system calibrated using direct linear transformation to capture three-dimensional in-situ displacements of stretched peripheral nerves. By capturing these displacements, strain induced at varying degrees of stretch can be determined informing the stretch injury thresholds that can advance the science of stretch-dependent nerve repair.
Peripheral nerves undergo physiological and non-physiological stretch during development, normal joint movement, injury, and more recently while undergoing surgical repair. Understanding the biomechanical response of peripheral nerves to stretch is critical to the understanding of their response to different loading conditions and thus, to optimizing treatment strategies and surgical interventions. This protocol describes in detail the calibration process of the stereo-imaging camera system via direct linear transformation and the tracking of the three-dimensional in-situ tissue displacement of peripheral nerves during stretch, obtained from three-dimensional coordinates of the video files captured by the calibrated stereo-imaging camera system.
From the obtained three-dimensional coordinates, the nerve length, change in the nerve length, and percent strain with respect to time can be calculated for a stretched peripheral nerve. Using a stereo-imaging camera system provides a non-invasive method for capturing three-dimensional displacements of peripheral nerves when stretched. Direct linear transformation enables three-dimensional reconstructions of peripheral nerve length during stretch to measure strain. Currently, no methodology exists to study the in-situ strain of stretched peripheral nerves using a stereo-imaging camera system calibrated via direct linear transformation. Capturing the in-situ strain of peripheral nerves when stretched can not only aid clinicians in understanding underlying injury mechanisms of nerve damage when overstretched but also help optimize treatment strategies that rely on stretch-induced interventions. The methodology described in the paper has the potential to enhance our understanding of peripheral nerve biomechanics in response to stretch to improve patient outcomes in the field of nerve injury management and rehabilitation.
Peripheral nerves (PNs) undergo stretch during development, growth, normal joint movement, injury, and surgery1. PNs display viscoelastic properties to protect the nerve during regular movements2,3 and maintain the structural health of its nerve fibers2. Because PN response to mechanical stretch has been shown to depend on the type of nerve fiber damage4, injuries to adjacent connective tissues2,4, and testing approaches (i.e., loading rate or direction)5,6,7,8,9,10,11,12,13,14, it is essential to distinguish the biomechanical responses of PNs during normal range of motion versus non-physiological range at both slow- and rapid-stretch rates. This can further the understanding of the PN injury mechanism in response to stretch and aid in timely and optimized intervention1,4,15,16. There has been a growing trend in physical therapy to evaluate and intervene based on the relationship between nerve physiology and biomechanics17. By understanding the differences in PN biomechanics at various applied loads, physical therapists can be better prepared to modify current interventions17.
Available biomechanical data of PNs in response to stretch remains variable and can be attributed to testing equipment and procedures and differences in elongation data analysis5,6,7,8,9,10,11,12,13,14,16. Furthermore, measuring three-dimensional (3D) in-situ nerve displacement remains poorly described in the currently available literature. Previous studies have used stereo-imaging techniques to maximize the accuracy of 3D reconstruction of tissue displacement of facet joint capsules18,19. The direct linear transformation (DLT) technique enables the conversion of two or more two-dimensional (2D) views to 3D real-world coordinates (i.e., in mm)20,21,22. DLT provides a high-accuracy calibration method for stereo-imaging camera systems because it enables precise reconstruction of 3D positions, accounting for lens distortion, camera parameters, and image coordinates, and permits flexibility in stereo-imaging camera setup20,21,22. Studies using DLT-calibrated stereo-imaging camera systems are typically used to study locomotion and gait analysis22,23. This protocol aims to offer a detailed methodology to determine the in-situ strain of PNs at varying degrees of stretch using a DLT-calibrated stereo-imaging camera system and an open-source tracking software22.
All procedures described were approved by the Drexel University Institutional Animal Care and Use Committee (IACUC). The neonatal piglet was acquired from a United States Department of Agriculture (USDA)-approved farm located in Pennsylvania, USA.
1. Stereo-imaging system setup
Figure 1: Stereo-imaging camera system. (A) Parallel stereo-imaging camera system with two cameras (left and right cameras) separated by a baseline of 63 mm. (B) Schematic of stereo-imaging camera system and stand setup. Please click here to view a larger version of this figure.
2. Stereo-imaging system DLT Calibration-digitizing the 3D control volume
Figure 2: Three-dimensional control volume and digitizer with foot pedal. (A) Schematic of 3D control volume. (B) Components of digitizer with foot pedal used to digitize 3D control volume to obtain (x, y, z) coordinates in mm. Abbreviation: 3D = three-dimensional. Please click here to view a larger version of this figure.
3. Stereo-imaging camera system calibration-generation of direct linear transformation coefficients
Figure 3: Schematic for acquiring an image of three-dimensional control volume using a stereo-imaging camera system for direct linear transformation calibration. (A) Attach the stereo-imaging camera system to a stand and then connect it to a laptop via a USB type-C cable. Place the 3D control volume 6 cm under the stereo-imaging camera system. (B) Using the imaging software, take an image of the 3D control volume. The output image is a combined image from the left and right cameras. (C) Using a custom MATLAB code, the combined output image is separated into individual left and right images of the 3D control volume. Abbreviation: 3D = three-dimensional. Please click here to view a larger version of this figure.
Figure 4: Schematic for generating direct linear transformation coefficients for left and right camera views of a stereo-camera imaging system. (A) Run DLTcal5.m22, click initialize on the controls window, and select the *.csv file with the digitized (x, y, z) coordinates (in mm) of the 3D control volume. (B) Select the calibration image of the left camera view. Then, select the points on the image in the same order that they were digitized. Then, click compute coefficients to generate the DLT coefficients for the left camera view. Next, click Add camera to repeat the steps for the right camera view. (C) Select the calibration image of the right camera view. Then, select the points on the image in the same order that they were digitized. Then, click compute coefficients to generate the DLT coefficients for the right camera view. (D) Click Save Data to select the directory to save the DLT coefficients for the left and right camera views. Enter the name for the output file and click OK and the DLT coefficients are saved as a *.csv file. Abbreviation: 3D = three-dimensional and DLT = direct linear transformation. Please click here to view a larger version of this figure.
4. Data acquisition
Figure 5: Representative schematic for data acquisition of peripheral nerve stretching. (A) Attach the stereo-imaging camera system to a stand and then connect it to a laptop via a USB type-C cable. Place the stereo-imaging camera system up to 6 cm above the peripheral nerve. (B) The peripheral nerve is clamped to the mechanical setup at the distal end. Using an ink-based skin marker, place a marker on the insertion and clamp sites and an additional two to four markers along the nerve length. Saline is squirted on the peripheral nerve to keep it hydrated before, during, and after testing. Please click here to view a larger version of this figure.
5. Data analysis-marker trajectory tracking
Figure 6: Schematic to set up a new project to begin three-dimensional trajectory tracking. (A) Run DLTdv7.m22 and click New Project to begin a new project. (B) Select 2 as the number of video files. (C) Select Video 1 file (i.e., left camera view) and then select Video 2 file (i.e., right camera view). (D) Select yes as the video files come from a DLT calibrated stereo-imaging camera system. Then, select the *.csv file containing the DLT coefficients. (E) The selected video files are now ready for tracking. Please click here to view a larger version of this figure.
Key/Click | Description |
Left Click | Tracks trajecgtory of a point in frame clicked |
(+) Key | Zooms current video frame in arount mosue pointer |
(-) Key | Zooms current video frame out arount mosue pointer |
(i) Key | Move point up |
(j) Key | Move point left |
(k) Key | Move point right |
(m) Key | Move point down |
Table 1: Keyboard and mouse shortcuts for tracking point trajectory.
Figure 7: Schematic to place initial points on tissue markers for Video 1 and Video 2 using DLTdv7.m22. (A) Set current point to 1. Place point 1 on the insertion marker in Video 1. Using the blue epipolar line in Video 2, place point 1 on the insertion marker. (B) Set current point to 2. Place point 2 on marker 1 in Video 1. Using the blue epipolar line in Video 2, place point 2 on marker 1. (C) Set current point to 3. Place point 3 on marker 2 in Video 1. Using the blue epipolar line in Video 2, place point 3 on marker 2. (D) Set current point to 4. Place point 4 on marker 3 in Video 1. Using the blue epipolar line in Video 2, place point 4 on marker 3. (E) Set current point to 5. Place point 5 on marker 4 in Video 1. Using the blue epipolar line in Video 2, place point 5 on marker 4. (F) Set current point to 6. Place point 6 on the clamp marker in Video 1. Using the blue epipolar line in Video 2, place point 6 on the clamp marker. Please click here to view a larger version of this figure.
Figure 8: Schematic for tracking marker point trajectories of Video 1 using DLTdv7.m22. (A) Set frame number to 1, current point to 1, autotrack mode to auto-advance, and autotrack predictor to extended Kalman. (B) Set current point to 1. On Video 1 file, begin tracking the insertion marker (i.e., point 1) displacement by left-clicking frame-by-frame until the last frame. (C) Set frame number to 1 and current point to 2. On Video 1 file, begin tracking marker 1 (i.e., point 2) displacement by left-clicking frame-by-frame until the last frame. (D) Set frame number to 1 and current point to 3. On Video 1 file, begin tracking marker 2 (i.e., point 3) displacement by left-clicking frame-by-frame until the last frame. (E) Set frame number to 1 and current point to 4. On Video 1 file, begin tracking marker 3 (i.e., point 4) displacement by left-clicking frame-by-frame until the last frame. (F) Set frame number to 1 and current point to 5. On Video 1 file, begin tracking marker 4 (i.e., point 5) displacement by left-clicking frame-by-frame until the last frame. (G) Set frame number to 1 and current point to 6. On Video 1 file, begin tracking the clamp marker (i.e., point 6) displacement by left-clicking frame-by-frame until the last frame. Please click here to view a larger version of this figure.
Figure 9: Schematic for tracking marker point trajectories of Video 2 using DLTdv7.m22. (A) Set frame number to 1, current point to 1, autotrack mode to auto-advance, and autotrack predictor to extended Kalman. (B) Set current point to 1. Using the blue epipolar line on Video 2 file, begin tracking the insertion marker (i.e., point 1) displacement by left-clicking frame-by-frame until the last frame. (C) Set frame number to 1 and current point to 2. Using the blue epipolar line on Video 2 file, begin tracking marker 1 (i.e., point 2) displacement by left-clicking frame-by-frame until the last frame. (D) Set frame number to 1 and current point to 3. Using the blue epipolar line on Video 2 file, begin tracking marker 2 (i.e., point 3) displacement by left-clicking frame-by-frame until the last frame. (E) Set frame number to 1 and current point to 4. Using the blue epipolar line on Video 2 file, begin tracking marker 3 (i.e., point 4) displacement by left-clicking frame-by-frame until the last frame. (F) Set frame number to 1 and current point to 5. Using the blue epipolar line on Video 2 file, begin tracking marker 4 (i.e., point 5) displacement by left-clicking frame-by-frame until the last frame. (G) Set frame number to 1 and current point to 6. Using the blue epipolar line on Video 2 file, begin tracking the clamp marker (i.e., point 6) displacement by left-clicking frame-by-frame until the last frame. Please click here to view a larger version of this figure.
6. Data analysis-strain analysis
Using the described methodology, various output files are obtained. The DLTdv7.m *_xyzpts.csv (Supplemental File 12) contains the (x, y, z) coordinates in millimeters of each tracked point at each time frame that is further used to calculate the length, change in length, and strain of the stretched PN. Representative length-time, change in length-time, and strain-time plots of a stretched PN are shown in Figure 10. The stretched PN had an insertion marker, four markers along...
Studies reporting biomechanical properties of peripheral nerves (PNs) because of stretch injury vary, and that variation can be attributed to testing methodologies such as testing equipment and elongation analysis5,6,7,8,9,10,11,12,
The authors have no conflicts of interest to disclose.
This research was supported by funding from the Eunice Kennedy Shriver National Institute of Child Health and Human Development of the National Institutes of Health under Award Number R15HD093024 and R01HD104910A and NSF CAREER Award Number 1752513.
Name | Company | Catalog Number | Comments |
Clear Acrylic Plexiglass Square Sheet | W W Grainger Inc | BULKPSACR9 | Construct three-dimensional control volume |
Stereo-imaging camera system - ZED Mini Stereo Camera | StereoLabs Inc. | N/A | N/A |
Imaging Software - ZED SDK | StereoLabs Inc. | N/A | N/A |
Maintence Software - CUDA 12 | StereoLabs Inc. | N/A | Download to run ZED SDK |
Camera stand - Cast Iron Triangular Support Stand with Rod | Telrose VWR Choice | 76293-346 | N/A |
MicroSribe G2 Digitizer with Immersion Foot Pedal | SUMMIT Technology Group | N/A | N/A |
Proramming Software - MATLAB | Mathworks | N/A | version 2019A or newer |
DLTcal5.m | Hedrick lab | N/A | Open Source |
DLTdv7.m | Hedrick lab | N/A | Open Source |
Request permission to reuse the text or figures of this JoVE article
Request PermissionExplore More Articles
This article has been published
Video Coming Soon
Copyright © 2025 MyJoVE Corporation. All rights reserved