13.8K Views
•
13:44 min
•
August 8th, 2011
DOI :
August 8th, 2011
•The overall goal of this procedure is to integrate a robot into a virtual environment library and then use the robot for stroke therapy. This is accomplished by first creating a wrapper class to extend the haptics library and then integrating the wrap class into the haptics library. The next step is to create a wrap class to extend the graphics library and then integrating this wrap class into the graphics library once the wrap classes are complete.
The H 3D API interface is used to set up a finite state machine for testing stroke patients. The integrated robot is then used to help rehabilitate stroke patients. The main advantage of this technique is that you no longer have to write a different software application for every robot.
Also, these applications can be run by scientists across the globe regardless of what robot their lab uses. This method can help answer key questions in the rehab, robotics, and motor control fields, such as to what degree our nervous system might benefit from error augmentation. This method can provide insight into stroke impairments as well as further our understanding of healthy function.
Generally, stroke patients new to rehabilitation, robotics need to overcome certain hurdles such as learning to grip the handle. We first had the idea for this method when we realized that we needed a robot with a large operational workspace in gravity compensation. We also wanted to minimize rewriting software so that future graduate students could benefit from those who had come before without redoing the same work.
This video gives instructions for integrating a whole arm manipulation or wham robot with the open source code available at www.hthreedpi.org. To integrate a new robot into the haptics library, the first step is to write a wrapper class for HAP the haptic rendering engine, which is used to control the movement of the WHAM robot. Begin by creating a CPP and header file.
The file names used for this example are happy, wham, cpp and happy wham h. Place the happy wham cpp file into the source directory. Happy slash source, then place happy.
Wham h into the header file directory. Happy slash include slash happy. At the top of happy wham h include the main header files of the robot.
Note that extern C is required to resolve compiler mangling because the included library is written in C and the H 3D API is written in c plus plus In happy wham h create the class for the robot and include the four functions in it. Haptics device release, haptics device, update device values, and send output. Be sure the class inherits publicly from the happy haptics device class.
Next, create a header guard for the class. Then create the static device output and the static haptics device registration attributes under the happy WHAM class. Then create the static member functions for callbacks specific to your robot.
For a WHAM robot, the callback function is used in the bull and it haptics device int function. Now that the prototypes are set in the header file, open happy wham dot cpp, and include the appropriate header files and namespace. Next, define the constructor and structor in Happy Wam cpp.
Then register the device in Happy Wam cpp. Now define the four inherited functions and callbacks in happy Wam cpp. The update device values function tells the computer the location of the robot's arm and how much force is being generated by the robot on the environment.
After creating the Happy Wrapper class, the next step is to build a wrap into the Happy library. The WAM depends on some libraries that H 3D API does not depend on in its raw form. Therefore, these libraries need to be added to happy.
Go to the H 3D 2.1 slash happy slash build directory and edit C make lists text. Add the dependent libraries after the line that says Set optional libs. Open a command console and navigate to H 3D 2.1 slash happy slash build and type in order, see make dot pseudo make and pseudo make install to complete the integration of the new robot with the H 3D open source code.
Write a second wrap class for the scene graph. API called H 3D API. This example uses a wrap class called wam device.
Navigate to the source directory, H 3D AP I slash source, and create the WAM device, CPP file and wam device.Cpp. Add the standard source for all H 3D PI devices. Next place Wam device H into the header file directory.
Wam device H should contain the standard header file for all H 3D API devices. Now that the wrapper class has been created, rebuild the H 3D PI library by editing C make lists text in the directory, H three DPI slash build. Finally in the directory, H 3D PI slash build.
Rebuild the H 3D API library using the commands, CMake dot pseudo make, and pseudo make install The new wrapper classes are now ready for use with a robot. Once the appropriate wrapper classes are installed, a finite state machine is created to control the experimental protocol for the targeted reaching task used with the Barrett WHAM robot. The states of the state machine are start of trial launch, target contact, and end of trial.
The the start of trial state begins with the allocation of a target. Target locations may be set randomly for each trial or may be set from a file. The start of trial state ends and the launch state begins.
Once the subject's hand has launched toward the target above velocity threshold, typically 0.06 meters per second. The launch state ends and the target contact state begins when the patient's cursor touches the target. Once contact is made, the target disappears and the target contact state ends and the end of trial state begins.
The end of trial state signals the data collection software to mark the data file with a delimiter for the end of the trial. Unless the final trial has been completed, the ending of the end of trial state enables the start of trial state and a new target is allocated. This section demonstrates a different robot called the phantom used in the rehabilitation of stroke patients.
By substituting the phantom's wrap class code into the appropriate libraries, the new robot can be integrated into the virtual environment, a three-dimensional haptics graphic system called the virtual reality Robotic and optical operations machine or room is used here. This system combines a projected stereo head tracked rendering on a semi silver mirror overlay display with a robotic system that can record wrist position and generate a force vector. A cinema quality digital projector displays the images that span a five foot wide display resulting in a 110 degree wide viewing angle.
Infrared emitters synchronize separate left and right eye images through the LCD shutter glasses. Head motion is tracked with the ascension flock of birds magnetic elements so that the visual display is rendered with the appropriate head centered perspective. An exo tendon glove with a wrist splint assists in neutral wrist and hand alignment.
The center of the robot handle is placed on the forearm posterior to the radiocarpal joint. This enables its forces to act at the wrist, but allows free motion at the hand. The patient's arm weight is lessened using a spring powered Wilmington robotic exoskeleton gravity balanced orthosis.
The therapist sitting next to the patient uses a tracking device to move a cursor around the display. In front of the, the patient is instructed to use the robotic handle to chase the cursor. Error augmentation is provided both visually and by forces generated by the robot.
When the patient's cursor deviates from the therapist's cursor and instantaneous error, vector E is established as the difference in position between the therapist cursor and the patient's hand. The error is visually magnified by a factor of 1.5 E as part of the error augmentation. Additionally, an error augmenting force of 100 E is applied to the patient's arm, which is programmed to saturate at a maximum of four Newtons.
For safety reasons, the error augmentation magnifies the errors that are perceived by the patient, which enhances the relearning process. The treatment protocol calls for the patient to practice several specific movements, including forward and side reaching shoulder elbow coupling and diagonal reaching across the body. For the next demonstration, the upper class code is swapped and the Barrett WHAM robot is used.
The goal of this next task is for either healthy subjects or stroke patients to catch a virtual projectile inside the semi-transparent green sphere. If the patient's hands strays outside the sphere, the sphere turns red and the patient's arm receives error augmentation feedback from the robot. When the patient successfully catches the projectile inside the defined space, the sphere remains green.
This figure shows the median reaching error. Each dot represents the median error measured for a two minute block of stereotypical functional movements. While the patients show progress across the two week period, there was not always improvement each day.
This figure shows resulting benefit of error augmentation in selectively reducing the movement to movement variability. The top plots show data from the control group and the lower plots show data from the treatment group. Each color represents a different subject.
The two horizontal lines display the location of the boundaries for the task. The subject's goal was to catch the virtual projectile within these boundaries. The location of a dot shows where a given subject came closest to contact.
In the projectile, the blue semi-transparent histograms overlaid on top of the raw data display, the movement distribution of the group. Along the anterior axis, the stroke patients performed 600 trials of the catching task for the initial 200 trials, and for the final 200 trials, no error augmentation Feedback was provided for trials 201 to 400. The robot provided error augmentation feedback to the patient, resulting in a significant improvement in the patient's ability to catch the target within the specified boundaries.
Following this procedure, traditional rehabilitation methods can still be performed, but with the robot we can quantify where the patient's hand is at all times over many repeated trials. This technique paved the way for new research in the field of neural engineering, where we can explore simulated reaching within a region so that there are many possible solutions. After watching this video, you should have an understanding of how to extend to the H 3D API library for a rehabilitation robot of choice and apply a variety of haptic graphic interfaces for the assessment and treatment of stroke patients.
So working with robots can be potentially dangerous, so careful consideration should be given when choosing a robot with such human interactions. We chose the Barrett WHAM for our more most recent project because it's more powerful and back drivable. Back drivable meaning that it can feel nearly resistance free if we program it to do so, and yet it's one of the safest options available for human machine interaction.
It's a separate watchdog computer monitors the main computer and can disable the system if anything goes awry, and the dynamic braking assures that the system is a damper whenever it's inactive. Although many robots perform differently, many of their functions do overlap. So our techniques portability between robots may foster new sharing between laboratories.
What's clear is that training treatments such as those demonstrated in this study provide encouraging evidence for restoring function to patients, as well as enhancing training in sports, piloting teleoperation music performance, surgical techniques, and any other task that requires repetitive practice.
最近,大量的前景,为人类与机器人互动系统。在本文中,我们勾勒出一个新的开源软件,可以迅速使可能的互动功能库的机械手设备的集成。然后,我们勾勒出一个神经康复应用的临床应用。
0:05
Title
1:44
Establishing the HAPI Wrapper Class for a Robot
4:09
HAPI Library Creation
4:53
H3D Wrapper Class for the Scene-Graph API Library
6:13
Finite State Machine
7:24
Application: Rehabilitation of the Stroke Patient
10:23
Results: Motor Skill Training Using a Haptic Device
11:44
Conclusion
相关视频
15.7K Views
9.6K Views
22.2K Views
9.4K Views
7.2K Views
15.4K Views
6.2K Views
6.8K Views
6.6K Views
5.4K Views
版权所属 © 2025 MyJoVE 公司版权所有,本公司不涉及任何医疗业务和医疗服务。