Our research focuses on improving dynamic hand gesture recognition using synchronized EMG and visual data. We aim to determine how accurately muscle activity maps to finger gesture across different hand positions and how this can enhance application in prosthetic rehabilitation and human-computer interaction. Our protocol addresses the gap in hand gesture recognition by enabling the mapping of muscle activity to finger gesture across various dynamic hand positions.
Our approach collects and synchronizes DMG and visual data during dynamic movements, laying the groundwork for developing robust gesture recognition models. Unlike traditional methods with static setups, our protocol uses a wireless EMGRA and the hand tracking system during dynamic movements, ensuring flexibility and more realistic data collection for gesture recognition studies. To begin, open the GitHub repository and follow the detailed instructions in the installation section.
Locate the primary Python file, data_collection. py, in the folder, and prepare it to run the experiment. Use the script spectrogram.
py to assess the electromyography, in short, EMG signal quality, and the data analysis script for signal filtering and segmentation. Ensure the EMG data acquisition unit, in short, DAU, is fully charged before each session. Then, turn on the DAU.
Connect the DAU to the PC through Bluetooth using the dedicated application. Set the Bluetooth communication rate to 500 samples per second. Install and open the hand tracking camera software on the PC.Connect the hand tracking camera to the PC using a cable.
Use one screen to always display the hand tracking camera software. To begin, instruct the participant to flex their right hand into a strong fist. As the participant flexes, gently press along their forearm to palpate the muscle and identify the spot with the most prominent activation.
Peel off the white protective layer from the EMG electrode array and carefully attach the electrodes to the identified forearm area. Place the adhesive tape close to the palm and gently tap it to secure the electrode array to the skin. Once the electrode array is attached to the skin, peel off the transparent support layer.
Next, insert the electrode array connector card into the DAU connector socket. Attach the DAU to the adhesive tape next to the electrodes. Run the custom Python spectrogram script to verify real-time signal quality.
Observe the displayed window showing raw data on the left and frequency domain data on the right for all electrodes. Verify that all electrodes are detected and functioning properly and that the signal is clean from excessive noise and 50 hertz noise. If required, unplug unnecessary equipment devices from the power and move away from the electronic devices to reduce noise, allowing time for the signal to stabilize.
Next, instruct the participant to place an elbow on the armchair and move fingers, then ask to relax. Ensure that a clear EMG signal is displayed, followed by static baseline noise. Close the script once signal verification is complete.
For reviewing the position, click on finger pose estimation followed by data acquisition to open the images folder. Review the gesture images with the participant. Explain the forehand positions clearly to the participant.
Instruct them on how to hold their hand before each session, ensuring proper posture and positioning. For hand position one, ask the participant to stand straight, approximately one meter away from the table. Then, instruct the participant to hold their right hand down, straight and relaxed, with the palm facing the hand tracking camera.
Fix the hand tracking camera on the table using a selfie stick and direct it to face the participant's hand. Ensure the participant makes firm gestures at the onset of the beep sound, followed by relaxed palm during rest period. For hand position two, instruct the participant to sit comfortably in an armchair, positioned 40 to 70 centimeters from the monitors.
Then, ask the participant to extend their right hand forward at a 90 degree angle with the palm relaxed and facing the hand tracking camera. Use a support device if necessary to hold the hand stable. Place the hand tracking camera on the table facing upward.
Ensure the participant makes firm gestures at the onset of the beep sound, followed by relaxed palm during rest period. For hand position three, ask the participant to fold their hand upward while resting their elbow on the armchair. Ensure the palm is relaxed and facing the hand tracking camera.
Fix the hand tracking camera on the table facing the participant's hand. Ensure the participant's position is optimal for both viewing the screens and being within the camera's field of view. Ensure the participant makes firm gestures at the onset of the beep sound followed by relaxed palm during rest period.
For hand position four, ask the participant to perform the finger gestures while moving the hand freely, choosing either dynamic hand position one, dynamic hand position two, or dynamic hand position three. Turn on the computer, open Python, and load the script data_collection.py. Adjust the hand tracking camera position and angle to align with the participant's hand position.
Run the data_collection. py script. A window will appear to enter the participant's details.
Fill in the required information and press okay to start the experiment automatically. For each session, record EMG and hand tracking data, which are automatically saved. As the experiment concludes, ensure that the data is automatically saved in a folder labeled with the participant serial number.
Verify that each session is stored in a subfolder named S number, containing four subfolders for each hand position labeled as P number. If a participant completes multiple sessions, confirm that all data is saved in the corresponding session folder. Ensure that each hand position folder contains the EMG data saved in an EDF file, hand tracking data saved in a CSV file, and a log file containing the metadata about the session.
The EMG channels displayed increased electrical activity during abduction phases compared to rest phases, as evident in higher amplitude signals across all channels, with mechanical artifacts marked by sharp spikes. Hand kinematic data demonstrated synchronized finger angle changes corresponding to instructed abduction gestures, with stable signal trajectories during unobstructed tracking and visible deviation in misaligned sections.