EMG dataset for gesture recognition with arm translation
Data files
Nov 19, 2024 version files 3.80 GB
-
data.zip
3.80 GB
-
DisciplineSpecificMetadata.json
8.92 KB
-
README.md
5.71 KB
Abstract
Myoelectric control has emerged as a promising approach for a wide range of applications, including controlling limb prosthetics, teleoperating robots, and enabling immersive interactions in the metaverse. However, the accuracy and robustness of myoelectric control systems are often affected by various factors, including muscle fatigue, perspiration, drifts in electrode positions, and changes in arm position. The latter has received less attention despite its significant impact on signal quality and decoding accuracy. To address this gap, we present GREAT, a novel dataset of surface electromyographic (EMG) signals captured from multiple arm positions. This dataset, comprising EMG and hand kinematics data from 8 participants performing 6 different hand gestures, provides a comprehensive resource for investigating position-invariant myoelectric control decoding algorithms. We envision this dataset to serve as a valuable resource for both training and benchmarking arm position-invariant myoelectric control algorithms. Additionally, to further expand the publicly available data capturing the variability of EMG signals across diverse arm positions, we propose a novel data acquisition protocol that can be utilized for future data collection.
https://doi.org/10.5061/dryad.8sf7m0czv
Corresponding author: Iris Kyranou, email: iriskyr@gmail.com
Description of the data and file structure
Subjects
8 intact subjects.
Acquisition Setup
The sEMG data are acquired using four Trigno Quattro Sensors (https://delsys.com/trigno-quattro/), while kinematic data are acquired using a Cyberglove II data glove (http://www.cyberglovesystems.com/cyberglove-ii).
The EMG data collection process involves the following steps:
- Surface EMG (sEMG) signals sampled at 2kHz
- Skin cleaned with 70% isopropyl alcohol before electrode placement
- 16 electrodes arranged in two rows of 8 around the forearm:
- First row: Starting at extensor carpi ulnaris muscle
- Second row: Positioned between the electrodes of the first row
- The 4 reference electrodes are placed: 2 near the elbow, 2 near the wrist
- Electrodes attached using Delsys adhesive interface
Acquisition Protocol
Experimental Protocol:
- Participants were prompted using a Python-based computer interface (using the Axopy package), that can be found in this link: https://github.com/MoveR-Digital-Health-and-Care-Hub/posture_dataset_collection/tree/main
- Required to perform 6 different grasps:
- Power, lateral, pointer, tripod, open, and rest
Arm Positioning System:
- 3x3 grid showing 9 possible arm positions (P1-P9)
- P5 = neutral position (90° angle to the upper arm, parallel to the ground)
- The rest of the positions: 45° movements from a neutral position
- Upper arm kept stable and relaxed
Data Collection Process:
- Participants familiarized with positions and grasps before recording
- Interface shows grasp image in specific grid position
- Recording starts only after the participant correctly positions their arm and forms a grasp
- Data collection initiated via keyboard to avoid capturing transition movements
The dataset is recorded over the period of two days and includes 5 repetitions of 6 different grasps at 5 different arm positions, recorded over two different sessions without the removal of the sensors each day. The subjects are shown the grasp that is asked to be performed on a monitor and they are holding it for as long as the prompt requires them to.
Files and variables
File: data.zip
Description: This zip file contains data from all subjects 1-8. Each subject has 4 subfolders:
- participant{N}day1block1
- participant{N}day1block2
- participant{N}day2block1
- participant{N}day1block2
Dataset variables
Each folder contains the following data:
• Raw EMG Data: The file named ‘emg_data.hdf5’, contains the raw recordings from the 16 EMG sensors. The file has the format of an HDF5 binary data file and is indexed by the trial number (total of 150 trials). The stored matrix per trial is of 16 x (2kHz*5sec) shape. The signal is processed upon recording, by applying a 4th-order Butterworth filter
• Raw Glove Data: The upsampled, uncalibrated signal recordings of the cyberglove are stored in the file ‘glove_data.hdf5’. The file has a similar format and size to the ‘emg_data.hdf5’ file. It’s an HDF5 binary data file of an 18 x (2kHz*5sec) matrix shape per trial.
• Calibrated Glove Data: The dataglove’s data correspond to the recordings from each of the 18 sensors. We also calculated the transformation that maps the data to a 5 x (2kHz*5sec) matrix, holding information on the position of each of the five fingers. This information is stored in the ‘finger_data.hdf5’ file.
• Recording Parameters File: The recording parameters file (‘recording_parameters.txt’) contains both metadata and technical specifications. The metadata includes the participant identifier (numbered 1-8), configuration type (+ or x), and a pseudo-randomized sequence of 6 grasps across 5 positions, along with the recording block, day (1 or 2), and timestamp. The technical parameters specify a window size of 150ms and a 4th-order Butterworth filter with frequencies between 20Hz and 450Hz. The recording protocol consisted of 5 trials, each lasting 5 seconds with 3-second rest intervals, captured across 16 channels.
• Labeling Data: The file ‘trials.csv’ contains the labeling information for each recorded trial in the raw dataset. Each entry represents a single trial and includes a unique trial ID ranging from 1 to 150, the target position on the grid (1-9), the performed grasp type (1-6), the trial number within a block (0-4), and a block index (0-29) that identifies each set of 5 trials.
Code/software
The data can be imported and read in Matlab or Python. Python is more suitable. The data included in the folder has .hdf5, .csv, and .txt formats.
The code for the interface used to record the dataset can be found here: https://github.com/MoveR-Digital-Health-and-Care-Hub/posture_dataset_collection
Access information
Other publicly accessible locations of the data: