A high-resolution and whole-body dataset of hand-object contact areas based on 3D scanning method
Data files
Mar 18, 2025 version files 58.78 GB
-
Digital_Data.zip
7.64 GB
-
Meta_Data.zip
128.40 KB
-
README.md
6.75 KB
-
Scan_Data_A.zip
4.56 GB
-
Scan_Data_B.zip
4.37 GB
-
Scan_Data_C.zip
4.11 GB
-
Scan_Data_D.zip
4.89 GB
-
Scan_Data_E.zip
4.46 GB
-
Scan_Data_F.zip
4.26 GB
-
Scan_Data_G.zip
3.95 GB
-
Scan_Data_H.zip
4.53 GB
-
Scan_Data_I.zip
3.79 GB
-
Scan_Data_J.zip
4.13 GB
-
Scan_Data_K.zip
4.18 GB
-
Scan_Data_L.zip
3.93 GB
Abstract
Hand contact data, reflecting the intricate behaviours of human hands during object operation, exhibits significant potential for analysing hand operation patterns to guide the design of hand-related sensors and robots, and predicting object properties. However, these potential applications are hindered by the constraints of low resolution and incomplete capture of the hand contact data. Leveraging a non-contact and high-precision 3D scanning method for surface capture, a high-resolution and whole-body hand contact dataset, named as Ti3D-contact, is constructed in this work. The dataset, with an average resolution of 0.72 mm, contains 1872 sets of texture images and 3D models. The contact area during hand operation is whole-body painted on gloves, which are captured as the high-resolution original hand contact data through a 3D scanner. Reliability validation on Ti3D-contact is conducted and hand movement classification with 95% precision is achieved using the acquired hand contact dataset. The properties of high-resolution and whole-body capturing make the acquired dataset exhibit a promising potential application in hand posture recognition and hand movement prediction.
https://doi.org/10.5061/dryad.2v6wwq003
Description of the data and file structure
The dataset comprises three folders: “Scan Data”, “Digital Data” and “Meta Data”, the “Scan Data” folder contains the original hand contact data in the form of 3D models and texture images, the processed hand contact data after separating and unifying is saved in the form of point cloud documents under the “Digital Data” folder, and the “Meta Data” folder contains the participant data. The structure of the Ti3D-contact dataset is presented in Figure 8.
The privacy of participants is rigorously safeguarded in our Ti3D contact dataset. Each participant is anonymized and identified with labels the same as in Table 1. Under the Digital Data folder, the data of each participant is saved in a separate folder. There are 52 sub-folders within the separate folder and each of them is named with a number representing a distinct grasping or manipulation action. In each sub-folder, there are 3 separate folders denoted by “Cx”, where “x” signifies the number of repetitions (e.g., “C3” indicates data from the third repeated experiment), to keep the results of repeated experiments. Additionally, there exists a dedicated folder named “coordinate system” within each subfolder, where the default and newly created coordinate systems are saved. In the folders corresponding to repeated experimental data, the processed hand contact data is provided both with and without coordinate conversion to cater to varied research requirements. The XLSX files with coordinate conversion are labeled “transformed points-x”, where “x” denotes the number of repetitions. Conversely, the XLSX files without coordinate conversion are named “original points-x-y”, where “x” represents the type of hand gestures, while “y” represents the number of repetitions. The processed hand contact data, without coordinate conversion, is also stored in TXT and ASE formats under the same naming convention as “x-y,” mirroring the structure of the XLSX file. In each “Scan_Data_x” folder, there are 52 subfolders corresponding to a specific grasping or manipulation action. Within each of these subfolders, there are additional subfolders for the three repeated experiments. Each of these experimental subfolders contains various file formats of 3D scanned models derived from the original hand contact data, presented as point clouds from different angles, including 3MF files, ASC files, JPG files, STL files, 3D Object files, and MTL files, ensuring a comprehensive description of scan features. The classification structure of folders is depicted in Figure 8.
“Digital Data” Description: The Digital Data folder includes ASE files, TXT files, and XLSX files.
In the folders denoted by “Cx”, the processed hand contact data without coordinate conversion is stored in the form of ASE files, TXT files, and XLSX files. Additionally, an XLSX file contains the processed hand contact data after coordinate conversion.
In the folder named “coordinate system”, the new coordinate origin and principal component axes data are also stored in the XLSX file. The original contact data obtained in the first process across three repeated trials is stored in the form of ASE files, and TXT files.
“Scan Data” Description: Because of the high-resolution and whole-body 3D models in the dataset, the overall size of the dataset is relatively large. Considering the convenience of data uploading and downloading, the “Scan Data” folder is divided into 12 subfolders named “Scan_Data_x”, where x means the unique identifier of each participant. In the “Scan Data” folder, the original hand contact data is saved in the form of 3D models and texture images. These files are commonly utilized in the field of 3D printing and manufacturing for transferring, sharing, and storing 3D model data. Various file types are included, such as:
3MF files: The files store high-resolution and whole-body data of the 3D whole-body model, including geometry, and texture. It is a standardized format that includes data related to 3D manufacturing such as models, material and property information, which defines the shape and composition of 3D objects that can be manufactured by the 3D printer.
ASC files: ASC files contain 3D high-resolution point cloud data of the 3D-printed hand models with gloves in text format. Each line in an ASC file typically represents the coordinates of a single point in the point cloud saved in ASCII format, including the values of X, Y, and Z coordinates, as well as optional attributes such as colour or intensity values.
JPG files: JPG files include the data of color and texture which can be used to enhance the visual effects of 3D models.
STL files: STL files are prevalent 3D model file formats utilized for displaying 3D models in the form of point cloud. These files contain geometric data of the 3D hand models without additional attributes such as color and texture.
3D Object files: These files typically contain information about the geometry and topology of the 3D-printed hand models with painted gloves and can be used in various applications such as computer graphics, animation, virtual reality, and 3D printing.
MTL files: MTL files describe the material and appearance properties of 3D-printed hand models with painted areas, often used in conjunction with 3D Object files. OBJ files contain 3D geometry data, while MTL files contain material information such as colour, texture, and transparency. MTL files define how the surfaces of the 3D model should be rendered when viewed in a 3D rendering or modelling software.
“Metadata” Description: The metadata file, presented in XLSX format, includes key participant data aimed at providing comprehensive details of participants for dataset users:
Participant identification: The participant’s identification is defined as X, where X varies from A to L.
Gender: The participant’s gender (Male or Female).
Age: The participant’s age in years, signifying an individual aged between 20 and 25 years old.
Basic body data: The participant’s height, with height in centimeters.
The purpose of the metadata is to provide the dataset users with much more information about participants. For instance, the age data indicates that the dataset includes individuals aged between 20 and 30, facilitating classification and analysis in related studies. These metadata are intended to assist dataset users in delving deeper into research data and provide essential background data about participants for various researches.
This work compiles a high-resolution and whole-body human hand contact dataset named as Ti3D-contact. First, participants wear cotton textile gloves to grasp and manipulate different types and sizes of objects, onto which high-adhesion paint is sprayed to paint whole-body hand contact areas on the gloves. Then, the painted gloves are scanned by a 3D scanner to capture the original hand contact data in the form of texture images and 3D models. After extracting the painted areas on the obtained 3D models in the form of point cloud, the processed hand contact data recording the contact areas between hands and objects is further obtained. A coordinate conversion method is then employed by unifying the coordinates of the processed hand contact data. Through unifying the coordinate systems, the consistency of all the hand contact data is improved, which benefits the analyses of hand operation patterns. Furthermore, a method to calculate the Euclidean distance between the adjacent points in the hand contact data is applied to obtain the resolution of the dataset, demonstrating that the high average resolution for the hand contact dataset is 0.72mm. Then, the processed hand contact data is vowelized to calculate its overlap, presenting the similarity among the 3D models for repetitive actions as 90% for verifying the reliability of the hand contact data. Finally, a classifier based on a convolutional neural network (CNN) is proposed for the classification of hand movements with 95% precision using the acquired hand contact data. This shows a potential application of the hand contact data in real-world task scenarios.Ti3D-contactcomprises1872 interactions conducted by 12 healthy adults, with 10 standardized objects, recording the entire contact areas between hands and objects with a high-resolution of 0.72 mm. A diverse array of 52 prescribed grasping and manipulation actions have been completed, including the 33 most commonly used conventional grasping gestures summarized by the Feix team and the 9 basic manipulation gestures proposed by the Elliott team. Additionally, we introduce 10 new common manipulation gestures based on the classification method pioneered by the Bullock team, enhancing the realism of task execution scenarios in daily life.