Human activities with videos, inertial units and ambient sensors
Data files
Jan 31, 2021 version files 8.13 GB
-
HWU-USP.zip
-
Readme.pdf
Abstract
Methods
The data collection was based on multimodal data from individuals performing activities of daily living. It considered inertial data from wearable devices, RGB and depth videos, as well as data from environmental sensors. All participants were adults without incapacitant physical or cognitive disabilities. The experiments were performed at the Robotic Assisted Living Testbed (RALT), Heriot-Watt University, Edinburgh Campus. A TIAGo robot, manufactured by Pal Robotics, was placed at the corner of the kitchen of the smart home, and recorded RGB and depth videos, while inertial sensors were placed at the participant's wrist and waist to record its movements. The recordings also included ambient sensors, i.e., switches at the cupboards and drawers, current measurements and presence detectors. All data was synchronised, in order to allow experiments on multimodal human activity recognition. The activities considered were "making a cup of tea", "making a sandwich", "making a bowl of cereals", "setting the table", "using a laptop", "using a phone", "reading a newspaper", "cleaning the dishes", and "tidying the kitchen."