~~NOTOC~~ ====== Human Activity Recognition Datasets ====== ===== The data ===== ^ Name ^ Scenario ^ Original purpose ^ Sensors ^ Subjects ^ Seg. / Cont. ^ Static / periodic act. ^ Sporadic act. ^ Comments ^ Authors ^ Link ^ | Skoda mini checkpoint {{ :wiki:dataset:skodaminicp:logo.jpg?direct&100 |}} | 10 manipulative gestures performed in a car maintenance scenario | Gesture recognition | 20 3D acceleration sensors (60 attributes) | 1 | Segmented and continous recording in dataset | - | 10: write notes, open engine hood, close engine hood, check door gaps, open door, close door, open/close two doors, check trunk gap, open/close trunk, check steering wheel.\\ 70 instances of each gesture. | [[http://dl.acm.org/citation.cfm?id=2345781|ACM TECS 2012]]\\ [[http://dl.acm.org/citation.cfm?id=1786017|EWSN 2008]]\\ [[http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4496857|ISSNIP 2007]] | [[daniel.roggen@ieee.org|Daniel Roggen]], Piero Zappi | {{:wiki:dataset:skodaminicp:skodaminicp_2015_08.zip|}} | | BodyAttack fitness {{ :wiki:dataset:bafitness:logo.jpg?direct&100 |}} | 6 fitness activity classes, done mostly with the legs. | Analyse effect of sensor displacement | 10 3D accelerometers on the leg | 1 | C | 6 activities: flick kicks; knee lifts; jumping jacks; superman jumps; high knee runs; feet back runs | - | [[http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5254652|ISWC 2009]] | [[daniel.roggen@ieee.org|Daniel Roggen]], Kilian Foerster | {{:wiki:dataset:bafitness:bafitness.zip|}} | | HCI gestures {{ :wiki:dataset:hci:logo.jpg?direct&100 |}} | 5 gestures performed freehand or guided against a blackboard | Analyse effect of sensor displacement | 8 3D acceleration sensors (24 attributes) | 1 | Segmented and continous recording in dataset | - | 5 gestures: triangle up, square, circle, infinity, triangle down.\\ 10 instances of freehand, 60 instances of guided gestures | [[http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5254652|ISWC 2009]] | [[daniel.roggen@ieee.org|Daniel Roggen]], Kilian Foerster | {{:wiki:dataset:hci:hci.zip|}} | | Daphnet Freezing of Gait Dataset in users with Parkinson's disease {{ :wiki:dataset:daphnetfog:logo.jpg?direct&100 |}} | Gait recording of PD users with occasional freeze | Detection of gait freeze | 3 3D acceleration sensors (9 attributes) | 10 | C | walk, freeze | - | - | [[daniel.roggen@ieee.org|Daniel Roggen]], Marc Baechlin, Meir Plotnik, Jeffrey M. Hausdorff, Nir Giladi | {{:wiki:dataset:daphnetfog:dataset_fog_release.zip|}}\\ [[https://archive.ics.uci.edu/ml/datasets/Daphnet+Freezing+of+Gait|Also on the UCI ML repository]] | | Opportunity Dataset \\ {{ :wiki:dataset:opportunity:logo.jpg?direct&100 |}} | Dataset of wearable, object, and ambient sensors recorded in a room simulating a studio flat where users performed early morning cleanup and breakfast activities. The dataset comprises freely executed "activities of daily living" (ADL) and more a constrained "drill" run. | Reference benchmark dataset for human activity recognition algorithms (classification, automatic data segmentation, sensor fusion, feature extraction, etc). | Body-worn sensors: 7 inertial measurement units, 12 3D acceleration sensors, 4 3D localization information\\ Object sensors: 12 objects with 3D acceleration and 2D rate of turn\\ Ambient sensors: 13 switches and 8 3D acceleration sensors\\ | 4 | C | Modes of locomotion and postures | 17 gestures in the Drill runs, larger number in the ADL runs | [[http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5573462&tag=1|Dataset publication]]\\ [[http://www.sciencedirect.com/science/article/pii/S0167865512004205|Challenge publication]] | [[daniel.roggen@ieee.org|Daniel Roggen]] and colleagues (see publications) | [[https://archive.ics.uci.edu/ml/datasets/OPPORTUNITY+Activity+Recognition|Available on the UCI ML repository]] | | Opportunity++ \\ {{ :wiki:dataset:opportunitypp:logos-opportunity-final_50p_pp_v2_xp_33p.png?direct&100 |}} | Opportunity++ is a precisely annotated dataset designed to support AI and machine learning research focused on the multimodal perception and learning of human activities. Opportunity++ is a significant multimodal extension of the original OPPORTUNITY Activity Recognition Dataset. Opportunity++ includes the original video recordings as well as video-derived skeleton tracking data. | Opportunity++ enables a wide-range of novel multimodal activity recognition research based on video data, ambient- and object-integrated sensors and wearable sensors (classification, automatic data segmentation, sensor fusion, feature extraction, etc). | Body-worn sensors: 7 inertial measurement units, 12 3D acceleration sensors, 4 3D localization information\\ Object sensors: 12 objects with 3D acceleration and 2D rate of turn\\ Ambient sensors: 13 switches and 8 3D acceleration sensors\\ Side-view video\\ Motion capture from video using OpenPose | 4 | C | Modes of locomotion and postures | 17 gestures in the Drill runs, larger number in the ADL runs | [[https://www.frontiersin.org/articles/10.3389/fcomp.2021.792065/full|Dataset publication]] | [[daniel.roggen@ieee.org|Daniel Roggen]] and colleagues (see publication) | [[https://ieee-dataport.org/open-access/opportunity-multimodal-dataset-video-and-wearable-object-and-ambient-sensors-based-human|Available on IEEE DataPort]] | | HCI Tabletop Gestures {{ :wiki:dataset:hcitable:hcitable-logo.png?direct&100 |}} | 39 writing gestures using the Palm alphabet performed in 3 sizes and on several touch surfaces: using a mouse sitting and standing, using a tablet standing, using a touchtable sitting and standing. | Gesture recognition | Three 9 DoF IMUs at the finger, hand and wrist; one AHRS at the wrist (9DoF IMU + orientation in quaternion); screen coordinates (48 attributes) | 10 | Continuous recording in dataset | - | 39 palm alphabet gestures (numbers, letters and symbols).\\ 5 instances of each gesture per size and per touch surface. | None. | [[daniel.roggen@ieee.org|Daniel Roggen]]| {{:wiki:dataset:hcitable:hcitable_release_2022_02_13.zip|}} | ===== Terminology ===== * Segmented: the recordings start and stop to comprise exactly one instance of an activity (e.g. one "drink" gesture, or one "walk"). * Continuous: the dataset contains a continous recording of the data delivered by the sensors, within which usually several activities take place. * Static/periodic activities: activities for which the sensor signals are usually static or periodic, such as when taking static postures (sit, lie, stand), during locomotion (walking, running, bicycling), or when performing some repetitive moves (e.g. jumping jack). * Sporadic activities: activities which are short lived and embedded in a null class, such as "drinking from a cup", "toggling a light switch". ===== Other dataset repositories ===== * [[http://www.ess.tu-darmstadt.de/datasets|Embedded Sensing Systems, TU Darmstadt]] * [[http://www.cse.ust.hk/~derekhh/ActivityRecognition/index.html|Derek Hao Hu's list]]