A diverse and multi-modal gait dataset of indoor and outdoor walks acquired using multiple cameras and sensors

6Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Gait datasets are often limited by a lack of diversity in terms of the participants, appearance, viewing angle, environments, annotations, and availability. We present a primary gait dataset comprising 1,560 annotated casual walks from 64 participants, in both indoor and outdoor real-world environments. We used two digital cameras and a wearable digital goniometer to capture visual as well as motion signal gait-data respectively. Traditional methods of gait identification are often affected by the viewing angle and appearance of the participant therefore, this dataset mainly considers the diversity in various aspects (e.g., participants’ attributes, background variations, and view angles). The dataset is captured from 8 viewing angles in 45° increments along-with alternative appearances for each participant, for example, via a change of clothing. The dataset provides 3,120 videos, containing approximately 748,800 image frames with detailed annotations including approximately 56,160,000 bodily keypoint annotations, identifying 75 keypoints per video frame, and approximately 1,026,480 motion data points captured from a digital goniometer for three limb segments (thigh, upper arm, and head).

Cite

CITATION STYLE

APA

Topham, L. K., Khan, W., Al-Jumeily, D., Waraich, A., & Hussain, A. J. (2023). A diverse and multi-modal gait dataset of indoor and outdoor walks acquired using multiple cameras and sensors. Scientific Data, 10(1). https://doi.org/10.1038/s41597-023-02161-8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free