4GAIT: Synchronized MoCap, video, GRF and EMG datasets: Acquisition, management and applications

N/ACitations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Presented is the 4GAIT, a group of multimodal, high quality Reference Human Motion Datasets. The described in this article Multimodal Human Motion Lab provides a comprehensive environment for multimodal data acquisition, management and analysis. Introduced is a proposed PJIIT Multimodal Human Motion Database (MHMD) model for multimodal data representation and storage, along with Motion Data Editor - a toolkit for multimodal motion data management, visualization and analysis. As an example a group of three synchronized, multimodal, motion datasets using MHMD model: 4GAIT-HM, 4GAIT-Paarkinson and 4GAIT-MIS are described. Comparing to the currently available Human Motion Datasets, 4GAIT offers multiple data modalities presented within a unified model, higher video resolution, and larger volume of motion data for specific medical tasks, which better fulfills the needs of medical studies and human body biomechanics research. © 2014 Springer International Publishing Switzerland.

Cite

CITATION STYLE

APA

Kulbacki, M., Segen, J., & Nowacki, J. P. (2014). 4GAIT: Synchronized MoCap, video, GRF and EMG datasets: Acquisition, management and applications. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8398 LNAI, pp. 555–564). Springer Verlag. https://doi.org/10.1007/978-3-319-05458-2_57

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free