Combining human action sensing of wheelchair users and machine learning for autonomous accessibility data collection

11Citations
Citations of this article
42Readers
Mendeley users who have this article in their library.

Abstract

The recent increase in the use of intelligent devices such as smartphones has enhanced the relationship between daily human behavior sensing and useful applications in ubiquitous computing. This paper proposes a novel method inspired by personal sensing technologies for collecting and visualizing road accessibility at lower cost than traditional data collection methods. To evaluate the methodology, we recorded outdoor activities of nine wheelchair users for approximately one hour each by using an accelerometer on an iPod touch and a camcorder, gathered the supervised data from the video by hand, and estimated the wheelchair actions as a measure of street level accessibility in Tokyo. The system detected curb climbing, moving on tactile indicators, moving on slopes, and stopping, with F-scores of 0.63, 0.65, 0.50, and 0.91, respectively. In addition, we conducted experiments with an artificially limited number of training data to investigate the number of samples required to estimate the target.

Cite

CITATION STYLE

APA

Iwasawa, Y., Eguchi Yairi, I., & Matsuo, Y. (2016). Combining human action sensing of wheelchair users and machine learning for autonomous accessibility data collection. IEICE Transactions on Information and Systems, E99D(4), 1153–1161. https://doi.org/10.1587/transinf.2015EDP7278

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free