Intelligent wheelchair and virtual training by LabVIEW

4Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The following paper describes the implementation of three different controllers for a wheelchair. It was built for improving people's life that could have disabilities associated with functions regarding muscle strength, power of muscle of all limbs, tone of muscle of all limbs and endurance of all muscles of the body. Using three programmed methods that allow the user to have full control over the wheelchair. By acquiring voltage signals generated in eye's movements and using Neuro-Fuzzy Networks for differentiating each one, the wheelchair is able of being controlled by eye's movements. Figuring out about different kinds of user's needs and environments, the wheelchair includes voice control in spaces where noise levels are less than 40dB. The voice commands are detected using the Speech Recognition software which is a default program installed on Microsoft Windows operative system. The eye and voice controls were designed using LabVIEW and the intelligent control toolkit (ICTL). There were developed virtual simulators on C# code that enable the patient to have a previous training for having a better wheelchair control and performance. The system was tested on a person with cerebral palsy. © 2010 Springer-Verlag.

Cite

CITATION STYLE

APA

Ponce, P., Molina, A., Mendoza, R., Ruiz, M. A., Monnard, D. G., & Fernández Del Campo, L. D. (2010). Intelligent wheelchair and virtual training by LabVIEW. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6437 LNAI, pp. 422–435). https://doi.org/10.1007/978-3-642-16761-4_37

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free