Gaze-based human-SmartHome-interaction by augmented reality controls

1Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The use of eye tracking systems enables people with motor disabilities to interact with computers and thus with their environment. Combined with an optical see-through head-mounted display (OSTHMD) it allows the interaction with virtual objects which are attached to real objects respectively actions which can be performed in the SmartHome environment. This means a user can trigger actions of real SmartHome actuators by gazing on the virtual objects in the OST-HMD. In this paper we propose a mobile system which is a combination of a low cost commercial eye tracker and a commercial OST-HMD. The system is intended for a SmartHome application. For this purpose we proof our concept by controlling a LED strip light using gaze-based augmented reality controls. We show a calibration procedure of the OST-HMD and evaluate the influence of the OST-HMD to the accuracy of the eye tracking.

Cite

CITATION STYLE

APA

Cottin, T., Nordheimer, E., Wagner, A., & Badreddin, E. (2017). Gaze-based human-SmartHome-interaction by augmented reality controls. In Advances in Intelligent Systems and Computing (Vol. 540, pp. 378–385). Springer Verlag. https://doi.org/10.1007/978-3-319-49058-8_41

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free