A Cognitive User Interface for a Multi-modal Human-Machine Interaction

3Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We developed a hardware-based cognitive user interface to help inexperienced and little technology-affine people to get easy access to smart home devices. The interface is able to interact (via speech, gestures, or touchscreen) with the user. By learning from the user’s behavior, it can adapt to each individual. In contrast to most commercial products, our solution keeps all data required for operation internally and is connected to other UCUI devices only via an encrypted wireless network. By design, no data ever leave the system to file servers of third-party service providers. In this way, we ensure the privacy protection of the user.

Cite

CITATION STYLE

APA

Tschöpe, C., Duckhorn, F., Huber, M., Meyer, W., & Wolff, M. (2018). A Cognitive User Interface for a Multi-modal Human-Machine Interaction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11096 LNAI, pp. 707–717). Springer Verlag. https://doi.org/10.1007/978-3-319-99579-3_72

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free