Sensing user needs: Recognition technologies and user models for adaptive user interfaces

3Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this paper we introduce the notion of a Virtual User Lab that employs virtual reality tools to simulate End-Users in realistic application scenarios in order to help industrial designers and application developers to create and test adaptive interfaces that evolve as users' preferences and potential handicaps are discovered. We describe key elements of the VUL, discuss computer vision-based algorithms for facial information processing to understand user behavior and present an email-reading scenario to better highlight the system's adaptive capabilities and practical usability. © 2011 Springer-Verlag.

Cite

CITATION STYLE

APA

Takacs, B., Simon, L., & Peissner, M. (2011). Sensing user needs: Recognition technologies and user models for adaptive user interfaces. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6761 LNCS, pp. 498–506). https://doi.org/10.1007/978-3-642-21602-2_54

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free