In this paper we introduce the notion of a Virtual User Lab that employs virtual reality tools to simulate End-Users in realistic application scenarios in order to help industrial designers and application developers to create and test adaptive interfaces that evolve as users' preferences and potential handicaps are discovered. We describe key elements of the VUL, discuss computer vision-based algorithms for facial information processing to understand user behavior and present an email-reading scenario to better highlight the system's adaptive capabilities and practical usability. © 2011 Springer-Verlag.
CITATION STYLE
Takacs, B., Simon, L., & Peissner, M. (2011). Sensing user needs: Recognition technologies and user models for adaptive user interfaces. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6761 LNCS, pp. 498–506). https://doi.org/10.1007/978-3-642-21602-2_54
Mendeley helps you to discover research relevant for your work.