Iteratively adapting avatars using task-integrated optimisation

13Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Virtual Reality allows users to embody avatars that do not match their real bodies. Earlier work has selected changes to the avatar arbitrarily and it therefore remains unclear how to change avatars to improve users' performance. We propose a systematic approach for iteratively adapting the avatar to perform better for a given task based on users' performance. The approach is evaluated in a target selection task, where the forearms of the avatar are scaled to improve performance. A comparison between the optimised and real arm lengths shows a significant reduction in average tapping time by 18.7%, for forearms multiplied in length by 5.6. Additionally, with the adapted avatar, participants moved their real body and arms significantly less, and subjective measures show reduced physical demand and frustration. In a second study, we modify finger lengths for a linear tapping task to achieve a better performing avatar, which demonstrates the generalisability of the approach.

Cite

CITATION STYLE

APA

McIntosh, J., Zajac, H. D., Stefan, A. N., Bergström, J., & Hornbæk, K. (2020). Iteratively adapting avatars using task-integrated optimisation. In UIST 2020 - Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology (pp. 709–721). Association for Computing Machinery, Inc. https://doi.org/10.1145/3379337.3415832

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free