Pointing gestures for a robot mediated communication interface

23Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper asked whether the pointing gesture accompanying with speech would facilitate comprehension of spatial information in the videoconference communication. Ten adults participated in our study and communicated with the experimenter over Skype (Skype Technologies, Luxembourg). The experimenter described the spatial layout of items in a room to the participants in two conditions - dynamic and static. In the static condition, the notebook was not moving; in the dynamic condition, the notebook moved around with the arms pointing to abstract spatial locations that represented the locations of items in the room. The movement was done by putting the notebook on the three-wheeled Wi-Fi enabled device that was equipped with two artificial arms and was controlled by the experimenter over the Internet. At the end of each description, the participants were asked to lay out the items properly. Reaction times and accuracy rate were recorded. The findings showed that the accuracy rate was higher in the dynamic condition than in the static condition. In addition, the response time was faster in the dynamic condition than in the static condition. It turned out that pointing gestures facilitated the speech comprehension of spatial information. © 2009 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Cabibihan, J. J., So, W. C., Nazar, M., & Ge, S. S. (2009). Pointing gestures for a robot mediated communication interface. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5928 LNAI, pp. 67–77). https://doi.org/10.1007/978-3-642-10817-4_7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free