According to the cognitive science research, the interaction intent of humans can be estimated through an analysis of the representing behaviors. This paper proposes a novel methodology for reliable intention analysis of humans by applying this approach. To identify the intention, 8 behavioral features are extracted from the 4 characteristics in human-human interaction and we outline a set of core components for nonverbal behavior of humans. These nonverbal behaviors are associated with various recognition modules including multimodal sensors which have each modality with localizing sound source of the speaker in the audition part, recognizing frontal face and facial expression in the vision part, and estimating human trajectories, body pose and leaning, and hand gesture in the spatial part. As a post-processing step, temporal confidential reasoning is utilized to improve the recognition performance and integrated human model is utilized to quantitatively classify the intention from multi-dimensional cues by applying the weight factor. Thus, interactive robots can make informed engagement decision to effectively interact with multiple persons. Experimental results show that the proposed scheme works successfully between human users and a robot in human-robot interaction. © ICROS 2013.
CITATION STYLE
Yun, S. S., Kim, M., Choi, M. T., & Song, J. B. (2013). Interaction intent analysis of multiple persons using nonverbal behavior features. Journal of Institute of Control, Robotics and Systems, 19(8), 738–744. https://doi.org/10.5302/J.ICROS.2013.13.1893
Mendeley helps you to discover research relevant for your work.