Quantification of Natural Multimodal Interaction Capacity

1Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In multimodal interaction, information is presented to users through multiple channels, e.g., sight, sound, touch, smell, and taste. Too much information delivered in a short time, however, may result in information overload that overflows people’s information processing capacity. We summarized the methods of quantifying the capacity by categorizing them into the span of storage or the speed of processing. The span of storage mainly includes short-term memory and working memory capacity and multiple object tracking capacity. Working memory is required in many intellectual functions, and its capacity could be tested with change detection tasks, self-ordered tasks, and complex span tasks. Whether different modalities have separate capacities, whether objects or features are stored, and whether the capacity works as discrete slots or a continuous resource pool were discussed. The speed of processing could be calculated as the information transfer rate with the stimuli and responses matrix; Entropy is used for more complex stimuli such as languages. The relative capacity of multitasking, which is often incorporated in multimodal interaction, could be calculated with the capacity coefficient. The application of these methods to the non-traditional modalities in human-computer interaction, e.g., touch, smell, and taste, was discussed.

Cite

CITATION STYLE

APA

Zheng, J., Rau, P. L. P., & Zhao, J. (2020). Quantification of Natural Multimodal Interaction Capacity. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12192 LNCS, pp. 269–283). Springer. https://doi.org/10.1007/978-3-030-49788-0_20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free