This paper aims firstly to provide a flexible framework for developing recipe guiding system that displays information step-by-step along to events recognized in user's activity, and secondly to introduce an example of our implementation on the proposed framework. Those who are working on a task requiring high concentration can be easily distracted by the interactive systems that require any kind of explicit manipulations. In such situation, recognizing events in the task is helpful as an alternative of the manipulations. The framework allows a system designer to incorporate his/her own recognizer to the guiding system. Based on this framework, we implemented a system working with user's grabbing and releasing objects. A grabbed object tells the user's intention of what is about to do next, and releasing the object indicates its completion. In the experiments using the WOZ method, we confirmed that these actions worked well as switches for the interface. We also summarize some of our efforts for automating the system. © 2014 Springer International Publishing.
Mendeley helps you to discover research relevant for your work.
CITATION STYLE
Hashimoto, A., Inoue, J., Funatomi, T., & Minoh, M. (2014). How does user’s access to object make HCI smooth in recipe guidance? In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8528 LNCS, pp. 150–161). Springer Verlag. https://doi.org/10.1007/978-3-319-07308-8_15