ERmed - Towards medical multimodal cyber-physical environments

6Citations
Citations of this article
22Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

With new technologies towards medical cyber-physical systems, such as networked head-mounted displays (HMDs) and eye trackers, new interaction opportunities arise for real-time interaction between cyber-physical systems and users. This leads to cyber-physicial environments in which the user has an active role to play inside the cyber-physical system. With our medical application in the context of a cancer screening programme, we are combining active speech based input, passive/active eye tracker user input, and HMD output (all devices are on-body and hands-free) in a convenient way for both the patient and the doctor inside such a medical cyber-physical system. In this paper, we discuss the design and implementation of our resulting Medical Multimodal Cyber-Physical Environment and focus on how situation awareness provided by the environmental sensors effectively leads to an augmented cognition application for the doctor. © 2014 Springer International Publishing Switzerland.

Cite

CITATION STYLE

APA

Sonntag, D. (2014). ERmed - Towards medical multimodal cyber-physical environments. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8534 LNAI, pp. 359–370). Springer Verlag. https://doi.org/10.1007/978-3-319-07527-3_34

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free