A touchless gestural platform for the interaction with the patients data

2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Usually, during a surgery, an assistant or a nurse operates mouse and keyboard in place of the surgeon and this indirect manipulation may cause a bad image interpretation or communication problems or misunderstandings. More recently the spread of tablets and touchscreen even in hospitals represented a step forward. However, in contexts where it is necessary an absolutely sterile environment, such as in the operating room, the touch screen does not represent a definitive solution. For this reason, the use of touchless technology in a medical context is motivated by the need to have aseptic interactions with the computer systems and offers the advantage of greater simplicity and intuitiveness of use. The system presented in this paper is based on the use of the Microsoft Kinect as input sensor for the detection of the user’s hand movements. The idea is to create an interaction modality that permits doctors to interact with the patient's data without contact with any device but only moving the hand in the free space. The interaction is based on the movements of only one hand and specific operations are associated to pertinent gestures. The system is able to let you to browse a list of patients and pick up one of these, refer to his data, display the medical images and interact with these in terms of translation and zooming in/out in order to highlight some specific details of the image.

Cite

CITATION STYLE

APA

De Paolis, L. T. (2016). A touchless gestural platform for the interaction with the patients data. In IFMBE Proceedings (Vol. 57, pp. 874–878). Springer Verlag. https://doi.org/10.1007/978-3-319-32703-7_172

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free