We propose a novel paradigm for clinical diagnostic software using a mobile multi-touch device for user interaction and dedicated monitors for image display. We show a demonstrator implementing a workflow-based breast MRI reading system tailored to multi-touch interaction. The demonstrator explores the feasibility of touch interaction for diagnostic reading of MRI patient cases. We show a patient-centric, workflow-oriented concept that is arranged around a multi-touch capable hybrid input-output device. In this contribution we introduce clinically useful concepts of the demonstrator. Firstly, a mechanism that we dubbed location awareness takes care of security issues. Reading is supported by (1) a patient browser with graphical patient history and cancer risk factors; (2) a workflow concept using hanging protocols; (3) dedicated ROI definition, annotation, and measurement tools using multi-touch gestures. Gesture concepts and interaction paradigms are introduced for intuitive user experiences while maintaining accuracy. © 2012 Springer-Verlag Berlin Heidelberg.
CITATION STYLE
Harz, M., Ritter, F., Benten, S., Schilling, K., & Peitgen, H. O. (2012). A novel workflow-centric breast MRI reading prototype utilizing multitouch gestures. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7361 LNCS, pp. 276–283). https://doi.org/10.1007/978-3-642-31271-7_36
Mendeley helps you to discover research relevant for your work.