We developed a contactless interface that exploits hand gestures to effectively control medical images in the operating room. We developed an in-house program called GestureHook that exploits message hooking techniques to convert gestures into specific functions. For quantitative evaluation of this program, we used gestures to control images of a dynamic biliary CT study and compared the results with those of a mouse (8.54 ± 1.77 s to 5.29 ± 1.00 s; p < 0.001) and measured the recognition rates of specific gestures and the success rates of tasks based on clinical scenarios. For clinical applications, this program was set up in the operating room to browse images for plastic surgery. A surgeon browsed images from three different programs: CT images from a PACS program, volume-rendered images from a 3D PACS program, and surgical planning photographs from a basic image viewing program. All programs could be seamlessly controlled by gestures and motions. This approach can control all operating room programs without source code modification and provide surgeons with a new way to safely browse through images and easily switch applications during surgical procedures.
CITATION STYLE
Park, B. J., Jang, T., Choi, J. W., & Kim, N. (2016). Gesture-Controlled Interface for Contactless Control of Various Computer Programs with a Hooking-Based Keyboard and Mouse-Mapping Technique in the Operating Room. Computational and Mathematical Methods in Medicine, 2016. https://doi.org/10.1155/2016/5170379
Mendeley helps you to discover research relevant for your work.