Fusion of inertial measurements and vision feedback for microsurgery

1Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A microsurgery system that achieves real-time enhanced micrometer scale positioning accuracy by fusing visual information from a high speed monovision camera mounted on an optical surgical microscope and acceleration measurements from an intelligent handheld instrument, ITrem2, is presented. The high speed camera captures images of the tool tip of ITrem2 to track its position in real-time. The focus value of the tool tip in the acquired image is used to locate the tool tip along the principal axis of the objective lens of the microscope and edge based geometric template matching gives the position in pixel coordinates. ITrem2 utilizes four dual-axis miniature digital MEMS accelerometers to sense and update the motion information. The system has a first in, first out (FIFO) queue to track the recent history of the slow non-drifting position estimation from the vision system and acceleration readings from the inertial sensors together with their respective time stamps. In the proposed method, real-time visual servoing of micrometer scale motion is achieved by taking into account the dynamic behavior of the vision feedback and incorporating synchronized fusion of these complementary sensors. © 2013 Springer-Verlag.

Cite

CITATION STYLE

APA

Aye, Y. N., Zhao, S., Shee, C. Y., & Ang, W. T. (2013). Fusion of inertial measurements and vision feedback for microsurgery. In Advances in Intelligent Systems and Computing (Vol. 194 AISC, pp. 27–35). Springer Verlag. https://doi.org/10.1007/978-3-642-33932-5_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free