Automated segmentation of surgical motion for performance analysis and feedback

5Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Advances in technology have motivated the increasing use of virtual reality simulation-based training systems in surgical education, as well as the use of motion capture systems to record surgical performance. These systems have the ability to collect large volumes of trajectory data. The capability to analyse motion data in a meaningful manner is valuable in characterising and evaluating the quality of surgical technique, and in facilitating the development of intelligent self-guided training systems with automated performance feedback. To this end, we propose an automatic trajectory segmentation technique, which divides surgical tool trajectories into their component movements according to spatio-temporal features. We evaluate this technique on two different temporal bone surgery tasks requiring the use of distinct surgical techniques and show that the proposed approach achieves higher accuracy compared to an existing method.

Author supplied keywords

Cite

CITATION STYLE

APA

Zhou, Y., Ioannou, I., Wijewickrema, S., Bailey, J., Kennedy, G., & O’Leary, S. (2015). Automated segmentation of surgical motion for performance analysis and feedback. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9349, pp. 379–386). Springer Verlag. https://doi.org/10.1007/978-3-319-24553-9_47

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free