The procedure of orthopaedic surgery is quite complicated, and many kinds of equipment have been used. Operating room nurses who deliver surgical instruments to surgeon are supposed to be forced to incur a heavy burden. This study aims to offer a computer-aided orthopaedic surgery (CAOS)-AI navigation system, which assists operating room nurses by suggesting the current progress of the procedure and expected surgical instruments. This paper proposes a method for recognizing the current phase of orthopaedic procedures from surgeon-wearable video camera images. The method plays the fundamental role of CAOS-AI navigation system. The proposed method is based on a convolutional-long short-term memory (LSTM) network. We also investigate the efficient CNN model in some competitive models such as VGG16, DenseNet, and ResNet to improve the recognition accuracy. Experimental results in unicomapartmenatal knee arthroplasty (UKA) surgeries showed that the proposed method achieved a phase recognition accuracy with 48.2%, 41.2%, and 53.6% using VGG16, DenseNet, and ResNet, respectively.
CITATION STYLE
NISHIO, S., HOSSAIN, B., NII, M., YAGI, N., HIRANAKA, T., & KOBASHI, S. (2020). Surgical Phase Recognition with Wearable Video Camera for Computer-aided Orthopaedic Surgery-AI Navigation System. International Journal of Affective Engineering, 19(2), 137–143. https://doi.org/10.5057/ijae.ijae-d-19-00018
Mendeley helps you to discover research relevant for your work.