Surgical tool segmentation using a hybrid deep CNN-RNN auto encoder-decoder

Surgical tool segmentation is used for detection, tracking and pose estimation of the tools in the vicinity of surgical scenes. It is considered as an essential task in surgical phase recognition and flow identification. Surgical flow identification is an unresolved task in the domain of context-awa...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC) S. 3373 - 3378
Hauptverfasser: Attia, Mohamed, Hossny, Mohammed, Nahavandi, Saeid, Asadi, Hamed
Format: Tagungsbericht
Sprache:Englisch
Veröffentlicht: IEEE 01.10.2017
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Surgical tool segmentation is used for detection, tracking and pose estimation of the tools in the vicinity of surgical scenes. It is considered as an essential task in surgical phase recognition and flow identification. Surgical flow identification is an unresolved task in the domain of context-aware surgical systems, which is used extensively on computer assisted intervention (CAI). CAI is used for staff assignment, automated guidance during intervention, surgical alert systems, automatic indexing of surgical video databases and optimisation of the real-time scheduling of operating room. Semantic segmentation is used for accurate delineation of surgical tools from the background. In semantic segmentation, each label is assigned to a class as a tool or a background. In this presented work, we applied a hybrid method utilising both recurrent and convolutional networks to achieve higher accuracy of surgical tools segmentation. The proposed method is trained and tested using a public dataset MICCAI 2016 Endoscopic Vision Challenge Robotic Instruments dataset "EndoVis". We achieved better performance using the proposed method compared to state-of-the-art methods on the same dataset for benchmarking. We achieved a balanced accuracy of 93.3% and Jaccard index of 82.7%.
DOI:10.1109/SMC.2017.8123151