Following the shift from time-based medical education to a competency-based approach, a computer-assisted training platform would help relieve some of the new time burden placed on physicians. A vital component of these platforms is the computation of competency metrics which are based on surgical tool motion. Recognizing the class and motion of surgical tools is one step in the development of a training platform. Object detection can achieve tool recognition. While previous literature has reported on tool recognition in minimally invasive surgeries, open surgeries have not received the same attention. Open Inguinal Hernia Repair (OIHR), a common surgery that general surgery residents must learn, is an example of such surgeries. We present a method for object detection to recognize surgical tools in simulated OIHR. Images were extracted from six video recordings of OIHR performed on phantoms. Tools were labelled with bounding boxes. A YOLOV3 object-detection model was trained to recognize the tools used in OIHR. The Average Precision scores per class and the mean Average Precision (mAP) were reported to benchmark the model’s performance. The mAP of the tool classes was 0.61, with individual Average Precision scores reaching up to 0.98. Tools with poor visibility or similar shapes such as the forceps, or scissors achieved lower precision scores. With an object detection network that can identify tools, research can be done on tissue-tool interactions to achieve workflow recognition. Workflow recognition would allow a training platform to detect the tasks performed in hernia repair surgeries.
As medical education adopts a competency-based training approach, assessment of skills and timely provision of formative feedback is required. Provision of such assessment and feedback places a substantial time burden on surgeons. To reduce this time burden, we look to develop a computer-assisted training platform to provide both instruction and feedback to residents learning open Inguinal Hernia Repairs (IHR). To provide feedback on residents’ technical skills, we must first find a method of workflow recognition of the IHR. We thus aim to recognize and distinguish between workflow steps of an open IHR based on the presence and frequencies of different tool-tissue interactions occurring during each step. Based on ground truth tissue segmentations and tool bounding boxes, we identify the visible tissues within a bounding box. This provides an estimation of which tissues a tool is interacting with. The presence and frequencies of the interactions during each step are compared to determine whether this information can be used to distinguish between steps. Based on the ground truth tool-tissue interactions, the presence and frequencies of interactions during each step in the IHR show clear, distinguishable patterns. In conclusion, due to the distinct differences in the presence and frequencies of the tool-tissue interactions between steps, this offers a viable method of step recognition of an open IHR performed on a phantom.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.