Jasmine Winter Beatty, MBBS, BSc, MRCS1, Karen Kerr, PhD2, Imanol Luengo, PhD2, Petros Giatagas, PhD2, Danail Stoyanov, PhD2, James Kinross, PhD, FRCS1, Sanjay Purkayastha, MD, FRCS1. 1Imperial College London, 2Digital Surgery LTD
Objective of Digital Surgery’s Ecosystem: Surgery is delivered by surgeons with different skill and experience levels, trained through apprenticeship models with non-standardized methods, across different teams and hospitals. Innovations towards achieving competent, effective and efficient surgical teams are required on a global scale, to meet the demands of increased case volumes and safer surgery (Meara et al., 2015; WHA, 2015), whilst dealing with rapid technological advancements and reduced training opportunities (Simpson et al., 2011; Debas et al., 2005).
Digital Surgery’s technology: Building on their validated training software, Touch Surgery (Chidambaram et al., 2018; Bunogerane et al., 2018), Digital Surgery’s new ecosystem utilizes digital technologies and artificial intelligence (AI) to improve delivery of surgical care and patient outcomes. An essential component of this ecosystem is the use of a novel AI software that is able to analyze surgical procedures. This is used to develop best-practice surgical workflows to train the Operating Room (OR) team, optimize team performance, monitor and assess operative competence, and improve efficiency and safety.
The technology specifically:
– digitizes surgical processes – creating workflows of individual surgeons’ techniques;
– provides surgical training software – supporting learning before and during surgery in real-time;
– provides induction software to OR staff – offering advanced knowledge of operative phases and necessary equipment;
– facilitates communication and teamwork between OR staff;
– provides insights into surgical performance.
The ecosystem enables surgical teams to create bespoke workflows combining operative steps and equipment details, promoting surgical standardization and offering a unique tool to support the entire OR team.
Workflows are presented in the OR on two auxiliary screens for the lead surgeon and for the scrub nurse. The scrub nurse’s view concomitantly displays the instruments required for the current and the successive step, reducing delays waiting for equipment and waste from incorrect equipment being opened.
During early studies, progression through the workflow was controlled via a foot-pedal. An AI algorithm has since been developed, detecting operative steps in real-time from the laparoscopic video-feed and automatically progressing the information displayed, ensuring the workflow is synchronised with the procedure.
Preliminary results: Over 240,000 laparoscopic sleeve gastrectomy video frames were segmented to label operative steps and surgical instruments that were then used to train the AI algorithm.
This algorithm was used to analyze 100 laparoscopic videos to identify the different instruments associated with each operative step. The AI was then able to reproduce this both on Digital Surgery’s video-based platform (>70 cases); as well as in real-time in the OR at St Mary’s Hospital, London (>10 patients). The algorithm identified the correct operative step from the live laparoscopic video-feed and was able to display the appropriate workflow information with 90-95%% accuracy.
Conclusion: We believe this technology will allow the digitalisation of surgical procedures, leading to standardization, improved training, optimized efficiency and most importantly, increased safety and better outcomes for patients. The integration of AI within this intelligent OR, will further support surgical teams globally to more effectively meet the perpetual and increasing need for safer surgery.
Presented at the SAGES 2017 Annual Meeting in Houston, TX.
Abstract ID: 98899
Program Number: ET012
Presentation Session: Emerging Technology Session
Presentation Type: Podium