An Autonomous Intervention Strategy for Robotic Soft Endoscope Guidedby Anatomical Features
JIANG Wei1,2,3, WANG Chongyang1,2,4, YAN Bin5, HE Xiao1,2,4, CUI Huanqi1,2, PENG Lihua5, YANG Yunsheng5, LIU Hao1,2,3,4
1. State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang 110016, China; 2. Institutes for Robotics and Intelligent Manufacturing, Chinese Academy of Sciences, Shenyang 110169, China; 3. University of Chinese Academy of Sciences, Beijing 100049, China; 4. Key Laboratory of Minimally Invasive Surgical Robot, Liaoning Province, Shenyang 110016, China; 5. Department of Gastroenterology and Hepatology, Chinese PLA General Hospital, Beijing 100853, China
Abstract:An autonomous intervention strategy for soft endoscope in upper digestive tract is studied based on the previously designed YunSRobot endoscopic robot. The anatomical features of digestive tract are extracted from the endoscopic images based on the Faster-RCNN (faster-regions with convolutional neural network feature) algorithm and mathematical morphology image processing. Then an autonomous endoscope manipulation and orientation strategy is designed to perform autonomous endoscope intervention in the upper digestive tract. Comparative experiments are carried out on a highly simulated phantom of upper digestive tract, and the results show that the intervention success rate of the proposed method is 100%. Compared with the master-slave control based intervention by the unprofessional operator, the average intervention duration of the autonomous intervention is reduced from 262.01 s to 197 s, and the maximum insertion force is decreased from 11.8 N to 9.6 N. Compared with master-slave control based intervention by the professional operator, the maximum insertion force is almost the same, and the average intervention duration is only 34.33 s longer. The autonomous intervention strategy proposed also can be applied to soft endoscope intervention in other natural orifice such as respiratory and urinary tracts.
[1] Yang G Z, Bellingham J, Dupont P E, et al.The grand challenges of Science Robotics[J]. Science Robotics, 2018, 3(14). DOI: 10.1126/scirobotics.aar7650. [2] Yang G Z, Cambias J, Cleary K, et al.Medical robotics-Regulatory, ethical, and legal considerations for increasing levels of autonomy[J]. Science Robotics, 2017, 2(4). DOI: 10.1126/ scirobotics.aam8638. [3] Shademan A, Decker R S, Opfermann J D, et al.Supervised autonomous robotic soft tissue surgery[J]. Science Translational Medicine, 2016, 8(337). DOI: 10.1126/scitranslmed.aad9398. [4] Murali A, Sen S, Kehoe B, et al.Learning by observation for surgical subtasks: Multilateral cutting of 3D viscoelastic and 2D orthotropic tissue phantoms[C]//IEEE International Conference on Robotics and Automation. Piscataway, USA: IEEE, 2015: 1202-1209. [5] Zhao Y, Guo S X, Wang Y X, et al.A CNN-based prototype method of unstructured surgical state perception and navigation for an endovascular surgery robot[J]. Medical & Biological Engineering & Computing, 2019, 57(9): 1875-1887. [6] Fagogenis G, Mencattelli M, Machaidze Z, et al.Autonomous robotic intracardiac catheter navigation using haptic vision[J]. Science Robotics, 2019, 4(29). DOI: 10.1126/scirobotics. aaw1977. [7] Boehler Q, Gage D S, Hofmann P, et al.REALITI: A robotic endoscope automated via laryngeal imaging for tracheal intubation[J]. IEEE Transactions on Medical Robotics and Bionics, 2020, 2(2): 157-164. [8] Reilink R, Stramigioli S, Misra S.Image-based flexible endoscope steering[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems. Piscataway, USA: IEEE, 2010: 2339-2344. [9] van der Stap N, Slump C H, Broeders I A M J, et al.Image-based navigation for a robotized flexible endoscope[C]//International Workshop on Computer-Assisted and Robotic Endoscopy. Cham, Switzerland: Springer, 2014: 77-87. [10] van der Stap N, Rozeboom E D, Pullens H J M, et al.Feasibility of automated target centralization in colonoscopy[J]. International Journal of Computer Assisted Radiology and Surgery, 2016, 11(3): 457-465 [11] 彭丽华, 刘浩, 杨云生, 等.软式内镜操控机器人YunSRobot在人体胃镜检查中的初步应用[J].中华医学杂志, 2018, 98(48):3963-3968. Peng L H, Liu H, Yang Y S, et al. A robot-assisted system YunSRobot for soft endoscopy: The first trial of upper gastrointestinal endoscopy on human volunteers[J]. National Medical Journal of China, 2018, 98(48): 3963-3968. [12] 李言民, 郝思文, 杨臻达, 等.主从式胃镜介入机器人系统[J].机器人, 2016, 38(1):107-114. Li Y M, Hao S W, Yang Z D, et al. Robot-assisted master-slave system for gastroscope intervention[J]. Robot, 2016, 38(1): 107-114. [13] Jiang W, Zhou Y Y, Yu T, et al.Interventional status awareness based manipulating strategy for robotic soft endoscopy[J]. International Journal of Robotics and Automation Technology, 2019, 6(1): 1-10. [14]14 Jiang W, Zhou Y, Wang C, et al.Navigation strategy for robotic soft endoscope intervention[J]. The International Journal of Medical Robotics and Computer Assisted Surgery, 2020, 16(2). DOI: 10.1002/rcs.2056. [15] Gulati S, Patel M, Emmanuel A, et al.The future of endoscopy: Advances in endoscopic image innovations[J]. Digestive Endoscopy, 2020, 32(4): 512-522. [16] 汪旭, 李昱骥, 周建平, 等.标准胃镜检查[M].沈阳:辽宁科学技术出版社, 2013:7-20. Wang X, Li Y J, Zhou J P, et al. Standard gastroscopy[M]. Shenyang: Liaoning Science and Technology Publishing House, 2013: 7-20. [17]17 Ren S Q, He K M, Girshick R, et al.Faster R-CNN: Towards real-time object detection with region proposal networks[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017, 39(6): 1137-1149.