VR-Caps: A Virtual Environment for Capsule Endoscopy

Kağan İncetan, Institute of Biomedical Engineering, Bogazici University, Istanbul, Turkey.
Ibrahim Omer Celik, Department of Computer Engineering, Bogazici University, Istanbul, Turkey.
Abdulhamid Obeid, Institute of Biomedical Engineering, Bogazici University, Istanbul, Turkey.
Guliz Irem Gokceler, Institute of Biomedical Engineering, Bogazici University, Istanbul, Turkey.
Kutsev Bengisu Ozyoruk, Institute of Biomedical Engineering, Bogazici University, Istanbul, Turkey.
Yasin Almalioglu, Computer Science Department, University of Oxford, Oxford, UK.
Richard J. Chen, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA; Department of Biomedical Informatics, Harvard Medical School, Boston, MA, USA.
Faisal Mahmood, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA; Cancer Data Science, Dana Farber Cancer Institute, Boston, MA, USA; Cancer Program, Broad Institute of Harvard and MIT, Cambridge, MA, USA.
Hunter Gilbert, Deparment of Mechanical and Industrial Engineering, Louisiana State University, Baton Rouge, LA USA.
Nicholas J. Durr, Department of Biomedical Engineering, Johns Hopkins University (JHU), Baltimore, MD, USA.
Mehmet Turan, Institute of Biomedical Engineering, Bogazici University, Istanbul, Turkey. Electronic address: mehmet.turan@boun.edu.tr.

Abstract

Current capsule endoscopes and next-generation robotic capsules for diagnosis and treatment of gastrointestinal diseases are complex cyber-physical platforms that must orchestrate complex software and hardware functions. The desired tasks for these systems include visual localization, depth estimation, 3D mapping, disease detection and segmentation, automated navigation, active control, path realization and optional therapeutic modules such as targeted drug delivery and biopsy sampling. Data-driven algorithms promise to enable many advanced functionalities for capsule endoscopes, but real-world data is challenging to obtain. Physically-realistic simulations providing synthetic data have emerged as a solution to the development of data-driven algorithms. In this work, we present a comprehensive simulation platform for capsule endoscopy operations and introduce VR-Caps, a virtual active capsule environment that simulates a range of normal and abnormal tissue conditions (e.g., inflated, dry, wet etc.) and varied organ types, capsule endoscope designs (e.g., mono, stereo, dual and 360 camera), and the type, number, strength, and placement of internal and external magnetic sources that enable active locomotion. VR-Caps makes it possible to both independently or jointly develop, optimize, and test medical imaging and analysis software for the current and next-generation endoscopic capsule systems. To validate this approach, we train state-of-the-art deep neural networks to accomplish various medical image analysis tasks using simulated data from VR-Caps and evaluate the performance of these models on real medical data. Results demonstrate the usefulness and effectiveness of the proposed virtual platform in developing algorithms that quantify fractional coverage, camera trajectory, 3D map reconstruction, and disease classification. All of the code, pre-trained weights and created 3D organ models of the virtual environment with detailed instructions how to setup and use the environment are made publicly available at https://github.com/CapsuleEndoscope/VirtualCapsuleEndoscopy and a video demonstration can be seen in the supplementary videos (Video-I).