Abstract
Background
We applied a new concept of “image overlay surgery” consisting of the integration of virtual reality (VR) and augmented reality (AR) technology, in which dynamic 3D images were superimposed on the patient’s actual body surface and evaluated as a reference for surgical navigation in gastrointestinal, hepatobiliary and pancreatic surgery.
Methods
We carried out seven surgeries, including three cholecystectomies, two gastrectomies and two colectomies. A Macintosh and a DICOM workstation OsiriX were used in the operating room for image analysis. Raw data of the preoperative patient information obtained via MDCT were reconstructed to volume rendering and projected onto the patient's body surface during the surgeries. For accurate registration, OsiriX was first set to reproduce the patient body surface, and the positional coordinates of the umbilicus, left and right nipples, and the inguinal region were fixed as physiological markers on the body surface to reduce the positional error.
Results
The registration process was non-invasive and markerlesss, and was completed within 5 min. Image overlay navigation was helpful for 3D anatomical understanding of the surgical target in the gastrointestinal, hepatobiliary and pancreatic anatomies. The surgeon was able to minimize movement of the gaze and could utilize the image assistance without interfering with the forceps operation, reducing the gap from the VR. Unexpected organ injury could be avoided in all procedures. In biliary surgery, the projected virtual cholangiogram on the abdominal wall could advance safely with identification of the bile duct. For early gastric and colorectal cancer, the small tumors and blood vessels, which usually could not be found on the gastric serosa by laparoscopic view, were simultaneously detected on the body surface by carbon dioxide-enhanced MDCT. This provided accurate reconstructions of the tumor and involved lymph node, directly linked with optimization of the surgical procedures.
Conclusions
Our non-invasive markerless registration using physiological markers on the body surface reduced logistical efforts. The image overlay technique is a useful tool when highlighting hidden structures, giving more information.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Introduction
Surgical navigation, which utilizes virtual reality (VR) for assisting surgical procedures, similar to the concept of automobile navigation, is attracting much attention. However, when a surgeon consults an image displayed on a monitor, a distance emergess from the actual visual field of the patient’s operative field; hence, a technology for dynamic 3D images that fuses together the actual and the virtual space has become necessary. Augmented reality (AR) is the superimposition of VR reconstructions onto a real patient's images, in real time [1]. This results in the visualization of internal structures through overlying tissues, providing a virtual transparent vision of surgical anatomy [2]. Although its application is in a preliminary stage, convincing evidence has been found showing that it is an effective teaching tool for training residents according to Shuhaiber’s report [3]. Its widespread use and the universal transfer of such technology remain limited until there is a better understanding of registration and ergonomics. Mixed reality (MR) was defined by Milgram et al. [4] in 1994 as anywhere between the extreme of the virtuality continuum (VC) where the VC extends from the completely real through to the completely virtual environment with AR and augmented virtuality ranging in between. MR refers to the merging of real and virtual fields to produce new environments and visualizations where physical and digital objects co-exist and interact in real time.
We have applied a new concept of “image overlay surgery” consisting of the integration of VR, AR and MR technology, in which computer-generated dynamic 3D images of a CG video were superimposed on the actual space in front of the surgeon, on the patient’s operative field or the surface of the abdomen, and evaluated such a system as a reference for surgical navigation using OsiriX software (OsiriX Foundation, Switzerland) [5–7]. In this paper, we evaluated the usefulness of image-assisted surgery using image overlay, processed via the OsiriX system in laparoscopic surgery, natural orifice translumenal endoscopic surgery (NOTES) and the hybrid procedures.
Subjects and methods
We carried out seven gastrointestinal and hepatopancreato-biliary surgeries, including three cholecystectomies, two gastrectomies and two colectomies, at the Department of Surgery, Teikyo University, Chiba Medical Center, Chiba Japan. For surgery, an 8-core Mac Pro and MacBook Pro (Apple Inc., CA, USA) were placed in the operating theater, as well as a DICOM workstation OsiriX (OsiriX Fundation, Geneva, Switzerland), and were employed for image analysis. Raw data obtained via a multidetector-row MDCT LightSpeed Ultra 16 scanner (GE Healthcare. Waukesha, WI; 500-ms rotation time, 0.625-mm slice thickness and mean scanning time of approximately 24 s) were used for preoperative patient information. Images were constructed by multidimensional real-time image reconstruction, via OsiriX processing in the operating theater.
With the information obtained from conventional MDCT scans, it is difficult to simultaneously display the stomach, large intestines, small intestines, etc., with the courses of the blood vessels, owing to the characteristics of the imaging technique. For accurate image overlay projection of the abdominal organs on the body surface, the target organ and blood vessels must be simultaneously displayed for image assistance. We have devised a method to overcome this imaging deficiency, injecting carbon dioxide into the gastrointestinal tract and pancreatobiliary duct, and at the same time administering a rapid intravenous injection of contrast agent, then scanning, enabling the simultaneous display of the gastrointestinal tract and pancreatobiliary duct with associated blood vessels. Setting an appropriate Color Look Up Table (CLUT) in OsiriX, we made the skin of the image transparent, and the display showed the patient’s intraperitoneal anatomy (Fig. 1).
The image overlay technique was carried out as follows. Volume rendering of the voxel data was projected onto the patient’s body surface from an EMP-1715 projector system (Epson, Tokyo, Japan) located above the operating table after calculating the distance and installation angle between the operating table and the projector and correcting the projection angle. For registration, OsiriX was first set to reproduce the patient virtual body surface mode, and the positional coordinates of the umbilicus, left and right nipples, and inguinal region were fixed as physiological markers on the body surface (Fig. 2). Then the bone mode was set, and the xiphoid process in the epigastric region, the left and right costal arches, iliac crests, and pubic bones were superimposed over the corresponding areas in the patient to reduce the positional error. During surgery, we processed 3D volume-rendering reconstructions of the gastrointestinal tract, hepatobiliary pancreatic duct, blood vessels and soft tissue, and projected them on the patient’s abdomen. Registration was done with consistent anatomical landmarks as viewed visually and with 3D MDCT imaging (Fig. 3).
Results
This registration process was non-invasive and completed within 5 min. By virtue of its excellent functionality and user-friendliness, not only surgeons and assistants, but also other staff could easily consult 3D image navigation using OsiriX during surgery. Taking account of the dissociation between the field of vision and the actual movement, an idiosyncratic feature of endoscopic surgery, we arranged the monitors side by side, with one displaying the laparoscopic image and the other displaying the virtual 3D image, with both deployed within the surgeon’s field of vision in the actual surgical setting (Fig. 4). The surgeon was able to minimize gaze movement and could utilize the image assistance without interfering with operation of the forceps, reducing the gap from the VR. By further extending this technology, we projected the 3D information simultaneously onto the patient’s abdomen, producing a new visual field. The differences of perceived position between real and virtual organs can be made negligible (within 10 mm on average) by correcting the sizes of the virtual objects that can be seen identically to the real object.
The presented method provided an easy way to determine safety margins related to patient setup errors upon registration of bony anatomy. The important role of the volume-rendered image was clearly demonstrated. In particular, by accurately reproducing the anatomical positions of the target organ and the abdominal wall, unexpected organ injury could be avoided when the port was inserted (Fig. 5). In case of re-operation, there were local adhesions behind the abdominal wall. In such a case the use of the image overlay could alter the unfortunate planned port positioning (Fig. 6). The operative times for the cases were shorter compared with standard cases without the use of the image overlay.
Our new system incorporating port placement planning and intraoperative navigation in minimally invasive surgery was established to aid the operative workflow. A significant reduction of operation time by improved planning and intraoperative support is anticipated. There were significantly fewer intra-operative injuries and less bleeding encountered with the use of the image overlay. After laparoscopic insufflation, the loss of registration was seen within 5 mm; there were no problems with the navigation and surgical procedures.
In laparoscopic biliary surgery with image overlay navigation, we projected a reconstructed image of the virtual cholangiography on the abdominal wall. This allowed the anatomy of the gallbladder, cystic duct and common bile duct around Carot’s triangle to be predicted, and the procedure could advance safely from identification of the cystic duct to detachment. The virtual cholangiography with surrounding vessel anatomy on the abdominal wall reduced the number of ports and allowed safe cholecystectomy (Fig. 7).
For laparoscopic gastric and colorectal surgery in particular, when carbon dioxide was injected into the stomach or colon before MDCT scanning, the alimentary tracts and blood vessels were simultaneously reconstructed using OsiriX, and this was used as the reference of the image overlay navigation, displayed on the monitor and simultaneously on the patient’s abdomen. These provided accurate reconstructions of the site of the malignant tumors, and the involved lymph nodes were directly linked with optimization of the surgical procedures and techniques. Viewings of the abdominal vascular system were effective for lymph node dissection along the arteries (Fig. 4). For patients with early stage gastric or colonic cancer, the minimally invasive laparoscopic surgery is well adapted. By distending the whole gastric cavity by intragastric injection of carbon dioxide, the small early stage tumors, which usually could not be found from outside of the serosa, were detected overall through the transparent stomach (Fig. 8).
Discussion
As surgical procedures become less invasive, the use of endoscopic surgery has become widespread, and avoiding problems such as incidental events and complications is regarded as a high priority for improving the safety of endoscopic surgery. In minimally invasive laparoscopic surgery and NOTES, because the surgeon is likely to have less tactile feedback than in the open surgical approach, image assistance can be increasingly helpful for three-dimensional anatomical understanding of the surgical target.
There is great demand for navigational guidance to assist the surgical procedure in the special environment of endoscopic surgery. However, since existing commercially available systems are expensive and difficult to operate, and require a lengthy time for registration, a methodology to overcome these problems is much needed. Preoperative recognition together with intraoperative image consultation are also important because there are numerous variations and anomalies, such as in the biliary tract or vascular system, and they are effective for avoiding these injuries and damage to other organs. The enhanced intraoperative orientation possibilities may lead to a further decrease in operation time and have the continuing ability to improve quality [8]. Potential advantages of the use of image overlay in general surgery include the delineation of dissection planes or resection margins and the avoidance of injury to invisible structures. Image overlay has the potential to facilitate the performance of radical surgical therapy by minimizing dissection and resection of neighboring tissues and organs [2]. Image overlay has been used in interventions [9–12] and widely applied to neurosurgery [13–15], which has the clear advantage of minimum organ motion in a relatively fixed surgical field within a bony reference, facilitating the registration of the virtual image on the real data. In contrast, the deformation of abdominal organs due to the heartbeat, ventilation or laparoscopic insufflation has limited the application of AR in general surgery [16, 3], laparoscopic and endoscopic surgery [2, 17–19].
Recently, Nakamoto et al. [20] reported a new method of 3D ultrasound-based navigation in laparoscopic liver surgery. They described a calibration method for intraoperative magnetic distortion using a magneto-optic hybrid tracker that can be applied to laparoscopic 3D US data acquisition. It suggested that their proposed method could correct for magnetic field distortion inside the patient’s abdomen within a clinically permissible period of time, as well as enabling an accurate 3D US reconstruction to be obtained that can be superimposed onto live endoscopic images.
Compared with the utility and costs associated with conventional AR or MR equipment, OsiriX was distributed free of charge, and being based on a commercially available personal computer, the cost is reasonable. OsiriX is a highly functional DICOM viewer specifically designed for the Mac OS X and is distributed as a free software. It can be installed free of charge by linking to the OsiriX and Apple Inc. web site (http://osirix-viewer.com/). When image overlay technology was applied to surgical assistance, massive amounts of patient volume data were required, as was an environment that was capable of real-time volume rendering in the client terminal alone. OsiriX was an ideal software package to achieve this end: it is simple to operate, and images can be easily reconstructed on a commercially available Macintosh computer. It was exceptionally easy to use, permitting various high level functionalities with minimal user training. We have undertaken research aimed at developing systems that do not require patient immobilization and enable the registration procedure to be carried out in a short time. We have accordingly devised a method in which volume data for patients acquired via preoperative MDCT are reconstructed during surgery using OsiriX software in VR 3D images, which are directly projected optically on the patient’s abdomen or operative field. This OsiriX image overlay method is highly practical from the perspective of user-friendliness. When we consider that surgical operations to date have proceeded from the surgeon’s knowledge, experience and the images in his or her head, the advantage of utilizing image overlay technology in gastroenterological surgery is great, even when we take into account the effects of minor errors caused by positional detection or respiratory changes.
Image registration is one of the most important problems in image overlay systems. Registration using natural features has many advantages [21–25]. It is simple, as no predefined fiducials or markers are used for registration. It is robust, because it remains effective as long as at least six natural features are tracked during the entire augmentation, and the existence of the corresponding projective matrices in the live video is guaranteed. Virtual objects can still be superimposed on the specified areas, even if some parts of the areas are occluded during the entire process. Markerless registration is a new procedure that may reduce logistical efforts and possibly also the radiation load on the patients prior to a computer-assisted intervention. Tracking is also a very important research subject in a real-time AR context. The main requirements for trackers are high accuracy and low latency at a reasonable cost. For our non-invasive registration, we used physiological markers on the body surface. At first OsiriX was set to reproduce patient body surface mode, and the positional coordinates of the umbilicus, left and right nipples, and the inguinal region were fixed; then the bone mode was set, and the xiphoid process in the epigastric region, the left and right costal arches, iliac crests, and pubic bones were superimposed over the corresponding areas in the patient to reduce the positional error. Our registration process was completed within a short time of about 5 min.
To further reduce the invasiveness of peritoneal access, the next logical step is to eliminate the abdominal wall incision using natural orifices [26–28]. Widespread use of these NOTES techniques will depend on providing the physician with adequate visual feedback, clear indicators of instrument location and orientation, and support in the recognition of anatomic structures. NOTES is another opportunity for image overlay guidance. To resolve these limitations, the image overlay technique may be a useful tool. It is anticipated that such augmentation will make interventional techniques easier to master and use in practice, and thus make them more likely to be widely adopted [29]. Some have reported endoscopic ultrasound-guided navigation for NOTES. The system reduced the mental burden of probe navigation and enhanced the operator’s ability to interpret the ultrasound image. Using an initial rigid body registration, the misregistration error between the ultrasound image and the reformatted CT plane was less than 5 mm, which is sufficient to enable the performance of novice users of endoscopic systems to approach that of expert users [30]. Highlighting hidden structures or overlays in the endoscope will give more information in difficult situations and enhance the operation quality.
This system would improve safety and decrease operating time, but a problem still remains, since this is little more than a projected image of a preoperative X-ray that will not change during the course of an operation. The surgical anatomy of the patient will be being continuously moved and altered, and the rigid image will rapidly become untrue. Particularly our technology may be useful for planning port placement at the beginning of laparoscopic surgery, when the sensing technology to detect the operative deformity and breath motion would be developed in the clinical situation; it could allow more accurate correspondence between the actual anatomy and navigation.
Conclusions
In the pursuit of the safety and minimal invasiveness of surgical procedures, the demand for development of surgical devices and surgery assistance systems is extremely high, and a system that permits effective visual assistance during surgery would be ideal. The converse situation of having rigorous positional accuracy and safety in the medical arena is that even if the data are superimposed, they are not superior in image quality terms, but they may be tolerated if the data are useful. Therefore, our simple image overlay system by OsiriX that takes into consideration the technological and cost aspects, and the installation setting, has the potential to contribute greatly to the reproduction of surgical plans prepared before surgery and the reduction of unexpected events during surgery. Further research is needed to evaluate its long-term clinical impact on patients, surgeons and hospital administrators. We anticipate that this system will continue to further evolve for the application of medical care in the near future, reducing invasiveness and improving quality of life for the patients and enhancing medical education, among other uses.
References
Tang SL, Kwoh CK, Teo MY, Sing NW, Ling KV. Augmented reality systems for medical applications. IEEE Eng Med Biol Mag. 1998;17:49–58.
Marescaux J, Rubino F, Arenas M, Mutter D, Soler L. Augmented-reality-assisted laparoscopic adrenalectomy. JAMA. 2004;292:2214–5.
Shuhaiber JH. Augmented reality in surgery. Arch Surg. 2004;139:170–4.
Milgram P, Kishino AF. Taxonomy of mixed reality visual displays. IEICE Trans Inform Syst 1994; E77-D:1321–9.
Rosset A, Spadola L, Ratib O, Osiri X. An open-source software for navigating in multidimensional DICOM images. J Digit Imaging. 2004;17:205–16. (Epub ahead of print).
Rosset C, Rosset A, Ratib O. General consumer communication tools for improved image management and communication in medicine. J Digit Imaging. 2005;18:270–9.
Rosset A, Spadola L, Pysher L, Ratib O. Informatics in radiology (infoRAD): navigating the fifth dimension: innovative interface for multidimensional multimodality image navigation. Radiographics. 2006;26:299–308.
Bauernschmitt R, Feuerstein M, Traub J, Schirmbeck EU, Klinker G, Lange R. Optimal port placement and enhanced guidance in robotically assisted cardiac surgery. Surg Endosc. 2007;21:684–7. (Epub 2006 Dec 16).
Fichtinger G, Deguet A, Masamune K, Balogh E, Fischer GS, Mathieu H, et al. Image overlay guidance for needle insertion in CT scanner. IEEE Trans Biomed Eng. 2005;52:2–1415.
Das M, Sauer F, Schoepf UJ, Khamene A, Vogt SK, Schaller S, et al. Augmented reality visualization for CT-guided interventions: system description, feasibility, and initial evaluation in an abdominal phantom. Radiology. 2006;240:230–5. (Epub 2006 May 23).
Nicolau SA, Pennec X, Soler L, Ayache N. A complete augmented reality guidance system for liver punctures: first clinical evaluation. Med Image Comput Comput Assist Interv Int Conf Med Image Comput Comput Assist Interv. 2005;8:539–47.
Deutschmann H, Steininger P, Nairz O, Kopp P, Merz F, Wurstbauer K, et al. “Augmented reality” in conventional simulation by projection of 3-D structures into 2-D images: a comparison with virtual methods. Strahlenther Onkol. 2008;184:93–9.
Iseki H, Masutani Y, Iwahara M, Tanikawa T, Muragaki Y, Taira T, et al. Volumegraph (overlaid three-dimensional image-guided navigation): clinical application of augmented reality in neurosurgery. Stereotact Funct Neurosurg. 1997;68:18–24.
Masutani Y, Dohi T, Yamane F, Iseki H, Takakura K. Augmented reality visualization system for intravascular neurosurgery. Comput Aided Surg. 1998;3:239–47.
Lovo EE, Quintana JC, Puebla MC, Torrealba G, Santos JL, Lira IH, et al. A novel, inexpensive method of image coregistration for applications in image-guided surgery using augmented reality. Neurosurgery. 2007;60:366–71. discussion 371–2.
Ackerman JD, Keller K, Fuchs H. Real-time anatomical 3D image extraction for laparoscopic surgery. Stud Health Technol Inform. 2001;81:18–22.
Kawamata T, Iseki H, Shibasaki T, Hori T. Endoscopic augmented reality navigation system for endonasal transsphenoidal surgery to treat pituitary tumors: technical note. Neurosurgery. 2002;50:1393–7.
Falk V, Mourgues F, Vieville T, Jacobs S, Holzhey D, Walther T, et al. Augmented reality for intraoperative guidance in endoscopic coronary artery bypass grafting. Surg Technol Int. 2005;14:231–5.
Ukimura O, Gill IS. Imaging-assisted endoscopic surgery: Cleveland clinic experience. J Endourol. 2008;22:803–10.
Nakamoto M, Nakada K, Sato Y, Konishi K, Hashizume M, Tamura S. Intraoperative magnetic tracker calibration using a magneto-optic hybrid tracker for 3-D ultrasound-based navigation in laparoscopic surgery. IEEE Trans Med Imaging. 2008;27:255–70.
Yuan ML, Ong SK, Nee AY. Registration using natural features for augmented reality systems. IEEE Trans Vis Comput Graph. 2006;12:569–80.
Marmulla R, Lüth T, Mühling J, Hassfeld S. Markerless laser registration in image-guided oral and maxillofacial surgery. J Oral Maxillofac Surg. 2004;62:845–51.
Comport AI, Marchand E, Pressigout M, Chaumette F. Real-time markerless tracking for augmented reality: the virtual visual serving framework. IEEE Trans Vis Comput Graph. 2006;12:615–28.
Marmulla R, Mühling J, Wirtz CR, Hassfeld S. High-resolution laser surface scanning for patient registration in cranial computer-assisted surgery. Minim Invasive Neurosurg. 2004;47:72–8.
Uenohara M, Kanade T. Vision-based object registration for real-time image overlay. Comput Biol Med. 1995;25:249–60.
Sugimoto M, Yasuda H, Koda K, Suzuki M, Yamazaki M, Tezuka T, et al. Evaluation for transvaginal and transgastric NOTES cholecystectomy in human and animal natural orifice translumenal endoscopic surgery. J Hepatobiliary Pancreat Surg. 2009;16:255–60. (Epub 2009 Apr 10).
Sugimoto M. Natural orifice translumenal endoscopic surgery (NOTES) for innovation in hepatobiliary and pancreatic surgery: preface. J Hepatobiliary Pancreat Surg. 2009;16:247–8. (Epub 2009 Apr 14).
Sugimoto M, Yasuda H, Koda K, Suzuki M, Yamazaki M, Tezuka T, et al. Rendezvous gastrotomy technique using direct percutaneous endoscopic gastrostomy for transgastric cholecystectomy in hybrid natural orifice translumenal endoscopic surgery. J Hepatobiliary Pancreat Surg. 2009. doi:10.1007/s00534-009-0143-1. (Epub 2009 Jul 15).
Vosburgh KG, San José Estépar R. Natural orifice transluminal endoscopic surgery (NOTES): an opportunity for augmented reality guidance. Stud Health Technol Inform. 2007;125:485–90.
Estépar RS, Stylopoulos N, Ellis R, Samset E, Westin CF, Thompson C, et al. Towards scarless surgery: an endoscopic ultrasound navigation system for transgastric access procedures. Comput Aided Surg. 2007;12:311–24.
Acknowledgments
This study was partially supported by a grant-in-aid from the Scientific Research Japan Society for the Promotion of Science, a JFE grant from the Japanese Foundation for Research and Promotion of Endoscopy, and the Teikyo University Tomoko Fujii memorial research grant for young medical researchers.
Author information
Authors and Affiliations
Corresponding author
About this article
Cite this article
Sugimoto, M., Yasuda, H., Koda, K. et al. Image overlay navigation by markerless surface registration in gastrointestinal, hepatobiliary and pancreatic surgery. J Hepatobiliary Pancreat Sci 17, 629–636 (2010). https://doi.org/10.1007/s00534-009-0199-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00534-009-0199-y