Keywords

1 Introduction

Robot-assisted surgery is a treatment method utilizing a surgery support system enabling safe and low-invasive surgery using 3D vision and forceps with 7 freedom degrees of joint. Robot-assisted radical prostatectomy (RARP) is performed in more than about 80% of radical prostatectomy (RP)-treated prostate cancer cases in the US. Surgery aiming at functional preservation including curability and urinary continence has been performed compared with the conventional surgical method, but complications, such as bleeding and anastomotic failure, occur during the learning curve early after introduction, being problematic. In addition, accurate conservation of nerves around the prostate is desired to retain the functions (urinary continence and sexual function) after surgery. On the other hand, for small kidney cancer, the importance of not only curability but also conservation of the renal function has been pointed out and partial nephrectomy has spread, but robot-assisted partial nephrectomy (RAPN) minimizing invasiveness of treatment and shortening the ischemic time has also spread. In these robot-assisted surgeries in the urology field, navigation may be a very important surgery support technique to increase the surgical accuracy.

2 Augmented Reality for Surgical Navigation

The technology to additionally present information in a reality environment perceptible for humans using a computer is termed Augmented Reality (AR). In the medical field, to visually recognize organs in the surgical field, for example, additional information not perceptible in reality is presented in real time on the endoscope screen by superimposing information not directly visible macroscopically or endoscopically, such as ‘the degree of tumor expansion in the organ’ and ‘tumor-feeding blood vessel distributing in the organ’, and this is considered a surgery support technique aiming at improvement of the objective of surgery (Marescaux et al. 2004). For the method to present additional information, new 3-dimensional images matching spatial arrangement are generally used, but commentary characters or sound related to the target may be used, and any means to augment information perceptible in reality may be used (Ukimura and Gill 2009a).

In surgery navigation, real-time superimposed display on the endoscope screen of additional information of the target as an object acquired by computer processing is required, for which real-time acquisition of spatial positional information of the target is necessary. For example, in car navigation, the present car position on the earth is identified utilizing an artificial satellite global-scale position measurement system termed Global Positioning System (GPS). Car movement is traced by the satellite in real time and the current reach and estimated time of arrival to several options of route to the destination set beforehand are displayed as additional information. For surgery navigation, it is necessary to prepare an organ tracking system to acquire information on the spatial position of the target in real time similar to the spatial position tracking system, satellite GPS.

3 Intraoperative Navigation in Robot-Assisted Urological Surgery

Transrectal ultrasound (TRUS) navigation is used in open and laparoscopic total prostatectomy, and its usefulness has been reported (Ukimura et al. 2006; Okihara et al. 2009). The Da Vinci system for RARP has a function to display ultrasound and CT/MRI images in the surgical field on the 3D endoscope screen, termed Tile Pro, and real-time ultrasound images can be easily provided to the surgical field through cable connection. Using this function, information on the accurate transected surface and posterior surface of the organ are provided (Fig. 38.1), facilitating navigation for safe and reliable surgery.

Fig. 38.1
figure 1

Intraoperative transrectal ultrasound (TRUS) images of a prostate in RARP

Laparoscopic partial nephrectomy (LPN) has spread aiming at preservation of the renal function and low invasiveness, but LPN is a difficult surgical procedure because it requires rapidly applying laparoscopic resection, hemostasis, and suture within a limited ischemic time. Thus, robot-assisted partial nephrectomy (RPN) has spread expecting to overcome this technical difficulty of LPN, reducing invasiveness, and shortening the ischemic time. Three-dimensional images of the tumor localization and renal arterial and venous distributions are prepared before partial nephrectomy using software, such as OsiriX (Pixmeo, Swiss) and Synapse Vincent System (FUJIFILM, Japan) (Fig. 38.2), and projected in a console using TilePro™ during surgery. In its standard procedure, before tumor resection during surgery, an ultrasound probe is placed in the body cavity and a resection line is marked around the tumor by coagulation using Monopolar curved scissors held by the right hand while ultrasound images are displayed on the console screen using TilePro™ Multi-Display (Fig. 38.3).

Fig. 38.2
figure 2

3D constructed CT images by OsiriX®

Fig. 38.3
figure 3

Intraoperative ultrasound and 3D constructed CT images on TilePro™ multi-input display in RAPN

4 Development of AR Navigation by Urology Department

To our knowledge, the world’s initial execution of AR navigation in the urology field transmitted at a public place is the live operation of laparoscopic partial nephrectomy performed by our team in the 2006 World Congress of Endourology annual meeting held in Cleveland (Ukimura and Gill 2008). Our AR system is comprised of a computerized workstation and surgical instruments (a rigid laparoscope and laparoscopic surgical instruments) equipped with an infrared optical position tracking camera (Polaris, Northern Digital, Waterloo, Canada) (Fig. 38.4). Firstly, we developed AR navigation for total prostatectomy using cancer and neurovascular bundle models visualized by transrectal ultrasound imaging immediately before surgery at an operation site. For partial nephrectomy, a 3-dimensional model of the kidney and renal tumor (and tumor blood vessels currently) was prepared from contrast-enhanced CT information acquired at a slice width of 1 mm or smaller on the day before surgery. The properties of the blood vessels in the renal hilum and curved kidney surface in the surgical field were identified using a positional sensor probe employing the Iterative Closest Point method, and registration with the corresponding points and curved surface in the 3-dimensional model was performed to realize AR by superimposed display. The renal tumor expansion pattern in the organ not-visible in the conventional endoscopic visual field could be observed as if it were see-through in the organ, which contributes to support for surgery. In the development for partial nephrectomy, we also developed a system which tracks the tip and direction of scissors for laparoscopic surgery using the optical tracking system and calculates the distance between the tumor and the tip of the surgical tool applying the AR function. Using this system, we proposed a concept, “Surgical Radar”, correcting the ideal direction of resection in real time to achieve negative resected margins (Ukimura and Gill 2009b). To define the spatial position and direction of scissors, we designed a color map method displaying the direction pointed by scissors and distance from the tumor in real time (“Surgical Radar” capable of displaying regions 5- and 10-mm distant from the outer margin of the tumor), like a weather forecast map (such as presenting a distance up to 5 mm in yellow and within 5–10 mm in green) (Fig. 38.5). Subsequently, we developed a result-predictive AR function aiming at correcting the current state through which the operator can explore the distance between the current resection site and tumor and direction to cut thereafter in real time by observing “Surgical Radar”, and reported a concept leading to automation of robot surgery in the future, which corresponds to a recent popular topic, autonomous cars (Ukimura and Gill 2008). We also newly designed a 3D reno-vascular tumor model in which kidney tumors and the distribution of blood vessels in the kidney can be recognized as if the internal region of the organ is see-through, aiming at non-ischemic partial nephrectomy, and realized AR using this new model (Fig. 38.6) (Ukimura et al. 2012; Nakamoto et al. 2012).

Fig. 38.4
figure 4

Augmented reality system using optical tracking camera and 3-markers-attached instruments (Modified from figure form reference Ukimura and Gill (2009a))

Fig. 38.5
figure 5

Augmented reality with predictive “Surgical Radar” function (Modified from figure form reference Ukimura and Gill (2009b))

Fig. 38.6
figure 6

Augmented reality in “zero-ischemia” partial nephrectomy for completely intra-renal hilar tumor (Modified from figure form reference Nakamoto et al. (2012))

Unlike neurosurgery and orthopedic surgery, in urological surgery, it is difficult to match the surgical field to the position information of acquired images because the surgical field alters with deformation, respiratory movement, and posture change-induced shift of the target organ. However, with recent progresses in computer and medical engineering technologies, techniques to trace or correct deformity and respiratory movement of the organ in real time during surgery have been introduced. To track spatial movement and distortion of the organ in real time, three wireless magnetic markers used in radiotherapy (Calypso 4D localization system) were applied surrounding the kidney tumor as Body-GPS. CT images of the kidney including the markers were acquired, and a 3-dimensional model of the kidney tumor was prepared before surgery. We clarified that dynamic AR (4-dimensional AR) can be realized: In the surgical field placed in the magnetic field, Body-GPS automatically transmits the spatial coordinates to the magnetic position tracking system in real time, and based on this information, even though the organ is dynamically moved by the surgical operation, information of the 3-dimensional tumor model is superimposed and displayed following the moving organ on the endoscopic screen (Nakamoto et al. 2008). We recently reported that the postoperative renal function corresponding to the resection line can be predicted before surgery (Isotani et al. 2015).

In 2009, Teber et al. in Germany reported that AR is feasible for laparoscopic partial nephrectomy and total prostatectomy by puncturing the organ with needles with a needlepoint and using many needlepoints as visual markers for coordinate recognition (Teber et al. 2009; Simpfendörfer et al. 2011). In the same year, Su et al. in Florida introduced a 3-dimensional camera and identified the spatial positional information by acquiring the characteristics of the target curved kidney surface, and AR was possible using the technique to superimpose the information on the preoperative kidney tumor model (Su et al. 2009). The method calculating the spatial coordinates from image information collected by a 3-dimensional camera appears a simple ideal system because it does not use an external organ tracking system, but unfortunately, the accuracy of AR prepared by calculation for registration from image-based calculation materials collected using a single 3-dimensional camera is insufficient compared with that of AR prepared by the current registration method using external markers, and improvements in various directions are being discussed. In 2013, Müller et al. of Germany developed an AR application system capable of iPad tablet display for PCNL (Müller et al. 2013), and Schneider et al. of Switzerland reported in 2014 that they increased the accuracy of AR to an error range of 2.1 ± 1.2 mm in a pig model (Schneider et al. 2014). In 2015, Edgcumbe et al. of Canada reported that registration of the kidney with only an about 1.5-mm error is possible even though an imaging technique-based position tracking system is used by placing a light source capable of projecting a grid-pattern checker board through a laparoscope and acquiring images of the 3-dimensional structure of the organ with a curved surface (Edgcumbe et al. 2015). In 2016, Lanchon et al. of France constructed a 3-dimensional image of the prostate by transuretheral ultrasonography and AR could be realized in laparoscopic total prostatectomy (Lanchon et al. 2016). Therefore, although the risk of errors in the registration process decreases in AR utilizing ultrasound images acquired in the surgical field, but preoperative CT and MRI with high resolution are appropriate for preparing a model of cancers and nerves, which are the targets of visualization, in many cases, leaving many problems to be solved. In the same year, Teber’s group of Germany proposed a system in which the entire procedure can be performed during surgery without preoperative CT data acquisition by puncturing markers: Using cone-beam CT applicable during surgery, they acquired CT 3D-volume data by puncturing many needlepoint needles during surgery, and realized AR using the needlepoints as visual markers for coordinate recognition since 2009 (Simpfendörfer et al. 2016).

5 Future Prospects of Intraoperative Navigation

Curability is the highest priority in surgery for cancer, but low-invasiveness and functional preservation are also considered important, leading to the current procedures including robot-assisted surgery. Functional preservation after total prostatectomy represents conservation of urinary continence and the erectile function and that after partial nephrectomy represents conservation of the renal function. To achieve both cancer cure and functional preservation, elaborate control of the resected stumps and nerve conservation are necessary, for which appropriate navigation and monitoring may be very useful.

With improvement of imaging software, preoperative construction of a detailed 3-dimensional model has become possible (Komai et al. 2016; Shin et al. 2016), and whether the surgeon prepared it having a sense of purpose is questioned as a true value of the 3-dimensional model. Calculation software for organ deformation models has also progressed. Deformation of organs depends on the shape, properties, and hardness of the organ, condition of adjacent organs, and position of fixation support tissue, and studies using various calculation models are progressing. It has also recently become possible to prepare a soft 3-dimensional model using a 3D-printer and a model with materials reflecting the properties of the constituting organ. Furthermore, application for hologram display using a laser light is expected (Hartl et al. 2016). Progression of these simulation techniques may be returned to AR. Realization of new medical AR by combining with a sensor detecting the characteristics and physical properties of the target is expected for the future in which not only superimposed image display but also commentary characters and sound up-dating the target condition in real time and a system informing of the safety and risk to the operator through color codes and alarm sound are introduced, and complementation and improvement of judgment ability will be possible by introducing Artificial Intelligence (AI) (Epp et al. 1988).