Abstract
Robotic surgery is an accepted adjunct to minimally invasive surgery, but training is restricted to console time. Virtual-reality (VR) simulation has been shown to be effective for laparoscopic training and so we seek to validate a novel VR robotic simulator. The American Urological Association (AUA) Office of Education approved this study. Subjects enrolled in a robotics training course at the 2007 AUA annual meeting underwent skills training in a da Vinci dry-lab module and a virtual-reality robotics module which included a three-dimensional (3D) VR robotic simulator. Demographic and acceptability data were obtained, and performance metrics from the simulator were compared between experienced and nonexperienced roboticists for a ring transfer task. Fifteen subjects—four with previous robotic surgery experience and 11 without—participated. Nine subjects were still in urology training and nearly half of the group had reported playing video games. Overall performance of the da Vinci system and the simulator were deemed acceptable by a Likert scale (0–6) rating of 5.23 versus 4.69, respectively. Experienced subjects outperformed nonexperienced subjects on the simulator on three metrics: total task time (96 s versus 159 s, P < 0.02), economy of motion (1,301 mm versus 2,095 mm, P < 0.04), and time the telemanipulators spent outside of the center of the platform’s workspace (4 s versus 35 s, P < 0.02). This is the first demonstration of face and construct validity of a virtual-reality robotic simulator. Further studies assessing predictive validity are ultimately required to support incorporation of VR robotic simulation into training curricula.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Introduction
The use of simulation technology for teaching and evaluating surgical skills acquisition is driven by the demand for improved quality of care and accountability in surgical outcomes, increasing restrictions on the use of animal models, dwindling resident case logs, medicolegal pressures, and fiscal mandates for cost-effective performance. Laparoscopy and endoscopy have proven to be ideally suited for the use of virtual-reality models for simulation training. This approach allows for virtual mentorship, task deconstruction, and the ability to study dynamic comprehensive surgical metrics [1, 2]. With the introduction of the surgical robot in 2001, an additional laparoscopic technological adjunct became available that arguably requires a shorter learning curve for attaining appropriate surgical skills. Robotic surgical training is restrictive because robotic simulators are just now becoming commercially available. The use of the actual robot for training can be difficult due to high utilization during the week for actual surgery, the high cost of purchasing a robot specifically for training purposes, and the space required to house a robot specifically for training.
Built to train robotic telesurgical skills, the dV-Trainer (MIMIC Technologies, Inc., Seattle, WA) is a novel portable offline virtual-reality (VR) robotic simulator platform comprised of a binocular three-dimensional visual output, two finger telemanipulators similar to the da Vinci robotic system’s finger cuff effectors (Intuitive Surgical, Sunnyvale, CA), and an optional three-dimensional (3D) projection screen [3]. The movements of the telemanipulators utilize a cable-driven system to appreciate movements of the surgeon through space and apply force feedback when appropriate (Fig. 1). Dry-lab tasks used to train learners with a da Vinci system can be digitally recreated for the dV-Trainer system with the same motion scaling afforded by the robot.
In order to assess basic aspects of validity of the dV-Trainer, the VR platform was introduced during a postgraduate pediatric robotic-assisted laparoscopic (RAL) urology course at the AUA meeting in Anaheim, June 2007. The goals of the pilot design were to assess whether or not the subject’s actual robotic OR experience correlates with performance metrics on the beta release version of the dV-Trainer VR platform, whether the trainer’s components were ‘acceptable,′ and whether the content of its exercises had deemed educational value as assessed by learners. This was measured through participant post-course evaluation and electronic performance metrics collection on the VR platform. The study’s objective is to examine for evidence of validity for the integration of robotic simulation tools into continuing medical education (CME) courses and surgical residency training.
Materials and methods
This study was approved by the Office of Education, AUA and each course enrollee filled out a consent form detailing the study goals at the beginning of the course. Fifteen learners in the course rotated through a didactics station of robotic surgery applications in pediatric urology, a da Vinci-S dry-lab training station, and a virtual-reality robotics training station. The instructors were blinded to the experience of each learner and each enrollee was given the opportunity to opt out of the study with no change in their skills lab time. Learners were first surveyed about their perceptions of the value of simulation training and were asked to rate acceptability of the dry-lab da Vinci-S console and the offline VR platform. Performance metrics were recorded on the dV-Trainer for a ring transfer module which was simulated in both VR and the dry-lab skills station (Fig. 2). Subjects were given 2–3 min of time to develop familiarity with the platform and then were given 5 min for the task. The module involved grasping three successive rings from a series of pegs on the wall of the module and transferring them from one instrument to another before placing them on an upright peg on the floor of the module. The purpose of the modules was to teach proper telemanipulator arm and camera clutching, object transfer from one instrument to another, and object placement. Class participants filled out course evaluations and acceptability questionnaires.
Performance metric results were divided into those obtained from learners with prior robotic experience and learners completely new to any robotics platform—experienced versus nonexperienced. We were deliberate in not describing learners as ‘experts’ or ‘nonexperts’ since to this date there have been no standardized qualifications to rate a robotic surgeon with either of these labels. Complete performance metrics recorded from the dV-Trainer were the time (s) for placing the first three rings on the floor peg, economy of motion as measured by distance traveled (mm) by each instrument during the task time, peak ring strain (which is a surrogate of tissue deformability), number of instrument collisions, time each instrument spent out of view (s), and the time the master telemanipulators were out of center (s). A Likert scale was used to assess the platform performance where a rating of 0 corresponded to totally unacceptable and a rating of 6 corresponded to totally acceptable. Means testing was performed with paired t test with unequal variances using STATA SE 9.2 (College Station, TX) with a statistical significance set at P < 0.05.
Results
Demographics
Four enrollees cited prior robotic experience whereas 11 (73%) participants had no prior robotic experience and none of the subjects had previously seen or used the dV-Trainer simulator. Almost half of the learners reported that they play video games and all students were right-handed (Table 1).
An overwhelming majority of learners (88%) believe that there is a role for the use of computerized simulation in robotic surgery training and almost half believe that simulation should be used for accreditation. Based on the needs assessment survey participants confirmed that a VR platform for robotic surgical training would be beneficial (Table 2). Users were asked to review the overall acceptability of the simulator as a whole, as well as its components, on a scale from 0 to 6 (Likert scale) whereby scores greater than 3 indicate acceptability for training (Fig. 3). The overall performance rating for simulation with the offline trainer and dry lab was 4.69 and 5.23, respectively, demonstrating face validity. Experienced learners were more critical of all categories of acceptability, including the actual da Vinci-S robot master, rating the overall performance of the dry lab and the offline trainer at 4.75 and 3.75, respectively. Nonexperienced learners rated these training modalities at 5.44 and 5.11, respectively.
Construct validity
Surgeons with da Vinci-S experience performed better than inexperienced users. Of the six metrics evaluated, task time, economy of motion, and time the master telemanipulators were outside the center of the interface workspace were found to be statistically significant when comparing those with robot experience to those without experience. Therefore, the construct that experience correlates with performance on the trainer has validity, as indicated when utilizing the three aforementioned metrics for performance evaluation [4] (Table 3).
Discussion
We undertook this study to assess the performance and acceptability of a novel robotic VR simulator. Since most medical institutions do not have the financial or space allowances for a da Vinci system solely for the purpose of training, a validated simulation platform for robotic surgical skills acquisition would be advantageous. In addition, for an institution to recover costs of the da Vinci system, the robot needs to be in constant use, precluding training time on the robot during regular working hours. This is particularly challenging when trying to adhere to the Accredited Council for Graduate Medical Education (ACGME) guidelines for resident duty hours. The training of laparoscopic skills through simulation has been shown to improve operative performance in residents [5, 6], yet robotic simulators have not undergone validation studies as legitimate adjuncts to robotic surgical training.
The results of the acceptability ratings demonstrate that both simulation and dry-lab training are considered acceptable modalities for robotic surgical skills acquisition. Class participants rated simulation training on a level comparable to the dry-lab training. This finding is evidence that access to the actual robot system may not be necessary to teach robotic skills. Recognizing that enrollment in the course was voluntary, we cannot ignore the potential for sampling bias since participants who signed up for the course are most likely early adopters of technology and view robotics or simulation training as useful. And we acknowledge that, although a learner may not perceive the need to be taught on the da Vinci system to learn robotics skills, acquisition of skills from a VR trainer does not necessarily translate into OR proficiency.
Overall, it was interesting to note that those surgeons who have used the robot prior to this course were much more critical of the dry-lab and VR simulation training than were novice users. This finding was consistent with the findings of Lin et al. when testing another robotic simulator between experts and nonexperts. They observed that nonexperts found that robotic simulation better reflected clinical skill than experts [7].
The six metrics chosen to evaluate the course enrollees are similar to those tested in prior laparoscopic simulation training studies [8–10]. Economy of motion (EOM) was chosen as a metric since studies have shown that EOM improves with increased proficiency [9, 11], and our data demonstrated a discernible difference. Another metric that has been linked to RAL skill proficiency is instrument collisions and, although fewer occurred in the experienced group, this did not reach statistical significance. This could be explained by the small sample size or by the short task time duration. The two metrics involving telemanipulator and instrument positioning are critical to small-space surgery relevant for pediatric procedures, and errors in these can translate into unwanted tissue trauma and awkward ergonomics for the roboticist. The nonexperienced participants tended to keep their instruments out of view longer than the experienced participants, but not to a degree of statistical significance, yet the master telemanipulators spent a significantly longer time out of the center of the workspace. We would expect experienced roboticists to keep instruments within the visual field, so perhaps expanding the task time would yield significant differences between the groups. The dramatic difference between the groups for the time the telemanipulators spent outside of the center of the workspace may be related to the impact that nonfamiliarity of the dV-Trainer had on the two groups. There may be an inherent understanding of telemanipulator function in the experienced group that was lacking in the nonexperienced group. This has implications for training beginners on a dV-Trainer prior to receiving robot console time. New learners may understand telemanipulator function better having used this technology through simulation and attain robotic skills faster.
Despite the low power of this study, we nonetheless demonstrated statistical significance in three performance metrics of construct validity. These results support the initiation of a larger trial involving subjects of many different experience levels.
Conclusion
Our study represents an initial demonstration of acceptability of a VR simulator for the da Vinci surgical robot system and gives insight into the potential value of offline training for robotics skills acquisition. Acceptability and preliminary face and content validity were demonstrated by the learners in this cohort. We also show evidence that the construct of experience correlated with key performance metrics. The data provide evidence that the dV-Trainer may be appropriate for integration into continuing medical education and residency robotic training curricula. Prospective studies looking at a larger pool of learners at varying degrees of both robotic and laparoscopic surgery skills are needed and will help determine the utility of integrating this simulator into the surgical curriculum. Further studies assessing predictive validity are ultimately required to confirm that VR robotic simulation training translates into improved surgical outcomes in patients.
References
Watterson JD, Beiko DT, Kuan JK et al (2002) A randomized prospective blinded study validating acquisition of ureteroscopy skills using a computer based virtual reality endourological simulator. J Urol 168:1928–1932. doi:10.1016/S0022-5347(05)64265-6
Sweet R, Kowalewski T, Oppenheimer P et al (2004) Face, content and construct validity of the University of Washington virtual reality transurethral prostate resection trainer. J Urol 172:1953–1957. doi:10.1097/01.ju.0000141298.06350.4c
Rashid HH, Berkley J, Vollenweider M et al (2006) Creating a patient specific interactive virtual reality model for robotic prostatectomy. J Endourol 20((Suppl 1)):A326. doi:10.1089/end.2006.20.326
Gallagher AG, Ritter EM, Satava RM (2003) Fundamental principles of validation, and reliability: rigorous science for the assessment of surgical education and training. Surg Endosc 17:1525–1529. doi:10.1007/s00464-003-0035-4
Seymour NE, Gallagher AG, Roman SA et al (2002) Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg 236:458–463. doi:10.1097/00000658-200210000-00008
Korndorffer JR Jr, Dunne JB, Sierra R et al (2005) Simulator training for laparoscopic suturing using performance goals translates to the operating room. J Am Coll Surg 201:23–29. doi:10.1016/j.jamcollsurg.2005.02.021
Lin DW, Romanelli JR, Thompson RE et al (2007) Computer-based laparoscopic and robotic surgical simulators: performance characteristics and perceptions of new users, SAGES Meeting Abstract S077: robotics
Gunther S, Rosen J, Hannaford B et al (2007) The red DRAGON: a multi-modality system for simulation and training in minimally invasive surgery. Stud Health Technol Inform 125:149–154
Figert PL, Park AE, Witzke DB et al (2001) Transfer of training in acquiring laparoscopic skills. J ACS 193:533
Gallagher AG, Ritter EM, Champion H et al (2005) Virtual reality simulation for the operating room: proficiency-based training as a paradigm shift in surgical skills training. Ann Surg 241:364–372. doi:10.1097/01.sla.0000151982.85062.80
Hassan I, Maschuw K, Rothmund M et al (2006) Novices in surgery are the target group of a virtual reality training laboratory. Eur Surg Res 38:109–113. doi:10.1159/000093282
Acknowledgments
We would like to thank the AUA Office of Education for allowing us to perform this study. We give special thanks to Intuitive Surgical and MIMIC Technologies, Inc. for providing us with da Vinci robots and the VR platforms, respectively.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Lendvay, T.S., Casale, P., Sweet, R. et al. Initial validation of a virtual-reality robotic simulator. J Robotic Surg 2, 145–149 (2008). https://doi.org/10.1007/s11701-008-0099-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11701-008-0099-1