Abstract
Despite the increased dexterity and precision of robotic surgery, like any new surgical technology it is still associated with a learning curve that can impact patient outcomes. The use of surgical simulators outside of the operating room, in a low-stakes environment, has been shown to shorten such learning curves. We present a multidisciplinary validation study of a robotic surgery simulator, the da Vinci® Skills Simulator (dVSS). Trainees and attending faculty from the University of Toronto, Departments of Surgery and Obstetrics and Gynecology (ObGyn), were recruited to participate in this validation study. All participants completed seven different exercises on the dVSS (Camera Targeting 1, Peg Board 1, Peg Board 2, Ring Walk 2, Match Board 1, Thread the Rings, Suture Sponge 1) and, using the da Vinci S Robot (dVR), completed two standardized skill tasks (Ring Transfer, Needle Passing). Participants were categorized as novice robotic surgeon (NRS) and experienced robotic surgeon (ERS) based on the number of robotic cases performed. Statistical analysis was conducted using independent T test and non-parametric Spearman’s correlation. A total of 53 participants were included in the study: 27 urology, 13 ObGyn, and 13 thoracic surgery (Table 1). Most participants (89 %) either had no prior console experience or had performed <10 robotic cases, while one (2 %) had performed 10–20 cases and five (9 %) had performed ≥20 robotic surgeries. The dVSS demonstrated excellent face and content validity and 97 and 86 % of participants agreed that it was useful for residency training and post-graduate training, respectively. The dVSS also demonstrated construct validity, with NRS performing significantly worse than ERS on most exercises with respect to overall score, time to completion, economy of motion, and errors (Table 2). Excellent concurrent validity was also demonstrated as dVSS scores for most exercises correlated with performance of the two standardized skill tasks using the dVR (Table 3). This multidisciplinary validation study of the dVSS provides excellent face, content, construct, and concurrent validity evidence, which supports its integrated use in a comprehensive robotic surgery training program, both as an educational tool and potentially as an assessment device.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Introduction
Surgical training has traditionally been based on an apprenticeship-style model, whereby clinical acumen and surgical skills are acquired under the supervision of an experienced mentor. With the introduction of mandated regulations for clinical duties and responsibilities, as well as the integration of innovative new surgical technologies such as robotics, the Halstedian model of training has been found insufficient to meet the needs of many contemporary trainees [1, 2].
Surgical simulation, when properly integrated into a comprehensive curriculum, has proved to be a valid educational tool to address this training gap [3–5]. Simulation-based training not only has the benefit of increased trainee exposure to content, but it allows for deliberate practice in a low-stakes environment that does not compromise patient safety [6]. In addition, formative and summative assessments can be made of the trainee using simulation-based devices, ensuring competency-based training of surgical trainees.
Outside of the United States, robotic surgery remains a relatively novel surgical technology, including in countries such as Canada. Until a critical mass of expertise and clinical volume develops, clinical opportunities for trainees will remain limited, as there is a “trickle-down” effect from novice faculty surgeons who continue to work through their respective learning curves. Robotic simulators may provide trainees with an opportunity to develop basic skills during this adoptive phase of technology, addressing the unavoidable training gap mentioned earlier. Due to the somewhat prohibitive costs associated with integrating such simulators into a robotic surgery training curriculum, validity evidence must be demonstrated, with various forms of validity evidence required. Face validity concerns the realism of a simulator and is determined by novice, non-experts. Content validity involves a judgment made by experts on whether a simulator actually teaches or assesses the content material of importance. Construct validity evidence relates to the ability to accurately distinguish “content novices” from “content experts”, and is critical to any valid simulator or simulation.
In addition to construct validity evidence, simulators that are to be used as an assessment tool should also demonstrate criterion validity. For example, concurrent validity, a form of criterion validity, concerns whether an assessment made using the simulator correlates with assessments made using accepted “gold standard” evaluative tools.
The aim of this study is to determine whether a commercially available robotic simulator, the da Vinci® Skills Simulator (dVSS), demonstrates validity evidence for both training and assessment purposes in the context of multi-disciplinary surgical trainees.
Methods
As part of a larger, more comprehensive 4-week robotic surgery basic skills training curriculum, residents and faculty members from the University of Toronto Divisions of Urology and Thoracic Surgery and Department of Obstetrics and Gynecology (ObGyn) were included in this validation study. Prior to testing on the dVSS, all subjects were provided with an introduction to the da Vinci robot (dVR) that included both a discussion and a demonstration of robot set-up, docking, instrument exchange, camera navigation, instrument clutching, suturing and knot tying, and object manipulation. This introduction also included approximately 10 min of hands-on basic skills training using the dVR and various inanimate part-task training models. Each subject was then assessed on their performance of two standardized skill tasks: Ring Transfer (RT) and Needle Passing (NP). Time to completion and number of errors were recorded for both tasks by two trained faculty educators, with errors being defined as dropped objects, unintentional instrument collisions, and excessive force on the model.
One week after the introductory session with the dVR, each subject was given a brief standardized introduction to the dVSS. Subjects were first permitted to complete a practice exercise (“pick and place”) to gain familiarity with the dVSS functionality, after which they performed at least seven different exercises on the dVSS: Camera Targeting 1, Peg Board 1, Peg Board 2, Ring Walk 2, Match Board 1, Thread the Rings, and Suture Sponge 1. Using the built-in Mimic® scoring algorithm, each subject was assessed on overall score, time to completion, economy of motion, and number of errors for each exercise.
Participants with more than 20 robotic surgical cases performed were categorized as more experienced (ERS), while all others were classified as novice robotic surgeons (NRS). Statistical analysis was conducted using SPSS® v21 software with independent T-tests used to compare mean scores for construct validity evidence and non-parametric Spearman’s correlation calculations to determine concurrent validity evidence.
Results
Demographics
A total of 53 participants were enrolled in this multi-disciplinary dVSS validation study: 27 from urology, 13 from ObGyn, and 13 from thoracic surgery (Table 1). The majority of subjects (89 %) had either no prior robotic console experience or had performed <10 robotic cases. Only five subjects (9 %) had performed ≥20 robotic surgical procedures, though four of them had performed >50 cases each.
Face and content validity
Overall, most subjects (97 %) agreed that the dVSS demonstrated acceptable realism in comparison to the dVR. More specifically, no less than 92 % of subjects agreed that the dVSS was a good simulation of the dVR with respect to each of camera navigation, clutch functionality, EndoWrist manipulation, and needle driving. Only 64 and 42 % of participants felt that the dVSS accurately simulated the dVR in regard to knot tying and dissection/cautery, respectively. Overall, 89 % of all participants felt that the dVSS was as effective at basic robotic skills training as using the dVR with inanimate models. All five surgeons (100 %) with significant robotic experience felt the dVSS was a valid educational tool for novice robotic surgery trainees.
Construct validity
ERS were found to perform significantly better than NRS on five out of the seven dVSS exercises with respect to overall score: Camera Targeting 1 (92 vs. 67 %, p = 0.008), Peg Board 1 (92 vs. 77 %, p = 0.004), Match Board 1 (85 vs. 68 %, p = 0.028), Thread the Rings (90 vs. 72 %, p = 0.011), Suture Sponge 1 (86 vs. 73 %, p = 0.042). Only the Ring Walk 2 (88 vs. 73 %, p = 0.086) and Peg Board 2 (92 vs. 83 %, p = 0.082) exercises did not demonstrate evidence of construct validity (Table 2).
ERS also outperformed NRS on both dVR standardized tasks: RT time (65 vs. 172 s, p = 0.001), RT errors (0.7 vs. 3.3, p = 0.004), and NP time (90 vs. 226 s, p < 0.001). There was no difference between ERS and NRS, however, for NP errors (1.8 vs. 4.0, p = 0.08).
Concurrent validity
Participants’ overall score on all but one (Peg Board 2) of the seven exercises selected for validation correlated with time to completion on both RT and NP tasks (p < 0.05). All other dVSS performance metrics (time to completion, economy of motion, and number of errors), for five of the seven different exercises correlated only with time to completion of the NP task (p < 0.05). None of the seven different overall scores, however, correlated with number of errors for both RT and NP tasks (Table 3).
Discussion
This study is the first to examine the validity evidence of the dVSS as both an instructional tool and assessment device in a multi-disciplinary cohort of trainees. The seven different dVSS exercises selected all seem to demonstrate acceptable face, content, construct, and concurrent validity evidence, supporting the integration of the dVSS when developing comprehensive, competency-based basic robotic skills training curricula. While advanced robotic surgical training will require subspecialty-specific training content and procedure-specific instructional methods, this multi-disciplinary study demonstrates that basic robotic skills can be taught and assessed through a common curriculum, using a common surgical simulator.
The potential benefits of robotic surgery have been well documented [7–12], and while the integration of robotics into clinical practice has been widespread, the development of validated training curricula and certification policies has not. Utilization of surgical robotics has also become a multi-disciplinary endeavour with subspecialties such as urology, gynecology, cardiothoracic surgery, general surgery, and otolaryngology all adopting the technology [7]. The availability and use of robotic simulators has the potential to significantly improve the initial learning curve associated with the adoption of any new technology, such as robotics, by permitting both educational and assessment opportunities. Surgical simulators are, however, associated with significant capital costs so it is imperative that proper validity evidence be provided before integration of such instructional methods. In addition, given the various surgical disciplines now using the surgical robot, it is a duty of educators to develop a multi-disciplinary curriculum that is applicable to many different surgical trainees from various backgrounds, at least for basic skills training, rather than “reinventing the wheel” for each subspecialty.
Several studies have found similar validity evidence for the dVSS as a training tool [13–15]. Liss and et al. [13] demonstrated acceptable content and construct validity evidence for the dVSS in a cohort of urology trainees and faculty members. Similarly, Kelly and colleagues demonstrated excellent face, content, and construct validity among a multi-disciplinary group of surgical trainees and faculty [14].
To date, only one other study has found validity evidence of the dVSS as a potential assessment device [16]. Hung and colleagues demonstrated that among a cohort of urologists, the dVSS demonstrated excellent concurrent and predictive validity evidence. In addition, the authors found that simulation-based training on the dVSS was particularly beneficial for “weaker” robotic surgeons.
There have been several other studies evaluating the validity evidence for other robotic surgical simulators such as the dV-Trainer™, RoSS®, and ProMIS® [17–20], many with similar findings. While there have been limited head-to-head comparisons between robotic simulator platforms, differences are likely to be of minimal educational significance as all have the common benefits of providing learners with opportunities for deliberate practice, content exposure, and even feedback.
There are several limitations to this study. While study participants were drawn from multiple surgical specialties, this was a single institution study, potentially impacting the generalizability of the results. Further multi-centre validation studies are required to validate the robustness of the concurrent validity evidence. The two faculty raters were trained specifically regarding the definition of errors during the performance of the RT and NP tasks; however, all participants were rated by only one faculty educator. As such, reliability scores for the number of errors made during the RT and NP tasks is not available, potentially compromising validity. Finally, the cohort of participants was a relatively inexperienced group of robotic surgeons, with <10 % of participants having performed more than 20 robotic cases. The evidence in support of utilizing the dVSS as both a training tool and assessment device may therefore be limited to a relatively novice audience.
Conclusions
This multidisciplinary validation study of the dVSS provides excellent face, content, construct, and concurrent validity evidence. This supports its integrated use in a comprehensive basic robotic surgery training curriculum, both as an educational tool and potentially as an assessment device.
References
Lee JY, Mucksavage P, Sundaram CP, McDougall EM (2011) Best practices for robotic surgery training and credentialing. J Urol 185(4):1191–1197
McDougall EM (2007) Validation of surgical simulators. J Endourol 21(3):244–247
Palter VN, Graafland M, Schijven MP, Grantcharov TP (2012) Designing a proficiency-based, content validated virtual reality curriculum for laparoscopic colorectal surgery: a Delphi approach. Surgery 151(3):391–397
Stefanidis D, Korndorffer JR Jr, Markley S, Sierra R, Heniford BT, Scott DJ (2007) Closing the gap in operative performance between novices and experts: does harder mean better for laparoscopic simulator training? J Am Coll Surg 205(2):307–313
Seymour NE (2007) VR to OR: a review of the evidence that virtual reality simulation improves operating room performance. World J Surg 32(2):182–188
Ericsson KA (2008) Deliberate practice and acquisition of expert performance: a general overview. Acad Emerg Med 15(11):988–994
Orvieto MA, Marchetti P, Castillo OA, Coelho RF, Chauhan S, Rocco B et al (2011) Robotic technologies in surgical oncology training and practice. Surg Oncol 20(3):203–209
Mucksavage P, Kerbl DC, Pick DL, Lee JY, McDougall EM, Louie MK (2011) Differences in grip forces among various robotic instruments and da Vinci surgical platforms. J Endourol 25(3):523–528
Aboumarzouk OM, Stein RJ, Eyraud R, Haber G-P, Chlosta PL, Somani BK et al (2012) Robotic versus laparoscopic partial nephrectomy: a systematic review and meta-analysis. Eur Urol 62(6):1023–1033
Ahmed K, Ibrahim A, Wang TT, Khan N, Challacombe B, Khan et al (2012) Assessing the cost effectiveness of robotics in urological surgery—a systematic review. BJU Int 110(10):1544–1556
Patel VR, Coelho RF, Chauhan S, Orvieto MA, Palmer KJ, Rocco B et al (2010) Continence, potency and oncological outcomes after robotic-assisted radical prostatectomy: early trifecta results of a high-volume surgeon. BJU Int 106(5):696–702
Panumatrassamee K, Autorino R, Laydner H, Hillyer S, Khalifeh A, Kassab A et al (2012) Robotic versus laparoscopic partial nephrectomy for tumor in a solitary kidney: a single institution comparative analysis. Int J Urol. doi:10.1111/j.1442-2042.2012.03205.x
Liss MA, Abdelshehid C, Quach S, Lusch A, Graversen J, Landman J et al (2012) Validation, correlation, and comparison of the da Vinci trainer™ and the da Vinci surgical skills simulator™ using the Mimic™ software for urologic robotic surgical education. J Endourol 26(12):1629-34
Kelly DC, Margules AC, Kundavaram CR, Narins H, Gomella LG, Trabulsi EJ et al (2012) Face, content, and construct validation of the da Vinci skills simulator. Urology 79(5):1068–1072
Finnegan KT, Meraney AM, Staff I, Shichman SJ (2012) da Vinci skills simulator construct validation study: correlation of prior robotic experience with overall score and time score simulator performance. Urology 80(2):330–336
Hung AJ, Patil MB, Zehnder P, Cai J, Ng CK, Aron M et al (2012) Concurrent and predictive validation of a novel robotic surgery simulator: a prospective, randomized study. J Urol 187(2):630–637
Lee JY, Mucksavage P, Kerbl DC, Huynh VB, Etafy M, McDougall EM (2012) Validation study of a virtual reality robotic simulator—role as an assessment tool? J Urol 187(3):998–1002
Abboudi H, Khan MS, Aboumarzouk O, Guru KA, Challacombe B, Dasgupta P et al (2013) Current status of validation for robotic surgery simulators—a systematic review. BJU Int 111(2):194-205
Korets R, Mues AC, Graversen JA, Gupta M, Benson MC, Cooper KL et al (2011) Validating the USE of the Mimic dV-trainer for robotic surgery skill acquisition among urology residents. Urology 78(6):1326–1330
Perrenot C, Perez M, Tran N, Jehl J-P, Felblinger J, Bresler L et al (2012) The virtual reality simulator dV-Trainer® is a valid assessment tool for robotic surgical skills. Surg Endosc 26(9):2587–2593
Conflict of interest
Authors Kirsten Foell, R. John D’A. Honey, Kenneth T. Pace, and Jason Y. Lee declare that they have no conflict of interest. Author Alexander Furse is a robotic surgical technician employed by Minogue Medical Inc., the sole distributor for Intuitive Surgical in Canada. His duties and capacity, however, in no way involve sales or marketing for the da Vinci robot.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Foell, K., Furse, A., Honey, R.J.D. et al. Multidisciplinary validation study of the da Vinci Skills Simulator: educational tool and assessment device. J Robotic Surg 7, 365–369 (2013). https://doi.org/10.1007/s11701-013-0403-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11701-013-0403-6