Abstract
Although previous chapters indicated the potential and benefits of training arthroscopic skills in simulated environments, training needs to be continued in the operating room to achieve the necessary proficiency. Based on the theory on learning strategies in Chap. 4, it is posed that if residents indeed acquire the basic skills before they enter the operating room, the focus in the operating room can be on more complex tasks. This requires the formulation of guidelines that determine the level that qualifies proficiency. For the actual cases in the operating room, this is a difficult task as the level of complexity of the procedure plays an important role, and proficiency is not necessarily defined as the summation of several part-task skills, but rather requires a holistic approach.
Access provided by Autonomous University of Puebla. Download chapter PDF
Similar content being viewed by others
Keywords
- Anterior Cruciate Ligament
- Surgical Performance
- Global Rating Scale
- Video Feedback
- Arthroscopic Rotator Cuff Repair
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
FormalPara Take-Home Messages-
The definition of standardised benchmarks is required to define arthroscopic competency.
-
Measuring surgical performance comes with challenges, but new developments such as affordable tracking systems and video analysis software can facilitate structural implementation.
-
Objective monitoring of resident learning curves is feasible using global rating scales.
-
ASSET and BAKSSS global rating scales are validated most extensively and suggested to be used in clinical practice, where ASSET offers potential for summative assessment of arthroscopic skills.
1 Introduction
Although previous chapters indicated the potential and benefits of training arthroscopic skills in simulated environments, training needs to be continued in the operating room to achieve the necessary proficiency. Based on the theory on learning strategies in Chap. 4, it is posed that if residents indeed acquire the basic skills before they enter the operating room, the focus in the operating room can be on more complex tasks. This requires the formulation of guidelines that determine the level that qualifies proficiency. For the actual cases in the operating room, this is a difficult task as the level of complexity of the procedure plays an important role, and proficiency is not necessarily defined as the summation of several part-task skills, but rather requires a holistic approach.
Generally, the complexity of an arthroscopy is divided in two levels: basic (removal) and advanced (reconstruction), e.g. meniscectomy vs. anterior cruciate ligament (ACL) reconstruction (Morris et al. 1993; O’Neill et al. 2002). For elbow arthroscopy, five levels of complexity have been defined (Savoie 2007). To cope with the complexity and support the holistic judgment, faculty members from recognised institutions that have performed a substantial number of procedures (>250) themselves qualify to judge proficiency (Morris et al. 1993; O’Neill et al. 2002) – a method that is being applied in many residency curricula. Despite arthroscopy being performed frequently, consensus is to be attained on the exact definition of arthroscopic competence and the number of procedures that are required to achieve it (Hodgins and Veillette 2013; O’Neill et al. 2002).
As little to no evidence is available on transfer validity of arthroscopic simulator training, and many residency curricula have yet to implement simulator training, the first section focuses on measuring surgical performance in the operating theatre. Measuring surgical performance is not only useful in training, but has also direct applications in quantification and monitoring of operative quality, patient safety and workflow optimisation. Tools and methods are presented from these areas. These could be applied to verify proficiency in basic arthroscopic skills. Additionally, work is presented to set reference baselines for comparing surgical performance.
As mentioned, training in the OR consists of the apprentice model, where the resident initially watches the teaching surgeon performing an operation and gradually takes over (Pedowitz et al. 2002). As modern medicine offers reduced time for residents to develop their arthroscopic skills, it is worthwhile to optimise the learning effect per operation. General educational theories indicate that feedback on one’s performance and stimulation of active learning contributes significantly to a more effective learning process (Prince 2004). For surgery, it has been demonstrated that direct feedback on performance improves the resident’s individual skills (Harewood et al. 2008; O’Connor et al. 2008). We present tools that are suitable to monitor this form of teaching and respect the holistic judgment model needed to assess the more complex tasks.
2 Measuring Surgical Performance and Baseline References
Measuring surgical performance is not an easy task, as patient care has number one priority, patient privacy and the sterile operating zone should be respected, and the operating theatre cannot be transformed into an experimental set-up. Besides, interpretation of the data is complex. That is why attention is paid as well to the registration of baseline reference data of procedures currently performed in the operating theatre. Two categories of tools are defined: sensors that can measure psychomotor skills similarly as done in simulated environments and video and audio registrations that can capture overall surgical performance. Each is elucidated with examples.
2.1 Sensors
The first parameter to be discussed is not surprisingly the operation time. It is easy to measure and often used to track operative planning and workflow. Its value is deducted from the well-established fact that experts execute surgical actions more efficiently compared to novices (Bridges and Diamond 1999). Farnworth and co-workers demonstrated that residents are significantly slower in performing ACL reconstructions compared to orthopaedic surgeons, which can also have financial consequences (Farnworth et al. 2001).
Psychomotor skills can also be monitored in the operating theatre by motion-tracking systems. Such systems exist using (infrared) cameras that track optical or reflective markers attached to the hands of the surgeon or the instruments or of electromagnetic systems with active markers. In surgical practice, such tracking systems are commonly used in computer-aided surgery for accurate positioning of orthopaedic implants (Fig. 13.1) (Matziolis et al. 2007; Moon et al. 2012; Rosenberger et al. 2008). Tracking can also be performed with normal video cameras and digital image-processing tools that recognise markers or other features in the image. Examples are presented by Doignon and co-workers (Blum et al. 2010; Doignon et al. 2005) who detected surgical instruments in the endoscopic video based on metal-coloured features of the system and by Bouarfa and co-workers who labelled various instruments with coloured markers at the tip to improve robustness (Fig. 13.2) (Bouarfa et al. 2012). Tracking of instrument motions provides insight in surgical performance and flow of the procedure (Aggarwal et al. 2007; Dosis et al. 2005). It does require careful data interpretation.
Another set of parameters that have been measured in the operating room are the forces and torques executed during knee arthroscopy (Chami et al. 2006). Chami and co-workers showed that force parameters can indeed discriminate between novices and experts (Chami et al. 2008).
2.2 Video and Audio
Video recordings of a procedure could offer a tool which allows a holistic type of feedback with easy interpretative illustrations. However, the few studies that we could find on using video feedback to improve surgical training did not find significant differences (Backstein et al. 2004; Backstein et al. 2005). Drawbacks of using video recordings are that the replay of an entire operation is time-consuming and without post-processing they do not provide objective measures. A similar line of reasoning can be given for audio recordings. Still, when executing post-processing techniques, video and audio recordings reveal useful cues that could be used to monitor surgical performance. We present some examples related to arthroscopic training.
Time-action analysis is a quantitative method to determine the number and duration of actions. It represents the relative timing of different events and the duration of the individual events. In the medical field, time-action analysis has proven its value in objectifying and quantifying surgical actions (den Boer et al. 2002; Minekus et al. 2003; Sjoerdsma et al. 2000). For training, patient safety and workflow monitoring, time-action analysis can be used to detect and to analyse deviations from the normal flow of the operation. This requires documentation of reference data sets through analysis of procedures performed by expert orthopaedic surgeons. We have performed such analyses for a set of predominantly meniscectomies with the intended purpose of investigating the effectiveness of arthroscopic pump systems (Tuijthof et al. 2007, 2008). To do so, the operations were divided into four phases – (1) creation of portals, (2) joint inspection with or without a probe, (3) cutting and (4) shaving – and their share in the operation time was quantified with the time-action analysis. Comparing the mean duration of each of the phases with those of a trainee can indicate if the trainee performs according to normal workflow or needs substantially more time for a certain phase. By analysing the number of instrument exchanges, repeated actions or the percentage of disturbed arthroscopic view as well, trainees can receive detailed objective feedback on the skills they need to improve. Other parameters that were analysed are the prevalence of instrument loss, triangulation time and prevalence of lookdowns, which showed a high correlation with global rating scale and motion analysis (Alvand et al. 2012).
As these early time-action analyses initially were performed manually by replaying the video frame by frame (den Boer et al. 2002; Minekus et al. 2003; Sjoerdsma et al. 2000; Tuijthof et al. 2007, 2008), implementation of this method for training purposes is unrealistic as it is too time-consuming. However, efforts have been made to perform such analyses automatically using image-processing techniques (Doignon et al. 2005; Tuijthof et al. 2011) or specific tracking systems (Bouarfa et al. 2012). When combined with statistical models, such as Markov models, one can even predict peroperatively what the flow of the operation is (Bouarfa et al. 2011; Bouarfa and Dankelman 2012; Padoy et al. 2012). Such methods could lead to tools that provide real-time objective feedback to a trainee during the operation.
Another feasible approach to implement time-action analysis techniques for training purposes is derived from training of high performance athletes. In this field, it is becoming a daily practice that training activities are recorded on video. To cope with the huge amount of data, sports analysis video software has been developed, which makes it easier to tag events, to assign event to categories, to make annotations and to perform quantitative analyses. Examples of commercial video analysis software packages are Utilius (CCC software, Leipzig, Germany, www.ccc-software.de), MotionViewTM (AllSportSystems, Willow Springs, USA, www.allsportsystems.com) and SportsCode Gamebreaker Plus (Sportstec, Sydney, Australia, www.sportstec.com). We present an example of applying such software for the analysis of verbal feedback during arthroscopic training in our university hospital. During supervised training of arthroscopy, verbal communication is mainly used to guide the resident through the procedure. This suggests that the training process can be monitored through verbal communication. To investigate if current training in the operating room involves sufficient feedback and/or questioning to stimulate active learning, verbal communication was objectified and quantified.
Within a period of two times 3 months, 18 arthroscopic knee procedures were recorded with a special capturing system consisting of two video cameras – one from the arthroscopic camera and one of the hands of the residents (digital CCD camera, 21CW, Sony CCD, Tokyo, Japan) – and a tie-clip microphone (ECM-3003, Monacor, Bremen, Germany) that was mounted on the supervising surgeon. The video images were combined by a colour quad processor (GS-C4CQR, Golden State Instrument Co., Tustin, USA) and digitised simultaneously with the sound by an A/D converter (ADVC 110, GV Thomson, Paris, France). Four residents who were supervised by either one of two participating surgeons performed the operations. Communication events were tagged with Utilius VS 4.3.2 (CCC-software, Leipzig, Germany) and assigned to categories for the type and content of communication (Fig. 13.3). Four communication types were adopted from Blom et al. (2007): explaining, questioning, commanding and miscellaneous (Table 13.1). As this study specifically focuses on training, one category was added, feedback, which reflects the judgment of the teaching surgeon on the actions of the resident. Six categories for communication content were defined as follows: operation method (that has an accent on steps that have to be taken in the near future e.g. start creating the second portal), anatomy and pathology, instrument handling and tissue interaction (e.g. open punch, reposition instrument, stress joint, increase portal size, push meniscus backwards), visualisation (e.g. move scope, irrigation, focus), miscellaneous (general or private) and indefinable (Table 13.1). The frequency of events as percentage of total events in each of the categories was determined (Table 13.1). A multivariable linear regression analysis was performed to determine if the teaching surgeon and the experience of the residents significantly influenced the frequency of communication events per minute (p < 0.05).
On average 6.0 (SD 1.8) communication events took place every minute. The communication types explaining and commanding show a considerable frequency compared to questioning and feedback (Table 13.1). The explaining events were primarily on anatomy and pathology followed by instrument handling and tissue interaction. The commanding events were primarily on instrument handling and tissue interaction and visualisation, which in general were the most frequent communication content categories (Table 13.1). A difference in mean events per minute was found between both teaching surgeons (p < 0.05). No significant correlation was found between the frequency of events and the experience of the residents.
The results highlight distinctive communication patterns. The relative high frequency of the types explaining and commanding as opposed to questioning and feedback is noticeable as the latter two stimulate active learning in general. Additionally, explaining on the contents anatomy and pathology and instrument handling and tissue interaction is considerable. These items are particularly suitable for training outside the operating room. If trained so, more options are left to focus on other learning goals. As a clear difference was present between the frequency of events per minute amongst the surgeons and no correlation was found for the experience of residents, we cannot confirm that this method is suitable as an objective evaluation tool for new training methods. Additional research is recommended with a larger group of residents to minimise the effect of outliers.
3 Monitoring Complex Tasks and Assessing Learning Curves
To respect the holistic assessment model, expert surgeons are needed to assess the more complex tasks. This type of assessment is sensitive to the subjective opinion of the assessor, which might compromise fair judgment (Mabrey et al. 2002). To overcome this issue, education theories recommend the formulation of rubrics, which describe clear evaluation criteria and various levels of competence. In surgical training, such rubrics are called global rating scales (GRS). The GRS suggested that arthroscopic skills will be elucidated as well as their validation and examples to assess learning curves.
Within this section, we loosely follow Hodgins and Veillette who reviewed assessment tools for arthroscopic competency (Hodgins and Veillette 2013). Recently, various GRS have been developed specifically for structured, objective feedback during training of arthroscopies (Table 13.2):
-
1.
Orthopaedic Competence Assessment Project (OCAP) (Howells et al. 2008)
-
2.
Basic Arthroscopic Knee Skill Scoring System (BAKSSS) (Insel et al. 2009)
-
3.
Arthroscopic Skills Assessment (ASA) (Elliott et al. 2012)
-
4.
Objective Assessment of Arthroscopic Skills (OAAS) (Slade Shantz et al. 2013)
-
5.
Arthroscopic Surgery Skill Evaluation Tool (ASSET) (Koehler et al. 2013)
The actual forms are available in Appendices 13.A, 13.B, 13.C, 13.D and 13.E. Noticeable is that all arthroscopic GRS except for ASA have a similar structure with 7–10 items that need to be scored on a 5-point Likert scale. At least 3 of 5 points are explicitly described, which should help uniform assessment. Also many of the items are similar, such as instrument handling, flow of operation, efficiency and autonomy. OCAP and BAKSSS are also recommended to be used with task-specific checklists, whereas ASA solely focuses on knee arthroscopy with such a checklist. Analysing these GRS, one can conclude that a certain level of consensus exists on arthroscopic skills that a resident should be able to demonstrate in the operating theatre and the required level to qualify as competent.
OCAP is not specifically tested, but its items are derived from the well-established OSATS GRS, which has been validated extensively (Martin et al. 1997; Reznick et al. 1997). The four other GRS have been validated for construct, content and concurrent validity as well as internal consistency, interrater and test-retest reliability (Table 13.2). The results indicate that they meet the requirements and show a high correlation with year of residency. Notice that none of the study designs for validation are the same, thus one-to-one comparison is not possible. The ASSET has also been evaluated for summative assessment in a pass-fail examination, which was confirmed with a high rater agreement (ICC = 0.83) (Koehler and Nicandri 2013).
For OCAP and BAKSSS, we determined if they reflect the learning curve during arthroscopic training in the operating room and what their discriminative level is. 75 arthroscopic procedures performed by 15 residents in their fourth, fifth and sixth year of their residency were assessed by their supervising surgeon.
Pearson correlation coefficients were calculated between year of residence and normalised sum scores of both GRS questionnaires. The normalised sum score consisted of all points scored on each of the items normalised to a 100-point scale. The Pearson correlation was significant for BAKSSS (R = 0.73) and for OCAP 0.70 (R = 0.70). A linear regression analysis demonstrated a significant increase of the GRS sum score of 9.2 points (95 % CI 6.2–12.1) for BAKSSS and 9.5 points (95 % CI 6.5–12.5) for OCAP. The results lead to our conclusion that both GRS are suitable to monitor overall arthroscopic skills progression in the operating theatre.
Now that the tools for monitoring surgical performance in the operating theatre are summarised, this section focusses on the application of these tools to assess learning curves. As the number of studies is quite limited all are briefly described. The learning curve of arthroscopic rotator cuff repair was determined using operation time as metric (Guttmann et al. 2005). Using blocks of ten operations for comparison, a significant decrease in operation was determined between the first two blocks, but not for consecutive blocks. This indicates that learning took place in the first ten procedures. The learning curve for hip arthroscopy is determined by measuring the operation but also by determining the complication rate (Hoppe et al. 2014). Improvement was seen between early and late experience with 30 patient cases as being the most common cut-off. A similar study design was used to assess the learning curve for arthroscopic Latarjet procedures, which showed a significant decrease in operation time and complication rate between the first 15 patient cases and the consecutive 15 patient cases (Castricini et al. 2013). Van Oldenrijk and co-workers, who used time-action analysis to assess a learning curve for minimally invasive total hip arthroplasty, found that learning took place in the first five to ten patient cases (Van Oldenrijk et al. 2008). This was quantified by the number of repetitions, waiting and additional actions executed during the operation.
4 Discussion
In this chapter, monitoring tools to measure surgical performance and training progression were presented. Operation time is easy to measure and as shown capable of reflecting learning curves. Still, using the operation time as a measure for training purposes is less useful, since it does not give clues for the trainee on what to improve, and it reflects many more factors than the surgical performance such as the complexity of the patient case. This is also acknowledged in the global rating scales. The tracking systems that have been used on research studies are quite expensive and require preoperative installation and calibration, which could explain the absence of studies performed in the operating room to determine learning curves. However, in the entertainment and gaming industry, motion-tracking developments are growing fast, from which the surgical training field could benefit. For example, Wii controllers are affordable and their accuracy is continuously being improved. Measuring of forces as presented by Chami requires a specific measurement set-up and modification of the instruments (Chami et al. 2008). Furthermore, attention needs to be paid on the manner of feedback using force parameters as the feedback should make sense for the trainee. Overall, these metrics are used in simulated environments and are strong in monitoring confined less complex tasks or actions. However, video monitoring seems to reflect the required holistic judgment model needed to assess more complex cognitive tasks. The challenge is to cope with the huge amounts of data that video registration gives. In that perspective, automatic detection with image-based tracking algorithms would be a perfect alternative tool as the arthroscopic view is available anyhow. However, until now these algorithms lacked robustness due to continuous changing lighting conditions in the view. With this feature perspective, video analysis software as applied in athlete training might be a good alternative at short notice, especially if supervising surgeons define critical phases of the procedure that will be the focus of the learning experience, since this would limit the video recordings to those events solely. A major advantage of video analysis is that it can provided highly comprehensive feedback to the trainee. Another alternative is the use of global rating scales. These scales structure and objectify the feedback of the supervising surgeons, but cannot be so illustrative as video feedback. Furthermore, it is recommended that assessors using the scales are trained to attain uniform assessment. However, they are truly easy to implement in residency curricula, have been demonstrated to reflect the learning curve of residents and could also be used for self-assessment. Summarizing, quite some tools have been presented, and validation of GRS for arthroscopic skills has been performed. This offers feasible tools to continue arthroscopic skills monitoring in an objective, structured and comprehensive manner that is formative assessment. Still more research is required to determine which of the tools could be used for summative assessment.
Bibliography
Aggarwal R, Grantcharov T, Moorthy K, Milland T, Papasavas P, Dosis A, Bello F, Darzi A (2007) An evaluation of the feasibility, validity, and reliability of laparoscopic skills assessment in the operating room. Ann Surg 245(6):992–999, available from: PM:17522527
Alvand A, Khan T, Al-Ali S, Jackson WF, Price AJ, Rees JL (2012) Simple visual parameters for objective assessment of arthroscopic skill. J Bone Joint Surg Am 94(13):e97, available from: PM:22760398
Alvand A, Logishetty K, Middleton R, Khan T, Jackson WF, Price AJ, Rees JL (2013) Validating a global rating scale to monitor individual resident learning curves during arthroscopic knee meniscal repair. Arthroscopy 29(5):906–912, available from: PM:23628663
Backstein D, Agnidis Z, Regehr G, Reznick R (2004) The effectiveness of video feedback in the acquisition of orthopedic technical skills. Am J Surg 187(3):427–432, available from: PM:15006577
Backstein D, Agnidis Z, Sadhu R, MacRae H (2005) Effectiveness of repeated video feedback in the acquisition of a surgical technical skill. Can J Surg 48(3):195–200, available from: PM:16013622
Blom EM, Verdaasdonk EG, Stassen LP, Stassen HG, Wieringa PA, Dankelman J (2007) Analysis of verbal communication during teaching in the operating room and the potentials for surgical training. Surg Endosc 21(9):1560–1566, available from: PM:17285367
Blum T, Feussner H, Navab N (2010) Modeling and segmentation of surgical workflow from laparoscopic video. Med Image Comput Comput Assist Interv 13(Pt 3):400–407, available from: PM:20879425
Bouarfa L, Akman O, Schneider A, Jonker PP, Dankelman J (2012) In-vivo real-time tracking of surgical instruments in endoscopic video. Minim Invasive Ther Allied Technol 21(3):129–134, available from: PM:21574828
Bouarfa L, Dankelman J (2012) Workflow mining and outlier detection from clinical activity logs. J Biomed Inform 45(6):1185–1190, available from: PM:22925724
Bouarfa L, Jonker PP, Dankelman J (2011) Discovery of high-level tasks in the operating room. J Biomed Inform 44(3):455–462, available from: PM:20060495
Bridges M, Diamond DL (1999) The financial impact of teaching surgical residents in the operating room. Am J Surg 177(1):28–32, available from: PM:10037304
Castricini R, De Benedetto M, Orlando N, Rocchi M, Zini R, Pirani P (2013) Arthroscopic Latarjet procedure: analysis of the learning curve. Musculoskelet Surg 97(Suppl 1):93–98, available from: PM:23588833
Chami G, Ward J, Wills D, Phillips R, Sherman K (2006) Smart tool for force measurements during knee arthroscopy: in vivo human study. Stud Health Technol Inform 119:85–89, available from: PM:16404020
Chami G, Ward JW, Phillips R, Sherman KP (2008) Haptic feedback can provide an objective assessment of arthroscopic skills. Clin Orthop Relat Res 466(4):963–968, available from: PM:18213507
den Boer KT, Bruijn M, Jaspers JE, Stassen LP, Erp WF, Jansen A, Go PM, Dankelman J, Gouma DJ (2002) Time-action analysis of instrument positioners in laparoscopic cholecystectomy. Surg Endosc 16(1):142–147, available from: PM:11961625
Doignon C, Graebling P, de Mathelin M (2005) Real-time segmentation of surgical instruments inside the abdominal cavity using a joint hue saturation color feature. Real Time Imaging 11:429–442
Dosis A, Aggarwal R, Bello F, Moorthy K, Munz Y, Gillies D, Darzi A (2005) Synchronized video and motion analysis for the assessment of procedures in the operating theater. Arch Surg 140:293–299
Elliott MJ, Caprise PA, Henning AE, Kurtz CA, Sekiya JK (2012) Diagnostic knee arthroscopy: a pilot study to evaluate surgical skills. Arthroscopy 28(2):218–224, available from: PM:22035780
Farnworth LR, Lemay DE, Wooldridge T, Mabrey JD, Blaschak MJ, DeCoster TA, Wascher DC, Schenck RC Jr (2001) A comparison of operative times in arthroscopic ACL reconstruction between orthopaedic faculty and residents: the financial impact of orthopaedic surgical training in the operating room. Iowa Orthop J 21:31–35, available from: PM:11813948
Guttmann D, Graham RD, MacLennan MJ, Lubowitz JH (2005) Arthroscopic rotator cuff repair: the learning curve. Arthroscopy 21(4):394–400, available from: PM:15800517
Harewood GC, Murray F, Winder S, Patchett S (2008) Evaluation of formal feedback on endoscopic competence among trainees: the EFFECT trial. Ir J Med Sci 177(3):253–256, available from: PM:18584274
Hodgins JL, Veillette C (2013) Arthroscopic proficiency: methods in evaluating competency. BMC Med Educ 13:61, available from: PM:23631421
Hoppe DJ, de Sa D, Simunovic N, Bhandari M, Safran MR, Larson CM, Ayeni OR (2014) The learning curve for hip arthroscopy: a systematic review. Arthroscopy, available from: PM:24461140
Howells NR, Gill HS, Carr AJ, Price AJ, Rees JL (2008) Transferring simulated arthroscopic skills to the operating theatre: a randomised blinded study. J Bone Joint Surg Br 90(4):494–499, available from: PM:18378926
Insel A, Carofino B, Leger R, Arciero R, Mazzocca AD (2009) The development of an objective model to assess arthroscopic performance. J Bone Joint Surg Am 91(9):2287–2295, available from: PM:19724008
Koehler RJ, Amsdell S, Arendt EA, Bisson LJ, Braman JP, Butler A, Cosgarea AJ, Harner CD, Garrett WE, Olson T, Warme WJ, Nicandri GT (2013) The arthroscopic surgical skill evaluation tool (ASSET). Am J Sports Med 41(6):1229–1237, available from: PM:23548808
Koehler RJ, Nicandri GT (2013) Using the arthroscopic surgery skill evaluation tool as a pass-fail examination. J Bone Joint Surg Am 95(23):e1871–e1876, available from: PM:24306710
Mabrey JD, Gillogly SD, Kasser JR, Sweeney HJ, Zarins B, Mevis H, Garrett WE Jr, Poss R, Cannon WD (2002) Virtual reality simulation of arthroscopy of the knee. Arthroscopy 18(6):E28, available from: PM:12098110
Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, Brown M (1997) Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg 84(2):273–278, available from: PM:9052454
Matziolis G, Krocker D, Weiss U, Tohtz S, Perka C (2007) A prospective, randomized study of computer-assisted and conventional total knee arthroplasty. Three-dimensional evaluation of implant alignment and rotation. J Bone Joint Surg Am 89(2):236–243, available from: PM:17272435
Minekus JP, Rozing PM, Valstar ER, Dankelman J (2003) Evaluation of humeral head replacements using time-action analysis. J Shoulder Elbow Surg 12(2):152–157, available from: PM:12700568
Moon YW, Ha CW, Do KH, Kim CY, Han JH, Na SE, Lee CH, Kim JG, Park YS (2012) Comparison of robot-assisted and conventional total knee arthroplasty: a controlled cadaver study using multiparameter quantitative three-dimensional CT assessment of alignment. Comput Aided Surg 17(2):86–95, available from: PM:22348661
Morris AH, Jennings JE, Stone RG, Katz JA, Garroway RY, Hendler RC (1993) Guidelines for privileges in arthroscopic surgery. Arthroscopy 9(1):125–127, available from: PM:8442822
O’Connor A, Schwaitzberg SD, Cao CG (2008) How much feedback is necessary for learning to suture? Surg Endosc 22(7):1614–1619, available from: PM:17973165
O’Neill PJ, Cosgarea AJ, Freedman JA, Queale WS, McFarland EG (2002) Arthroscopic proficiency: a survey of orthopaedic sports medicine fellowship directors and orthopaedic surgery department chairs. Arthroscopy 18(7):795–800, available from: PM:12209439
Olson T, Koehler R, Butler A, Amsdell S, Nicandri G (2013) Is there a valid and reliable assessment of diagnostic knee arthroscopy skill? Clin Orthop Relat Res 471(5):1670–1676, available from: PM:23254692
Padoy N, Blum T, Ahmadi SA, Feussner H, Berger MO, Navab N (2012) Statistical modeling and recognition of surgical workflow. Med Image Anal 16(3):632–641, available from: PM:21195015
Pedowitz RA, Esch J, Snyder S (2002) Evaluation of a virtual reality simulator for arthroscopy skills development. Arthroscopy 18(6):E29, available from: PM:12098111
Prince M (2004) Does active learning work? A review of the research. J Eng Educ 93(3):223–231
Reznick R, Regehr G, MacRae H, Martin J, McCulloch W (1997) Testing technical skill via an innovative “bench station” examination. Am J Surg 173(3):226–230, available from: PM:9124632
Rosenberger RE, Hoser C, Quirbach S, Attal R, Hennerbichler A, Fink C (2008) Improved accuracy of component alignment with the implementation of image-free navigation in total knee arthroplasty. Knee Surg SportsTraumatol Arthrosc 16(3):249–257, available from: PM:18157493
Savoie FH III (2007) Guidelines to becoming an expert elbow arthroscopist. Arthroscopy 23(11):1237–1240, available from: PM:17986413
Sjoerdsma W, Meijer DW, Jansen A, den Boer KT, Grimbergen CA (2000) Comparison of efficiencies of three techniques for colon surgery. J Laparoendosc Adv Surg Tech A 10(1):47–53, available from: PM:10706303
Slade Shantz JA, Leiter JR, Collins JB, MacDonald PB (2013) Validation of a global assessment of arthroscopic skills in a cadaveric knee model. Arthroscopy 29(1):106–112, available from: PM:23177383
Tuijthof GJ, de Vaal MM, Sierevelt IN, Blankevoort L, van der List MP (2011) Performance of arthroscopic irrigation systems assessed with automatic blood detection. Knee Surg Sports Traumatol Arthrosc 19(11):1948–1954, available from: PM:21479643
Tuijthof GJ, Sierevelt IN, van Dijk CN (2007) Disturbances in the arthroscopic view defined with video analysis. Knee Surg Sports Traumatol Arthrosc 15(9):1101–1106, available from: PM:17410346
Tuijthof GJ, van den Boomen H, van Heerwaarden RJ, van Dijk CN (2008) Comparison of two arthroscopic pump systems based on image quality. Knee Surg Sports Traumatol Arthrosc 16(6):590–594, available from: PM:18322672
Van Oldenrijk J, Schafroth MU, Bhandari M, Runne WC, Poolman RW (2008) Time-action analysis (TAA) of the surgical technique implanting the collum femoris preserving (CFP) hip arthroplasty. TAASTIC trial identifying pitfalls during the learning curve of surgeons participating in a subsequent randomized controlled trial (an observational study). BMC Musculoskelet Disord 9:93, available from: PM:18577202
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendices
Appendix 13.A Orthopaedic Competence Assessment Project
Skill | Score 1 | Score 2 | Score 3 | Score 4 | Score 5 |
---|---|---|---|---|---|
Follows protocol | Unsatisfactory | Adequate. Occasional need for guidance and help | Excellent adherence to agreed protocol. No prompts. No mistakes | ||
Handles tissue well | Careless. Potential to cause damage | Adequate. No tissue damage. Occasional need for increased care | Excellent tissue handling. Precise and delicate | ||
Appropriate and safe use of instruments | Dangerous. Risk to patient and assistant. Potential for damage to equipment | Adequate use of instruments and scope. Occasional guidance to ensure instruments remain within field of vision | Excellent use of instruments. Good control of arthroscope. Instruments constantly within field of vision | ||
Appropriate pace with economy of movement | Erratic pace and movements. Overly rushing or inappropriately slow | Adequate economy of movement. Majority of movements controlled and careful. Occasional erratic movement | Excellent fluidity and economy of movement. Procedure performed at appropriate pace without erratic movements | ||
Act calmly and effectively with untoward events | Unable to deal with adverse events. Panic and inability to respond | Remains calm. Remains safe. Takes advice from supervisor. Unable to cope independently | Excellent ability to cope with adverse events. Remains calm. Deals with complication independently | ||
Appropriate use of assistant | Fails to involve assistant appropriately. Resultant poor positioning. Poor rapport | Asks for appropriate joint position at appropriate times. Unable to suggest alternative positions to improve view/access | Excellent use of assistant. Good rapport. Able to constantly modify input of assistant to best advantage throughout procedure | ||
Communicates with scrubs nurse | Inappropriate communication resulting in confusion or operative delay | Appropriate communication with scrub nurse. Occasional need for clarification from supervisor | Excellent rapport with scrub nurse. Clear and effective communication, maximising procedural efficiency | ||
Clearly identifies common abnormalities | Unable to identify common abnormalities. Confusion over basic anatomy | Adequate identification of common pathology. Occasional mistake. Unsure of precise classifications | Excellent knowledge of pathology of common abnormalities. Clear understanding of classification of injuries | ||
Protecting the articular surface | Inability to protect articular surface appropriately. Potential to cause damage | Awareness of need to protect articular surface. Adequate care taken. Occasional prompt from supervisor required | Excellent awareness of articular surfaces. High degree of care maintained throughout the procedure |
Appendix 13.B Basic Arthroscopic Knee Skill Scoring System
Skill | Score 1 | Score 2 | Score 3 | Score 4 | Score 5 |
---|---|---|---|---|---|
Dissection | Appeared excessively hesitant, caused trauma to tissues, did not dissect into correct anatomical plan | Controlled and safe dissection into correct anatomical plane, caused minimal trauma to tissues | Superior and atraumatic dissection into the correct anatomical plane | ||
Instrument handling | Repeatedly makes tentative or awkward movements with instruments | Competent use of instruments, although occasionally appeared stuff or awkward | Fluid moves with instruments and no awkwardness | ||
Depth perception | Constantly overshoots target, slow to correct | Some overshooting or missing of target | Accurately directs instruments in the correct plane to target | ||
Bimanual dexterity | Noticeably awkward with non-dominant hand, poor coordination between hands | Uses both hands but does not maximise interaction between hands | Expertly uses both hands in complementary manner to provide optimum performance | ||
Flow of operation and forward planning | Frequently stopped operating or needed to discuss next move | Demonstrated ability for forward planning with steady progression of operative procedure | Obviously planned course of operation with effortless flow from one move to the next | ||
Knowledge of instruments | Frequently asked for the wrong instrument or used inappropriate instrument | Knew the names of most instruments and used appropriate instrument for the task | Obviously familiar with the instruments required and their names | ||
Efficiency | Many unnecessary, inefficient movements. Constantly changing focus or persisting without progress | Slow, but planned movements are reasonably organised with few unnecessary or repetitive movements | Confident, clear economy of movement and maximum efficiency | ||
Knowledge of specific procedure | Deficient knowledge, needed specific instruction at most operative steps | Knew all important aspects of the operation | Demonstrated familiarity with all aspects of the operation | ||
Autonomy | Unable to complete entire task, even with verbal guidance | Able to complete task safely with moderate guidance | Able to complete task independently without prompting | ||
Quality of final product | Very poor | Competent | Clearly superior |
Appendix 13.C Arthroscopic Skills Assessment
Start time | Stop time | Total time |
---|---|---|
Landmark | To be visualised | Score |
Suprapatellar pouch | View all areas of pouch | (3) |
Patella | View medial facet | (3) |
View lateral facets | (3) | |
Trochlea | View trochlear surface | (4) |
Medical recess | View medial gutter/assess meniscal synovial junction | (4) |
Lateral recess | View lateral gutter/assess meniscal junction/popliteus | (4) |
Medial compartment | Assess condyle for chondral lesions | (5) |
Meniscus/view anterior, middle, posterior | (5) | |
Probe superior and inferior surface | (10) | |
Intercondylar notch | View and inspect ACL | (5) |
View and inspect PCL | (5) | |
Lateral compartment | Assess condyle for chondral lesions | (5) |
Meniscus/view anterior, middle, posterior | (5) | |
Probe superior and inferior surface | (10) | |
View popliteus tendon | (4) |
Missed items | Scope score | |
---|---|---|
Time | Time penalty | Total time score |
Total score |
Appendix 13.D Objective Assessment of Arthroscopic Skills
Skill | Novice | Advanced beginner | Competent | Proficient | Expert |
---|---|---|---|---|---|
Examining/manipulating joint | Did not examine joint or position to give improved visualisation during procedure | Examined joint without diagnostic abilities and lacked ability to facilitate view by positioning | Positioned knee appropriately after some difficulty with visualisation | Used common positioning to facilitate view during arthroscopy | Used accepted and novel positioning to perform the arthroscopy effortlessly |
Triangulating instruments | Could not insert instruments into ports and maintain them in view. Unable to locate instrument tips without difficulty | Unable to maintain instrument in field of view consistently | Found instruments with delay. Field of view wandered from operative site but returned | Found instruments quickly and began work. Occasionally delayed in orienting camera to afford better visualisation | Immediately located instruments and began work without delay. Kept instrument in field of view at all times |
Controlling fluid flow and joint distension | Under-/overdistended joint consistently due to inappropriate matching of suction and flow. | Achieved proper distension after delays. Some extravasation into tissue due to overdistension | Distended joint adequately after initial loss of pressure during suction | Joint distended appropriately through control of flow and suction | Minimal fluid extravasated with constantly maintained field of view |
Maintaining field of view | Often disoriented. Was unable to adjust scope to improve visualisation | Maintained field of view part of the time | Maintained and adjusted arthroscope to provide maximal view with some difficulty | Maintained field of view in same portal | Changed portals quickly to improve visualisation |
Controlling instruments | Was unable to perform tasks with provided instruments. Caused cartilage damage | Repeatedly made tentative or awkward moves with instruments | Competently used instruments although occasionally appeared stiff or awkward | Used instruments appropriately and efficiently | Made fluid moves with instruments and used some instruments in novel ways to increase efficiency |
Economising time and planning forward | Was unable to complete any portion of the procedure | Was able to complete components of the procedure, but needed to discuss next move | Completed all components of the operation with some unnecessary moves | Was efficient, but continued discovering new time saving motions | Showed economy of movement and maximum efficiency |
Overall | Possessed rudimentary arthroscopic skills with only basic anatomical and mechanical understanding | Knew basic steps of procedure and performed some independently | Performed the procedure independently | Performed procedure with changes to improve efficiency | Performed the procedure with minimal chance to improve efficiency |
Complexity | No difficulties | Slightly difficult | Moderately difficult | Considerable difficulty | Critical |
Appendix 13.E Arthroscopic Surgical Skill Evaluation Tool
Skill | Score 1 | Score 2 | Score 3 | Score 4 | Score 5 |
---|---|---|---|---|---|
Safety | Significant damage to articular cartilage or soft tissue | Insignificant damage to articular cartilage or soft tissue | No damage to articular cartilage or soft tissue | ||
Field of view | Narrow field of view, inadequate arthroscope or light source positioning | Moderate field of view, adequate arthroscope and light source positioning | Expansive field of view, optimal arthroscope and light source positioning | ||
Camera dexterity | Awkward or graceless movements, fails to keep camera centred and correctly oriented | Appropriate use of camera, occasionally needs to reposition | Graceful and dexterous throughout procedure with camera always centred and correctly | ||
Instrument dexterity | Overly tentative or awkward with instruments, unable to consistently direct instruments to targets | Careful, controlled use of instruments, occasionally misses targets | Confident and accurate use of all instruments | ||
Bimanual dexterity | Unable to use both hands or no coordination between hands | Uses both hands but occasionally fails to coordinate movement of camera and instruments | Uses both hands to coordinate camera and instrument positioning for optimal performance | ||
Flow of procedure | Frequently stops operating or persists without progress, multiple unsuccessful attempts prior to completing tasks | Steady progression of operative procedure with few unsuccessful attempts prior to completing tasks | Obviously planned course of procedure, fluid transition from one task to the next with no unsuccessful attempts | ||
Quality of procedure | Inadequate or incomplete final product | Adequate final product with only minor flaws that do not require correction | Optimal final product with no flaws | ||
Autonomy | Unable to complete procedure even with intervention(s) | Able to complete procedure but required intervention(s) | Able to complete procedure without intervention | ||
Complexity | No difficulty | Moderate difficulty (mild inflammation or scarring) | Extreme difficulty (severe inflammation or scarring, abnormal anatomy) |
Rights and permissions
Copyright information
© 2015 ESSKA
About this chapter
Cite this chapter
Tuijthof, G.J.M., Sierevelt, I.N. (2015). Monitoring Performance and Progression in the Operating Theatre. In: Karahan, M., Kerkhoffs, G., Randelli, P., Tuijthof, G. (eds) Effective Training of Arthroscopic Skills. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-44943-1_13
Download citation
DOI: https://doi.org/10.1007/978-3-662-44943-1_13
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-44942-4
Online ISBN: 978-3-662-44943-1
eBook Packages: MedicineMedicine (R0)