Abstract
Purpose
Simulation-based medical education (SBME) is provided by all anesthesiology residency programs in Canada. The purpose of this study was to characterize SBME in Canadian anesthesiology residency training programs.
Methods
We administered a 21-question survey to the simulation director/coordinator for all 17 Canadian academic departments of anesthesiology from October 2019 to January 2020. The survey consisted of questions pertaining to the characteristics of the simulation centres, their faculty, learners, curriculum, and assessment processes.
Results
All 17 residency training programs participated in the survey and reported large variability in the number and formal training of simulation faculty and in content delivery. Five programs (29%) did not provide faculty recognition for curriculum design and running simulation sessions. Most programs offered one to four simulation sessions per academic year for each year of residency. All programs offered mannequin-based and part-task trainers for teaching technical and nontechnical skills. Fourteen programs (82%) offered interprofessional and interdisciplinary simulation sessions, and ten programs (59%) did not include in situ simulation training. Commonly reported barriers to faculty involvement were lack of protected time (12 programs, 71%), lack of financial compensation (ten programs, 59%), and lack of appreciation for SBME (seven programs, 41%).
Conclusion
Large variability exists in the delivery of SBME in Canadian anesthesiology residency simulation programs, in part because of differences in financial/human resources and educational content. Future studies should explore whether training and patient outcomes differ between SBME programs and, if so, whether additional standardization is warranted.
Résumé
Objectif
La formation médicale par simulation est offerte par tous les programmes de résidence en anesthésiologie au Canada. L’objectif de cette étude était de déterminer l’état actuel de la formation médicale par simulation dans les programmes canadiens de résidence en anesthésiologie.
Méthode
D’octobre 2019 à janvier 2020, nous avons administré un sondage comportant 21 questions aux directions et équipes de coordination de la simulation des 17 départements universitaires d’anesthésiologie canadiens. L’enquête comportait des questions portant sur les caractéristiques des centres de simulation, le corps professoral, les apprenants et apprenantes, le programme d’études et les processus d’évaluation.
Résultats
Les 17 programmes de résidence ont tous participé à l’enquête et ont fait état d’une grande variabilité dans le nombre et la formation officielle du corps professoral en simulation ainsi que dans la prestation de contenu. Cinq programmes (29 %) n’ont pas reconnu le corps professoral en charge de la conception des programmes d’études et de l’organisation des séances de simulation. La plupart des programmes offraient une à quatre séances de simulation par année universitaire à chaque année de résidence. Tous les programmes disposaient de simulateurs d’entraînement pour tâches partielles et de mannequins pour enseigner des compétences techniques et non techniques. Quatorze programmes (82 %) offraient des séances de simulation interprofessionnelles et interdisciplinaires, et dix programmes (59 %) ne comportaient pas de formation par simulation in situ. Les obstacles les plus fréquemment signalés à la participation du corps professoral étaient le manque de temps protégé (12 programmes, 71 %), le manque de compensation financière (dix programmes, 59 %) et le manque d’appréciation de la formation médicale par simulation (sept programmes, 41 %).
Conclusion
Il existe une grande variabilité dans la prestation de formation médicale par simulation dans les programmes de simulation pendant la résidence en anesthésiologie au Canada, causée en partie par des différences dans les ressources financières et humaines et par le contenu de la formation. Des études futures devraient déterminer si la formation et les issues pour les patient·es diffèrent d’un programme de formation médicale par simulation à l’autre et, dans l’affirmative, si une normalisation supplémentaire est justifiée.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Competency-based medical education (CBME) has dramatically changed medical education in Canadian anesthesiology programs. Competency-based medical education is an outcomes-based approach to the design, implementation, assessment, and evaluation of medical education.1,2 Society expects educational programs to ensure physicians deliver safe and effective care. The transition to CBME was intended to ensure that all graduating residents achieve the necessary competencies before independent practice.3 In an ideal situation, all the required competencies can be observed or assessed during routine clinical practice. In reality, anesthesiology training is unpredictable and heterogeneous, and many high-stakes situations may not be amenable or appropriate for trainee assessment.4 Simulation-based medical education (SBME), which is defined as any educational activity that uses simulation to replicate clinical scenarios, can be used to complement traditional methods of medical education and allow learners to practice and reinforce their knowledge, skills, and attitudes, especially in low-frequency, high-stakes clinical events.5,6,7
Simulation-based medical education was widely adopted in anesthesiology training programs well before the CBME model was introduced, primarily in response to decreasing tolerance for medical errors and greater emphasis on patient safety.8 Simulation-based medical education is already being used to teach and assess several anesthesiology competencies that are difficult to acquire, including procedural and communication skills, interprofessional learning, nontechnical skills (e.g., situational awareness, leadership, teamwork), and management of rare and emergent clinical situations.5,9 Simulation-based assessments have also been shown to provide valid competency assessments that correlate with clinical performance.10,11 In Canada, all anesthesiology residents must successfully complete five mandatory standardized mannequin-based simulation scenarios, called the Canadian National Anesthesiology Simulation Curriculum (CanNASC) scenarios, prior to certification by the Royal College of Physicians and Surgeons of Canada.12
In response to the implementation of CBME in anesthesiology, the demand for simulation-based training and assessment is anticipated to increase. Thus, there needs to be a greater understanding of how SBME is currently being used for teaching and assessment, the available resources, and any potential barriers to the use of this educational modality. We conducted this study to assess the content and assessment criteria of SBME in Canadian academic departments of anesthesiology.
Methods
Ethics approval for this study was obtained from the University of Alberta Research Ethics Board (Edmonton, AB, Canada; approval number, 93102).
Survey development
We developed an English-language survey in accordance with established guidelines.13 A search of MEDLINE, PubMed®, and Google Scholar identified four prior surveys of simulation use and resource studies.14,15,16,17,18 We created the initial survey questions based on themes and topics described in these articles, and modified the questions based on gaps identified within our Canadian context. Our collective experience includes graduate training in education, fellowship training in simulation and medical education, and experience developing and implementing both summative and formative simulation curricula in Canadian residency programs.
The survey captured demographic information, simulation centre characteristics, simulation faculty characteristics, training program characteristics, simulation content, and learner assessment. The investigators held group discussions, during which items were removed, added, or modified until consensus was achieved on survey question content, survey length, and appropriate response options. We used checkboxes for all question responses as a previous report has shown that closed-ended questions result in fewer incomplete questionnaires than open-ended formats.19 We piloted the survey with an anesthesiology faculty member with expertise in SBME who gave feedback about the flow, content, clarity, time required for completion, and user-friendliness. J. W. B, a methodologist with expertise in conducting survey-based research, reviewed the survey. The resulting feedback was reviewed and incorporated into the final version of the survey, which consisted of 21 questions (see Electronic Supplementary Material eAppendix).
Study population
All 17 Canadian academic departments of anesthesiology were invited to take part in the study. The simulation director/coordinator at each site was identified and either they or their delegate completed the survey.
Survey distribution
We administered and collected surveys from October 2019 to January 2020. An electronic consent form, which described the background and intent of the study, description of the survey content and length, and study investigator contact information, was provided to each institution’s designated SBME contact. A link to the survey was attached to each e-mail invitation using Google Forms (Alphabet Inc., Mountainview, CA, USA). The survey was made available online for ten weeks, with two reminder e-mails sent during the third and sixth weeks after the initial invitation. Involvement in the study was voluntary. All data were stored within a password-protected university-affiliated Google account. No incentives to participate were provided.
Statistical analysis
All survey responses were downloaded from Google Forms and analyzed using Microsoft® Excel version 16 (Microsoft Corporation, Redmond, WA, USA). Demographic data and characteristics of educational programs, simulation content, and simulation assessment were presented as frequencies with percentages and median with range where appropriate.
Results
Demographics
The response rate for the survey was 17/17 (100%). Among the respondents, 9/17 (53%) were simulation directors, while the remaining eight respondents were the local CanNASC leads of their respective simulation programs.
Simulation centre characteristics
The faculty of medicine funded 11 simulation centres (65%) within their university. The second and third most common funding sources were the provincial health authority (5 of 17; 29%) and department of anesthesiology (4 of 17; 24%). Approximately half of the simulation centres were accredited, most commonly by the Royal College of Physicians and Surgeons of Canada (Table 1).
Simulation faculty characteristics
There was large variability regarding simulation faculty number, roles, and formal training across programs (Table 2). The median number of faculty members involved in simulation at each site was eight (range, 4–20). Three programs (18%) reported no fellowship-trained faculty and eight (47%) had only one to two fellowship trained faculty. Most programs reported faculty who had completed a simulation instructor course or were trained by other faculty members. Table 3 reports the types of compensation received by simulation faculty. Of note, five programs (29%) did not offer any form of faculty compensation for those involved in curriculum design and development, or for running and debriefing simulation sessions. Seven of the programs (41%) were accredited by the Royal College of Physicians and Surgeons of Canada. The three most reported barriers to faculty involvement in SBME were lack of protected time (12 programs; 71%), lack of financial compensation (ten programs; 59%), and lack of appreciation for SBME (seven programs; 41%) (Table 4).
Simulation program characteristics
Fourteen programs (82%) provided a structured rotating annual simulation curriculum. Ten simulation programs (59%) were designed in collaboration with the Residency Program Committee, and eight (47%) were integrated into existing academic curriculums. Fifteen programs (88%) offered an introductory first-year intensive simulation course designed to help transition the learner into residency training. Every program offered both mannequin-based simulators and part-task trainers (Table 5).
Most residents received one to four days of simulation sessions per year during their first to third, and fifth year of residency (Table 6). Two programs (12%) offered 13–16 days of simulation for first-year trainees. Residents during their fourth year of training received five to eight days of simulation sessions per year. Outside of CanNASC, three programs (18%) did not offer any simulation sessions to their fourth- and fifth-year residents.
Simulation curricular content
More than 90% of the programs included technical or procedural skills training, mannequin-based management of rare perioperative problems, and crisis resource management training during residency training (Table 7). Three programs (18%) did not provide interdisciplinary or interprofessional training or pediatric simulation. Ten programs (59%) did not include in situ simulation training. Only two programs (12%) did not offer training in interpersonal communication skills, and four programs (24%) did not offer simulation sessions related to ethical considerations in anesthesiology practice.
In regard to part-task trainers, all programs provided advanced airway simulation-based training. The provision of simulation training for other part-task trainers, such as echocardiography, invasive access or monitoring, bronchoscopy, and regional anesthesia skills ranged from six (35%) to 13 (76%) programs (Table 8).
Resident assessment
CanNASC is a mandatory summative simulation-based examination that all senior anesthesiology residents must successfully complete during their training; consequently, all programs use this form of summative simulation-based assessment. Outside of CanNASC, only four programs (24%) used simulation for summative assessment. Among these four programs, the format of assessment varied between mannequin-based (3/4; 75%), objective structured clinical examinations (2/4; 50%), and part-task trainers (1/4; 25%) (Table 9). The purpose underlying summative assessment also varied, with three programs (75%) focusing on the assessment of CanMEDS roles and promotion of residents to the next level of seniority.
Discussion
Simulation-based medical education is a core component of anesthesiology residency training; however, our survey found important variability in the design, content, and experience of simulation faculty across the 17 programs in Canada. Whether this variability affects the clinical competencies of trainees or, more importantly, patient outcomes, is uncertain.
From a simulation faculty characteristic perspective, most programs we surveyed had engaged three to five faculty to create scenarios and run and debrief simulation sessions, which is similar to centres in the USA.15 Also, as was the case in our study, inadequate financial compensation and time for SBME faculty were reported in three prior international and one Canadian survey exploring barriers to the use of simulation in anesthesiology.15,18,20,21 Many programs also reported a lack of appreciation or recognition for SBME among their nonsimulation faculty. Reasons may include inadequate faculty involvement in simulation activities, lack of awareness of the evidence supporting SBME, and an absence of local champions/leaders advocating for SBME.22 In our study, 18% of programs did not have any faculty who had completed simulation fellowships. Engagement of faculty who have completed formal fellowships in simulation education may champion and improve the profile of SBME within their department and has been shown to improve career satisfaction and scholarly success.23
A previous Canadian survey of anesthesiology residents from 2010 found perceived variability in the way simulation was delivered at each training program.16 Our survey of Canadian simulation directors or their proxy has confirmed these impressions. Our findings are also consistent with an American survey that found large variations in simulation-based training and assessment programs.15 The clinical experiences and exposures of residents during their training period can vary, and relying solely on real patient interactions may not guarantee residents will be exposed to all the necessary training requirements laid out by published national residency curriculums. Thus, simulation is an effective tool to complement our current educational programs to ensure the necessary competencies are achieved.5,6 In addition, integrating SBME into a standardized educational curriculum may support the sustainability and quality of programs.24 Recognizing the importance of SBME, Scottish junior doctors and North American emergency medicine programs have already attempted to standardize and implement national simulation curriculums for both teaching and/or assessment.25,26,27,28
Nevertheless, in light of the monetary costs and resource demands of simulation training, more research is needed on the optimal method of SBME delivery in academic centres before it becomes standard.
A 2010 Canadian survey explored residents’ experiences and attitudes towards SBME. This survey found that junior and senior residents received a median of two simulation sessions per year,16 whereas our survey found that many programs had significantly increased the number of simulation sessions over the subsequent ten years. In the 2010 survey, 81% of the residents agreed that an introductory simulator course focused on management of common intraoperative emergencies should be available, and we found that 88% of programs currently provide such a course.16 These encouraging findings highlight the progress SBME has made over the years, which may indicate a greater acceptance of SBME, less acceptance of patient risk during training, transformation of medical education delivery (i.e., CBME), and reduced costs of purchasing and maintaining simulation equipment.24
Eighty-two percent of the programs we surveyed offered interprofessional or interdisciplinary team training, which is a substantial increase compared with the 2010 Canadian survey,16 in which 76% of residents “rarely” or “never” practiced with individuals from other programs or specialties. Practicing within a team is important during residency training as team training has been shown to improve crisis resource management skills, and is associated with greater patient safety.27,28,29 Furthermore, team training within authentic clinical environments also referred to as in-situ simulation training, has been associated with increased detection of latent safety threats and improved patient outcomes.30,31,32,33 Team training requires participation by several disciplines and allied health care professionals and often requires endorsement and support from organizational leaders.33 Our survey found that team-based simulation training has become increasingly common, which suggests that greater acceptance of SBME has occurred in fields outside of anesthesiology.
Seventy-six percent of programs did not use simulation for summative assessments outside of CanNASC scenarios. Simulation is highly stressful for learners, and traditionally, simulation instructors often prioritize the importance of simulation for formative (learning) purposes and confidentiality as a way of maintaining a safe learning environment.34,35 These concerns compounded by the resource intensive nature of SBME, may partially explain why programs have been slow to adopt summative-based formats. In response to the resource constraints, a recent study by Fleming et al. found that the optimal number of raters and scenarios needed for competency-based assessments in a simulated setting were two and four, respectively.36 These types of innovative studies are needed to help optimize our current resource supply and educational demand.
The advent of CBME in anesthesiology is likely to shift the traditional paradigm of SBME and accelerate the adoption of summative-based SBME, as residents will require more frequent contextualized assessments of their level of competence, especially with rare clinical presentations.12,36 As depicted in Miller’s pyramid, simulation is an ideal tool for assessing a learner’s “shows how” level of competence, thus reflecting a more accurate picture of how a trainee behaves in their actual clinical setting.37 Nevertheless, before widespread adoption, more research is needed regarding proper implementation and validity evidence for summative simulation-based assessment.38
Limitations
This study has several limitations. First, we only surveyed academic centres primarily focused on SBME offered to residents, and our findings may not be representative of SBME offered through community centres or other learners, such as staff, fellows, or medical students. Many residents rotate through community centres during their training, and simulation training in these centres are often run independently from the resident’s formal simulation curriculum. Second, we relied on self-report and did not confirm information provided, which may have been affected by positive response bias. Third, not all respondents were local simulation directors. Recruitment was conducted through our national CanNASC working group, and in some centres, the CanNASC lead is not the simulation director/coordinator. In those instances, the CanNASC lead had the opportunity to clarify or seek answers from their local simulation faculty, including their simulation director. Lastly, our study was conducted prior to the onset of the SARS-CoV-2 pandemic and postpandemic surgical backlog. The pandemic imposed drastic changes and necessary adaptations in the delivery of medical education, including simulation-based training.39,40,41,42 Many centres had to divert simulation resources (space and personnel) towards team preparation and personal protective equipment training during the pandemic. In addition, strict workplace restrictions were imposed that made running routine simulation sessions impossible. In response to these changes, the simulation community began shifting more toward distant and remote simulation techniques.39,40,41,42 Furthermore, with our current national shortage of anesthesiologists and surgical backlog, a greater demand for clinical services may divert time and resources away from nonclinical activity, including simulation-based medical education.43 It is unclear how these factors have influenced our findings.
Conclusion
We found significant progress and innovation in the use of simulation within academic Canadian anesthesiology programs over the past ten years. Substantial variations in these programs, including faculty support, financial and human resources, educational content, and delivery of curricula still exist. Future work is needed to establish optimal strategies to integrate simulation within the CBME framework, explore the return on investment of simulation activities, and gather more validity evidence for SBME.
References
Lockyer J, Carraccio C, Chan MK, et al. Core principles of assessment in competency-based medical education. Med Teach 2017; 39: 609–16. https://doi.org/10.1080/0142159x.2017.1315082
Frank JR, Snell L, Englander R, Holmboe ES, ICBME Collaborators. Implementing competency-based medical education: moving forward. Med Teach 2017; 39: 568–73. https://doi.org/10.1080/0142159x.2017.1315069
Touchie C, Kinnear B, Schumacher D, et al. On the validity of summative entrustment decisions. Med Teach 2021; 43: 780–7. https://doi.org/10.1080/0142159x.2021.1925642
Holmboe ES. Realizing the promise of competency-based medical education. Acad Med 2015; 90: 411–3. https://doi.org/10.1097/acm.0000000000000515
Dupre J, Naik VN. The role of simulation in high-stakes assessment. BJA Educ 2021; 21: 148–53. https://doi.org/10.1016/j.bjae.2020.12.002
Sydor DT, Sherbino J, Frank JR. Chapter 10: Simulation and competency-based medical education: “showing how.” In: Sherbino J, Frank JR (Eds.). Educational Design: A CanMEDS Guide for the Health Professions. Ottawa: Royal College of Physicians and Surgeon; 2011: 65–9.
Al-Elq AH. Simulation-based medical teaching and learning. J Fam Community Med 2010; 17: 35–40. https://doi.org/10.4103/1319-1683.68787
Leblanc VR. Simulation in anesthesia: state of the science and looking forward. Can J Anesth 2012; 59: 193–202. https://doi.org/10.1007/s12630-011-9638-8
Weller JM, Naik VN, San Diego RJ. Systematic review and narrative synthesis of competency-based medical education in anaesthesia. Br J Anaesth 2020; 124: 748–60. https://doi.org/10.1016/j.bja.2019.10.025
Kealey A, Naik VN. Competency-based medical training in anesthesiology: has it delivered on the promise of better education? Anesth Analg 2022; 135: 223–9. https://doi.org/10.1213/ane.0000000000006091
Isaak RS, Chen F, Martinelli SM, et al. Validity of simulation-based assessment for Accreditation Council for Graduate Medical Education milestone achievement. Simul Healthc 2018; 13: 201–10. https://doi.org/10.1097/sih.0000000000000285
Chiu M, Tarshis J, Antoniou A, et al. Simulation-based assessment of anesthesiology residents’ competence: development and implementation of the Canadian National Anesthesiology Simulation Curriculum (CanNASC). Can J Anesth 2016; 63: 1357–63. https://doi.org/10.1007/s12630-016-0733-8
Artino AR Jr, Durning SJ, Sklar DP. Guidelines for reporting survey-based research submitted to academic medicine. Acad Med 2018; 93: 337–40. https://doi.org/10.1097/acm.0000000000002094
Isaak RS, Chen F, Arora H, Martinelli SM, Zvara DA, Stiegler MP. A descriptive survey of anesthesiology residency simulation programs: how are programs preparing residents for the new American Board of Anesthesiology APPLIED certification examination? Anesth Analg 2017; 125: 991–8. https://doi.org/10.1213/ane.0000000000002189
Rochlen LR, Housey M, Gannon I, Tait AR, Naughton N, Kheterpal S. A survey of simulation utilization in anesthesiology residency programs in the United States. A A Case Rep 2016; 6: 335–42. https://doi.org/10.1213/xaa.0000000000000304
Price JW, Price JR, Pratt DD, Collins JB, McDonald J. High-fidelity simulation in anesthesiology training: a survey of Canadian anesthesiology residents’ simulator experience. Can J Anesth 2010; 57: 134–42. https://doi.org/10.1007/s12630-009-9224-5
Russell E, Hall AK, Hagel C, Petrosoniak A, Dagnone JD, Howes D. Simulation in Canadian postgraduate emergency medicine training—a national survey. CJEM 2018; 20: 132–41. https://doi.org/10.1017/cem.2017.24
Savoldelli GL, Naik VN, Hamstra SJ, Morgan PJ. Barriers to use of simulation-based education. Can J Anesth 2005; 52: 944–50. https://doi.org/10.1007/bf03022056
Griffith L, Cook DJ, Guyatt GH, Charles CA. Comparison of open and closed questionnaire formats in obtaining demographic information from Canadian general internists. J Clin Epidemiol 1999; 52: 997–1005. https://doi.org/10.1016/s0895-4356(99)00106-7
Savoldelli GL, Østergaard D. Simulation-based education and training in anaesthesia during residency in Europe: where are we now? A survey conducted by the European Society of Anaesthesiology and Intensive Care Simulation Committee. Eur J Anaesthesiol 2022; 39: 558–61. https://doi.org/10.1097/fja.0000000000001667
Morgan PJ, Cleave-Hogg D. A worldwide survey of the use of simulation in anesthesia. Can J Anesth 2022; 49: 659–62. https://doi.org/10.1007/bf03017441
Ferguson J, Astbury J, Willis S, Silverthorne J, Schafheutle E. Implementing, embedding and sustaining simulation‐based education: what helps, what hinders. Med Educ 2020; 54: 915–24. https://doi.org/10.1111/medu.14182
Hughes PG, Brito JC, Ahmed RA. Training the trainers: a survey of simulation fellowship graduates. Can Med Ed J 2017; 8: e81–9. https://doi.org/10.36834/cmej.36865
Harrison NM, Dennis A. Developing an integrated national simulation-based educational programme for Scottish junior doctors through structured, multistep action research cycles. BMJ Open 2022; 12: e059229. https://doi.org/10.1136/bmjopen-2021-059229
Binstadt ES, Walls RM, White BA, et al. A comprehensive medical simulation education curriculum for emergency medicine residents. Ann Emerg Med 2007; 49: 495–504. https://doi.org/10.1016/j.annemergmed.2006.08.023
Dagnone JD, McGraw R, Howes D, et al. How we developed a comprehensive resuscitation-based simulation curriculum in emergency medicine. Med Teach 2016; 38: 30–5. https://doi.org/10.3109/0142159X.2014.976187
Weile J, Nebsbjerg MA, Ovesen SH, Paltved C, Ingeman ML. Simulation-based team training in time-critical clinical presentations in emergency medicine and critical care: a review of the literature. Adv Simul (Lond) 2021; 6: 3. https://doi.org/10.1186/s41077-021-00154-4
Palaganas JC, Epps C, Raemer DB. A history of simulation-enhanced interprofessional education. J Interprof Care 2014; 28: 110–5. https://doi.org/10.3109/13561820.2013.869198
Navedo A, Pawlowski J, Cooper JB. Multidisciplinary and interprofessional simulation in anesthesia. Int Anesthesiol Clin 2015; 53: 115–33. https://doi.org/10.1097/aia.0000000000000077
Zendejas B, Brydges R, Wang AT, Cook DA. Patient outcomes in simulation-based medical education: a systematic review. J Gen Intern Med 2013; 28: 1078–89. https://doi.org/10.1007/s11606-012-2264-5
Brydges R, Hatala R, Zendejas B, Erwin PJ, Cook DA. Linking simulation-based educational assessments and patient-related outcomes: a systematic review and meta-analysis. Acad Med 2015; 90: 246–56. https://doi.org/10.1097/acm.0000000000000549
Josey K, Smith ML, Kayani AS, et al. Hospitals with more-active participation in conducting standardized in-situ mock codes have improved survival after in-hospital cardiopulmonary arrest. Resuscitation 2018; 133: 47–52. https://doi.org/10.1016/j.resuscitation.2018.09.020
Kurup V, Matei V, Ray J. Role of in-situ simulation for training in healthcare: opportunities and challenges. Curr Opin Anaesthesiol 2017; 30: 755–60. https://doi.org/10.1097/aco.0000000000000514
Gaba DM, Howard SK, Fish KJ, Smith BE, Sowb YA. Simulation-based training in anesthesia crisis resource management (ACRM): a decade of experience. Simul Gaming 2001; 32: 175–93. https://doi.org/10.1177/104687810103200206
Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc 2007; 2: 115–25. https://doi.org/10.1097/sih.0b013e3180315539
Fleming M, McMullen M, Beesley T, Egan R, Field S. Simulation-based evaluation of anaesthesia residents: optimising resource use in a competency-based assessment framework. BMJ Simul Technol Enhanc Learn 2020; 6: 339–43. https://doi.org/10.1136/bmjstel-2019-000504
Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990; 65: S63–7. https://doi.org/10.1097/00001888-199009000-00045
Buléon C, Mattatia L, Minehart RD, et al. Simulation-based summative assessment in healthcare: an overview of key principles for practice. Adv Simul (Lond) 2022; 7: 42. https://doi.org/10.1186/s41077-022-00238-9
Alves Bastos e Castro M, Lucchetti G. Simulation in healthcare education during and after the COVID-19 pandemic. Simul Healthc 2020; 15: 298–9. https://doi.org/10.1097/sih.0000000000000492
Buléon C, Caton J, Park YS, et al. The state of distance healthcare simulation during the COVID-19 pandemic: results of an international survey. Adv Simul (Lond) 2022; 7: 10. https://doi.org/10.1186/s41077-022-00202-7
Daniel M, Gordon M, Patricio M, et al. An update on developments in medical education in response to the COVID-19 pandemic: a BEME scoping review: BEME Guide No. 64. Med Teach 2021; 43: 253–71. https://doi.org/10.1080/0142159x.2020.1864310
Gordon M, Patricio M, Horne L, et al. Developments in medical education in response to the COVID-19 pandemic: a rapid BEME systematic review: BEME Guide No. 63. Med Teach 2020; 42: 1202–15. https://doi.org/10.1080/0142159x.2020.1807484
McVicar JA, Orser BA, Wilson CR. No community left behind: advancing rural anesthesia, surgery, and obstetric care in Canada. Can J Anesth 2022; 69: 1443–8. https://doi.org/10.1007/s12630-022-02340-y
Author contributions
Yuqi Gu, Marshall Tenenbein, Linda Korz, Jason Busse, and Michelle Chiu contributed substantially to all aspects of this manuscript, including study conception and design; acquisition, analysis, and interpretation of data; and drafting of the manuscript.
Acknowledgement
We sincerely thank all the simulation faculty who gave their time to complete our survey.
Disclosures
None declared.
Funding statement
This was an unfunded study. Jason Busse is funded, in part, by a CIHR Canada Research Chair in Prevention & Management of Chronic Pain. Drs Chiu and Gu were supported by The Ottawa Hospital Anesthesia Alternate Funds Association.
Editorial responsibility
This submission was handled by Dr. Stephan K. W. Schwarz, Editor-in-Chief, Canadian Journal of Anesthesia/Journal canadien d’anesthésie.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Below is the link to the electronic supplementary material.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Gu, Y., Tenenbein, M., Korz, L. et al. Simulation-based medical education in Canadian anesthesiology academic institutions: a national survey. Can J Anesth/J Can Anesth (2024). https://doi.org/10.1007/s12630-024-02720-6
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s12630-024-02720-6