To the Editor,

Competence to practice is the ultimate goal for all specialty education and certification programs at the Royal College of Physicians and Surgeons of Canada (RCPSC). The RCPSC created the CanMEDS competency-based framework for educating its Fellows across 69 specialties and subspecialties.Footnote 1 CanMEDS evolved throughout the 1990 s until it reached the current version consisting of seven CanMEDS roles for all competent physicians: medical expert (as a central role), communicator, collaborator, manager, health advocate, scholar, and professional. This framework has become a popular standard for medical education internationally.1

The RCPSC recently mandated that all specialty training programs revise their curricula to match and address the CanMEDS competency framework. Most anesthesiology programs structure their daily evaluations according to this framework, and the final in-training evaluation report is mapped to the CanMEDS competencies.2 These CanMEDS mapped objectives of training are accessible for trainees on the RCPSC website.Footnote 2

Given that their curriculum is structured to follow the CanMEDS framework, one would expect anesthesiology graduates to possess, if not demonstrate, detailed knowledge of the seven roles by the time they are ready for competence certification. In 2009, the final written examination for anesthesiology certification introduced a simple short-answer question asking the candidates to list the seven CanMEDS roles. The results were rather disappointing. Only 70 of the 155 candidates correctly identified all seven CanMEDS roles [mean 3.7; standard deviation (2.3)]. We recognize that the results are not a reflection of the meaning or practice of the CanMEDS competencies, but rather, the inability to identify the labels assigned to the demonstrable roles.

The content of the examination should reflect the objectives of training in our curriculum. Given that the curriculum is mapped to CanMEDS and that the results of the CanMEDS question on the 2009 written examination were poor, we suggest that curriculum may not be the only driver in the preparation for this examination.

To that end, in the 2010 RCPSC anesthesiology written examination, 174 candidates were again asked to list the seven CanMEDS roles. The mean score for this part of the examination was significantly higher than in 2009 [mean 5.8 (1.8); P < 0.0001]. Furthermore, more candidates correctly identified all seven CanMEDS roles in 2010. If the examination results reflect the objectives of training, did the curriculum then change to reflect this improvement? We suggest not.

Stimulus-response psychology may explain the candidates’ interest in previous examinations. Medical curricula are difficult to change, and reform can often be delayed when curricula are challenged by a variety of perceived and unperceived barriers.3 If we assume that the anesthesiology curriculum does not change from one given year to the next, perhaps the introduction of new questions to the examination acts as a stimulus for reform. That stimulus reinforces the learning of the new precise content introduced to the assessment each year.4 In-training assessments for learning (i.e., formative assessment) and final examinations of learning (i.e., summative assessment) have both been implicated as a driving force behind the theory of “assessment-driven learning”.5 In essence, every assessment has an effect on learning.

Formative and summative assessments exist to measure competence to practice. The unintended effects of assessment include providing a framework for studying for high-stakes summative assessments. Candidates study more thoughtfully when they anticipate a certain examination format. Changes in either format or content can shift their focus of learning.6,7 In the case of the RCPSC anesthesiology certification exam, candidates use previously asked questions to guide the consolidation of their knowledge.

Blew et al. supported the reality that it is very expensive to create a valid examination question. Expertise, time, and effort are needed to generate, edit, and approve a valid question.8,9 Therefore, in any given year, 50% of the anesthesiology written test questions are repeated from a reviewed old bank of questions.8 In high-stakes examinations, up to 80% of candidates receive information from their peers about the examination in advance of the test date.10 This may explain the candidates’ collective desire to stack the old questions every year as a major part of their examination preparation. In essence, the summative assessment is shaping a self-directed curriculum, which is more specific than the generalized training objectives.

In our view, this example shows that assessment may be driving an “unwritten” anesthesia curriculum. The RCPSC certification examination will continue to serve as a surrogate curriculum for learning.