INTRODUCTION

In 1999, the Accreditation Council for Graduate Medical Education (ACGME) instituted a radical change in the way it accredited residency programs in the United States. It defined 6 competency goals that all residents should meet, 2 of which—Systems-Based Practice (SBP) and Practice-Based Learning and Improvement (PBLI)1—reflected the ACGME’s conviction that medical graduates need to understand the system in which they practice and to apply systematic methods to identify and address system problems.

When faced with integrating new curricular content into any training program, educators may encounter several impediments: programs may already be saturated with equally important content; stakeholders may not understand the new content or perceive its importance; and there may be a shortage of faculty experts to teach the new content.2,3

To overcome these problems, we designed a program—Achieving Competence Today (ACT) — to help residency programs teach SBP/PBLI and develop new SBP/PBLI curricula. The design of ACT addresses the challenges mentioned above by: (a) creating a self-directed, web-based curriculum that eliminates the need for expert faculty but also trains faculty, (b) developing learning activities to be carried out in the local clinical setting with learners’ own patients, and (c) engaging learners in teaching and curriculum development in collaboration with their faculty. At the same time, we encouraged ACT participants to adapt the materials flexibly to their own local systems and individual needs and interests. It was our hope that residents would find the activities engaging and important to their care of patients, that their work would stimulate interest in SBP/PBLI among other residents and faculty, and that their teaching and collaboration with preceptors on development of a curriculum would reinforce their learning, as well as jump start the change process.

In this paper, we describe the ACT program and report the extent to which it changed residents’ and faculty members’ knowledge of, sense of competency in and attitude toward SBP/PBLI training. We also examine the elements of ACT that residency training directors believed best facilitated residents’ learning.

METHODS

A Guided Curriculum for a Clinical Elective in SBP/PBLI

The 4-week course in SBP and PBLI engaged the residents in active, self-directed learning and real work. We designed the curriculum centrally and delivered it via the web, but the residents’ experience took place in their own hospitals and local health systems. Moreover, we designed the curriculum so that residents could undertake this intensive experience with no requirement for content expertise on the part of their faculty, and so that the course would promote faculty–resident colearning, when needed. The web-delivered materials included basic information, readings, web links, cases, and exercises, as well as detailed step-by-step instructions for the exercises and work products submitted to preceptors and the ACT office.

We designed the content of the course to cover all aspects of the 2 domains, as defined by ACGME1. Central to the course were exercises that allowed residents to follow the flow of money through the system to illustrate the complex relationships among patients, providers, medical centers, insurers, and purchasers (Table 1). These exercises introduced the residents to basic principles of health care economics and financing, including the economic levers held by each stakeholder in their system, provided them with a broad foundation in systems and quality improvement, and encouraged them to think about problems from a systems perspective.

Table 1 The 4-week Curriculum

The course immersed the residents in practical experiences within their local hospital to help them understand systems of care and to develop skills in practice improvement. Residents identified a system problem within their institution as well as a patient who exemplified that problem. They then applied a variety of tools or strategies (e.g., root cause analysis, the Plan-Do-Study-Act (PDSA) cycle, effort-yield tables, and run charts) that helped them understand system complexity and how to identify and capitalize upon opportunities for improvement. By using a patient as a lens through which to view a system problem, residents took an approach to change that was patient-centered as well as methodical and analytical.

Last, to reinforce learning, residents taught other residents some aspect of SBP/PBLI in the months after the course.

Evaluation

ACT was a 2-year initiative, repeated for 2 cohorts of residents. In the hope of creating a national model for change, we identified 2 groups of residency training directors who were influential nationally. First, we identified top-tier Internal Medicine residency programs through a nomination and snowballing method. Specifically, we invited directors of 2 top-rated residency programs in Boston to nominate their top competitors. We then asked these competitors the same question, ending with a list of 36 programs. We invited all 36 programs to apply for an ACT grant. We also invited 50 program directors involved in Partnerships for Quality Education initiatives to apply.

From among these, 18–11 from the first group and 7 from the second-submitted proposals that met our criteria. Criteria for inclusion were that the program directors and hospital leaders were committed to finding a way to meet the ACGME requirements comprehensively; the environments supported the project (internally within departments and hospitals and externally among insurers); residents, faculty, curriculum time, and technical support were immediately available.

The local principal investigators (PIs) invited their residents to take the course during an elective period in fall 2003 or 2004, and then selected a convenience sample of 2 or more second- or third-year residents from among the volunteers. In total, 78 residents participated in ACT and 72 of their peers served as controls. The study was granted exempt status by the Institutional Review Board at Harvard Pilgrim Health Care.

We evaluated the effectiveness of the ACT program in increasing residents’ and preceptors’ knowledge of SBP/PBLI, sense of competency in SBP/PBLI skills, and attitude toward the value of learning SBP/PBLI during residency. We sought to answer the following:

  1. 1.

    Did residents and preceptors (a) gain knowledge about SBP/PBLI, (b) change attitudes toward the value of learning SBP/PBLI during residency, and (c) perceive themselves to be more competent in SBP/PBLI after participating in ACT?

  2. 2.

    How did residents’ prior experiences with aspects of SBP/PBLI affect their knowledge, attitudes and sense of competency?

  3. 3.

    How did the extent of ACT residents’ involvement in ACT, as evidenced by the number of deliverables completed, affect their knowledge, attitudes, and sense of competency?

  4. 4.

    What aspects of ACT did PIs consider most effective in enhancing residents’ learning?

Measures

We used a before–after cross-comparison of participating residents (n = 78) with their peers within the same residency programs (n = 72). We used a before–after comparison of participating faculty (n = 42) with no control group. All groups—ACT residents, controls, and faculty preceptors—responded to the following surveys before the 4-week course.

  1. (a)

    8-item survey of prior experiences with SBP/PBLI-related activities (developed by the authors);

  2. (b)

    50-item test of knowledge (developed by Tufts Health Care Institute (THCI) in consultation with the authors). During field testing in April 2003, THCI administered the instrument to a sample of residents at their own institution. In analyses of 200 completed assessments using the 50-item test, they examined 2 psychometric properties: internal consistency and test-retest reliability. Internal consistency was respectable, with a Cronbach’s Alpha coefficient of.75. For a subsample of 15 respondents, a test–retest reliability coefficient of.89 (p<.01) was obtained.

  3. (c)

    15-item self-assessment of competency in SBP/PBLI (“Considering your prior experiences, rate your current degree of competence in PBLI and SBP.” Scale: 1 = not competent to 5 = highly competent).

  4. (d)

    19-item survey of attitudes toward residents’ learning SBP/PBLI (“Rate each competency in terms of how important it is for Internal Medicine residents to achieve competency in the area before completion of residency.” Scale: 1 = not important to 5 = essential). (Both c. and d. were adapted from a survey created by Yedidia et al.4)

All measurement tools were content valid in that their items paralleled the program content, which paralleled ACGME’s defined SBP/PBLI subcompetencies. (An example of a subcompetency is “Analyze practice experience and perform practice-based improvement activities using a systematic methodology.” The ACGME website provides complete definitions1.) With the exception of THCI’s knowledge test, surveys are available upon request.

At the end of the 2003–2004 academic year (6 months or more after the elective), participating and control residents again took the knowledge test, assessed their own competency, and responded to the attitude survey. This timing allowed us to assess the impact of the entire program, not just the 4-week elective. This procedure was repeated for the 2004–2005 cohort of ACT and comparison residents. Faculty responded to postprogram surveys at the end of the second year, or at the end of the first year if they handed off precepting responsibilities to a colleague. Also, the number of deliverables each ACT resident submitted (0–4) was summed.

Last, at the end of the ACT program, PIs identified the aspects of ACT that were most and least effective in enhancing residents’ learning by: (a) rating the effectiveness of each component on a scale of 1–10; and (b) noting which aspect was most effective and which was least. To determine whether our assumption that faculty lacked expertise in SBP/PBLI was true, we asked PIs to rate 5 factors (1–10 scale), including “lack of faculty trained in systems and practice improvement,” as barriers to curricular change (other factors were limited time that could be freed up for the residents, lack of institutional support, lack of resident interest, and quality of the curriculum).

Analyses

Descriptive statistics (frequencies, means, medians, and standard deviations) were computed after each administration of the surveys; change scores between pre- and posttests were also computed annually. Finding no significant differences between cohorts of residents, we combined groups.

The major outcome variables were the self-assessed competence, knowledge, and attitude scores with all 3 scores normalized to a 100-point scale and treated as continuous. Mean values were used for missing values. Mean scores for the aggregate ACT intervention residents and the aggregate control residents were compared using Student’s t test. In addition, change scores were computed from pre- and posttest results and compared using Student’s t test. Pearson correlation coefficients were computed to determine relationships among the 3 outcome measures.

RESULTS

A total of 150 Internal Medicine residents participated in the evaluation of ACT over the 2-year period: 78 residents participated in the ACT program, 37 in 2003–2004 and 41 in 2004–2005; 72 residents served as controls, 36 in each cohort. Forty-two faculty members served as preceptors. Response rates for the online assessments were high in the preintervention phase: 90% of the ACT and control residents and 83% of faculty completed the attitude and self-assessed competence surveys; 80% of the ACT residents, 66% of the controls and 88% of faculty completed the test of knowledge. In the postintervention phase, 80% of the ACT residents completed the attitude and self-assessed competency surveys and 60% completed the knowledge survey. Among control residents, 61% completed the attitude and self-assessed competence surveys and 38% completed the knowledge survey. Forty-three percent of faculty completed postprogram attitude and competency surveys; 69% completed the knowledge survey.

Knowledge, skill, and attitude

Across all 3 outcome measures, before the intervention ACT and control residents differed significantly only in terms of knowledge (Table 2). After ACT, knowledge, self-assessed competency, and attitudes were all significantly higher among participants than controls. However, when change scores were computed, ACT residents’ gains in knowledge (4.4) and self-assessed competency (11.3) were greater than controls’ (−1.9 and −8.0), but their attitudes were no longer significantly different from controls’ (ACT = 7.2, controls = 2.6, p > .05). ACT residents’ gains in self-assessed competency were greater than gains in knowledge.

Table 2 Residents’ Self-assessed Competence, Knowledge, and Attitude Scores

Based on post-ACT scores, the correlation between ACT residents’ attitudes and self-assessed competency was r = .53 (p < .0001); the correlation between controls’ attitudes and self-assessed competency was not significant (r = 0.22, p = .07). There was no significant correlation between residents’ knowledge and either attitudes or self-assessed competency (data not tabled). Based on an analysis of ACT residents’ change scores, the number of deliverables completed had no significant effect on any of the 3 outcome measures. Finally, we examined whether significant changes resulted from prior SBP/PBLI experience, and found no significant difference between ACT and control residents on any outcome measure (p values for all effect estimates were >.60).

The mean faculty scores on the preprogram tests of knowledge and self-assessed competency were moderate and did not change significantly by the end of the program (Table 3). Attitudes, however, did become significantly more positive over time (54.0 pre-ACT vs 69.8 post-ACT, p <.001). Lack of faculty expertise was confirmed through PI ratings of barriers to change: A majority of the PIs believed that a lack of faculty expertise in the 2 domains had been an important barrier to curricular change (median rating = 8.0); in fact, 57% of them rated this as the most important barrier to change.

Table 3 Preceptors’ Knowledge, Self-assessed Competence and Attitude Scores

PIs’ perceptions of ACT

PIs rated 7 of 8 components of ACT as effective methods to enhance residents’ learning, with mean ratings of 7.0 or higher on a 10-point scale (Table 4). PIs found a ready-to-use curriculum effective (mean rating was 8.5). As the ratings of the most and least effective components of ACT demonstrate, PIs varied in terms of which aspects of the program worked best for their residents. For example, some considered precepting to be the most effective factor in promoting learning, whereas others felt it was the least effective. In open-ended comments, PIs reported that almost all residents found the program valuable, but some felt that planning, but not carrying out a quality improvement project, was frustrating.

Table 4 Principal Investigator Ratings of Effectiveness of ACT Components

DISCUSSION

Achieving Competence Today successfully provided residency programs a framework and a curriculum for teaching systems-based practice and practice-based learning and improvement. As intended, PIs and participating residents used the materials flexibly, adapting them to their own environments. On 2 independent measures, ACT residents made significant gains: they improved their knowledge of the 2 domains and assessed their competency at a higher level than at baseline. ACT residents and controls both expressed positive attitudes toward learning SBP/PBLI during residency training.

ACT helped programs begin to teach SBP/PBLI. Between the time of the ACGME directive and the implementation of ACT, medical educators had made little progress in developing curricula to teach SBP/PBLI.2 Many had been hampered by a lack of understanding of the domains, lack of time, and a shortage of faculty content experts.5,6 We believe that the combination of 3 factors enabled ACT residency programs to make progress: first, an easily imported, ready-made design to overcome the high barrier of creating a program where both the director’s time and expertise were limited; second, the use of a self-directed course, which did not require a faculty with content expertise; and third, the active engagement of learners in real work growing from a locally implemented experience, which created relevance and engagement similar to that of a clinical rotation.

As demonstrated elsewhere, residency program directors welcome ready-made programs that allow them to build on others’ expertise.7 Before ACT, such shared materials were limited to lists of learning objectives, general suggestions of teaching activities and assessment tools.7,8 ACGME has now provided an instructional toolbox.9 Recent publications offer a variety of successful approaches to teaching discrete SBP/PBLI subcompetencies (e.g., quality improvement projects,1013 journal clubs,14 group analyses of errors,15 web-based instruction.16,17). In contrast, ACT provided a comprehensive curriculum with extensive teaching materials and guidelines, which reduced the effort required of program directors and helped overcome their perceived barrier of lack of faculty expertise. We would also note that ACT residents’ quality improvement projects and curriculum were invariably well designed and useful to the institution, ranging from a check-in redesign that reduced delays in patients’ arrival to a method that allows an entire residency program to meet the 80-hour work week requirement.

ACT was educationally robust. We based the instructional design and content on the principles of active, adult learning. Competency is most likely to be achieved when education is grounded in real-life experiences, stimulates active, hands-on learning, and fosters self-assessment and independence.5 In a review of methods used to teach aspects of PBLI, the authors concluded that the development of competence in PBLI is “a skill-based activity with important theoretical and methodological foundations.”5

There is evidence that competence is founded on both content knowledge and practice.18 Therefore, we constructed a curriculum that introduced residents to core concepts around health care systems, health economics, and quality improvement, and included exercises such as interviewing various stakeholders in the local system. Based on assumptions about adult learners, we used a patient-centered perspective to systems improvement to gain residents’ interest and to link their ACT work directly to their day-to-day clinical responsibilities.19 Follow-up activities such as teaching others (a PBLI1 subcompetency) served to reinforce learning and maintain residents’ interest and commitment.

There are several limitations in this study. First, although the test of knowledge had both content and discriminant validity with respect to the formal ACT curriculum, the variation in residents’ experiences may be reflected in the modest gains in knowledge. The second limitation is the possibility of selection bias of intervention residents and the assumed match for peer controls. We do not know why ACT residents’ preprogram knowledge scores were higher than controls’, especially as their prior experiences with SBP/PBLI-related tasks were not different. In addition, whereas response rates to most surveys were acceptable, the proportion of control residents who took the postprogram knowledge test was low. Finally, changing preceptors between cohorts and faculty’s low response rates to both sets of measures undermined the program design and our assessment of the degree to which inexpert faculty could learn alongside residents. Moreover, we had no control group for faculty, so the rise in positive attitudes over time, like residents’, may have been no greater than peers’ had they been measured.

In an era when safety and outstanding performance are expected of our teaching institutions, academic health centers should be doing more than just getting by in the teaching of SBP/PBLI. We believe that the need to create improved systems of care is so great that devoting 1 elective—3% of total training time—to the topic is as important as any other advanced training. ACT is an adaptable model for training residents in SBP/PBLI, and is now available on CD-ROM (ACT2@harvardpilgrim.org). It can be scaled up for use with all residents because it is a prepackaged curriculum and does not depend on what is usually a major barrier to change, namely, a lack of faculty expertise in the given area.