INTRODUCTION

The Accreditation Council for Graduate Medical Education’s (ACGME) Core Program Requirements mandate that faculty members annually pursue development designed to enhance their educator skills.1 Despite this requirement, variability continues in the quality of teaching provided by physician educators, both from the perspective of residents and more objective measures.2,3,4,5 Reported variability includes teaching styles, expectations attendings have of residents, and inconsistencies in how feedback is delivered.2,3,4 It remains unclear how the numerical scores given to attendings by resident raters characterize the quality of physician educators’ skills.

High-quality, engaged precepting by attendings is associated with improved overall resident performance and wellness5,6,7, and has been shown to impact learners’ ultimate choice of medical specialty8. Less often reported is the quality of attending educators according to ACGME’s six core competencies: (1) patient care, (2) medical knowledge, (3) practice-based learning and improvement, (4) systems-based practice, (5) interpersonal and communication skills, and (6) professionalism9. Guerrero et al. analyzed 1378 responses from residents in 12 different specialties across training years. Between 80 and 97% rated their training for ACGME competencies as adequate, with patient care activities and observations of attending physicians and peers being most helpful.10 Lee et al. assessed the effectiveness of a faculty development program designed to increase teaching and assessment skills needed for ACGME competencies, and showed clinical instructors could successfully apply skills learned.11

Few studies have focused on identifying and improving the quality of teaching provided by attending physicians during residency education. The Division of Hospital Medicine at Oregon Health and Science University (OHSU) has recently transitioned to a Core Competency-based resident assessment of attending physicians. Our study sought to utilize these assessments to determine the characteristics that residents associate with high teaching quality.

METHODS

Study Setting

Oregon Health & Science University (OHSU) is a 576-bed teaching hospital. OHSU’s medical teaching service consisted of 46 attending physicians, all of whom were hospitalists providing care and medical consultation to hospitalized patients, teaching residents, conducting research, and co-leading the division; however, only 25 attendings met the inclusion criteria of teaching for at least 3 years or more during the study period. The Internal Medicine (IM) Residency Program includes 111 residents across three years of training with 104 (93.7%) categorical and five (4.5%) preliminary residents. The OHSU IM program includes an inpatient wards rotation that typically spans 3 weeks within a 3+1 schedule.

Instrument Development and Implementation

A 15-item assessment instrument with two to four variables per core ACGME competency was developed in 2015 as part of a larger competency-based redesign of all trainee and faculty assessments. The assessment was modified from an existing validated evaluation tool assessing clinical teachers.12 This evaluation is routinely filled out by all residents at the conclusion of their inpatient internal medicine rotation at OHSU. The scale contained six response options (1=never/rarely, 2= occasionally, 3= frequently, 4= consistently, 5= exceptional, and N/A). The instrument included space for comments after each competency section. Assessment data were anonymous and captured via MedHub13 between 7/1/2015 and 6/30/2021. Faculty names were replaced with a study identifier during analyses. We also sent a five-question survey to attending physicians to characterize their demographic information and how long they had been precepting trainees. OHSU’s Institutional Review Board reviewed study activities, which were considered quality improvement efforts and deemed not human subjects research (IRB #25005).

Data Analyses

For quantitative analyses, we calculated descriptive statistics for each faculty-educator assessment variable, including frequencies and percentiles. Means, standard deviations, and ranges in scores were calculated for each variable and as summary scores for each core ACGME competency. Further, we normalized the scores on a scale of 0–100 to identify which ACGME Core Competencies were rated highest and lowest by residents. Cronbach’s alpha was calculated to measure the internal consistency of the rating scale for each ACGME core competency. To identify high- and low-performing attendings, we calculated a summary score that included all their assessments by residents and then stratified these into tertiles of high, medium, and low performers.

For qualitative analyses, we used a positive deviance approach, which assists in explaining causes of variation14 by posing the research question, “What characteristics distinguish high quality from low quality attending physician educators?” We retained comments from the highest and lowest tertiles to include in qualitative analyses. We used classical content analysis15 to analyze residents’ comments, which involved an iterative process of open and axial coding, sharing and discussing the codes in consensus meetings, separating and/or collapsing codes and creating descriptions to characterize themes, and selecting exemplars that best reflected them. We excluded attending faculty who had fewer than 15 assessments by residents. Because of similarities in findings, we grouped the ACGME Core Competencies when presenting findings into (1) Patient Care and Medical Knowledge, (2) Systems-based Practice and Practice-based Learning and Improvement, and (3) Interpersonal and Communication Skills and Professionalism.

RESULTS

Quantitative Findings

Twenty-five attending educators were assessed by residents during the study period, producing 724 individual attending physician assessments. Four of the 25 attendings had fewer than 15 assessments and thus were not included in further analysis. Eighteen completed the demographics survey (72.0% response rate). Characteristics of attendings included average age of 44 years, predominantly female, white, non-Hispanic, and precepting for an average of about 13 years (Table 1). Quality of physician attending skills as educators ranged from 4.10 (SD=1.10; range 1–5) for “Demonstrates incorporation of cost awareness principles” to 4.59 (SD=0.59; range 2–5) for “Displays enthusiasm for teaching” (Table 2). In terms of teaching quality according to ACGME core competency, the highest rating for attending educators was for Interpersonal and Communication Skills (normalized score=90.1), closely followed by Medical Knowledge and Professionalism (Normalized scores=90.0). The lowest rated teaching quality was for Systems-based practice (normalized scores=85.2) (Table 2).

Table 1 Characteristics of Attending Physician Educator Participants
Table 2 Assessment Ratings for Variables According to ACGME Core Competency (n=724 Assessments for 25 Physician Attending Educators)

The number of resident assessments for the remaining faculty ranged from 15 to 63 (mean=33.8). All but one attending had assessments that spanned all three tertiles (Table 3). The summary mean score in the lowest tertile was 58 (SD=5.0; range=37–62), and the attendings who most often scored in this tertile compared to the middle and upper tertiles were attendings #1, #7, #8, #16, #19, #22, and #23 (n=7) (Table 3). The summary mean score in the middle tertile was 67.4 (SD=2.6; range=63–72), including attendings #5, #6, #10, #11, and #25 (n=5). The summary mean score in the highest tertile was 74.4 (SD=0.92; range 73–75) for Attendings #3, #4, #13, #14, #15, #17, #18, #20, and #21 (n=9). Based on these findings, qualitative comments from 16 physician attending educators (those in the first and third tertiles) were included in analyses.

Table 3 Tertile Determination for Positive Deviance Qualitative Analysis Approach

Qualitative Findings

Under Patient Care and Medical Knowledge, six themes emerged as being characteristics of high-quality attendings, including balance (e.g., balancing supervision and autonomy); role modeling; engaging or knowing when and how to attract and involve learners; availability to learners and team members; compassion with trainees as well as patients and families; and excellent teaching (Table 4).

Table 4 Emergent Themes on High-Performing Faculty Educators as Assessed by Internal Medicine Residents

Under Systems-based Practice (SBP) and Practice-based Learning and Improvement (PBLI), two themes emerged. First was guided coordination of patient care that involved multiple team members or complex medical and psychosocial care issues. Second was the ability to both deliver and receive meaningful/actionable feedback.

One emergent theme reflecting low-quality attending physician educators for Patient Care and Medical Knowledge (PC&MK) was inefficiency on rounds, which caused stress and did not allow learners to complete their work in a timely manner (Table 5). Under SBP and PBLI, two themes emerged. First, team-based communication or a desire for interactions to make better connections between attendings and the care team. Second was role on rounds, or that leadership and resident roles could be improved during rounds. Lastly, under Interpersonal Communication and Professionalism (ICS&P), a single emergent theme was a desire for more feedback to aid resident development.

Table 5 Emergent Themes on Low-Performing Faculty Educators as Assessed by Internal Medicine Residents

DISCUSSION

Our study is novel in that we reviewed 724 individual attending physician assessments using a mixed methods approach to characterize high- and low-quality physician attending educators. Interestingly, the majority of attendings scored across all three tertiles, with no one attending scoring solely in the top or bottom tertile. The average assessment scores for ACGME core competencies were relatively narrow, with ratings ranging from 4.10 to 4.59 on a scale of 1 to 5. The observed narrow range of average scores may be attributed to several factors. It is possible there was a social response bias with the attending participants in our study, leading to consistently positive assessment. Resident evaluators may have had preconceived positive expectations about attending physicians, influencing their evaluations. This could result from previous experiences, reputation, or general perceptions. The design of the evaluation instrument itself may not have been sensitive enough to capture subtle variations in attending performance. We plan on continuing to refine and validate our evaluation tools to ensure they accurately reflect the diverse aspects of performance. Residents may also feel pressure to provide positive evaluations to avoid potential conflicts or repercussions, leading to artificially high scores. Our quantitative data demonstrated that interpersonal communication skills are rated more highly, while system-based practice is rated lower for attendings. This may be due to confusion about SBP and/or may represent that communication skills are easier to evaluate or more important to residents.

Qualitative data suggested that high-quality attending physicians allow residents autonomy to lead the team. This is incredibly important for confidence building and can be challenging to balance with being supportive and approachable. Other qualities included having a calming presence with the team and patients, and empathy during sensitive discussions. Existing literature on physician role modeling found that “teacher/supervisor” role modeling was closely associated with professional attitudes towards residents, providing feedback, and affecting the learning climate.16 We found that high-quality educators had an impressive knowledge base, and were willing to share medical and practical knowledge, which was especially well-received when it was engaging. Valuable practical knowledge included navigating hospital systems, optimizing care resources, and communicating effectively and professionally with the multidisciplinary care team. Providing feedback was also a strength among the high-performing attendings. Residents expressed appreciation for frequent check-ins followed by actionable feedback tailored to the learner with positive reinforcement when appropriate.

A number of recurring themes emerged among faculty educators with lower ratings. Inefficiency was one, with evaluations noting that prolonged rounding or protracted teaching points led to delays in patient care that created stress for residents attending to other commitments. Poor time management was mentioned in prior studies examining narrative feedback of clinical teachers.17 Communication was another emergent theme, especially regarding patient care, with several residents voicing a preference for direct communication, as opposed to attendings placing orders without notifying the team or communicating through the chart. Given that residents appreciate autonomy, it is not surprising that some lower scoring attendings would deprive learners the opportunity to come up with a differential diagnosis or communicate daily plans to patients. Attendings who did not seem invested or did not provide assistance when the team was struggling were also rated lower. A desire for more frequent and direct feedback on performance was also voiced, showing that while this was a strength for some faculty, others are underperformers. Unfortunately, it is unclear how best to provide frequent feedback, a finding that has also been reported elsewhere.17, 18

While resident perceptions of the quality of attending physician educators are valuable, several caveats exist when interpreting assessment data. Assessments typically contain personal or social bias that influence the review process. Though the resident responses were anonymous, complicated power dynamics of rating a supervisor may limit negative feedback for fear it will affect future interactions with attendings. Similarly, implicit biases towards gender, race, and age may influence assessment language and scoring outcomes. Residents often lack training or experience on how to provide high-quality assessments and/or may be completing them weeks to months after interacting with attendings, limiting their depth and scope. Thus, resident assessments must be interpreted with caution and used with other feedback sources to fully determine attending educator quality.

Utilizing a mixed-methods approach allowed for both qualitative and quantitative analyses and provided breadth and depth to this study. The positive deviance approach to determine the highest- and lowest-rated attendings provided a framework for core ACGME competencies that can inform future educational strategies for low performers to improve.

This study was conducted at a single institution with an institution-specific evaluation form, which may limit generalizability. However, over 700 assessments were included in the analysis collected over six years, and the assessment tool was found to be highly reliable as a measure of the ACGME core competencies (Cronbach alpha 0.92).

Future research should explore how to improve evaluative processes and clinical teaching effectiveness. More robust evaluative processes would allow for objective evaluations of attendings across programs and institutions. Future research on multi-source feedback, including peer evaluations, learning specialist reviews, patient comments, and self-assessments, could identify new areas for improvement. Research should also explore factors influencing attending physicians’ receptiveness to feedback and strategies to promote a culture of continuous improvement. As technology advances, opportunities will emerge to develop innovative tools, including simulation-based assessments, machine learning algorithms, and real-time feedback tools.