INTRODUCTION

The educational environment for house staff has evolved substantially since the Accreditation Council for Graduate Medical Education (ACGME) first limited duty hours in 2003.1 A systematic review of the literature on house staff work from 2003 demonstrated that house staff spent 36 % of their time on patient care and 35 % in activities of marginal educational value; 15 % was considered to be a teaching or learning activity.2 The average number of hours worked per week in that study was 84.2

Since then, house staff appear to spend fewer hours in the hospital,3,4 consistent with the ACGME rules. However, there are limited data to inform our understanding of how the 2003 duty hour rules have changed the actual composition of their work. A national survey of internal medicine house staff concluded that the majority spent more than 4 hours per day performing documentation tasks; this is more time than was spent in face-to-face patient care.5 As the authors note, it is unclear how much of this documentation time has educational value. Another multi-site study revealed that most house staff spend fewer than 1 hour per day on independent reading,6 a finding unchanged from the pre-2003 time period.7

Such studies raise the question of what the appropriate allocation of house staff time actually is. The issue of appropriate allocation of house staff time is very important, but is not described in the literature.8 Schwartz et al. describe a pie chart model containing the components of house staff work, but they do not assign an ideal time allocation to the components.8 The pie pieces are protected sleep, rest on call, patient care with little educational value, patient care with high educational value, education without patient care, and administrative tasks. An important idea in this model is the overlap between education and patient care, which is corroborated in a position paper from the Association of Program Directors in Internal Medicine (APDIM).9 Working from these ideas, a conceptual model of house staff time allocation should reflect that the largest proportion of time be devoted to patient care, with a priority placed on patient care of high educational value. This time could be spent in formal education directly related to patient care, or in independent learning directly related to patient care (akin to Schwartz’s patient care with high educational value). Other learning and other patient care would occur outside of this overlap, as would administrative work and rest/downtime.

Eight years later, we have again revised house staff work schedules as the newest iteration of the duty hour rules went into effect in July 2011. Therefore, it is imperative that we know the details of house staff work as they pertain to these issues. While we wait for new time-motion studies to emerge from the post-2011 time, we have been working with data available from the intermediate period (2003–2011), to consider reworking the house staff experience to maximize patient care-related education. Because on-call periods are rich with opportunities to spend time with patients and to learn experientially about their medical problems, we chose to focus on these periods in this study. Therefore, the aim of this project was to determine how interns in 2010 spent their time on call.

METHODS

Study Design

This was a prospective observational time-motion study.

Setting

This study was approved by the Milwaukee VAMC Institutional Review Board. During the study, the Milwaukee VAMC had four general internal medicine ward teams that were staffed by residents and interns from the Medical College of Wisconsin’s internal medicine residency program. The teams were typically composed of one 2nd or 3rd year resident, two interns and three medical students. Teams admitted patients from approximately 3:00 PM to 7:00 AM every fourth day. The house staff typically evaluated and initiated the work-ups for the newly admitted patients without direct attending supervision. On these “call nights,” the team that was admitting patients also covered the patients of the other three teams that were not in the hospital overnight. This structure is similar to that found in many academic hospitals that do not utilize a night float system. This study was conducted between May-October 2010.

Participants

Interns who were rotating on the general medicine ward services during the study period (May-October 2010) were eligible for inclusion. Those who were willing to participate provided written informed consent, and were given a $20 gift card. There were no exclusion criteria. Each participant was shadowed on only one call night.

Data Collection Procedures

Trained research assistants (RAs) shadowed interns while they were on call. The RA began collecting data at approximately 1:00 PM, before the intern would typically begin to accept new admissions (usually 3:00 PM on weekdays), and ended at 5:00 AM the following day (most admitting was finished by this time). The RA followed the participant throughout the hospital. The RA accompanied the intern into direct patient encounters, but left for sensitive parts of the encounter (such as the physical examination). The RA did not observe the intern during sleep periods.

The RA operated a data-logging laptop computer that ran custom data collection software, which has been refined over the last 14 years and has been demonstrated to be effective at measuring the effects of a variety of performance shaping factors.1013 This software contained all possible tasks performed by the intern, based on a pilot study14 (task list available online). The observer recorded each of the intern’s tasks by clicking on the appropriate task button on the computer display (Fig. 1). Each of the individual tasks, their respective task groups, and event markers, which are specific events that allow tasks to be associated with those events (e.g., patient #1 care start/end, attending absent/present, etc.), were rigorously defined. For example, there were six different tasks that could have been recorded as a teaching/learning task. Each was defined separately. The teaching/training task was defined as “clinical teaching or training others,” and the supervising task was defined as “when a subject observes and supervises/oversees/watches someone else conducting patient care activities (e.g. performing a procedure, conducting a physical exam, taking a patient history). This does not include instances when a subject is teaching another clinician.”

Figure 1.
figure 1

Screen shot of the software used by observers to record tasks. Note that the tasks are categorized for ease of observer recording.

The software automatically logged the time each task was initiated. When interns were performing two or more tasks simultaneously, the observer held down the “command” key to mark multi-tasking periods.11,12,15 To ensure that we accounted for the amount of time spent on each task that was simultaneously performed (i.e., each task within the multi-tasking markers), the total time for the case (i.e., the denominator used for the % time calculations/metric) was extended by the time of the overlapping tasks. In addition, the observational software program included an annotation feature that allowed observers to write brief notes about their observations and the data. These annotations allowed observers to correct task entry errors (e.g., move a task back 5 seconds if it was selected late), to capture any observed user/subject errors and inefficiencies and to place the task and event data in the proper clinical context. The RAs asked for clarification, as needed.

The RAs underwent extensive training to learn to be accurate and reliable observers. This training consisted of observing team rounds, observing doctors on call, and practice sessions of 2 hours, during which they observed a doctor who was working on the wards. The RAs were specifically trained not to engage with the teams during the observation period, and were taught to see themselves as “flies on the wall,” in order to minimize the study influencing the work of the interns. We did not share the task lists with the interns prior to the study, in order to reduce the likelihood that interns would perform certain tasks because they thought the study team wanted them to do so. During task list development, pilot observations and RA training, the team reached consensus on how to categorize the full spectrum of clinical situations and tasks. The RAs worked in overlapping shifts of 4–6 hours.

Physician Demographics

Upon enrollment, interns filled out a brief demographic survey that included age, sex and number of months in training.

Data Analysis

The RAs saved the data from each observation shift in a text file. Each line in the text file contained the task and the task start time. Time-stamped annotations also appeared in the text files. The data from each shift were pasted into Excel files so that the task times flowed continuously. For periods of overlap, we used the data from the observer who was beginning their observation shift. The data in the Excel files were then audited and cleaned. We used a custom data analysis program written in Visual Basic for Applications (Microsoft Corp., Redmond, WA) to analyze each observation log or data file. This was done by automatically calculating and collating, for each consented intern, the minutes and percentage of time spent on each task and task category, as well as the number of times each task was observed (task occurrence) and the duration of each task’s occurrence (task duration). The times spent on all occurrences of each task were summed to generate the total time spent on each task for each subject. We combined the tasks into work categories, which were reached by consensus with the study team (the work categories were clinical computer work, non-patient communication, direct patient care, downtime, transit and teaching/learning). We added up the amount of time spent by each intern in the tasks assigned to each work category. We then calculated descriptive statistics across cases, including means and standard deviations of the time spent on each task and work category. In order to compare the amount of time spent on each of the three most common work categories (clinical computer work, non-patient communication and direct patient care), we used repeated measures analysis of variance (ANOVA) to account for the within-subject correlation. Subsequent pairwise t-tests were performed to further examine significant results. All tests were two-tailed, with significance at the p < 0.05 level.

RESULTS

Twenty-five out of 36 eligible interns participated (69 %), and the other eleven declined participation. Eleven (44 %) were men and 14 (56 %) were women. Mean age was 28.6 (SD 2.4) years. The mean number of months in training at the time of observation was 4 (SD 3.7), with a range of 1–11 months. The mean census at the start of the day for the interns was 2.6 (SD 1.6), and the mean number of admissions per intern was 3.9 (SD 1.8). The mean number of patients cross-covered overnight was 27.7 (SD 12.1). The mean observation period was 14.3 hours.

The distribution of intern time into general work categories is presented in Table 1, and time spent on specific tasks of interest is shown in Table 2. Overall, interns spent a mean of 344+/−74 minutes (40 % of total observation time) on clinical computer work, consisting mainly of writing notes, reviewing patient documents and writing orders. Interns spent 253+/−66 minutes (30 %) on non-patient communication tasks, which included clinical conversations with team members, other physicians and nurses. It also included socializing with other doctors and sign-out related discussions. Interns spent 104+/−39 minutes (12 %) performing direct patient care (bedside) activities, for their own and cross-cover patients. Downtime activities (e.g., sleeping, eating and emailing/surfing the internet) accounted for 93+/−100 minutes (11 %). Interns spent small amounts of time on transit and educational activities. Interns spent a mean of 32+/− 22 minutes multi-tasking during the observation period. ANOVA confirmed that time spent on clinical computer work, direct patient care and non-patient communication tasks were significantly different from each other (p < 0.0001), with all post-hoc pairwise comparisons also significant (p < 0.0001).

Table 1 Time Spent on General Categories of Work
Table 2 Time Spent on Selected Tasks of Interest

Twelve (48 %) of interns slept at least some while they were on call. During 16 of the 25 observed call shifts (64 %), an attending physician interacted in person with the observed intern.

DISCUSSION

We conducted a time-motion study to determine how interns in 2010 spent their time while on call. We demonstrated that the majority of interns’ time was spent in indirect patient care, usually in the form of clinical documentation tasks performed at a computer. Only a small proportion of their time was spent at the bedside of patients, and even less was spent in overt educational activities.

Interns spent the greatest proportion of their time in front of a computer. This is consistent with a recent review of inpatient work studies that found approximately 50 % of inpatient physician time is spent in indirect patient care.16 Our study was conducted at a VA, which has an integrated electronic medical record. Therefore, the large fraction of time spent at the computer may be partially driven by the fact that order entry, test results, chart review and all documentation occur at the computer. At hospitals with less well-developed electronic medical records, other clinical administrative and documentation tasks (e.g., searching for paper charts, handwriting notes) likely consume at least a comparable amount of interns’ time. While these documentation tasks are often maligned as busy work, it is possible that interns are also thinking and integrating information while in front of the computer. It is difficult to capture cognitive work in an observational study, but we hypothesize that at least some of the cognitive work that occurs during the admission process occurs while notes and orders are being written.

In a conceptual model of house staff time allocation, this time on clinical computer work could fall into the category of independent education related to patient care, or into non-educational patient care. Our data does not allow us to know how much computer work would qualify as non-educational patient care time. In the past, “scut work” was the epitome of non-educational patient care. Examples included activities such as placing IVs or drawing blood.17 In our study, no intern time was spent on these tasks.

Time at the bedside is an important aspect of the experiential learning that occurs during residency.18 In our study, interns spent on average only 12 % of their time with patients. Other studies corroborate that direct patient care consumes less time than indirect patient care.16 In the only comparable US study since 2003, a single internal medicine intern was observed, and that individual spent 20 % of their time at the bedside.19 It is notable that we began our observations in the afternoons, after normal rounds would be over, but we were able to capture the time during which most admitting and cross-cover occurred. Presumably, if we had observed the interns for an entire 24 hour period, we would have captured their morning visits to patients, and any subsequent visits with supervising residents or attendings on rounds.

The ideal amount of time at the bedside for internal medicine house staff is not defined in the literature. We believe that the on-call period is a time when bedside work should be prominent, given that the admitting history and physical require more time than daily follow-up visits. With the VA EMR, it is possible that the past medical and social histories on new patients are obtained from the computer rather than from the patients, which could shorten the bedside interactions considerably, partially explaining our finding.

Only 20 minutes (2 %) per call night were spent in overt educational activities, and only 14 minutes on clinical reading. In the 2003 review by Boex, et al., 15 % of resident time was estimated to be spent in teaching/learning activities.2 Because we focused on the on-call period, we did not capture teaching from attending rounds or planned educational sessions, such as our daily noon conferences. In addition, it is possible that some educational activities were not categorized as such by our research assistants, because they were not overtly education (e.g., an intern asking a resident for advice on antibiotic therapy may not have been coded as education). However, we gave the research assistants definitions about teaching/learning activities to increase the chance that they would correctly identify them.

Time spent on call is a substantial proportion of overall house staff experience, and this time includes opportunities for both independent learning and formal teaching during patient care. Most of the formal teaching will occur during attending rounds and noon conferences. However, the on-call period is rife with opportunities for direct supervision of interns by more senior residents, but this supervision does not appear to be happening frequently. It is possible that as the shift lengths of house staff become further compressed with progressive limitations of duty hours, the opportunity for such direct supervision may be waning.

The results of our study should be considered in the context of its limitations. First, the results from one VA hospital may not generalize to other non-VA programs. However, many teaching programs include VA rotations, and the call structure at our VA utilizes a fairly standard schedule. Second, we observed each intern only one time, which means that the results for each intern may only reflect the situational factors of the night he/she was observed, rather than an estimate of his/her typical work pattern. We chose to do this to limit the burden on individual interns, and also to increase the generalizability of the study. Third, the observation period in this study did not cover the main educational conferences or attending rounds; therefore, many educational activities would not have been recorded by our observers. Time spent on teaching/learning would have increased by at least an hour (based solely on the noon conferences), and possibly by more, depending on how much teaching would have occurred on attending rounds. Finally, we did not calculate inter-rater reliability for the observations.

How house staff spend their time on call has been studied periodically over time.17,2022 Even so, it is important to revisit their work distribution, as major changes in the graduate medical education system are occurring. Our results suggest a need to proactively consider strategies to increase intern time spent with patients and in formal and informal teaching activities when on call. It is only by knowing this information that the medical education community can maximize the house staff clinical experience in the newest era of duty hour reform.