ABSTRACT
BACKGROUND
There have been recent calls for improved internal medicine outpatient training, yet assessment of clinical and educational variables within existing models is lacking.
OBJECTIVE
To assess the impact of clinic redesign from a traditional weekly clinic model to a 50/50 outpatient–inpatient model on clinical and educational outcomes.
DESIGN
Pre-intervention and post-intervention study intervals, comparing the 2009–2010 and 2010–2011 academic years.
PARTICIPANTS
Ninety-six residents in a Primary Care Internal Medicine site of a large academic internal medicine residency program who provide care for > 13,000 patients.
INTERVENTION
Continuity clinic redesign from a traditional weekly clinic model to a 50/50 model characterized by 50 % outpatient and 50 % inpatient experiences scheduled in alternating 1 month blocks, with twice weekly continuity clinic during outpatient months and no clinic during inpatient months.
MAIN MEASURES
1) Clinical outcomes (panel size, patient visits, adherence with chronic disease and preventive service guidelines, continuity of care, patient satisfaction, and perceived safety/teamwork in clinic); 2) Educational outcomes (attendance at teaching conference, resident and faculty satisfaction, faculty assessment of resident clinic performance, and residents’ perceived preparedness for outpatient management).
RESULTS
Redesign was associated with increased mean panel size (120 vs. 137.6; p ≤ 0.001), decreased continuity of care (63 % vs. 48 % from provider perspective; 61 % vs. 51 % from patient perspective; p ≤ 0.001 for both; team continuity was preserved), decreased missed appointments (12.5 % vs. 10.9 %; p ≤ 0.01), improved perceived safety and teamwork (3.6 vs. 4.1 on 5-point scale; p ≤ 0.001), improved mean teaching conference attendance (57.1 vs. 64.4; p ≤ 0.001), improved resident clinic performance (3.6 vs. 3.9 on 5-point scale; p ≤ 0.001), and little change in other outcomes.
CONCLUSION
Although this model requires further study in other settings, these results suggest that a 50/50 model may allow residents to manage more patients while enhancing the climate of teamwork and safety in the continuity clinic, compared to traditional models. Future work should explore ways to preserve continuity of care within this model.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
The continuity clinic should be a cornerstone of internal medicine residency, where residents develop longitudinal relationships with patients through an outpatient practice. There have been concerns voiced over the past decade that residents are not prepared for future outpatient practice, and a national consensus has emerged that improvement to the internal medicine outpatient training environment is warranted to provide an experience that better develops resident competence in the comprehensive and coordinated care of ambulatory patients.1–8
This charge has prompted significant innovation to the structure of continuity clinic among internal medicine residency programs across the country. While there is substantial heterogeneity, the majority of continuity clinic designs fall into one of two broad structures: 1) Traditional weekly continuity clinic experience, where residents see their outpatients one or two half days per week during most of their rotations, and 2) Block model, where residents engage in sustained continuity clinic time for several sessions per week, alternating with sustained time away from continuity clinic.
A particularly novel and rigorously evaluated example of the block model is the 12 month “long block” of high intensity continuity clinic experience, which has demonstrated improved resident satisfaction and knowledge, patient satisfaction, and patient-relevant outcomes compared with a traditional model.9,10 Several other block models have been described,11,12 but assessment is limited by a lack of available outcome measures. Outcomes that link educational processes to patient relevant outcomes are particularly important in assessing impact of medical education interventions.13
Through participation in the Accreditation Council for Graduate Medical Education (ACGME) Educational Innovation Project (EIP),14 our residency program implemented continuity clinic redesign at the start of the 2010–2011 academic year. Redesign was prompted by a desire to mitigate the “training gap” that exists between the inpatient focus of many residency programs and the fact that the majority of healthcare is occurring in the outpatient setting among increasingly complex patients.15,16 The goals of the redesign were to minimize conflicts between inpatient and outpatient duties, increase resident exposure to continuity clinic, protect time for education, enhance longitudinal relationships between residents and patients, develop patient-centered resident care teams, and enhance longitudinal relationships between residents and faculty. In this study, we assess the impact of continuity clinic redesign from a traditional weekly clinic model to a 50/50 outpatient–inpatient model characterized by 50 % outpatient experiences and 50 % inpatient experiences scheduled in alternating 1 month blocks, with twice weekly continuity clinic during outpatient months and no continuity clinic during inpatient months on clinical and educational outcomes.
METHODS
Study Setting and Participants
The study was conducted in the Mayo Clinic–Rochester Internal Medicine Residency Program among the 96 residents from all three post-graduate years (PGY) who care for approximately 13,000 patients from Olmsted County at the Primary Care Internal Medicine sites for their continuity clinic. There are an additional 48 residents in the program who provide continuity care to patients from surrounding regional counties; these residents received the same programmatic intervention, but their data was not included in the study due to multiple missing data points for clinical outcomes. This study was deemed exempt by the Mayo Clinic Institutional Review Board.
Study Design
We examined clinical and educational outcomes before and after continuity clinic redesign. The pre-intervention study interval was the 2009–2010 academic year; the post-intervention study interval was the 2010–2011 academic year.
Intervention
Key components of the continuity clinic redesign are shown in Table 1. First, in the new 50/50 model, residents’ inpatient and outpatient experience were completely separated, such that continuity clinics were scheduled during outpatient rotations only, which alternated every other month with inpatient rotations. This significant structural change made the additional components of the intervention possible. Second, there were more total clinic days in the new model. This was achieved through increasing clinic days from once to twice per week, and by increasing focused continuity clinic rotations (4–5 half days of clinic per week) from 2 to 3 months (1 month per academic year). Third, clinic days were scheduled to maximize patient access to care. This was achieved through even distribution of clinic days among resident care teams throughout the week, and even staggering of clinic days across inpatient and outpatient rotations. Residents work in care teams of six resident providers (two providers from each post-graduate year) who care for each other’s outpatients for acute issues when the primary resident physician is not in the clinic. There are four resident care teams per firm; each of the six firms is overseen by a faculty firm chief and a pool of seven additional faculty preceptors. Finally, because clinic scheduling is no longer tied to call schedules, more consistent resident-faculty continuity was achieved in the new model.
Demographic Measures
In order to describe and compare the pre-intervention and post-intervention resident cohorts, the following data were obtained on all residents: age on July 1 of each study interval, gender, medical school characteristics (US allopathic, US osteopathic, international), and pre-matriculation performance measures (US Medical licensing exam (USMLE) scores). Data on planned career choice was obtained on the subset of residents who completed the Learners’ Perception Survey described below.
Clinical Outcomes Measures
Data retrieval and reporting of clinical outcome measures were developed through collaboration with clinical informaticist and information technology staff that support Mayo Clinic Rochester’s data aggregation software system based on the Microsoft Amalga Platform. This platform includes all existing institutional database feeds, thus aggregating and integrating data from the operational, clinical, and administrative data at an individual patient level. Each patient is electronically tagged to a provider; in this case, a resident physician. The patients are then grouped to care teams and applications run for population management systems for adult preventive services and chronic diseases. The system defines patient eligibility and completion of preventive and disease-specific tests by retrieving completion of services and results from all-source data systems. This information system can be queried to determine the completion rates of any service for any demographic population.17 From the same Amalga system, quality and utilization reports are generated for each individual physician and care team.
Continuity of care was assessed from the perspective of the physician (proportion of visits conducted by residents in which residents saw their own patients; Continuity for Physician [PHY]),18 and from the perspective of the patient (proportion of patient visits in which each patient was seen by their assigned physician; Usual Provider Continuity [UPC]).19 Quality of care was assessed at the level of the physician across the following domains: 1) Percent of panel patients with hypertension at goal blood pressure (< 140/90 mmHg); 2) Quality of diabetes care (percent of patients with hemoglobin A1C < 8 %, blood pressure < 140/90 mmHg, LDL cholesterol < 100 mg/dL, and urine microalbumin checked within 1 year); 3) percent of eligible patients who receive recommended preventive care services: cervical cancer screening within 3 years for women ages 21–65 years, bone mineral density testing for women age 65 years or older, lipid screening within 5 years for women ages 45–75 years and men ages 35–75 years.
Clinical process measures were evaluated, including resident panel size, number of patient visits, and proportion of missed appointments. Patient satisfaction was assessed at the resident level using the American Board of Internal Medicine Patient Assessment Module questions.20 We used the teamwork and safety domains of the Safety Attitudes Questionnaire (SAQ)21 to assess residents’ perceptions of the teamwork and safety climate of the continuity clinic four times yearly pre-intervention and post-intervention. Substantial validity evidence supports use of the SAQ as a measure of safety climate in healthcare settings,21–25 including with residents.26 SAQ items are structured on 5-point scales (higher scores indicate greater teamwork and safety). We used a shortened form of the safety and teamwork domains, based on a published factor analysis in which items with the highest factor loading were retained.21
Educational Outcome Measures
Educational outcomes for each resident related to continuity clinic were retrieved from the existing Integrated Scheduling Evaluation System, an electronic database containing educational assessment data for which there is good validity evidence for each domain (Cronbach α range for internal consistency = 0.94–0.97; weighted kappa range for inter-rater reliability = 0.08–0.4).27 These domains included resident satisfaction with continuity clinic, faculty satisfaction with continuity clinic, and resident performance in continuity clinic, as assessed by the pool of eight teaching faculty in each resident’s firm.
The Learners’ Perceptions Survey, a tool to assess resident satisfaction with continuity clinic,28 was included to augment institutional satisfaction measures as a more granular opportunity for assessment of clinic domains. Further, survey items were used to assess resident perceived preparedness to manage outpatients through one item on a 5-point Likert scale,29 and common outpatient conditions through eight items/conditions on a 5-point Likert scale8 (score of 5 represents highest preparedness on both scales). Residents received this survey electronically during the last 2 months of each study interval.
Attendance at program-wide teaching conferences on a variety of internal medicine teaching topics, held from 12:15 PM to 1:00 PM, 4 days per week, was recorded during both study intervals through a card swipe mechanism. Rotation or site-specific teaching session attendance records (e.g., ambulatory morning report, hospital morning report, etc.) were not included.
Data Analysis
Demographics of the resident cohorts were reported with descriptive statistics; comparison between the two groups was performed using 2-sample t tests for means and Fisher’s Exact Test for percentages. Changes in the outcome measures outlined above were assessed using generalized linear models with identity link function and normally distributed errors, estimated by the generalized estimating equation (GEE) method for correlated responses within residents. For each outcome, means and standard errors were reported for each time period, and p values for chi-square tests were used to assess changes seen with the new continuity clinic model. A conservative alpha level of 0.01 was used to account for multiple comparisons. All analyses were performed using SAS statistical software (version 9.3; SAS Institute Inc, Cary, North Carolina).
RESULTS
No significant differences in cohort demographics were seen between study intervals within post-graduate years (PGY) (all p values > 0.11). The overall mean (standard deviation) age at the start of each study interval was 27.3 (2.3), 28.8 (2.7), and 29.7 (2.8) for PGY 1, 2, and 3, respectively; 42 % were female, 84 % were US allopathic medical school graduates, and 82 % were considering pursuit of subspecialty training after residency at the time of the last survey. The overall mean (standard deviation) USMLE Step 1 and 2 scores were 232.6 (16.0) and 243.2 (16.2), respectively.
Clinical Outcomes
Clinical outcomes in the pre-intervention and post-intervention intervals are shown in Table 2. Mean panel size increased significantly (120 vs. 137.6; p ≤ 0.001), individual physician and patient continuity of care declined (63 % vs. 48 % from physician perspective; 61 % vs. 51 % from patient perspective; p ≤ 0.001 for both), while care team continuity was unchanged. The proportion of missed appointments decreased (12.5 % vs. 10.9 %; p ≤ 0.01). Perceived safety and teamwork in the outpatient environment improved (3.6 vs. 4.1 on 5-point scale; p ≤ 0.001). There was no consistent difference for the remainder of clinical outcomes. There was no difference in patient satisfaction.
Educational Outcomes
Attendance at teaching conferences improved significantly (57.1 vs. 64.4; p ≤ 0.001), as did resident clinic performance as assessed by faculty (3.6 vs. 3.9 on 5-point scale; p ≤ 0.001). There were no differences in resident or faculty satisfaction with clinic (Table 3).
There were 56 Learners’ Perceptions Surveys completed in the pre-intervention interval (response rate = 58 %) and 55 surveys completed in the post-intervention interval (response rate = 57 %). The Learners’ Perceptions Survey confirmed no overall change in resident satisfaction. However, the two items that assessed the ability to focus on clinic and perceived inpatient/outpatient balance both improved significantly (30 % vs. 85 %, 27 % vs. 71 % very or somewhat satisfied, respectively, both p < 0.0001). There were no changes in perceived preparedness to manage outpatients or common outpatient conditions.
DISCUSSION
In this study, we found that continuity clinic redesign from a traditional weekly continuity clinic model to a 50/50 model, highlighted by separation of the inpatient and outpatient experience, more total clinic days, more faculty–resident continuity, and clinic care teams designed to maximize patient appointment access, was associated with increased panel size and patient visits, decreased continuity of care, decreased missed appointments, improved perceived safety and teamwork, improved teaching conference attendance, improved clinic evaluation scores of residents by faculty, and little change in other clinical or educational outcomes. This adds to the literature by reporting a comprehensive evaluation of clinical and educational outcomes influenced by redesign that has not been previously described.
Of the multiple components of this intervention, the de-linking of the inpatient–outpatient experiences was the most structurally significant. In a previous multi-institutional survey, the vast majority of residents and program directors felt that the absence of conflict between inpatient and outpatient responsibilities is important for outpatient training,30 and this conflict is associated with inability to focus on clinic31 and low resident and patient satisfaction.24,32 Indeed, concern about this conflict has prompted recent changes to accreditation requirements for internal medicine that now mandate that “programs must develop models and schedules for ambulatory training that minimize conflicting inpatient and outpatient responsibilities.”33 In our study, residents reported significant improvement in their ability to focus on clinic days and perceived inpatient/outpatient balance in the 50/50 model compared with the traditional model. Not surprisingly, our finding of increased conference attendance suggests that de-linking inpatient–outpatient experiences may also protect time for education.
Increased resident exposure to clinic through more clinic sessions and associated visits has the potential to improve continuity of care, thereby providing the opportunity to enhance longitudinal coordinated care of complex medical patients.34 Further, more concentrated clinic experiences in the new model (twice weekly clinic and dedicated continuity clinic months) may lay the foundation for burst continuity, i.e., allowing more frequent monitoring during acute illness.35 Unfortunately, these structural changes did not demonstrate increased continuity of care as we had hoped. Instead, continuity of care in our model decreased at the individual physician and patient level. This may be due, in part, to two factors.
First, an unintended consequence of this structural change was a disproportionate increase in average resident panel size; more clinic days triggered an opening of resident panels to meet clinical demand. It is likely that these large panel sizes contributed to lower individual physician continuity of care through less relative appointment slots per patient.36 Therefore, future applications of this model should strive to maintain stable panel sizes to prevent a lapse in continuity.
Second, the fact that each resident has less total months with clinic availability may have negatively impacted continuity of care. Patients with acute medical needs during a month when their resident is working in the hospital will not have a continuity experience with their resident for that encounter. One month intervals between clinic days may be too long to optimize continuity. Clinic models with more rapid “cycling” of inpatient/outpatient experiences may improve continuity while maintaining the inpatient/outpatient split.12 Alternatively, adding back a limited number of clinic sessions (1–2) to inpatient months for chronic disease management, follow-up, or sub-acute visits with the resident’s own patients may improve continuity within our model.
Of note, while individual continuity of care decreased in this model, team-based continuity was preserved. The ability to work effectively in interdisciplinary healthcare teams is essential for effective outpatient care in a patient-centered medical home.37 Further, development of effective resident care teams can improve the sense of continuity, office efficiency, and team collaboration in continuity clinics.38 In our study, residents reported significantly higher teamwork and safety climate in the continuity clinic environment in the new model. These perceptions may reflect increased familiarity and comfort with clinic processes and infrastructure, as well as increased opportunities to interact with colleagues in interdisciplinary teams in the clinic setting. Further, our finding of decreased missed appointments in the new model suggests that patient behaviors may be reacting to this more cohesive care team process. Finally, the fact that patient satisfaction did not decrease despite a drop in continuity with their own physician suggests that team-based continuity may be acceptable for patients.
This study has limitations. Most significantly, a lack of a concurrent control group precludes the ability to assign causation to the intervention. Because this was a multi-component intervention, future work should determine the relative impact of relevant covariates on clinical and educational outcomes. Further, though sample size was relatively large, results of this single-institution study may not be generalizable to other settings.
In summary, this study reports a detailed evaluation of clinical and educational outcomes in a traditional and 50/50 model of continuity clinic. Although this model requires further study in other settings, these results suggest that a 50/50 outpatient-inpatient continuity clinic structure may allow residents to manage more patients and enhance the climate of teamwork and safety in the continuity clinic, compared to traditional models. Future work should explore ways to preserve continuity of care and enhance quality of care within this model.
REFERENCES
Weinberger SE, Smith LG, Collier VU. Redesigning training for internal medicine. Ann Intern Med. 2006;144:927–32.
Kassirer JP. Redesigning graduate medical education—location and content. N Engl J Med. 1996;335:507–9.
Meyers FJ, Weinberger SE, Fitzgibbons JP, Glassroth J, Duffy FD, Clayton CP. Redesigning residency training in internal medicine: the consensus report of the alliance for academic internal medicine education redesign task force. Acad Med. 2007;82:1211–9.
Fitzgibbons JP, Meyers FJ. Redesigning training for internal medicine. Ann Intern Med. 2006;145:865–6. author reply 6.
Holmboe ES, Bowen JL, Green M, et al. Reforming internal medicine residency training. A report from the society of general internal medicine’s task force for residency reform. J Gen Intern Med. 2005;20:1165–72.
Bowen JL, Salerno SM, Chamberlain JK, Eckstrom E, Chen HL, Brandenburg S. Changing habits of practice. Transforming internal medicine residency education in ambulatory settings. J Gen Intern Med. 2005;20:1181–7.
Darer JD, Hwang W, Pham HH, Bass EB, Anderson G. More training needed in chronic care: a survey of US physicians. Acad Med. 2004;79:541–8.
Wiest FC, Ferris TG, Gokhale M, Campbell EG, Weissman JS, Blumenthal D. Preparedness of internal medicine and family practice residents for treating common conditions. JAMA. 2002;288:2609–14.
Warm EJ, Schauer DP, Diers T, et al. The ambulatory long-block: an accreditation council for graduate medical education (ACGME) educational innovations project (EIP). J Gen Intern Med. 2008;23:921–6.
Warm EJ. Interval examination: the ambulatory long block. J Gen Intern Med. 2010;25:750–2.
Hoskote S, Mehta B, Fried ED. The sxi plus-two ambulatory care model: a necessity in today’s internal medicine resideny program. J Med Educ Perspect. 2012;1:16–9.
Mariotti JL, Shalaby M, Fitzgibbons JP. The 4ratio1 schedule: a novel template for internal medicine residencies. J Grad Med Educ. 2010;2:541–7.
Swing SR, Clyman SG, Holmboe ES, Williams RG. Advancing resident assessment in graduate medical education. J Grad Med Educ. 2009;1(2):278–86.
Accreditation Council for Graduate Medical Education. Educational innovation project. (Accessed December 11, 2012, at http://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/140_EIP_PR205.pdf.)
Arora V, Guardiano S, Donaldson D, Storch I, Hemstreet P. Closing the gap between internal medicine training and practice: recommendations from recent graduates. Am J Med. 2005;118:680–5.
Green LA, Fryer GE Jr, Yawn BP, Lanier D, Dovey SM. The ecology of medical care revisited. N Engl J Med. 2001;344:2021–5.
Chaudhry R, Scheitel SM, McMurtry EK, et al. Web-based proactive system to improve breast cancer screening: a randomized controlled trial. Arch Intern Med. 2007;167:606–11.
Darden PM, Ector W, Moran C, Quattlebaum TG. Comparison of continuity in a resident versus private practice. Pediatrics. 2001;108:1263–8.
Breslau N, Reeb KG. Continuity of care in a university-based practice. J Med Educ. 1975;50:965–9.
Webster G. Final report on the patient satisfaction questionnaire project. Philadelphia: American Board of Internal Medicine Committee on Evaluation of Clinical Competence; 1989.
Sexton JB, Helmreich RL, Neilands TB, et al. The safety attitudes questionnaire: psychometric properties, benchmarking data, and emerging research. BMC Heal Serv Res. 2006;6:44.
Sexton JB, Holzmueller CG, Pronovost PJ, et al. Variation in caregiver perceptions of teamwork climate in labor and delivery units. J Perinatol. 2006;26:463–70.
Siassakos D, Fox R, Hunt L, et al. Attitudes toward safety and teamwork in a maternity unit with embedded team training. Am J Med Qual. 2011;26:132–7.
Carney BT, West P, Neily JB, Mills PD, Bagian JP. Improving perceptions of teamwork climate with the veterans health administration medical team training program. Am J Med Qual. 2011;26:480–4.
Sexton JB, Thomas EJ, Helmreich LR, et al. Frontline assessments of healthcare culture: safety attitudes questionnaire norms and psychometric properties. Austin: The University of Texas Center of Excellence for Patient Safety Research and Practice; 2004. Technical Report No. 04–01.
O’Leary KJ, Wayne DB, Haviley C, Slade ME, Lee J, Williams MV. Improving teamwork: impact of structured interdisciplinary rounds on a medical teaching unit. J Gen Intern Med. 2010;25:826–32.
Beckman TJ, Mandrekar JN, Engstler GJ, Ficalora RD. Determining reliability of clinical assessment scores in real time. Teach Learn Med. 2009;21:188–94.
Keitz SA, Holland GJ, Melander EH, Bosworth HB, Pincus SH. The veterans affairs learners’ perceptions survey: the foundation for educational quality improvement. Acad Med. 2003;78:910–7.
Blumenthal D, Gokhale M, Campbell EG, Weissman JS. Preparedness for clinical practice: reports of graduating residents at academic health centers. JAMA. 2001;286:1027–34.
Thomas KG, West CP, Popkave C, et al. Alternative approaches to ambulatory training: internal medicine residents’ and program directors’ perspectives. J Gen Intern Med. 2009;24:904–10.
Salerno SM, Faestel PM, Mulligan T, Rosenblum MJ. Disruptions and satisfaction in internal medicine resident continuity clinic differ between inpatient and outpatient rotations. Teach Learn Med. 2007;19:30–4.
Feddock CA, Hoellein AR, Griffith CH, et al. Are continuity clinic patients less satisfied when residents have a heavy inpatient workload? Eval Health Prof. 2005;28:390–9.
Accreditation Council for Graduate Medical Education. ACGME program requirements for graduate medical education in internal medicine. 2009. (Accessed December 11, 2012, at http://www.acgme.org/acgmeweb/Portals/0/PFAssets/ProgramRequirements/140_internal_medicine_07012009.pdf).
American College of Physicians. Joint principles for the medical education of physicians as preparation for practice in the patient-centered medical home. (Accessed December 11, 2012, at http://www.acponline.org/running_practice/delivery_and_payment_models/pcmh/understanding/educ-joint-principles.pdf).
American Medical Association. Redesigning residency: new models for internal medicine residency programs. (Accessed December 11, 2012, at http://www.ama-assn.org/amednews/2006/10/23/prsa1023.htm).
Francis MD, Zahnd WE, Varney A, Scaife SL, Francis ML. Effect of number of clinics and panel size on patient continuity for medical residents. J Grad Med Educ. 2009;1:310–5.
Sevin C, Moore G, Shepherd J, Jacobs T, Hupke C. Transforming care teams to provide the best possible patient-centered, collaborative care. J Ambul Care Manag. 2009;32:24–31.
Hern T, Talen M, Babiuch C, Durazo-Arvizu R. Patient care management teams: improving continuity, office efficiency, and teamwork in a residency clinic. J Grad Med Educ. 2009;1:67–72.
Acknowledgements
This work was supported by the Mayo Clinic Department of Medicine and by the Mayo Clinic Internal Medicine Residency Office of Educational Innovations, as part of the ACGME Educational Innovations Project. The views expressed by the authors are their own and should not be considered policy statements of any organizations with which any of the authors may be affiliated. This work was presented at the Society of General Internal Medicine 2012 Annual Meeting.
Conflict of Interest
Mark Wieland has no conflict of interest. Andrew Halvorsen has no conflict of interest. Rajeev Chaudhry is an employee of Mayo Clinic and the inventor of the population management software referenced in this publication. Mayo Clinic has licensed this technology to a commercial entity (Vital-Health Software), but has received no royalties to date. Dr. Chaudhry receives no royalties from the licensing of this technology. Darcy Reed has no conflict of interest. Furman McDonald has no conflict of interest. Kris Thomas has no conflict of interest.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Wieland, M.L., Halvorsen, A.J., Chaudhry, R. et al. An Evaluation of Internal Medicine Residency Continuity Clinic Redesign to a 50/50 Outpatient–Inpatient Model. J GEN INTERN MED 28, 1014–1019 (2013). https://doi.org/10.1007/s11606-012-2312-1
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11606-012-2312-1