Introduction

In the literature, there is a paucity of descriptive studies that explain how undergraduate medical education (UME) programs are tackling formalized continuous quality improvement (CQI) requirements. Since the 1990s, higher education has adopted more formalized improvement systems and processes to monitor growing and complex organizations [1]. Roffe argues that in higher education, traditional CQI methodologies that are practical in other areas (e.g., manufacturing and industry) are more challenging to implement in higher education [2]. However, these issues are not insurmountable, and good practices can lead to successful outcomes [2]. For medical schools, the Liaison Committee on Medical Education (LCME) has charged undergraduate medical education (UME) programs to implement a CQI process. Blouin et al. contend that a major outcome of medical school accreditation is that leaders are encouraged to establish a true CQI model that relies on perennial reviews of the program [3]. In July 2015, the LCME officially transitioned to a new set of accreditation standards with associated “elements,” notably, Element 1.1 evolved from the “old” standard Institutional Setting 1 (IS-1). IS-1 noted that “an institution that offers a medical education program must engage in a planning process that sets the direction for its program and results in measurable outcomes” [4]. The Element now states that a medical school must engage “in ongoing planning and continuous quality improvement processes that establish short and long-term programmatic goals, result in the achievement of measurable outcomes that are used to improve programmatic quality, and ensure effective monitoring of the medical education program’s compliance with accreditation standards” [5]. For the first time, in July 2015, LCME standards officially required a formal CQI process.

The restructuring of Element 1.1 led many UME programs to consider their processes for monitoring formal accreditation standards. Barzansky et al. contend that medical schools should review compliance with accreditation standards internally and at regular intervals seeking to establish a CQI culture [6]. Interim reviews should be established to supplement the self-study and onsite review by the LCME that occurs on a typical 8-year cycle [6]. Since the formal adoption of Element 1.1, UME programs are left to ascertain a way forward, and establish their own processes. As noted in the LCME white paper, “Implementing a System for Monitoring Performance in LCME Accreditation Standards,” programs must adopt a guiding policy, have dedicated personnel, and designate resources (including software or hardware) [7]. Furthermore, schools must choose the elements for monitoring, the timing of reviews, and goals to remediate issues (or remain compliant), and ultimately, repeat the process ensuring continuous compliance. Some medical educators have argued that veritable CQI processes may enhance the medical education environment, which may ultimately translate to improved clinical competence and, in turn, better patient care [8]. Furthermore, schools should rely on validated instruments borrowed from fields outside of higher education [8]. Shroyer et al., for instance, have presented their practical approach to creating and maintaining a CQI process, including a comprehensive dashboard system [9]. According to these authors, institutions are seeking practical tools and methods to formally track and organize educational performance metrics [9].

In general, the development of a thorough CQI process requires serious planning that includes delineating shareholders and gaining buy-in from faculty and administrators, determining standards to monitor, defining key program performance indicators and benchmarks for compliance, developing a data management system, and ultimately reviewing standards. The LCME white paper establishes some basic principles of expectations, but by design, it does not dictate a process. Therefore, 3 years after reformatting Element 1.1, medical educators are looking to peers, seeking reassurance that their processes are in step with current thinking and trends. Administrators and schools may be unsure of how to proceed, wanting to allocate necessary resources and do what is necessary for compliance. Some members of the Association of American Medical Colleges’ (AAMC) Southern Group on Educational Affairs (SGEA) CQI Special Interest Group (SIG) sought to understand better what their peer institutions are doing to solve practical issues encountered while developing a CQI model. The purpose of this descriptive study is to detail how ten medical schools are implementing CQI processes that provide future directions other medical schools may adopt or adapt.

Methods

In order to answer some of the most fundamental questions, we compiled information about how their CQI processes were disparate and similar. We sought to answer basic questions that include the following: (1) What individuals are typically maintaining primary responsibility of the CQI process?; (2) Are schools creating “offices” for CQI management?; (3) How many schools are charging a CQI committee and if so, what is the membership makeup of the committee?; (4) How many LCME standards does this committee monitor annually?; (5) What software are schools using for data collection and data management?; (6) Are institutions using a formal CQI or quality assurance process (e.g., Plan-Do-Study-Act “PDSA,” Baldrige Excellence Framework, Balanced Scorecard, ISO 9000)?; (7) Are institutions using consultants or engaging the LCME secretariat for guidance?; (8) What are some lessons learned from LCME site visits, program’s greatest CQI successes and greatest failures?

We submitted the characteristics of CQI processes and opinions about the LCME standard into a REDCap electronic data capture tool [10]. The ten medical schools included in this analysis include Emory University School of Medicine, Florida International University Herbert Wertheim College of Medicine, Mercer University School of Medicine, Texas A&M College of Medicine, Tulane University School of Medicine, University of Mississippi School of Medicine, University of Tennessee Health Science Center College of Medicine, University of Texas Rio Grande Valley School of Medicine, Wake Forest School of Medicine, and West Virginia University School of Medicine. Participants included three directors, three assistant deans, three associate deans, and one vice-dean.

Results and Discussion

CQI White Paper, Assignment of Staff, and CQI Responsibility

The first question asked respondents to rate the clarity of the LCME CQI white paper with options that included “Did not read” (No Value), “Not at all clear” (1), “Somewhat clear” (2), “Clear” (3), and “Very clear” (4). The response average totaled 2.7, with four respondents indicating “somewhat clear,” five respondents indicating “clear” and one respondent selecting “very clear.” Next, respondents were asked if their institution had assigned an individual with the primary responsibility of CQI management, of which nine institutions replied “yes” and one replied “no.” Five institutions further indicated that this individual was administratively located in the schools’ “dean’s office or equivalent;” while the four respondents indicated a “department of medical education or equivalent.” Four respondents indicated that their institutions had established a CQI office with assigned staff. One institution reported their office had three full-time employees, one reported two to three full-time employees, one reported two full-time employees, and one institution reported one full-time employee (see Table 1 for an overview of responses).

Table 1 Overview of continuous quality improvement processes across medical schools (schools listed in no specific order)

CQI Committee

The majority (nine of ten) reported the creation of a committee to oversee the CQI process at their school. Furthermore, six of those nine committees operate independently, and are not subordinate to any other committee at the school. Of the three committees who are subordinate to another committee, two are subordinate to a curriculum committee and one to a faculty committee. On average, 11 individuals serve on those committees with a range of five to 30 members. Five committees formed in academic year 2017–2018, while three committees formed in the previous academic year, 2016–2017. One school’s committee formed in 2015–2016. Membership (see Table 2) across the committees is highly variable, but includes the following participation in some capacity (number reporting membership in parentheses): dean of education (8); dean of student affairs (7); curriculum committee leader (7); faculty development (6); assessment (5); students (5); faculty affairs (5); staff level CQI manager (5); finance representative (3); information technology (3); the dean (2); admissions (2); library services (2); financial aid (1); facilities/space management (1); joint academic programs (e.g., MD/PhD) (1); medical affairs (affiliate hospitals) (1); student inclusion and diversity (1); dean of administration (1); clinical CQI manager (1); and at large faculty (1). One school planned to add elected faculty members to the committee after an LCME site visit recommended broader faculty participation in committees in general. Likewise, the workload of the committees is also mixed. Respondents were asked to indicate the number of LCME elements reviewed annually: two institutions reported reviewing fewer than 20 elements; three institutions reported between 21 and 30 elements; three reported 31–40 elements; and one reported reviewing all LCME elements.

Table 2 Committee makeup at each school (school’s assigned letter corresponds with letter in Table 1)

CQI Software

Respondents were also asked to indicate the software type used to help manage their CQI data or process. CQI data may include performance metrics in various areas related to LCME standards. The data may include quantitative and qualitative information and may inform decisions about whether long-term and short-term programmatic goals are being met. Eight of the ten respondents indicated that they utilize standard spreadsheet software. Two individuals also utilize a second program. One school noted utilizing a popular cloud-based system and indicated in the comments that they had a greater need for workflow management than offered via standard spreadsheet software. The other school with a second software program noted utilization of specific accreditation management software, which they indicated was not currently meeting the school’s needs for CQI. One of the two schools who did not indicate they were utilizing standard spreadsheet software specified their institution was also utilizing a specific accreditation management software. However, the respondent noted that they had just begun using the software and could not yet give an adequate review of the software. Two schools utilizing standard spreadsheet software and one school not utilizing any software indicated they were currently searching for management software. One also indicated that they had opted to move to an accreditation management software before their next site visit, but had not done so at this time. Of the eight schools using standard spreadsheet software, five indicated that the program was meeting their needs for CQI management. Some anecdotally commented that an advantage of using standard spreadsheet software is that users are unfamiliar with any other systems that could meet their needs, but are already familiar with common software. Others noted that standard spreadsheet software was not specific to the CQI process, is not customizable, is not searchable, requires many sheets, and needs better “dashboarding.”

Formally Recognized CQI Methodology

Of the ten total respondents, three indicated their program was utilizing a formal CQI methodology (e.g., PDSA, Balanced Scorecard, Baldridge Excellence Framework, ISO 9000). All three medical programs noted the use of Plan-Do-Study-Act (PDSA), citing particular benefits of the CQI model: PDSA makes sense to people, is easily understandable to most, and provides a simple model for communicating expectations to faculty and other stakeholders. Two challenges remain for the optimal application of PDSA for medical education CQI: medical schools must (1) define key metrics for LCME elements and (2) establish robust protocols for “closing the loop.” Nevertheless, all three respondents indicated that they would recommend the process to others, despite one of the three noting that it was not meeting their needs. None of the remaining seven respondents indicated their utilization of a formal theoretical framework for developing CQI methodology.

Resources for CQI: Consultants, the LCME Secretariat, and LCME Visits

Of the ten respondents, one indicated engaging consultants to review their school’s CQI process and considered it helpful. Two respondents specified that they had engaged the LCME secretariat (outside of monthly conference calls) for CQI guidance. Both respondents indicated guidance as “helpful” (from options: “not at all helpful,” “somewhat helpful,” “helpful,” and “very helpful”). Furthermore, three respondents indicated that they have undergone review by the LCME since the establishment of their CQI process. One individual noted that their process was considered acceptable. Another noted that future LCME visits would expect that schools meet all requirements specified in the LCME White Paper (including established policy, personnel, and resources). The third respondent indicated that the reviewers were more interested in CQI of elements/standards and less interested in “true” CQI. All ten respondents of this section noted that, if offered, they would be interested in participating in a monthly or quarterly conference call to discuss CQI related issues or present individual CQI programs.

Notable Successes and Challenges

Finally, respondents commented on successes and challenges of implementing a CQI process at their school. Several themes emerged. First, some schools noted that the CQI process requires an ongoing commitment to remaining in compliance with LCME standards. This is portent to a culture shift at many institutions where a collective feeling has been that every 8 years the school must comply with LCME standards. However, with an accountable CQI process, this logic is no longer tolerable. Furthermore, some celebrated simply gaining the attention of school leadership and faculty in general. Others indicated marked improvements in areas where they had previously seen difficulty. One respondent noted “a system-wide approach to improvements in our learning environment, career advising and academic advising.” Another individual noted a success as, “addressing several of the concerns students brought up in previous [Graduation Questionnaires]: observation of history and physicals, mid-clerkship feedback, clinical skills, all improved dramatically with our CQI process.” Finally, one respondent noted that the adoption of a formal CQI process helped to formalize communication.

A few of the ten schools also indicated challenges with the development of their CQI process. Simply put, medical schools are building the CQI bridge as they cross it. Notably, for some, implementation has been slow. Practical issues include designating faculty, staff, and administrative support as well as committing to a software program and appointing/hiring individuals for the task of data management. Another indicated a challenge included appropriately documenting improvement efforts. One comment noted that a challenge for their institution included clearly designating roles and responsibilities (i.e., who is doing what). Likewise, one individual noted a challenge included the development of a process that coincided with the natural periods of collection of data (i.e., avoiding duplication of effort). A response noted that there is a learning curve associated with the process, writing that a challenge for them was, “keeping faculty and staff informed of expectations of the CQI process, even when we, ourselves, may need clarity on what exactly the expectation is for monitoring certain standards.”

Limitations

We recognize a few limiting factors of this descriptive study. First, we did not seek the amount of allocated financial resources at each school to establish and manage a CQI process from designating faculty time, purchasing software, engaging consultants, and/or employing CQI staff managers. Next, ten schools participated; therefore, the results serve as a snapshot of a small number of institutions. Finally, the CQI process of each medical school has not been evaluated through an LCME site visit. Therefore, it is not possible to be certain that the processes established at each school would satisfy the requirements of LCME Element 1.1.

Conclusion and Future Directions

Since the LCME shifted to requiring a formal CQI process in July 2015, it is evident that many programs have taken the change seriously, and are considering how to embrace Element 1.1. For the ten medical schools that participated, many have earnestly reflected on a way forward. As with any major shift in process, paralysis can set in, leaving some unsure of how to proceed. This paper is not intended to dispel or dismiss myths and rumors about expectations of Element 1.1. However, it does seek to glean basic information from programs establishing a CQI model. At least for our ten programs, schools are engaging in a CQI process, determining policy, personnel, and resources. The process, for some, includes designating individuals (faculty and/or staff) as well as newly formed committees. Likewise, many schools are relying on standard spreadsheet software for data management, although not exclusively. Likewise, some are following established CQI processes (e.g., PDSA), but again, not exclusively. In the future, the CQI processes utilized at UME programs have room to be appraised and studied. Studies should include defining commonalities of CQI models as well as seeking differences. Furthermore, what are components of CQI models that may affect accreditation compliance negatively? Are there “worst practices” to avoid? What LCME elements are most commonly identified for CQI, and what are the successes and struggles for addressing those elements? What are identifiable challenges relating to using standard spreadsheet software and engaging information technology for support? What costs are schools accumulating to establish a CQI process? How can we engage students to be more involved in the CQI process? Finally, how do these major shifts to a formalized CQI impact the educational experience?