Introduction

Sustainability refers to the properties of a system that lead to endurance; sustainment refers to the actual persistence of the system over time. In the field of evidence-based practices, many national and statewide initiatives have had limited long-term impact (Drake et al. 2008; McFarlane et al. 2001). Similarly, large-scale initiatives to expand evidence-based practices have rarely been sustained beyond initial enthusiasm and grant funding (Hunter et al. 2015).

Despite enthusiasm for the adoption and maintenance of evidence-based practices in mental health, little research exists on sustainment. In one national study of several evidence-based practices, 80% of sites sustained their programs over a 2-year period (Swain et al. 2010). Other studies have reported lower 2-year program sustainment rates, ranging from 59 to 76% (Shaver 2015).

Research on sustainability factors also remains in early stages, despite a proliferation of theories and models (Wiltsey Stirman et al. 2012). Drawing on a synthesis of the implementation literature, Torrey et al. (2012) developed a conceptual framework of domains of activities necessary for implementation of an evidence-based practice in community mental health settings. The framework comprised prioritization, leadership, workforce, workflow, and reinforcement domains of activity. (See Table 1 for a description of these domains.) In a 2-year follow-up study of evidence-based practices using this same model to study sustainment, program leaders identified state support, practice proficiency, practice evaluation, agency leadership, and staff support as reasons for sustaining or not sustaining the program (Swain et al. 2010). A subsequent 6-year follow-up added financial and client compatibility domains to the sustainment framework (Bond et al. 2014). At 6 years, program leaders reported finances, poor workforce skills, and low prioritization as predictors of discontinuation.

Table 1 Operational definitions for the nine sustainability domains

As part of a prospective study of the 2-year sustainment of Individual Placement and Support (IPS; described elsewhere in this special section), this study examined team leaders’ perspectives on the key barriers and facilitators to sustainment of their IPS programs.

Methods

This study examined 2 years of sustainment between 2012 and 2014 among IPS programs participating in the IPS learning community. The institutional review board of Dartmouth College, which followed the principles outlined in the Declaration of Helsinki, approved the study.

Program

The IPS learning community includes the IPS Center, 20 state Departments of Mental Health and Vocational Rehabilitation, over 250 participating community mental health centers, and thousands of clients and families (see Johnson-Kwochka et al. and Bond et al. in this special section for a full description). At the time this study began in 2012, the learning community was smaller: 13 states and 129 community mental health center IPS programs. Members of the learning community support each other by sharing resources, experiences, and data in order to improve the quality and outcomes of employment services. The learning community also helps other states and programs to implement and expand IPS services.

This learning community followed many of the principles outlined in Schouten et al. (2008). Whereas a learning collaborative is short-term and aimed at successful implementation, the goals of a learning community continue beyond successful implementation toward sustainment of programs. In an earlier report we identified quality improvement activities including on-site technical assistance and fidelity reviews; funding through vocational rehabilitation services, Medicaid, and the State; and diverse sources of funding as factors promoting sustainment of these IPS programs (Bond et al. 2016). We found that the rates of fidelity assessment, participating in training, using available technical assistance, participating in conference calls, and sharing of strategies to access funding were all consistent with the learning community’s philosophy and were widely adopted.

Study Group

The study group consisted of IPS team leaders from 129 sites in 13 states. As of 2012, these 129 sites had provided IPS employment services for an average of 4.5 years (SD = 2.7).

Procedures

We conducted initial interviews with team leaders of all IPS programs in the learning community that were actively serving clients as of January 2012. To prepare interviewers, we held a daylong training aimed at clarifying the procedures and standardizing the interview process. A senior methodologist led the training, which included an overview on rigorous interviewing methods used to reduce interviewer bias, and led weekly supervision calls for problem-solving. For the initial team leader interviews, between February and May of 2012, 10 interviewers, knowledgeable about IPS, conducted telephone interviews with the IPS team leader at the 129 IPS programs. The telephone interviews averaged 1 h in length. Interviewers recorded responses verbatim.

Between February and July of 2014, a team of seven interviewers (including six who were interviewers in 2012) conducted the 2-year follow-up interviews with the same sites (100% participation), including those that had not sustained IPS. We determined sustainment status in 2014 from the team’s submission of quarterly outcome data to state leaders as part of their participation in the IPS learning community. As described elsewhere (Bond et al. 2016), 122 IPS sites had sustained their IPS programs, two sites had merged their IPS programs and continued, and five sites had discontinued IPS services. We interviewed the program leaders for the two programs that were independently operated in 2012 before the merger of their parent agencies. For the five discontinued sites, we interviewed the former IPS team leader, a clinical director, or another staff member who knew the history of the program and the reasons for discontinuation.

Interview Protocols

We developed the initial (2012) interview protocol by modifying one used in an earlier study of sustainment of evidence-based practices (Bond et al. 2014). The modifications made it more specific to IPS. (See Online Appendix for interview guide.) The follow-up (2014) interview protocol was a shortened version of the initial 2012 interview with the key questions unchanged. For the five discontinuing sites we constructed a brief semi-structured interview tailored to their specific circumstances, aimed at understanding the reasons for discontinuation.

Data Analysis

Coding of Barriers and Facilitators

We identified team leaders’ perceptions of the reasons for sustainment through responses to two open-ended questions and several close-ended questions on the perceived barriers and facilitators to sustaining their IPS programs. One open-ended question addressed barriers to sustainability: “…tell me three factors that have worked against sustaining IPS supported employment at your agency.” The second open-ended question addressed facilitators: “What are three factors that you think have been critical in sustaining IPS supported employment at your agency?” We also identified team leaders’ concerns about discontinuation of their IPS program with the question, “Do you have any worries about IPS being discontinued in the next year?”

The 2012 interview responses were entered into ATLAS.ti qualitative software, which facilitates systematic coding and analysis of qualitative data (Atlas.ti 2.0 2002). Two members of the research team coded responses. Beginning with a conceptual framework developed in a previous study examining sustainability of five evidence-based practices (Bond et al. 2014), the coders collaboratively developed a coding scheme to describe the content of responses based on a 10% subsample of the database. The a priori domains were augmented by inductive review (Braun and Clarke 2006). The final codebook included 10 content domains (agency prioritization, funding, agency workforce, workflow, external prioritization, reinforcement, client factors, local community, agency size, and miscellaneous) and 52 subdomains. (Operational definitions and examples are shown in Table 1.) Coders used the 52 subdomains to code responses. The coding process allowed for multiple codes per quotation. Agreement on coding between coders on barriers and facilitators across all responses for the 129 sites was acceptable (kappa = 0.60). The coders reached final coding decisions through consensus.

Two researchers coded the 2014 responses using the codebook developed in 2012. As coding progressed, coders added one content domain, leadership, which we constructed using subdomains in the 2012 codebook, and 10 new subdomains, yielding 11 content domains and 62 subdomains. Because coded data were organized differently in 2014 compared to 2012, we calculated kappa for each of the 11 content domains separately rather than based on agreement of subdomain coding. For barriers, kappa ranged from 0.85 to 0.97; for facilitators, kappa ranged from 0.81 to 0.95.

Statistical Analyses

We collapsed the 2012 subdomain coding into the 11 content domains. For each interview, we examined barrier codings to determine which of the 11 content domains were present and did the same for the facilitator codings, ignoring multiple codings of the same barrier/facilitator code within the interview. We excluded two low-frequency domains (the miscellaneous and agency size content domains) in subsequent analyses, resulting in nine content domains. We then rank ordered both the barrier and facilitator domains and focused on high-frequency domains (i.e., those identified by ≥30% of the team leaders.)

Results

Perceived Barriers and Facilitators in Sustained Programs

Barriers

As shown in Table 2, over one-third of team leaders in the 122 agencies that sustained IPS services identified inadequate funding, local community factors such as lack of public transportation and struggling economy, lack of agency prioritization, and workforce issues such as inability to hire IPS specialists as major barriers in 2012 and/or 2014. Funding barriers were diverse, including Medicaid restrictions, loss of state general funds, delays in milestone payments, access to only one source of funding, cuts to Medicaid funding, insufficient funding to hire staff such as peers and employment specialists, and lack of supplemental funding for events to showcase champions.

Table 2 Rank order of the number of sites identifying each domain as a barrier and facilitator in 2012 and 2014

Interview responses illustrated these common barriers. One team leader described the effects of limited funding on staff and their ability to serve their clients: “Zero-exclusion increases our referrals, but we don’t have the money to hire additional staff, so we have trouble bringing in the clients that are on the waiting list.” Another team leader highlighted staff turnover as a specific workforce barrier to sustainment: “Retention of employment specialists has been challenging. It may give a perception that it is not a desirable position or that it is a starter position, not something that people would want to do long-term. These positions are so different. It is hard to hire for these positions. It has not affected our fidelity scores, though.” Another team leader described some of the causes and consequences of poor agency prioritization: “… [a] lack of strong commitment and knowledge from my management above me. When we did our fidelity scale, the director of our health department said, ‘I know nothing about supported employment.’ My direct supervisor—it’s all about money for him.”

Local community barriers centered on limited public transportation (typically among rural agencies) and the struggling economy, leading to limited job availability. One team leader highlighted a relationship between local community barriers and funding: “Transportation is a big factor in how large or small we are able to offer our services. We have been resourceful about getting transportation dollars through grants. We live in a small town where so much of the town is rural, and our small transit company doesn’t go very far and it is expensive. So getting people to jobs, interviews, and around can limit what we are doing.” Leadership was the least endorsed barrier to sustainability over the 2-year period. In contrast, leadership was endorsed as a top facilitator for sustainability.

Facilitators

At least one-third of team leaders identified adequate funding, agency prioritization, leadership, workforce competence, and workflow such as minimal paperwork and collaboration with support services (clinical, vocational rehabilitation, state mental health) as facilitators to sustainment at both time points. The relative importance of facilitator content domains remained fairly consistent from 2012 to 2014. Team leader comments about funding as a facilitator fell into two main categories: either the agency had several sources of funding or specific funding sources were critical (e.g., Medicaid, Department of Rehabilitation Services, Department of Mental Health). Some team leaders reported reorganizing agency funds to support their program as solutions to limited funding: “Programs within the agency help to support each other financially. If the residential program is doing well, it offsets losses in the supported employment program” and “… They allocate funding from indigent funds and general funds when needed. They believe in IPS.”

Agency prioritization included the philosophy that employment is paramount for people with mental illness, a constant stream of referrals, buy-in from case managers and psychiatrists, celebrations recognizing success, and the agency staff members working together as a team. One team leader expressed the recovery-oriented philosophy of the organization, “Employment is kept as an absolutely critical mission of the agency and is viewed as recovery.” Another team leader described their agency’s support as “…agency-wide backing. The agency leadership buys into the notion that employment is everyone’s business, and it is reinforced through all the programs.”

Leadership facilitators included support from the administration at the agency, direct involvement of the executive directors, senior administrators dedicated to the program, and state mental health and vocational rehabilitation leaders advocating for the program. One team leader described their leader’s advocacy for their employment program: “Our agency’s director of community relationships has helped to bring a lot of exposure to our supported employment program. He has helped to get an article about the program in the local newspaper and a client success story. He put together a public access program that advertised our services to employers.” Team leaders reported skilled employment specialists, strong advocacy from supervisors, integrated clinical services, and collaboration with vocational rehabilitation services as workforce and workflow facilitators. One team leader described his/her team as “…a knowledgeable team, they are go-getters, want to get it done.” Another team leader specified improvements in workflow through his/her relationship with the clinical teams: “I have very supportive clinical and case management teams that facilitate integration of IPS and mental health and who make referrals and ensure that IPS runs smoothly.”

Worries About Discontinuation of IPS Services

Between 2012 and 2014, the percentage of team leaders reporting worries about discontinuation reduced from 24% (n = 29) to 14% (n = 17), McNemar’s test (N = 121), p = .06. In most cases, the reason team leaders gave for being worried about program discontinuation was a concern about long-term funding: 76% (n = 22) in 2012 and 65% (n = 11) in 2014.

Discontinued Programs

During the 2-year follow-up, only 4% of agencies (5 agencies in three states) discontinued IPS services. At three sites, respondents identified financial pressures as a major factor in discontinuation. Two of these also identified the demands of meeting IPS fidelity standards and maintaining associated program records as factors in the decision to discontinue. (Both states required agencies to meet fidelity standards to receive enhanced Medicaid reimbursement rates.) A third agency experienced the loss of the agency director, a strong IPS advocate, and was subsequently acquired by a large behavior health organization with different priorities. At the fourth agency, the site contact indicated that the IPS program had never officially started, that the agency was undergoing accreditation review, and that leaders did not want to overextend by pursuing implementation of a high-fidelity IPS program. At the fifth site, the site contact indicated that their rural medical school hospital setting was not suited to offer IPS. For their single IPS employment specialist, travel time had been a challenge.

Discussion

Team leaders at sustaining sites identified several barriers: funding, local community factors, prioritization of the program by the agency, and characteristics of the workforce. Team leaders who were worried about impending discontinuation of IPS services typically indicated funding as a primary concern, which paralleled those of administrators at discontinued sites. Nevertheless, team leaders reported that factors within financial supports, prioritization of the program by the agency, and the agency’s workforce, had helped to sustain their programs. Leadership and a streamlined workflow were also seen as facilitators; however, leadership was rarely endorsed as a barrier.

Because nearly all sites in this study sustained IPS, we believe that facilitators were much stronger than the barriers. Accomplishing this high sustainment rate would have been difficult in face of active opposition from the agency or state leadership. Most team leaders focused on program and agency-level actions and may not always recognize the crucial (and often behind-the-scenes) work of state and national leadership. Nevertheless, some IPS team leaders did indicate the role of state leadership in overcoming barriers, as the following quote illustrates: “We had a reduction in our usual mental health state funding, but the state subsequently made an investment, a new funding mechanism through Medicaid dollars. That was an improvement, bringing in more revenue.” This quote illustrates that while shortfalls in usual funding was a barrier, the new funding stream proactively developed by state leaders was a facilitator. Good leaders prioritized IPS, found ways to overcome financial challenges, and aligned workflow and workforce factors to sustain IPS.

Strong backing from senior leadership is not always present when IPS is disseminated on a wide scale. For example, in a multi-site study of implementation of IPS within the Department of Veterans Affairs’ national network of hospitals, staff identified a lack of leadership as a primary barrier (Pogoda et al. 2011). An alternative interpretation to the high sustainment rate found in the current study is that the sustainment of all of these programs may have derived from participation in the IPS learning community, which conferred multifarious benefits to the participating states and programs (Bond et al. 2016). These benefits include fidelity and outcome monitoring, staff training through an interactive online course on IPS, collaboration with local vocational counselors, and supervision and field mentoring (Becker et al. 2014). State leaders and the learning community’s leadership help these sites use these strategies by publically advocating for IPS, establishing polices that facilitate funding for IPS, providing technical assistance to ensure a trained work force, and conducting fidelity reviews to promote adherence to the IPS model (Bond et al. 2016). In all likelihood the actions of the learning community’s leadership, state governments, and local programs meshed in synergistic fashion to sustain IPS programs.

Our finding of a high rate of sustainment is atypical but the identified factors—which together may comprise sustainability—were more typical. They align with previous empirical studies, which have found leadership and funding as crucial factors in sustainment (Bond et al. 2014; Swain et al. 2010; Wiltsey Stirman et al. 2012) and also with the literature on conceptual models (Aarons et al. 2011); however, research on the adoption, implementation, and sustainment of evidence-based practices in mental health is in its infancy, especially with regard to sustainment. Several general conceptual frameworks have been proposed to guide measurement of sustainability factors (e.g., Chambers et al. 2013; Damschroder et al. 2009; Schell et al. 2013), but none has been validated. The model used in the IPS learning community postulates that interventions at several levels are important: (1) state mental health services and vocational rehabilitation collaborate on policy; the State provides funding, training opportunities, and fidelity monitoring, advocates for Medicaid waivers, and collects employment outcomes for planning with local agencies; (2) local agencies afford training, supervision, team-based care, and fidelity monitoring; the agencies track outcomes and collaborate with local vocational rehabilitation services; (3) consumers and families are involved in service planning, program monitoring, and advocacy for services; and (4) the IPS Employment Center provides in-person and online training, technical assistance and educational materials, collects data for outcome monitoring, holds telephone conferences and annual meetings, and provides research opportunities. The IPS model does overlap with and encompasses a variety of other models; however, our main goal for the current study was to examine several factors that may influence the sustainment of IPS programs.

The current study had several limitations. Statistical comparisons between discontinued sites and sustained sites were not feasible due to the small number of discontinued sites. A longer follow-up period likely would have yielded more discontinued sites, which may have made statistical comparisons possible. Second, we examined a single evidence-based practice, precluding comparisons between different practices on sustainment rates in order to identify practice-specific factors. Third, we interviewed a single respondent at each site, raising the issue of respondent bias. An alternative approach would be to use a web-based questionnaire and recruit multiple respondents from each site. Fourth, our assumption that the IPS team leader was the most knowledgeable informant may have been incorrect. Our decision may have led to systematic biases in viewing barriers and facilitators to sustainment. Other staff, such as the center director or state trainer, may have different insights. Fifth, our methods addressed perceived barriers and facilitators, not actual barriers and facilitators. Sixth, our sampling strategy limited sites to those in a learning community; we had no comparison group without a learning community. The learning community may have had an influence on the types of barriers and facilitators to sustainment of IPS programs, however further experimental or quasi-experimental research is greatly warranted given the lack of conclusive evidence on this topic to date.

Conclusions

This study of barriers and facilitators showed that program leaders of the 122 sustaining IPS programs, within the context of an ongoing learning community, overcame common barriers. The few non-sustained sites reported discontinuing due to funding and idiosyncratic reasons. We conclude that, within the context of an active multi-state learning community, secured funding, an agency’s support for the IPS programs, a strong workforce, a structured workflow, and strong agency and state leadership helped to sustain evidence-based IPS supported employment services. We recommend further study of whether these findings generalize to IPS programs outside of the learning community or to other evidence-based practices using controlled trials.