Introduction

Research continues to produce a steady stream of innovations that can improve routine care for youth with behavioral health problems, such as anxiety, depression, trauma, and disruptive behavior problems (Weisz and Kazdin 2017). Despite the promise of such research, these findings often are not successfully translated into everyday service settings in which youth naturally exist (Dingfelder and Mandell 2011; Owens et al. 2014). Implementation research across different service sectors has shown that without deliberate efforts to bridge the science-to-practice gap through the use of implementation strategies, there is likely to be uneven uptake, use, and sustainment of research findings as part of routine practice (Proctor et al. 2013; Powell et al. 2015). In fact, research from the broader field of implementation science has estimated that two thirds of implementation efforts fail (Burnes 2004; Damschroder et al. 2009) and most have no impact on service recipient outcomes (Powell et al. 2014).

There has been a strong push among researchers and policymakers to strategically increase the availability of evidence-based practices (EBP) as part of routine service delivery in the main settings in which youth function (Fixsen et al. 2013). Schools continue to be one of these settings, as they are the primary venue in which youth receive behavioral health supports (Farmer et al. 2003). Due to greater access, reduced stigma, and the availability of professionals who can deliver needed services, schools are an ideal setting to integrate behavioral health services with academic supports (Owens et al. 2014). Researchers have developed and evaluated numerous EBP that span multiple tiers of prevention (universal, targeted, and intensive) for implementation in schools. For example, school-wide positive behavior intervention and supports (Bradshaw et al. 2010) and social–emotional learning curricula (Brackett and Rivers 2013) prevent behavioral health problems and promote success-enabling factors (Cook et al. 2015). Moreover, targeted small group interventions grounded in cognitive behavior therapy have been shown to decrease mental health problems and promote better academic-related outcomes (e.g., Lochman and Wells 2002). Last, more intensive forms of school-based treatment, such as individualized function-based behavior intervention plans (Walker et al. 2017) and therapeutic interventions (e.g., Morina et al. 2016), have been linked to reduced problem behavior and improvements in social, emotional, and academic functioning among high-risk youth. For these reasons, schools are under immense pressure from policy and stakeholders to deliver a continuum of EBP that target preventing and ameliorating behavioral health problems (Bruns et al. 2016).

Implementation Gap

Schools are confronted with an implementation gap, with the slow adoption of EBP into routine practice ultimately limiting their effects on youth outcomes (Gottfredson and Gottfredson 2002; Owens et al. 2014; Ringwalt et al. 2009). Even when EBP are selected for adoption in schools, they are infrequently implemented with fidelity or sustained over time (Gottfredson and Gottfredson 2002; Ringwalt et al. 2004). This is problematic given the demonstrated link between high-quality implementation and changes in youth social, emotional, and academic outcomes (Durlak and DuPre 2008; St. Peter Pipkin et al. 2010). Addressing the extant gap between research and practice represents a critical aspect of translational prevention science to move beyond development and efficacy studies to dissemination and implementation research that seeks to realize the potential public health impact of prevention science (Fishbein et al. 2016).

Implementation Strategies

Successful implementation efforts depend on the strategic use of implementation strategies, methods, or techniques that are used to facilitate the adoption, use, and sustainment of EBP (Proctor et al. 2013). These methods and techniques target putative contextual and individual-level mechanisms that influence implementation processes and outcomes (e.g., acceptability, appropriateness, fidelity, penetration) (Lewis et al. 2018). Implementation outcomes, the targets of implementation strategies, are district from service outcomes, reflect the primary dependent variables in implementation research, and are defined as the effects of deliberate and purposeful efforts to influence implementation (Proctor et al. 2009). Increasingly, implementation strategies are being developed and tested to promote the adoption, delivery, and sustainment of EBP in routine service settings (Powell et al. 2017).

Implementation research and frameworks point to a wealth of strategies that are pertinent to different phases (e.g., exploration, preparation, implementation, and sustainment; Aarons et al. 2011) and across multiple levels (e.g., outer setting, inner setting, individual implementers, the innovation/practice itself) of the implementation process (Leeman et al. 2017). For example, high-quality professional development that involves dynamic training and follow-up consultation/coaching has been shown to successfully increase providers’ delivery of EBP (Herschell et al. 2010; Sholomskas et al. 2005). Moreover, assessing readiness by examining barriers to and facilitators of implementation can inform strategic planning that targets specific implementation outcomes, such as appropriateness (i.e., the suitability or fit of a particular practice to the context) and acceptability (i.e., satisfaction with a particular practice; Weiner et al. 2017). Additionally, monitoring implementation and providing data-driven performance-based feedback can serve as an effective means for continuous improvement of implementation outcomes, such as intervention fidelity and reach (McHugh and Barlow 2010). Moreover, there is a general consensus among implementation scientists that a core aspect of implementation practice is the selection and tailoring of implementation strategies to address the barriers present within a given service setting (Powell et al. 2017). Tailoring implementation strategies typically involves an assessment of determinants that are likely to influence implementation outcomes, such as features of the intervention or practice (e.g., Good Behavior Game as a classroom management practice; Barrish et al. 1969), context-specific determinants associated with the school setting in which the practice will be implemented (e.g., supportive leadership, protected time, connect between practice and performance evaluations, etc.), or individual-level factors associated with those expected to implement the practice (e.g., self-efficacy, beliefs and attitudes, intentions to implement). Prior research has established guides to inform the assessment of these factors (Flottorp et al. 2013; Wensing and Grol 2005), with resulting data informing the selection and tailoring of implementation strategies to context-specific barriers associated with a given setting (Wensing et al. 2011).

Keeping track of all of the implementation strategies represents an information management problem that is likely to stall both research and practice, especially when inconsistent terminology and inadequate definitions are used in research (Proctor et al. 2013). Researchers have focused on identifying and revising a taxonomy of implementation strategies that could inform future implementation research, as well as real-world implementation practice efforts focused on bridging the science-to-practice gap.

Implementation research in the healthcare sector is more advanced than other service sectors, including schools (e.g., Sanetti et al., manuscript in preparation). In fact, organized efforts have been undertaken to generate a taxonomy of implementation strategies that could be utilized within healthcare research. The Expert Recommendations for Implementing Change (ERIC) project (Waltz et al. 2015) was built upon work conducted by a smaller group of implementation researchers who systematically developed an initial taxonomy of implementation strategies (Powell et al. 2012). This list was refined via a larger panel of implementation experts (Powell et al. 2015) and analyzed to examine the feasibility and importance of each implementation strategy (Waltz et al. 2015). The ERIC compilation (Powell et al. 2015) has provided a much-needed common language for implementation researchers and practitioners and allowed for better tracking and reporting implementation strategies within and across studies (Proctor et al. 2013).

As it stands, no comparable effort has occurred to support implementation in schools. Given that the education sector has a number of unique implementation challenges—including educational timelines, professional characteristics, policies, and organizational constraints (Forman et al. 2013; Owens et al. 2014)—it is likely that strategies designed to support clinical practice in more traditional healthcare settings (e.g., primary care, specialty mental health) will require adaptation for use in schools. In fact, Waltz et al. (2015, pp. 4–5) have advocated that there is a need to adapt strategies via expert consensus to ensure “a common nomenclature for implementation strategy terms, definitions, and categories that can be used to guide implementation research and practice.”

Adaptation to the School Context

Adaptation is a process of making changes to a method, program, practice, or finding to increase its suitability for use with a particular target population (e.g., school-based researchers and practitioners) or within a given organizational context (e.g., educational sector; McKleroy et al. 2006). Adaptation is a critical aspect to improve the appropriateness or contextual fit of a particular innovation (Proctor et al. 2013). This has resulted in some researchers concluding that implementation does not occur without adaptation (Lyon and Bruns 2019). In intervention science, multiple models have been proposed to facilitate the adaptation of EBP, including making cultural and contextual changes to EBP to improve appropriateness and relevance of the practice to the service recipients (Bernal et al. 2009). Most of these models share common features with regard to the level or depth of adaptation made to an existing practice (Barrera and González-Castro 2006; Bernal et al. 1995; Leong 1996). One level of adaptation represents surface changes to alter the label, referents, terminology, and/or examples used to describe the practice to ensure the language facilitates comprehension, contextual appropriateness, and usability by the intended end users of the innovation who operate in a specific context (Resnicow et al. 1990). In the education sector, this involves school-based implementation researchers and practitioners whose work focuses on the translation of EBP into routine service delivery through the use of implementation strategies. Another level of adaptation refers to deeper changes made to the substance of the practice that involves altering the meaning in a way that departs from the original content of the practice to increase its relevance and appropriateness within the specific context it will be deployed (McKleroy et al. 2006; Resnicow et al. 1990). Although many implementation strategies are generic and are applicable across contexts, including schools, the educational sector is a unique service setting with different nomenclature used to communicate information and contextual constraints (e.g., professional roles, scheduling) and features (e.g., teacher unions, school boards) rendering particular strategies more or less relevant and appropriate. Considering the above, to enhance the comprehension, contextual appropriateness, and utility of the ERIC strategy compilation in the educational sector, there is a need to engage in an iterative adaptation process.

Purpose of This Study

The purpose of this study was to initiate the School Implementation Strategies, Translating ERIC Resources (SISTER) Project by iteratively adapting the ERIC strategy compilation to derive a taxonomy of implementation strategies with relevance to the education sector. Consistent with the initial study procedures used to inform the eventual development of the ERIC compilation (Powell et al. 2012), a small group of implementation experts convened over multiple occasions to systematically and iteratively adapt the existing compilation for use in schools. The aim was to produce a SISTER strategy compilation that could inform subsequent research examining and comparing the feasibility and importance of the implementation strategies for use in the school context, as well as investigations exploring the mechanisms through which particular strategies work (Lewis et al. under review). Additionally, a sub-aim of this study was to model a process that could be used by implementation researchers working in other sectors to successfully leverage existing implementation products and resources and adapt them to their targeted setting.

Method

Expert Participants

Consistent with the process used to generate the original implementation strategy compilation (Powell et al. 2012) that informed the development of the refined ERIC compilation (Powell et al. 2015), this study included a small subset of implementation experts to develop a school-adapted taxonomy of implementation strategies that is applicable to the education sector. The participants in this study included three PhD-level experts with extensive experience conducting implementation research in schools and two of the lead researchers from the ERIC project. These five experts engaged in an iterative adaptation process, with multiple rounds of revisions and feedback. The three school-based implementation experts took the lead on making changes to the original ERIC strategy compilation to enhance the comprehensibility and appropriateness of the strategies, while the two lead ERIC researchers provided feedback on the changes made to the implementation strategies to maintain conceptual consistency with the original strategies. This process was repeated three times until consensus was reached.

Original ERIC Strategy Compilation

The refined ERIC strategy compilation (Powell et al. 2015) was used to develop a school-adapted taxonomy of implementation strategies—the SISTER compilation. The revised ERIC compilation was generated based on input from an expert panel of implementation researchers and practitioners, with nearly two thirds of the experts being affiliated with the Veteran’s Administration healthcare system. A modified Delphi approach involving three rounds of iterative revision was applied to the published taxonomy of Powell et al. (2012) of 68 strategies to develop a revised compilation based on expert consensus. Consistent with the Delphi approach, experts engaged in structured conversations across multiple rounds to iteratively adapt and reach consensus on adaptations to the original ERIC compilation (Dalkey and Helmer 1963). This process resulted in the expert panel reaching consensus on a final compilation of 73 implementation strategies. These 73 implementation strategies span multiple levels of the service delivery context (inner setting, outer setting) and stages/phases of the implementation process (exploration, preparation, implementation, sustainment) and target different stakeholders involved in the uptake and use of EBP (leaders, implementers, recipients, other stakeholders). The revised ERIC compilation has informed subsequent research examining the feasibility and importance of implementation strategies for use in particular service sectors and classification of strategies into conceptual categories and linking strategies to specific barriers to advance tailored implementation (Powell et al. 2017; Waltz et al. 2015).

Procedure

Prior to conducting this study, IRB approval was sought, and the university IRB determined that this study was exempt. An iterative adaptation process was developed to systematically examine and make changes to the revised ERIC compilation of 73 implementation strategies to create the SISTER strategy compilation. A key aspect of the adaptation process included the recruitment and participation of two of the developers from the ERIC project to serve as independent experts who provided feedback at specific points. All changes to extant ERIC strategies considered the common language and unique constraints and features of the school context to increase comprehension, contextual appropriateness, and utility for school-based implementation researchers and practitioners. The adaptation protocol proceeded systematically along a series of seven sequential steps: (1) school-based implementation experts reviewed existing implementation strategies to make revisions to the language, referents, and terminology to be consistent with the school context; (2) modification or expansion of examples to increase comprehension regarding how each strategy is applicable to EBP implementation in the school context; (3) removal of implementation strategies determined to be contextually inappropriate to the school context or redundant with other strategies as they manifest in schools; (4) addition of novel implementation strategies not included in the ERIC compilation that have evidence to enhance EBP implementation in schools; (5) review and feedback by ERIC investigators on the school-adapted compilation to ensure conceptual consistency with original strategy; (6) additional revision, based on feedback from ERIC developers, to ensure conceptual consistency with original strategies and increase the comprehension, contextual appropriateness, and utility to the school context; and (7) re-review by ERIC developers and finalization of an initial set of school-adapted implementation strategies.

The analytic approach consisted of categorizing the nature of revisions made to each of the strategies as either no change, surface change, or deep change. Further, we recorded the specific features of the strategy that were modified, including (a) changes to the label of strategy, (b) changes to the referents used to contextualize the strategy (e.g., replacing agency with school or district or replacing clinician with teacher), (c) changes to the terminology used within the definition of the strategy, and/or (d) changes to the examples used to illustrate the strategy. No change referred to strategies that remained unaltered and, therefore, included the same label, referents, terminology, and examples as the original ERIC strategy. Surface-level changes reflected relatively minor changes to the strategy that did not depart from the meaning of the original strategy, but were made to increase contextual appropriateness for school-based researchers and practitioners. Specific surface-level changes were recorded, which included changes to the strategy label, referents (e.g., school personnel instead of clinician or school instead of clinic or agency), terminology (e.g., new practice instead of clinical innovation), or parenthetical and nonparenthetical examples in the strategy description. Deep change was used to categorize strategies that underwent more substantial modifications that altered the meaning of the strategy in a way that it departed from the original ERIC strategy. For all strategies that underwent deep changes, the specific reason for the deep change was recorded. Additionally, strategies that were deleted from or added to the ERIC taxonomy were recorded in order to tabulate the number of strategies that were deemed irrelevant and inappropriate to the school context, as well as those novel strategies that were not included in the ERIC compilation but educational research has identified as a method or procedure that could impact the successful uptake and use of EBP in schools. After completing the iterative adaption process, to examine patterns in the types of modifications, we synthesized the different changes (no change, surface, deep, deleted, added) according to each of the strategy categories derived from Waltz et al. (2015). Waltz et al. (2015) used expert ratings and concept mapping (Kane and Trochim 2007) to derive the following nine strategy categories: (1) use evaluative and iterative strategies; (2) provide interactive assistance; (3) adapt and tailor to context; (4) develop stakeholder relationships; (5) train and educate stakeholders; (6) support educators (word “clinicians” from the original was replaced with “educators”); (7) engage consumers; (8) financial strategies; and (9) change infrastructure. These categories were used to organize a side-by-side comparison of the ERIC and SISTER compilations, as well as examine patterns in the types of modifications made to the original strategies.

Results

The results of the adaptation process are depicted in Tables 1, 2, 3, 4, 5, 6, 7, 8, and 9, which includes a side-by-side comparison of the ERIC strategy compilation and the adapted SISTER compilation for each of the nine conceptual categories. The strategies are organized in alphabetical order within each of the conceptual categories. After applying the iterative adaptation process to the ERIC strategy compilation, the final SISTER compilation included a total of 75 unique implementation strategies. Below is a detailed account of the adaptations that were made to generate the 75 strategies included in the SISTER compilation.

Table 1 Adaptations to strategies falling under use evaluative and iterative strategies
Table 2 Adaptations to strategies falling under provide interactive assistance
Table 3 Adaptations to strategies falling under adapt and tailor to context
Table 4 Adaptations to strategies falling under develop stakeholder interrelationships
Table 5 Adaptations to strategies falling under train and educate stakeholders
Table 6 Adaptations to strategies falling under support clinicians
Table 7 Adaptations to strategies falling under engage consumers
Table 8 Adaptations to strategies falling under use financial strategies
Table 9 Adaptations to strategies falling under change infrastructure

Strategy Adaptation

No Change

Out of the 73 ERIC strategies, 11 remained unaltered with no surface-level changes made to the label, referents, terminology, and/or examples. Representative example strategies that were deemed to generalize well to the educational sector in their current form included the following: access new funding: access new or existing money to facilitate the implementation (strategy no. 60); visit other sites: visit sites where a similar implementation effort has been considered successful (strategy no. 36); and develop academic partnerships: partner with a university or academic unit for the purposes of shared training and bringing research skills to an implementation project (strategy no. 24).

Overall Changes

Results from the coding indicated that changes were made to 57 (78%) of the original ERIC strategies. Changes included the following: (a) 28 strategies with label changes, (b) 39 strategies with changes to the referent (e.g., teacher instead of clinician or school instead of agency), (c) 50 strategies with changes to terminology used to describe the strategy, and (d) 17 strategies with changes to the examples used to illustrate the strategy.

Surface Change

For 52 of the 57 adapted strategies, only surface-level changes were made. In total, 147 unique surface-level changes were made to the labels, referents, terminology, or examples to increase the comprehension and appropriateness of these 52 ERIC strategies. On average, there were roughly 2.5 surface-level changes to each of the adapted strategies, with a range of one surface-level change (e.g., terminology change to strategy no. 45 shadow other experts) to four surface-level changes (e.g., label, referent, terminology, and examples changes to strategy no. 50 facilitate relay of clinical data to providers). Specifically, changes to the label were made to 33 of the strategies, with examples including changing “Remind clinicians” to “Remind school personnel” and changing “Facilitate relay of clinical data to providers” to “Facilitate relay of intervention fidelity and student data to school personnel.” Further, changes to the referents (implementers, service recipients, or service setting) in the strategy were made to 40 ERIC strategies. The most common referent changes consisted of replacing “clinician” with “school personnel” (13 times), “sites or organizations” to “school or district” (25 times) and “consumer/patient” to “students and/or families” (25 times). Out of all the surface-level changes, the most common changes consisted of modifications to the terminology used to describe the strategy, with a total of 55 of the ERIC strategies undergoing terminology changes. Changes to terminology included using “new practice” instead of “clinical innovation” and adding terminology that represents common language used by school-based researchers and practitioners. Last, changes or additions to examples in the definition (parenthetical and nonparenthetical) were applied to 19 of the ERIC strategies to increase understanding of how the strategy could be applied in the school context. For example, for strategy no. 38, an expanded parenthetical example was provided to better describe the type of trained person who could conduct an educational outreach visit to support the implementation of a new practice.

Deep Change

Deep changes were made to five of the ERIC strategies and resulted in modifications that altered the core meaning of the adapted strategy in a way that departed from the original. These deep changes were made in addition to the surface-level changes described above. Three of these deep changes were made to strategies involving the use of financial mechanisms to influence implementation outcomes, which the school-based implementation experts agreed were inappropriate to the school context. However, these financial strategies had parallels to the school context and, thus, underwent deep changes. For example, develop disincentives (strategy no. 63) was preserved but altered to remove reference to financial penalties and instead include description of disincentives that are more appropriate to the school context, such as write up in professional file, meeting with the administrator to discuss insufficient implementation, and participating in additional professional development for failure to implement or use the new practices. Moreover, make billing easier (strategy no. 65) was maintained but substantially altered to make implementation easier by removing burdensome documentation tasks, as the latter is a more contextually appropriate strategy to reduce burdens that impede educators’ implementation efforts. Another strategy underwent deep change because it involved changing liability law (strategy no. 67), which currently do not exist in education to the extent they do in healthcare (e.g., there is no educational malpractice statute like there is in medicine). Thus, the strategy was altered to reflect changes in ethical and professional standards of practice, which represents an implementation strategy that is more appropriate to how schools operate. Last, the ERIC strategy develop and implement tools for quality monitoring (strategy no. 7) had deep changes made to it because it included a diffuse set of recommendations (changes to language, protocols, algorithms, standards, and measures of processes, patient/consumer outcomes, and implementation outcomes) that would limit its appropriateness and usability in the school context. Thus, deep changes were made to narrow the focus of the strategy and make it more appropriate to the school context.

Deleted

A total of five ERIC strategies were deleted and not included in the final SISTER strategy compilation due to consensus that they were not appropriate to the school context. Three out of the five strategies were deleted because they involved methods or techniques targeting the manipulation of financial structures to facilitate implementation outcomes. Due to the unique constraints of educational settings, such as school boards, compulsory attendance, and educational policy, financial strategies such as fee-for-service, use capitated payments, and use other payment schemes are not applicable and appropriate to the school context (Lyon et al. 2018). One strategy was deleted due to redundancy given the overlap with and lack of distinction from other strategies: use an implementation advisor. Last, revising professional roles was removed from inclusion in the SISTER compilation because of the contextual inappropriateness of revising educators roles in the contexts of schools. Teachers, for example, have highly prescriptive roles and credentials that prohibit shifting or revising their roles with other educators (e.g., with a school counselor; or a special education teacher with a general education teacher; Herlihy and Corey 2006; Urbach et al. 2015).

Added

A deliberate scan of the ERIC compilation to identify missing strategies resulted in a total of seven new strategies being added to the SISTER compilation: (a) develop local policy that supports implementation (strategy no. 72), (b) improve implementers’ buy-in (strategy no. 51), (c) peer-assisted learning (strategy no. 13), (d) pre-correction prior to implementation (strategy no. 52), (e) pruning competing initiatives (strategy no. 74), (f) targeting/improving implementer well-being (strategy no. 54), and (g) test-drive and select practices (strategy no. 18). These strategies were included based on knowledge of findings from school-based research on different methods and techniques used across multiple levels (e.g., policy to individual implementers) to facilitate implementation. Expanded definitions of each of these newly added strategies are included in the tables.

Strategy Changes by Category

The types of modifications made for each of the nine conceptual strategy categories of Waltz et al. (2015) are depicted in Table 10. Proportionally, the category of financial strategies underwent the most significant modifications, with two thirds of the strategies undergoing deep changes to modify meaning (n = 3, 33%) or deletion from inclusion in the SISTER compilation (n = 3, 33%). Strategies were deleted from only three of the nine categories (Develop stakeholder relationships, Support educators, and Financial strategies), while new strategies were added to four of the nine categories (Provide interactive assistance, Adapt and tailor to context, Support educators, Change infrastructure). Five of the categories included strategies that required deep changes that altered its meaning from the original ERIC strategy (Develop stakeholder relationships n = 1, 5%; Support educators n = 1, 20%; Financial strategies n = 3, 33%).

Table 10 Types of modifications according to established conceptual strategy categories (Powell et al. 2015)

Discussion

The identification, deployment, and testing of implementation strategies are critical to advancing implementation science and practice. This study iteratively adapted the refined ERIC strategy compilation (Powell et al. 2015) for use by school-based implementation researchers and practitioners. Application of the iterative adaptation process resulted in 11 of the 73 ERIC strategies requiring no modification, 52 undergoing surface-level changes only, and five needing deep changes. Five strategies were deleted and seven new strategies were added, resulting in a total of 75 unique school-based implementation strategies.

Dissemination of this study’s findings is important to ensure that school-based implementation researchers and practitioners become aware of the full range of implementation strategies available to support the uptake, delivery, and sustainment of EBP given that the majority of efforts to change routine practice fail (Burnes 2004; Damschroder et al. 2009). Dissemination of implementation strategies is critical to establish a common nomenclature among prevention scientists engaged in school-based research and to develop a generalizable knowledge base to answer key questions, such as What strategy worked under what conditions and how did it work? Akin to intervention science, clear labels and definitions of implementation strategies will facilitate more precise assessment and reproducibility in research and practice (Proctor et al. 2013). For example, the SISTER compilation may enable prevention scientists to more accurately identify and track the core implementation strategies they deploy in efficacy studies (e.g., conduct ongoing training, provide local technical assistance, provide ongoing consultation) to support the successful uptake and delivery of EBP with fidelity that otherwise go unreported, resulting in a greater likelihood of replication across studies and investigative groups (Boyd et al. 2017; Bunger et al. 2017). Further, capturing the types of strategies that are needed to promote effective implementation (e.g., identify and prepare champions, alter and provide system- and individual-level incentives, provide practice-specific supervision) will be critical to support both indigenous school personnel (e.g., school psychologists, social workers) and EBP purveyors (e.g., external organizations who provide training and technical assistance on a given EBP) to facilitate the successful translation of EBP into everyday practice when strict oversight and control by researchers is lessened or not available (i.e., effectiveness research).

Emerging Patterns by Strategy Category

When examining patterns in the types of modifications made to strategies according to the conceptual categories of Waltz et al. (2015), several interesting findings emerged. First, consistent with the above, the strategy category with the most substantial modifications was financial strategies, with two thirds of the strategies (six out of nine) either being deeply modified or deleted from inclusion in the SISTER compilation. Financial strategies are largely inappropriate for use in schools due to unique policy, collective bargaining arrangements (i.e., unions and contracts), and compensation schemes (Lyon et al. 2019). These findings suggest that certain types of implementation strategies may be more bound to a specific service sector and, thus, less transmittable across contexts that have different organizational constraints regarding how services are accessed (e.g., fee for service) and providers are incentivized to implement new practices. Some of the financial strategies had parallels, however, to the school context. For example, although financial disincentives are inappropriate for use in schools, the broader notion of creating disincentives for lackluster implementation is appropriate for application in schools. Indeed, creating situations that educators want to avoid (e.g., teacher meeting with the site administrator to discuss lackluster implementation at an inconvenient time) as a way of promoting greater uptake and delivery of EBP is a strategy that has been found to be effective in schools (DiGennaro et al. 2005).

Second, there were four strategy categories (provide interactive assistance, adapt and tailor to context, train and educate stakeholders, and engage consumers) that underwent minimal modifications to increase the comprehension, contextual appropriateness, and utility by implementation researchers and practitioners operating in schools. Strategies that fall under these categories may be agnostic to the service delivery context and, therefore, more generalizable to a variety of implementation scenarios, settings, and providers. For example, there is consensus among researchers and practitioners across different service sectors that the category of train and educate stakeholders is relevant and necessary whether one is functioning within the context of healthcare, behavioral health, or education (Beidas and Kendall 2010; Grol 2001; Lyon et al. 2017; Stahmer et al. 2015), as stakeholders need to have knowledge of the underlying reasons why the EBP is needed, what the EBP entails, and what implementations looks like. Moreover, it is clear across service contexts, including schools, that providing interactive assistance is critical to support frontline providers (e.g., nurses, mental health providers, or teachers) with ongoing support via technical assistance, facilitation, and supervision to promote their uptake and delivery of EBP (Cook and Odom 2013; Lyon et al. 2017; Stetler et al. 2006).

Last, the seven newly generated strategies were classified into only four out of the nine conceptual strategy categories, with most additions falling under supporting educators (n = 3) and changing infrastructure (n = 2). This finding speaks to the overall representativeness of the refined ERIC strategy compilation (Powell et al. 2015), as relatively few new strategies were generated and classified into a small subset of the conceptual categories. This finding also indicates that certain strategy categories, like supporting educators and changing infrastructure, may have greater room for innovation regarding the generation of additional individual- and contextual-level strategies to support implementation. The generation of additional strategies for inclusion into strategy compilations should continue to be guided by consensus-driven procedures using the best available evidence, including efforts to classify new strategies under existing conceptual strategy categories to facilitate understanding of how the new ones fit among the more comprehension collection of other strategies.

Addition of New Strategies

The rationale for including additional unique strategies in schools that were missing from the ERIC compilation warrants further discussion. Develop local policy that supports implementation was added based on research findings related to universal prevention efforts, such as school-wide positive behavior intervention supports, suggesting that changes to school discipline policy lead to changes in adult behavior regarding how educators effectively respond to problem behavior (Horner et al. 2017). Improve implementers’ buy-in was included based on emerging evidence linking changes in educator beliefs and attitudes as important predictors of implementation intentions and behaviors (Cook et al. 2015). Peer-assisted learning was added in light of research suggesting that peer learning networks or collaborative frameworks are facilitative of reflective practice and provide educators with a form of peer accountability to enhance the implementation of academic and behavioral supports (Kohler et al. 1997; Vescio et al. 2008). Pre-correction prior to implementation was generated based on the impact of antecedent strategies delivered temporally before an opportunity to facilitate educators’ successful delivery of an EBP (Cook et al. 2017a, b). Last, pruning competing initiatives reflects strategic de-adoption practices to offset the potential for implementation overload, and was included as a strategy to make room for frontline providers to prioritize the implementation of a new program or practice (Abrahamson 2004; Nadeem and Ringle 2016). Targeting/improving implementer well-being has recently emerged as an implementation strategy, with findings showing stress and burnout reductions lead to improved intentions to implement and actual use of EBP by teachers (Cook et al. 2017a, b; Larson et al., under review). Test-drive and select practices is a way of incorporating implementer choice/preference in the selection of an EBP and has shown promise as a technique for improving fidelity among educators who are initially resistant to adopt and deliver a new practice (Dart et al. 2012; Johnson et al. 2014).

Although these additions were identified with the school context in mind, most of them are likely to be applicable to other service sectors. For example, efforts to promote implementer buy-in prior to and during an implementation effort are likely facilitative of implementation outcomes across other service sectors focused on promoting youth behavioral health outcomes, such as healthcare, child welfare, juvenile justice, and public health (e.g., Russ et al. 2014). Moreover, stress and burnout among implementers are not unique barriers to implementation in schools (e.g., Khamisa et al. 2013). Thus, efforts targeting stress and burnout reduction are likely to help promote providers’ well-being and may serve to increase their intentions to adopt and deliver clinical innovations (Damian et al. 2017). In the multidisciplinary spirit of implementation science, strategies facilitative of implementation outcomes in one context may ultimately be appropriate and have utility beyond the setting in which they were originally developed.

Implications

This study has notable implications for prevention scientists dedicated to improving youth access to high-quality behavioral health services in schools. First, although implementation science is far less advanced in the educational sector than other fields (Sanetti et al., manuscript in preparation), lagging behind other sectors can be viewed an opportunity for strategic adaptation of established implementation tools and resources. Service sectors, such as education, with lagging research are well-positioned to take advantage of extant findings from other service sectors, such as healthcare, by strategically adapting findings for use in a novel context. As highlighted in this study, strategic selection and adaptation of existing resources involves capitalizing on the trailblazing work by implementation scientists and practitioners operating in other service sectors to generalize and adapt extant findings for use in a novel service setting, such as schools. To support these advancements, school-based prevention scientists must strive to keep informed of implementation research outside of their own discipline to identify existing findings that could be strategically adapted for use in their specific context. As an example, in the areas of measurement, researchers in child welfare and youth mental health have developed pragmatic tools to assess key factors of the inner organizational context (i.e., the microsystem in which implementation happens) that are most proximal to providers’ implementation behaviors (Aarons et al. 2014; Ehrhart et al. 2014, 2015), and these measures have been adapted for use in the context of school-based implementation research and practice (Lyon et al. 2019).

Establishing an adapted compilation of implementation strategies has implications for deepening understanding of which strategies are most commonly needed, feasible to deploy, and effective across implementation efforts. The existing implementation strategies are not necessarily equal, as some may require more resources (i.e., time, money, and energy) to deploy, some may be more or less effective, and some may be needed more frequently. Thus, there is a need to examine pragmatic dimensions of strategies that impact their likely use among implementation practitioners. Ultimately, school-based implementation research should be concerned with potentially replicating the divide that it seeks to address between what research indicates works and what gets adopted in everyday service settings (Lyon et al. 2019). Similar to the work undertaken with the ERIC compilation (Waltz et al. 2015), researchers should examine experts’ and practitioners’ perceptions of the feasibility and impact of strategies to identify those that are low burden to deploy yet likely to have an influence on EBP implementation.

The SISTER strategy compilation, as well as other published taxonomies, have implications for identifying the subset of strategies that are most frequently needed by implementation practitioners within a given service setting. One starting place is to link strategies to the most commonly encountered malleable determinants (i.e., barriers or facilitators to implementation) that impact successful EBP implementation. This represents a useful starting place as one approach to tailored implementation involves targeting strategies to specific barriers identified in a given context. Pareto’s law of the Vital Few (Bookstein 1990), which captures the natural distribution of problems for particular phenomena in order to distill them to a core set, suggests that there is likely a smaller subset of barriers (e.g., 20%) that account for the majority of implementation issues encountered (e.g., 80%). This may prove quite useful, given that 601 plausible determinants of implementation have been identified (Krause et al. 2014). If this law holds true, then researchers and practitioners need to identify the vital determinants that account for the majority of implementation failures. Those barriers that are frequently encountered and are malleable could be the ideal targets for developing more pragmatic approaches to tailoring strategies to a given setting (Locke et al. 2016). Researchers have identified four different methodologies that could help inform tailoring implementation strategies to context, including concept mapping, group model building, conjoint analysis, and intervention mapping (Powell et al. 2017). These methodologies can help provide greater guidance on how to link implementation strategies to more precise (a) stages of the implementation process (e.g., exploration, preparation, implementation, and sustainment; Aarons et al. 2011), (b) determinants that serve as barriers to implementation (e.g., insufficient knowledge of or motivation to implement the new innovation), and (c) measures to monitor specific implementation outcomes (e.g., appropriateness, intervention fidelity, penetration/reach) to inform data-driven improvement decisions. Further streamlining of implementation strategies may come from emerging efforts to detail the mechanisms through which strategies influence implementation outcomes (e.g., Lewis 2017; Williams 2016). Similar to the push to identify mechanisms of action in intervention science (Kazdin 2007), identifying and testing implementation mechanisms holds promise for eliminating strategies (or strategy components) that do not operate through the strongest pathways of action. Researchers also have begun to outline methodologies that could be used to begin developing and testing specific strategy–mechanism–outcomes linkages (e.g., Lyon et al. 2016), which have relevance to work in the education sector. Research focused on determining how to tailor implementation strategies to a given context will hopefully provide more efficient and effective approaches to implementation.

We believe that existing taxonomies, like the original ERIC, need to be adapted to the specific service sector in which they will be used, as adaptation helps ensure that products and ideas are comprehensible and appropriate to stakeholders (e.g., researchers, practitioners, and policymakers) operating in that sector (Bernal et al. 2009). In the area of children’s mental and behavioral health, education, community mental health, juvenile justice, primary care, and child welfare represent the main child-serving sectors in which children receive services. Thus, we anticipate that the potential number of adapted compilations would mirror the number of child-serving sectors. It is also important that findings stemming from adaptation efforts, like SISTER, should be fed back to the original source to potentially expand and refine the ERIC compilation.

Limitations and Directions for Future Research

This study has several limitations. First, this initial study did not include as comprehensive a group of experts as the original ERIC group, which included a total of 71 implementation research and practitioner experts. Future research on the SISTER compilation will seek to expand the representativeness of research and practitioner experts who provide input on the compilation and recommendations to inform pragmatic use as part of real-world implementation efforts. This would ideally include input from implementation practitioners or intermediaries (e.g., external organizations or individuals who are EBP champions and use the science of implementation to support real-world implementation efforts; Franks and Bory 2015) working in real-world educational settings. Second, the adaptation process employed was not predicated on a widely established approach. Rather, the seven-step adaptation process was constructed for the purposes of this study due to the lack of a widely accepted approach to adapt existing research findings for use in novel contexts. Researchers may use the adaptation process in this study as a starting point to establish a more rigorous approach through expert consensus-driven procedures. Third, although there are systematic reviews of the school-based literature examining the use and effects of consultation and coaching on implementation, there are no comprehensive reviews of implementation strategies. Such research will be an important follow-up to the work presented in this paper. Lastly, this study provides no guidance to facilitate decision-making with regard to the selection and use of strategies in response to particular implementation scenarios. The lack of empirical guidance is noteworthy in school-based behavioral health relative to other service sectors (Novins et al. 2013), as there are few experimental studies or comparisons of implementation strategies.

Conclusion

Implementation strategies are essential to effectively incorporate EBP into school-based behavioral health service delivery and improve outcomes for youth. This study established the initial school-adapted SISTER strategy compilation, which will hopefully provide common language and stimulate future implementation research in the education sector. The SISTER compilation provides a useful starting place to move school-based behavioral health forward. Eventually, we hope to arrive at a place of greater understanding among implementation researchers and practitioners regarding when and how to select implementation strategies for new circumstances. Nevertheless, it is unlikely that the current SISTER compilation reflects the full set of potential relevant and useful implementation strategies in the education sector. As our research and collaborations in this area continue to advance, as well as the field of implementation science more generally, we anticipate further revisions will be made to this list. Moreover, we are hopeful that prevention scientists will scale-out this work by adapting it to novel child-serving sectors for use by researchers and practitioners seeking to advance EBP implementation.