Abstract
Schools, like other service sectors, are confronted with an implementation gap, with the slow adoption and uneven implementation of evidence-based practices (EBP) as part of routine service delivery, undermining efforts to promote better youth behavioral health outcomes. Implementation researchers have undertaken systematic efforts to publish taxonomies of implementation strategies (i.e., methods or techniques that are used to facilitate the uptake, use, and sustainment of EBP), such as the Expert Recommendations for Implementing Change (ERIC) Project. The 73-strategy ERIC compilation was developed in the context of healthcare and largely informed by research and practice experts who operate in that service sector. Thus, the comprehensibility, contextual appropriateness, and utility of the existing compilation to other service sectors, such as the educational setting, remain unknown. The purpose of this study was to initiate the School Implementation Strategies, Translating ERIC Resources (SISTER) Project to iteratively adapt the ERIC compilation to the educational sector. The results of a seven-step adaptation process resulted in 75 school-adapted strategies. Surface-level changes were made to the majority of the original ERIC strategies (52 out of 73), while five of the strategies required deeper modifications for adaptation to the school context. Six strategies were deleted and seven new strategies were added based on existing school-based research. The implications of this study’s findings for prevention scientists engaged in implementation research (e.g., creating a common nomenclature for implementation strategies) and limitations are discussed.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Introduction
Research continues to produce a steady stream of innovations that can improve routine care for youth with behavioral health problems, such as anxiety, depression, trauma, and disruptive behavior problems (Weisz and Kazdin 2017). Despite the promise of such research, these findings often are not successfully translated into everyday service settings in which youth naturally exist (Dingfelder and Mandell 2011; Owens et al. 2014). Implementation research across different service sectors has shown that without deliberate efforts to bridge the science-to-practice gap through the use of implementation strategies, there is likely to be uneven uptake, use, and sustainment of research findings as part of routine practice (Proctor et al. 2013; Powell et al. 2015). In fact, research from the broader field of implementation science has estimated that two thirds of implementation efforts fail (Burnes 2004; Damschroder et al. 2009) and most have no impact on service recipient outcomes (Powell et al. 2014).
There has been a strong push among researchers and policymakers to strategically increase the availability of evidence-based practices (EBP) as part of routine service delivery in the main settings in which youth function (Fixsen et al. 2013). Schools continue to be one of these settings, as they are the primary venue in which youth receive behavioral health supports (Farmer et al. 2003). Due to greater access, reduced stigma, and the availability of professionals who can deliver needed services, schools are an ideal setting to integrate behavioral health services with academic supports (Owens et al. 2014). Researchers have developed and evaluated numerous EBP that span multiple tiers of prevention (universal, targeted, and intensive) for implementation in schools. For example, school-wide positive behavior intervention and supports (Bradshaw et al. 2010) and social–emotional learning curricula (Brackett and Rivers 2013) prevent behavioral health problems and promote success-enabling factors (Cook et al. 2015). Moreover, targeted small group interventions grounded in cognitive behavior therapy have been shown to decrease mental health problems and promote better academic-related outcomes (e.g., Lochman and Wells 2002). Last, more intensive forms of school-based treatment, such as individualized function-based behavior intervention plans (Walker et al. 2017) and therapeutic interventions (e.g., Morina et al. 2016), have been linked to reduced problem behavior and improvements in social, emotional, and academic functioning among high-risk youth. For these reasons, schools are under immense pressure from policy and stakeholders to deliver a continuum of EBP that target preventing and ameliorating behavioral health problems (Bruns et al. 2016).
Implementation Gap
Schools are confronted with an implementation gap, with the slow adoption of EBP into routine practice ultimately limiting their effects on youth outcomes (Gottfredson and Gottfredson 2002; Owens et al. 2014; Ringwalt et al. 2009). Even when EBP are selected for adoption in schools, they are infrequently implemented with fidelity or sustained over time (Gottfredson and Gottfredson 2002; Ringwalt et al. 2004). This is problematic given the demonstrated link between high-quality implementation and changes in youth social, emotional, and academic outcomes (Durlak and DuPre 2008; St. Peter Pipkin et al. 2010). Addressing the extant gap between research and practice represents a critical aspect of translational prevention science to move beyond development and efficacy studies to dissemination and implementation research that seeks to realize the potential public health impact of prevention science (Fishbein et al. 2016).
Implementation Strategies
Successful implementation efforts depend on the strategic use of implementation strategies, methods, or techniques that are used to facilitate the adoption, use, and sustainment of EBP (Proctor et al. 2013). These methods and techniques target putative contextual and individual-level mechanisms that influence implementation processes and outcomes (e.g., acceptability, appropriateness, fidelity, penetration) (Lewis et al. 2018). Implementation outcomes, the targets of implementation strategies, are district from service outcomes, reflect the primary dependent variables in implementation research, and are defined as the effects of deliberate and purposeful efforts to influence implementation (Proctor et al. 2009). Increasingly, implementation strategies are being developed and tested to promote the adoption, delivery, and sustainment of EBP in routine service settings (Powell et al. 2017).
Implementation research and frameworks point to a wealth of strategies that are pertinent to different phases (e.g., exploration, preparation, implementation, and sustainment; Aarons et al. 2011) and across multiple levels (e.g., outer setting, inner setting, individual implementers, the innovation/practice itself) of the implementation process (Leeman et al. 2017). For example, high-quality professional development that involves dynamic training and follow-up consultation/coaching has been shown to successfully increase providers’ delivery of EBP (Herschell et al. 2010; Sholomskas et al. 2005). Moreover, assessing readiness by examining barriers to and facilitators of implementation can inform strategic planning that targets specific implementation outcomes, such as appropriateness (i.e., the suitability or fit of a particular practice to the context) and acceptability (i.e., satisfaction with a particular practice; Weiner et al. 2017). Additionally, monitoring implementation and providing data-driven performance-based feedback can serve as an effective means for continuous improvement of implementation outcomes, such as intervention fidelity and reach (McHugh and Barlow 2010). Moreover, there is a general consensus among implementation scientists that a core aspect of implementation practice is the selection and tailoring of implementation strategies to address the barriers present within a given service setting (Powell et al. 2017). Tailoring implementation strategies typically involves an assessment of determinants that are likely to influence implementation outcomes, such as features of the intervention or practice (e.g., Good Behavior Game as a classroom management practice; Barrish et al. 1969), context-specific determinants associated with the school setting in which the practice will be implemented (e.g., supportive leadership, protected time, connect between practice and performance evaluations, etc.), or individual-level factors associated with those expected to implement the practice (e.g., self-efficacy, beliefs and attitudes, intentions to implement). Prior research has established guides to inform the assessment of these factors (Flottorp et al. 2013; Wensing and Grol 2005), with resulting data informing the selection and tailoring of implementation strategies to context-specific barriers associated with a given setting (Wensing et al. 2011).
Keeping track of all of the implementation strategies represents an information management problem that is likely to stall both research and practice, especially when inconsistent terminology and inadequate definitions are used in research (Proctor et al. 2013). Researchers have focused on identifying and revising a taxonomy of implementation strategies that could inform future implementation research, as well as real-world implementation practice efforts focused on bridging the science-to-practice gap.
Implementation research in the healthcare sector is more advanced than other service sectors, including schools (e.g., Sanetti et al., manuscript in preparation). In fact, organized efforts have been undertaken to generate a taxonomy of implementation strategies that could be utilized within healthcare research. The Expert Recommendations for Implementing Change (ERIC) project (Waltz et al. 2015) was built upon work conducted by a smaller group of implementation researchers who systematically developed an initial taxonomy of implementation strategies (Powell et al. 2012). This list was refined via a larger panel of implementation experts (Powell et al. 2015) and analyzed to examine the feasibility and importance of each implementation strategy (Waltz et al. 2015). The ERIC compilation (Powell et al. 2015) has provided a much-needed common language for implementation researchers and practitioners and allowed for better tracking and reporting implementation strategies within and across studies (Proctor et al. 2013).
As it stands, no comparable effort has occurred to support implementation in schools. Given that the education sector has a number of unique implementation challenges—including educational timelines, professional characteristics, policies, and organizational constraints (Forman et al. 2013; Owens et al. 2014)—it is likely that strategies designed to support clinical practice in more traditional healthcare settings (e.g., primary care, specialty mental health) will require adaptation for use in schools. In fact, Waltz et al. (2015, pp. 4–5) have advocated that there is a need to adapt strategies via expert consensus to ensure “a common nomenclature for implementation strategy terms, definitions, and categories that can be used to guide implementation research and practice.”
Adaptation to the School Context
Adaptation is a process of making changes to a method, program, practice, or finding to increase its suitability for use with a particular target population (e.g., school-based researchers and practitioners) or within a given organizational context (e.g., educational sector; McKleroy et al. 2006). Adaptation is a critical aspect to improve the appropriateness or contextual fit of a particular innovation (Proctor et al. 2013). This has resulted in some researchers concluding that implementation does not occur without adaptation (Lyon and Bruns 2019). In intervention science, multiple models have been proposed to facilitate the adaptation of EBP, including making cultural and contextual changes to EBP to improve appropriateness and relevance of the practice to the service recipients (Bernal et al. 2009). Most of these models share common features with regard to the level or depth of adaptation made to an existing practice (Barrera and González-Castro 2006; Bernal et al. 1995; Leong 1996). One level of adaptation represents surface changes to alter the label, referents, terminology, and/or examples used to describe the practice to ensure the language facilitates comprehension, contextual appropriateness, and usability by the intended end users of the innovation who operate in a specific context (Resnicow et al. 1990). In the education sector, this involves school-based implementation researchers and practitioners whose work focuses on the translation of EBP into routine service delivery through the use of implementation strategies. Another level of adaptation refers to deeper changes made to the substance of the practice that involves altering the meaning in a way that departs from the original content of the practice to increase its relevance and appropriateness within the specific context it will be deployed (McKleroy et al. 2006; Resnicow et al. 1990). Although many implementation strategies are generic and are applicable across contexts, including schools, the educational sector is a unique service setting with different nomenclature used to communicate information and contextual constraints (e.g., professional roles, scheduling) and features (e.g., teacher unions, school boards) rendering particular strategies more or less relevant and appropriate. Considering the above, to enhance the comprehension, contextual appropriateness, and utility of the ERIC strategy compilation in the educational sector, there is a need to engage in an iterative adaptation process.
Purpose of This Study
The purpose of this study was to initiate the School Implementation Strategies, Translating ERIC Resources (SISTER) Project by iteratively adapting the ERIC strategy compilation to derive a taxonomy of implementation strategies with relevance to the education sector. Consistent with the initial study procedures used to inform the eventual development of the ERIC compilation (Powell et al. 2012), a small group of implementation experts convened over multiple occasions to systematically and iteratively adapt the existing compilation for use in schools. The aim was to produce a SISTER strategy compilation that could inform subsequent research examining and comparing the feasibility and importance of the implementation strategies for use in the school context, as well as investigations exploring the mechanisms through which particular strategies work (Lewis et al. under review). Additionally, a sub-aim of this study was to model a process that could be used by implementation researchers working in other sectors to successfully leverage existing implementation products and resources and adapt them to their targeted setting.
Method
Expert Participants
Consistent with the process used to generate the original implementation strategy compilation (Powell et al. 2012) that informed the development of the refined ERIC compilation (Powell et al. 2015), this study included a small subset of implementation experts to develop a school-adapted taxonomy of implementation strategies that is applicable to the education sector. The participants in this study included three PhD-level experts with extensive experience conducting implementation research in schools and two of the lead researchers from the ERIC project. These five experts engaged in an iterative adaptation process, with multiple rounds of revisions and feedback. The three school-based implementation experts took the lead on making changes to the original ERIC strategy compilation to enhance the comprehensibility and appropriateness of the strategies, while the two lead ERIC researchers provided feedback on the changes made to the implementation strategies to maintain conceptual consistency with the original strategies. This process was repeated three times until consensus was reached.
Original ERIC Strategy Compilation
The refined ERIC strategy compilation (Powell et al. 2015) was used to develop a school-adapted taxonomy of implementation strategies—the SISTER compilation. The revised ERIC compilation was generated based on input from an expert panel of implementation researchers and practitioners, with nearly two thirds of the experts being affiliated with the Veteran’s Administration healthcare system. A modified Delphi approach involving three rounds of iterative revision was applied to the published taxonomy of Powell et al. (2012) of 68 strategies to develop a revised compilation based on expert consensus. Consistent with the Delphi approach, experts engaged in structured conversations across multiple rounds to iteratively adapt and reach consensus on adaptations to the original ERIC compilation (Dalkey and Helmer 1963). This process resulted in the expert panel reaching consensus on a final compilation of 73 implementation strategies. These 73 implementation strategies span multiple levels of the service delivery context (inner setting, outer setting) and stages/phases of the implementation process (exploration, preparation, implementation, sustainment) and target different stakeholders involved in the uptake and use of EBP (leaders, implementers, recipients, other stakeholders). The revised ERIC compilation has informed subsequent research examining the feasibility and importance of implementation strategies for use in particular service sectors and classification of strategies into conceptual categories and linking strategies to specific barriers to advance tailored implementation (Powell et al. 2017; Waltz et al. 2015).
Procedure
Prior to conducting this study, IRB approval was sought, and the university IRB determined that this study was exempt. An iterative adaptation process was developed to systematically examine and make changes to the revised ERIC compilation of 73 implementation strategies to create the SISTER strategy compilation. A key aspect of the adaptation process included the recruitment and participation of two of the developers from the ERIC project to serve as independent experts who provided feedback at specific points. All changes to extant ERIC strategies considered the common language and unique constraints and features of the school context to increase comprehension, contextual appropriateness, and utility for school-based implementation researchers and practitioners. The adaptation protocol proceeded systematically along a series of seven sequential steps: (1) school-based implementation experts reviewed existing implementation strategies to make revisions to the language, referents, and terminology to be consistent with the school context; (2) modification or expansion of examples to increase comprehension regarding how each strategy is applicable to EBP implementation in the school context; (3) removal of implementation strategies determined to be contextually inappropriate to the school context or redundant with other strategies as they manifest in schools; (4) addition of novel implementation strategies not included in the ERIC compilation that have evidence to enhance EBP implementation in schools; (5) review and feedback by ERIC investigators on the school-adapted compilation to ensure conceptual consistency with original strategy; (6) additional revision, based on feedback from ERIC developers, to ensure conceptual consistency with original strategies and increase the comprehension, contextual appropriateness, and utility to the school context; and (7) re-review by ERIC developers and finalization of an initial set of school-adapted implementation strategies.
The analytic approach consisted of categorizing the nature of revisions made to each of the strategies as either no change, surface change, or deep change. Further, we recorded the specific features of the strategy that were modified, including (a) changes to the label of strategy, (b) changes to the referents used to contextualize the strategy (e.g., replacing agency with school or district or replacing clinician with teacher), (c) changes to the terminology used within the definition of the strategy, and/or (d) changes to the examples used to illustrate the strategy. No change referred to strategies that remained unaltered and, therefore, included the same label, referents, terminology, and examples as the original ERIC strategy. Surface-level changes reflected relatively minor changes to the strategy that did not depart from the meaning of the original strategy, but were made to increase contextual appropriateness for school-based researchers and practitioners. Specific surface-level changes were recorded, which included changes to the strategy label, referents (e.g., school personnel instead of clinician or school instead of clinic or agency), terminology (e.g., new practice instead of clinical innovation), or parenthetical and nonparenthetical examples in the strategy description. Deep change was used to categorize strategies that underwent more substantial modifications that altered the meaning of the strategy in a way that it departed from the original ERIC strategy. For all strategies that underwent deep changes, the specific reason for the deep change was recorded. Additionally, strategies that were deleted from or added to the ERIC taxonomy were recorded in order to tabulate the number of strategies that were deemed irrelevant and inappropriate to the school context, as well as those novel strategies that were not included in the ERIC compilation but educational research has identified as a method or procedure that could impact the successful uptake and use of EBP in schools. After completing the iterative adaption process, to examine patterns in the types of modifications, we synthesized the different changes (no change, surface, deep, deleted, added) according to each of the strategy categories derived from Waltz et al. (2015). Waltz et al. (2015) used expert ratings and concept mapping (Kane and Trochim 2007) to derive the following nine strategy categories: (1) use evaluative and iterative strategies; (2) provide interactive assistance; (3) adapt and tailor to context; (4) develop stakeholder relationships; (5) train and educate stakeholders; (6) support educators (word “clinicians” from the original was replaced with “educators”); (7) engage consumers; (8) financial strategies; and (9) change infrastructure. These categories were used to organize a side-by-side comparison of the ERIC and SISTER compilations, as well as examine patterns in the types of modifications made to the original strategies.
Results
The results of the adaptation process are depicted in Tables 1, 2, 3, 4, 5, 6, 7, 8, and 9, which includes a side-by-side comparison of the ERIC strategy compilation and the adapted SISTER compilation for each of the nine conceptual categories. The strategies are organized in alphabetical order within each of the conceptual categories. After applying the iterative adaptation process to the ERIC strategy compilation, the final SISTER compilation included a total of 75 unique implementation strategies. Below is a detailed account of the adaptations that were made to generate the 75 strategies included in the SISTER compilation.
Strategy Adaptation
No Change
Out of the 73 ERIC strategies, 11 remained unaltered with no surface-level changes made to the label, referents, terminology, and/or examples. Representative example strategies that were deemed to generalize well to the educational sector in their current form included the following: access new funding: access new or existing money to facilitate the implementation (strategy no. 60); visit other sites: visit sites where a similar implementation effort has been considered successful (strategy no. 36); and develop academic partnerships: partner with a university or academic unit for the purposes of shared training and bringing research skills to an implementation project (strategy no. 24).
Overall Changes
Results from the coding indicated that changes were made to 57 (78%) of the original ERIC strategies. Changes included the following: (a) 28 strategies with label changes, (b) 39 strategies with changes to the referent (e.g., teacher instead of clinician or school instead of agency), (c) 50 strategies with changes to terminology used to describe the strategy, and (d) 17 strategies with changes to the examples used to illustrate the strategy.
Surface Change
For 52 of the 57 adapted strategies, only surface-level changes were made. In total, 147 unique surface-level changes were made to the labels, referents, terminology, or examples to increase the comprehension and appropriateness of these 52 ERIC strategies. On average, there were roughly 2.5 surface-level changes to each of the adapted strategies, with a range of one surface-level change (e.g., terminology change to strategy no. 45 shadow other experts) to four surface-level changes (e.g., label, referent, terminology, and examples changes to strategy no. 50 facilitate relay of clinical data to providers). Specifically, changes to the label were made to 33 of the strategies, with examples including changing “Remind clinicians” to “Remind school personnel” and changing “Facilitate relay of clinical data to providers” to “Facilitate relay of intervention fidelity and student data to school personnel.” Further, changes to the referents (implementers, service recipients, or service setting) in the strategy were made to 40 ERIC strategies. The most common referent changes consisted of replacing “clinician” with “school personnel” (13 times), “sites or organizations” to “school or district” (25 times) and “consumer/patient” to “students and/or families” (25 times). Out of all the surface-level changes, the most common changes consisted of modifications to the terminology used to describe the strategy, with a total of 55 of the ERIC strategies undergoing terminology changes. Changes to terminology included using “new practice” instead of “clinical innovation” and adding terminology that represents common language used by school-based researchers and practitioners. Last, changes or additions to examples in the definition (parenthetical and nonparenthetical) were applied to 19 of the ERIC strategies to increase understanding of how the strategy could be applied in the school context. For example, for strategy no. 38, an expanded parenthetical example was provided to better describe the type of trained person who could conduct an educational outreach visit to support the implementation of a new practice.
Deep Change
Deep changes were made to five of the ERIC strategies and resulted in modifications that altered the core meaning of the adapted strategy in a way that departed from the original. These deep changes were made in addition to the surface-level changes described above. Three of these deep changes were made to strategies involving the use of financial mechanisms to influence implementation outcomes, which the school-based implementation experts agreed were inappropriate to the school context. However, these financial strategies had parallels to the school context and, thus, underwent deep changes. For example, develop disincentives (strategy no. 63) was preserved but altered to remove reference to financial penalties and instead include description of disincentives that are more appropriate to the school context, such as write up in professional file, meeting with the administrator to discuss insufficient implementation, and participating in additional professional development for failure to implement or use the new practices. Moreover, make billing easier (strategy no. 65) was maintained but substantially altered to make implementation easier by removing burdensome documentation tasks, as the latter is a more contextually appropriate strategy to reduce burdens that impede educators’ implementation efforts. Another strategy underwent deep change because it involved changing liability law (strategy no. 67), which currently do not exist in education to the extent they do in healthcare (e.g., there is no educational malpractice statute like there is in medicine). Thus, the strategy was altered to reflect changes in ethical and professional standards of practice, which represents an implementation strategy that is more appropriate to how schools operate. Last, the ERIC strategy develop and implement tools for quality monitoring (strategy no. 7) had deep changes made to it because it included a diffuse set of recommendations (changes to language, protocols, algorithms, standards, and measures of processes, patient/consumer outcomes, and implementation outcomes) that would limit its appropriateness and usability in the school context. Thus, deep changes were made to narrow the focus of the strategy and make it more appropriate to the school context.
Deleted
A total of five ERIC strategies were deleted and not included in the final SISTER strategy compilation due to consensus that they were not appropriate to the school context. Three out of the five strategies were deleted because they involved methods or techniques targeting the manipulation of financial structures to facilitate implementation outcomes. Due to the unique constraints of educational settings, such as school boards, compulsory attendance, and educational policy, financial strategies such as fee-for-service, use capitated payments, and use other payment schemes are not applicable and appropriate to the school context (Lyon et al. 2018). One strategy was deleted due to redundancy given the overlap with and lack of distinction from other strategies: use an implementation advisor. Last, revising professional roles was removed from inclusion in the SISTER compilation because of the contextual inappropriateness of revising educators roles in the contexts of schools. Teachers, for example, have highly prescriptive roles and credentials that prohibit shifting or revising their roles with other educators (e.g., with a school counselor; or a special education teacher with a general education teacher; Herlihy and Corey 2006; Urbach et al. 2015).
Added
A deliberate scan of the ERIC compilation to identify missing strategies resulted in a total of seven new strategies being added to the SISTER compilation: (a) develop local policy that supports implementation (strategy no. 72), (b) improve implementers’ buy-in (strategy no. 51), (c) peer-assisted learning (strategy no. 13), (d) pre-correction prior to implementation (strategy no. 52), (e) pruning competing initiatives (strategy no. 74), (f) targeting/improving implementer well-being (strategy no. 54), and (g) test-drive and select practices (strategy no. 18). These strategies were included based on knowledge of findings from school-based research on different methods and techniques used across multiple levels (e.g., policy to individual implementers) to facilitate implementation. Expanded definitions of each of these newly added strategies are included in the tables.
Strategy Changes by Category
The types of modifications made for each of the nine conceptual strategy categories of Waltz et al. (2015) are depicted in Table 10. Proportionally, the category of financial strategies underwent the most significant modifications, with two thirds of the strategies undergoing deep changes to modify meaning (n = 3, 33%) or deletion from inclusion in the SISTER compilation (n = 3, 33%). Strategies were deleted from only three of the nine categories (Develop stakeholder relationships, Support educators, and Financial strategies), while new strategies were added to four of the nine categories (Provide interactive assistance, Adapt and tailor to context, Support educators, Change infrastructure). Five of the categories included strategies that required deep changes that altered its meaning from the original ERIC strategy (Develop stakeholder relationships n = 1, 5%; Support educators n = 1, 20%; Financial strategies n = 3, 33%).
Discussion
The identification, deployment, and testing of implementation strategies are critical to advancing implementation science and practice. This study iteratively adapted the refined ERIC strategy compilation (Powell et al. 2015) for use by school-based implementation researchers and practitioners. Application of the iterative adaptation process resulted in 11 of the 73 ERIC strategies requiring no modification, 52 undergoing surface-level changes only, and five needing deep changes. Five strategies were deleted and seven new strategies were added, resulting in a total of 75 unique school-based implementation strategies.
Dissemination of this study’s findings is important to ensure that school-based implementation researchers and practitioners become aware of the full range of implementation strategies available to support the uptake, delivery, and sustainment of EBP given that the majority of efforts to change routine practice fail (Burnes 2004; Damschroder et al. 2009). Dissemination of implementation strategies is critical to establish a common nomenclature among prevention scientists engaged in school-based research and to develop a generalizable knowledge base to answer key questions, such as What strategy worked under what conditions and how did it work? Akin to intervention science, clear labels and definitions of implementation strategies will facilitate more precise assessment and reproducibility in research and practice (Proctor et al. 2013). For example, the SISTER compilation may enable prevention scientists to more accurately identify and track the core implementation strategies they deploy in efficacy studies (e.g., conduct ongoing training, provide local technical assistance, provide ongoing consultation) to support the successful uptake and delivery of EBP with fidelity that otherwise go unreported, resulting in a greater likelihood of replication across studies and investigative groups (Boyd et al. 2017; Bunger et al. 2017). Further, capturing the types of strategies that are needed to promote effective implementation (e.g., identify and prepare champions, alter and provide system- and individual-level incentives, provide practice-specific supervision) will be critical to support both indigenous school personnel (e.g., school psychologists, social workers) and EBP purveyors (e.g., external organizations who provide training and technical assistance on a given EBP) to facilitate the successful translation of EBP into everyday practice when strict oversight and control by researchers is lessened or not available (i.e., effectiveness research).
Emerging Patterns by Strategy Category
When examining patterns in the types of modifications made to strategies according to the conceptual categories of Waltz et al. (2015), several interesting findings emerged. First, consistent with the above, the strategy category with the most substantial modifications was financial strategies, with two thirds of the strategies (six out of nine) either being deeply modified or deleted from inclusion in the SISTER compilation. Financial strategies are largely inappropriate for use in schools due to unique policy, collective bargaining arrangements (i.e., unions and contracts), and compensation schemes (Lyon et al. 2019). These findings suggest that certain types of implementation strategies may be more bound to a specific service sector and, thus, less transmittable across contexts that have different organizational constraints regarding how services are accessed (e.g., fee for service) and providers are incentivized to implement new practices. Some of the financial strategies had parallels, however, to the school context. For example, although financial disincentives are inappropriate for use in schools, the broader notion of creating disincentives for lackluster implementation is appropriate for application in schools. Indeed, creating situations that educators want to avoid (e.g., teacher meeting with the site administrator to discuss lackluster implementation at an inconvenient time) as a way of promoting greater uptake and delivery of EBP is a strategy that has been found to be effective in schools (DiGennaro et al. 2005).
Second, there were four strategy categories (provide interactive assistance, adapt and tailor to context, train and educate stakeholders, and engage consumers) that underwent minimal modifications to increase the comprehension, contextual appropriateness, and utility by implementation researchers and practitioners operating in schools. Strategies that fall under these categories may be agnostic to the service delivery context and, therefore, more generalizable to a variety of implementation scenarios, settings, and providers. For example, there is consensus among researchers and practitioners across different service sectors that the category of train and educate stakeholders is relevant and necessary whether one is functioning within the context of healthcare, behavioral health, or education (Beidas and Kendall 2010; Grol 2001; Lyon et al. 2017; Stahmer et al. 2015), as stakeholders need to have knowledge of the underlying reasons why the EBP is needed, what the EBP entails, and what implementations looks like. Moreover, it is clear across service contexts, including schools, that providing interactive assistance is critical to support frontline providers (e.g., nurses, mental health providers, or teachers) with ongoing support via technical assistance, facilitation, and supervision to promote their uptake and delivery of EBP (Cook and Odom 2013; Lyon et al. 2017; Stetler et al. 2006).
Last, the seven newly generated strategies were classified into only four out of the nine conceptual strategy categories, with most additions falling under supporting educators (n = 3) and changing infrastructure (n = 2). This finding speaks to the overall representativeness of the refined ERIC strategy compilation (Powell et al. 2015), as relatively few new strategies were generated and classified into a small subset of the conceptual categories. This finding also indicates that certain strategy categories, like supporting educators and changing infrastructure, may have greater room for innovation regarding the generation of additional individual- and contextual-level strategies to support implementation. The generation of additional strategies for inclusion into strategy compilations should continue to be guided by consensus-driven procedures using the best available evidence, including efforts to classify new strategies under existing conceptual strategy categories to facilitate understanding of how the new ones fit among the more comprehension collection of other strategies.
Addition of New Strategies
The rationale for including additional unique strategies in schools that were missing from the ERIC compilation warrants further discussion. Develop local policy that supports implementation was added based on research findings related to universal prevention efforts, such as school-wide positive behavior intervention supports, suggesting that changes to school discipline policy lead to changes in adult behavior regarding how educators effectively respond to problem behavior (Horner et al. 2017). Improve implementers’ buy-in was included based on emerging evidence linking changes in educator beliefs and attitudes as important predictors of implementation intentions and behaviors (Cook et al. 2015). Peer-assisted learning was added in light of research suggesting that peer learning networks or collaborative frameworks are facilitative of reflective practice and provide educators with a form of peer accountability to enhance the implementation of academic and behavioral supports (Kohler et al. 1997; Vescio et al. 2008). Pre-correction prior to implementation was generated based on the impact of antecedent strategies delivered temporally before an opportunity to facilitate educators’ successful delivery of an EBP (Cook et al. 2017a, b). Last, pruning competing initiatives reflects strategic de-adoption practices to offset the potential for implementation overload, and was included as a strategy to make room for frontline providers to prioritize the implementation of a new program or practice (Abrahamson 2004; Nadeem and Ringle 2016). Targeting/improving implementer well-being has recently emerged as an implementation strategy, with findings showing stress and burnout reductions lead to improved intentions to implement and actual use of EBP by teachers (Cook et al. 2017a, b; Larson et al., under review). Test-drive and select practices is a way of incorporating implementer choice/preference in the selection of an EBP and has shown promise as a technique for improving fidelity among educators who are initially resistant to adopt and deliver a new practice (Dart et al. 2012; Johnson et al. 2014).
Although these additions were identified with the school context in mind, most of them are likely to be applicable to other service sectors. For example, efforts to promote implementer buy-in prior to and during an implementation effort are likely facilitative of implementation outcomes across other service sectors focused on promoting youth behavioral health outcomes, such as healthcare, child welfare, juvenile justice, and public health (e.g., Russ et al. 2014). Moreover, stress and burnout among implementers are not unique barriers to implementation in schools (e.g., Khamisa et al. 2013). Thus, efforts targeting stress and burnout reduction are likely to help promote providers’ well-being and may serve to increase their intentions to adopt and deliver clinical innovations (Damian et al. 2017). In the multidisciplinary spirit of implementation science, strategies facilitative of implementation outcomes in one context may ultimately be appropriate and have utility beyond the setting in which they were originally developed.
Implications
This study has notable implications for prevention scientists dedicated to improving youth access to high-quality behavioral health services in schools. First, although implementation science is far less advanced in the educational sector than other fields (Sanetti et al., manuscript in preparation), lagging behind other sectors can be viewed an opportunity for strategic adaptation of established implementation tools and resources. Service sectors, such as education, with lagging research are well-positioned to take advantage of extant findings from other service sectors, such as healthcare, by strategically adapting findings for use in a novel context. As highlighted in this study, strategic selection and adaptation of existing resources involves capitalizing on the trailblazing work by implementation scientists and practitioners operating in other service sectors to generalize and adapt extant findings for use in a novel service setting, such as schools. To support these advancements, school-based prevention scientists must strive to keep informed of implementation research outside of their own discipline to identify existing findings that could be strategically adapted for use in their specific context. As an example, in the areas of measurement, researchers in child welfare and youth mental health have developed pragmatic tools to assess key factors of the inner organizational context (i.e., the microsystem in which implementation happens) that are most proximal to providers’ implementation behaviors (Aarons et al. 2014; Ehrhart et al. 2014, 2015), and these measures have been adapted for use in the context of school-based implementation research and practice (Lyon et al. 2019).
Establishing an adapted compilation of implementation strategies has implications for deepening understanding of which strategies are most commonly needed, feasible to deploy, and effective across implementation efforts. The existing implementation strategies are not necessarily equal, as some may require more resources (i.e., time, money, and energy) to deploy, some may be more or less effective, and some may be needed more frequently. Thus, there is a need to examine pragmatic dimensions of strategies that impact their likely use among implementation practitioners. Ultimately, school-based implementation research should be concerned with potentially replicating the divide that it seeks to address between what research indicates works and what gets adopted in everyday service settings (Lyon et al. 2019). Similar to the work undertaken with the ERIC compilation (Waltz et al. 2015), researchers should examine experts’ and practitioners’ perceptions of the feasibility and impact of strategies to identify those that are low burden to deploy yet likely to have an influence on EBP implementation.
The SISTER strategy compilation, as well as other published taxonomies, have implications for identifying the subset of strategies that are most frequently needed by implementation practitioners within a given service setting. One starting place is to link strategies to the most commonly encountered malleable determinants (i.e., barriers or facilitators to implementation) that impact successful EBP implementation. This represents a useful starting place as one approach to tailored implementation involves targeting strategies to specific barriers identified in a given context. Pareto’s law of the Vital Few (Bookstein 1990), which captures the natural distribution of problems for particular phenomena in order to distill them to a core set, suggests that there is likely a smaller subset of barriers (e.g., 20%) that account for the majority of implementation issues encountered (e.g., 80%). This may prove quite useful, given that 601 plausible determinants of implementation have been identified (Krause et al. 2014). If this law holds true, then researchers and practitioners need to identify the vital determinants that account for the majority of implementation failures. Those barriers that are frequently encountered and are malleable could be the ideal targets for developing more pragmatic approaches to tailoring strategies to a given setting (Locke et al. 2016). Researchers have identified four different methodologies that could help inform tailoring implementation strategies to context, including concept mapping, group model building, conjoint analysis, and intervention mapping (Powell et al. 2017). These methodologies can help provide greater guidance on how to link implementation strategies to more precise (a) stages of the implementation process (e.g., exploration, preparation, implementation, and sustainment; Aarons et al. 2011), (b) determinants that serve as barriers to implementation (e.g., insufficient knowledge of or motivation to implement the new innovation), and (c) measures to monitor specific implementation outcomes (e.g., appropriateness, intervention fidelity, penetration/reach) to inform data-driven improvement decisions. Further streamlining of implementation strategies may come from emerging efforts to detail the mechanisms through which strategies influence implementation outcomes (e.g., Lewis 2017; Williams 2016). Similar to the push to identify mechanisms of action in intervention science (Kazdin 2007), identifying and testing implementation mechanisms holds promise for eliminating strategies (or strategy components) that do not operate through the strongest pathways of action. Researchers also have begun to outline methodologies that could be used to begin developing and testing specific strategy–mechanism–outcomes linkages (e.g., Lyon et al. 2016), which have relevance to work in the education sector. Research focused on determining how to tailor implementation strategies to a given context will hopefully provide more efficient and effective approaches to implementation.
We believe that existing taxonomies, like the original ERIC, need to be adapted to the specific service sector in which they will be used, as adaptation helps ensure that products and ideas are comprehensible and appropriate to stakeholders (e.g., researchers, practitioners, and policymakers) operating in that sector (Bernal et al. 2009). In the area of children’s mental and behavioral health, education, community mental health, juvenile justice, primary care, and child welfare represent the main child-serving sectors in which children receive services. Thus, we anticipate that the potential number of adapted compilations would mirror the number of child-serving sectors. It is also important that findings stemming from adaptation efforts, like SISTER, should be fed back to the original source to potentially expand and refine the ERIC compilation.
Limitations and Directions for Future Research
This study has several limitations. First, this initial study did not include as comprehensive a group of experts as the original ERIC group, which included a total of 71 implementation research and practitioner experts. Future research on the SISTER compilation will seek to expand the representativeness of research and practitioner experts who provide input on the compilation and recommendations to inform pragmatic use as part of real-world implementation efforts. This would ideally include input from implementation practitioners or intermediaries (e.g., external organizations or individuals who are EBP champions and use the science of implementation to support real-world implementation efforts; Franks and Bory 2015) working in real-world educational settings. Second, the adaptation process employed was not predicated on a widely established approach. Rather, the seven-step adaptation process was constructed for the purposes of this study due to the lack of a widely accepted approach to adapt existing research findings for use in novel contexts. Researchers may use the adaptation process in this study as a starting point to establish a more rigorous approach through expert consensus-driven procedures. Third, although there are systematic reviews of the school-based literature examining the use and effects of consultation and coaching on implementation, there are no comprehensive reviews of implementation strategies. Such research will be an important follow-up to the work presented in this paper. Lastly, this study provides no guidance to facilitate decision-making with regard to the selection and use of strategies in response to particular implementation scenarios. The lack of empirical guidance is noteworthy in school-based behavioral health relative to other service sectors (Novins et al. 2013), as there are few experimental studies or comparisons of implementation strategies.
Conclusion
Implementation strategies are essential to effectively incorporate EBP into school-based behavioral health service delivery and improve outcomes for youth. This study established the initial school-adapted SISTER strategy compilation, which will hopefully provide common language and stimulate future implementation research in the education sector. The SISTER compilation provides a useful starting place to move school-based behavioral health forward. Eventually, we hope to arrive at a place of greater understanding among implementation researchers and practitioners regarding when and how to select implementation strategies for new circumstances. Nevertheless, it is unlikely that the current SISTER compilation reflects the full set of potential relevant and useful implementation strategies in the education sector. As our research and collaborations in this area continue to advance, as well as the field of implementation science more generally, we anticipate further revisions will be made to this list. Moreover, we are hopeful that prevention scientists will scale-out this work by adapting it to novel child-serving sectors for use by researchers and practitioners seeking to advance EBP implementation.
References
Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of evidence-based practice implementation in child welfare. Administration and Policy in Mental Health and Mental Health Services Research, 38, 4–23.
Aarons, G. A., Ehrhart, M. G., & Farahnak, L. R. (2014). The implementation leadership scale (ILS): Development of a brief measure of unit level implementation leadership. Implementation Science, 9, 45.
Abrahamson, E. (2004). Change without pain: How managers can overcome initiative overload, organizational chaos, and employee burnout. Boston: Harvard Business School Press.
Barrera, M., & González-Castro, F. (2006). A heuristic framework for the cultural adaptation of interventions. Clinical Psychology: Science and Practice, 13, 311–316.
Barrish, H. H., Saunders, M., & Wolfe, M. D. (1969). Good behavior game. Effects of individual contingencies for group consequences and disruptive behavior in a classroom. Journal of Applied Behavioral Analysis, 2, 119–124.
Beidas, R. S., & Kendall, P. C. (2010). Training therapists in evidence-based practice: A critical review of studies from a systems-contextual perspective. Clinical Psychology: Science and Practice, 17, 1–30.
Bernal, G., Bonilla, J., & Bellido, C. (1995). Ecological validity and cultural sensitivity for outcome research: Issues for the cultural adaptation and development of psychosocial treatments with Hispanics. Journal of Abnormal Child Psychology, 23, 67–82.
Bernal, G., Jimenez-Chafey, M. I., & Domenech-Rodríguez, M. M. (2009). Cultural adaptation of treatments: A resource for considering culture in evidence-based practice. Professional Psychology: Research and Practice, 40, 361–368.
Bookstein, A. (1990). Informetric distributions, part I: Unified overview. Journal of the American Society for Information Science, 41, 368–375.
Boyd, M. R., Powell, B. J., Endicott, D., & Lewis, C. C. (2017). A method for tracking implementation strategies: An exemplar implementing measurement-based care in community behavioral health clinics. Behavior Therapy. https://doi.org/10.1016/j.beth.2017.11.012.
Brackett, M. A., & Rivers, S. E. (2013). Transforming students’ lives with social and emotional learning. In R. Pekrun & L. Linnenbrink-Garcia (Eds.), International handbook of emotions in education (pp. 368–388). New York: Taylor & Francis.
Bradshaw, C. P., Mitchell, M. M., & Leaf, P. J. (2010). Examining the effects of schoolwide positive behavioral interventions and supports on student outcomes: Results from a randomized controlled effectiveness trial in elementary schools. Journal of Positive Behavior Interventions, 12, 133–148.
Bruns, E. J., Duong, M. T., Lyon, A. R., Pullmann, M. D., Cook, C. R., Cheney, D., & McCauley, E. (2016). Fostering SMART partnerships to develop an effective continuum of behavioral health services and supports in schools. American Journal of Orthopsychiatry, 86, 156–170.
Bunger, A. C., Powell, B. J., Robertson, H. A., MacDowell, H., Birken, S. A., & Shea, C. (2017). Tracking implementation strategies: A description of a practical approach and early findings. Health Research Policy and Systems, 15, 1–12.
Burnes, B. (2004). Emergent change and planned change: Competitors or allies? The case of XYZ construction. International Journal of Operations & Production Management, 24, 886–902.
Cook, B. G., & Odom, S. L. (2013). Evidence-based practices and implementation science in special education. Exceptional Children, 79, 135–144.
Cook, C. R., Lyon, A. R., Kubergovic, D., Wright, D. B., & Zhang, Y. (2015). A supportive beliefs intervention to facilitate the implementation of evidence-based practices within a multi-tiered system of supports. School Mental Health, 7, 49–60.
Cook, C. R., Miller, F., Fiat, A., Renshaw, T. L., Frye, M., & Joseph, G. (2017a). Promoting secondary teachers’ well-being and intentions to implement evidence-based practices: Randomized evaluation of the ACHIEVER Resilience Curriculum. Psychology in the Schools, 54, 13–28.
Cook, C. R., Pauling, S., McCaslin, S., Larson, M., Thayer, A. J., & Fiat, A. (2017b). Brief reminders as a low threshold strategy to facilitate teacher delivery of evidence-based classroom practices. Manuscript in preparation.
Dalkey, N. C., & Helmer, O. (1963). An experimental application of the Delphi method to the use of experts. Management Science, 9, 458–467.
Damian, J., Gallo, J., Leaf, P., & Mendelson, T. (2017). Organizational and provider level factors in implementation of trauma-informed care after a city-wide training: An explanatory mixed methods assessment. BMC Health Services Research, 17, 750.
Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implement Science, 4, 50.
Dart, E., Cook, C. R., Gresham, F. M., & Cheneir, J. (2012). Test driving to increase treatment integrity and student outcomes. School Psychology Review, 41, 467–481.
DiGennaro, F. D., Martens, B. K., & McIntyre, L. L. (2005). Increasing treatment integrity through negative reinforcement: Effects on teacher and student behavior. School Psychology Review, 34, 220–231.
Dingfelder, H. E., & Mandell, D. S. (2011). Bridging the research-to-practice gap in autism intervention: An application of diffusion of innovation theory. Journal of Autism and Developmental Disorders, 41, 597–609.
Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350.
Ehrhart, M. G., Aarons, G. A., & Farahnak, L. R. (2014). Assessing the organizational context for EBP implementation: The development and validity testing of the Implementation Climate Scale (ICS). Implementation Science, 9, 157.
Ehrhart, M. G., Aarons, G. A., & Farahnak, L. R. (2015). Going above and beyond for implementation: The development and validity testing of the Implementation Citizenship Behavior Scale (ICBS). Implementation Science, 10, 65.
Farmer, E. M., Burns, B. J., Phillips, S. D., Angold, A., & Costello, E. J. (2003). Pathways into and through mental health services for children and adolescents. Psychiatric Services, 54, 60–66.
Fishbein, D. H., Ridenour, T. A., Stahl, M., & Sussman, S. (2016). The full translational spectrum of prevention science: Facilitating the transfer of knowledge to practices and policies that prevent behavioral health problems. Translational Behavioral Medicine, 6, 5–16.
Fixsen, D., Blase, K., Metz, A., & Van Dyke, M. (2013). Statewide implementation of evidence-based programs. Exceptional Children, 79, 213–232.
Flottorp, S. A., Oxman, A. D., Krause, J., et al. (2013). A checklist for identifying determinants of practice: A systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implementation Science, 8, 1–11. https://doi.org/10.1186/1748-5908-8-35 [PMC free article] [PubMed] [CrossRef].
Forman, S. G., Shapiro, E. S., Codding, R. S., Gonzales, J. E., Reddy, L. A., Rosenfield, S. A., et al. (2013). Implementation science and school psychology. School Psychology Quarterly, 28, 77–100.
Franks, R., & Bory, T. (2015). Who supports the successful implementation and sustainability of evidence-based practices? Defining and understanding the roles of intermediary and purveyor organisations. New Directions for Child and Adolescent Development, 149, 41–56.
Gottfredson, D., & Gottfredson, G. (2002). Quality of school-based prevention programs: Results from a national survey. Journal of Research in Crime & Delinquency, 39, 3–35.
Grol, R. (2001). Successes and failures in the implementation of evidence-based guidelines for clinical practice. Medical Care, 39, 46–54.
Herlihy, B., & Corey, G. (2006). Boundary issues in counseling: Multiple roles and responsibilities. Alexandria: American Counseling Association.
Herschell, A. D., Kolko, D. J., Baumann, B. L., & Davis, A. C. (2010). The role of therapist training in the implementation of psychosocial treatments: A review and critique with recommendations. Clinical Psychology Review, 30, 448–466.
Horner, R. H., Sugai, G., & Fixsen, D. L. (2017). Implementing effective educational practices at scales of social importance. Clinical Child & Family Psychology Review, 20, 25–35.
Johnson, L. D., Wehby, J. H., Symons, F. J., Moore, T. C., Maggin, D. M., & Sutherland, K. S. (2014). An analysis of preference relative to teacher implementation of intervention. The Journal of Special Education, 48, 214–224.
Kane, M., & Trochim, W. M. K. (2007). Concept mapping for planning and evaluation. Thousand Oaks: Sage Publications.
Kazdin, A. E. (2007). Mediators and mechanisms of change in psychotherapy research. Annual Review of Clinical Psychology, 3, 1–27.
Khamisa, N., Peltzer, K., & Oldenburg, B. (2013). Burnout in relation to specific contributing factors and health outcomes among nurses: A systematic review. International Journal of Environmental Research and Public Health, 10, 2214–2240.
Kohler, F., Crillery, K., Shearer, D., & Good, G. (1997). Effects of peer coaching on teacher and student outcomes. Journal of Educational Research, 90, 240–250.
Krause, J., Van Lieshout, J., Klomp, R., Huntink, E., Aakhus, E., Flottorp, S., et al. (2014). Identifying determinants of care for tailoring implementation in chronic diseases: An evaluation of different methods. Implementation Science, 9, 102.
Leeman, J., Birken, S. A., Powell, B. J., Rohweder, C., & Shea, C. M. (2017). Beyond “implementation strategies”: Classifying the full range of strategies used in implementation science and practice. Implementation Science, 12, 125–134.
Leong, F. T. (1996). Toward an integrative model of cross-cultural counseling and psychotherapy. Applied and Preventive Psychology, 5, 189–209.
Lewis, C. C. (2017). Systematic review of D & I Mechanisms in health. Keynote address at 4th biennial Society for Implementation Research Collaboration Conference.
Lewis, C. C., Klasnja, P., Powell, B. J., Lyon, A. R., Tuzzio, L., Jones, S., et al. (2018). From classification to causality: advancing understanding of mechanisms of change in implementation science. Frontiers of Public Health, 6, 1–6.
Lochman, J. E., & Wells, K. C. (2002). Contextual social–cognitive mediators and child outcome: A test of the theoretical model in the Coping Power program. Development and Psychopathology, 14, 945–967.
Locke, J., Beidas, R. S., Marcus, S., Stahmer, A., Aarons, G. A., Lyon, A. R., et al. (2016). A mixed methods study of individual and organizational factors that affect implementation of interventions for children with autism in public schools. Implementation Science, 11, 135.
Lyon, A. R., & Bruns, E. J. (2019). From evidence to impact: Joining our best school mental health practices with our best implementation strategies. School Mental Health, 11, 106–114.
Lyon, A. R., Lewis, C. C., Melvin, A., Boyd, M., Nicodimos, S., Liu, F. F., & Jungbluth, N. (2016). Health Information Technologies—Academic and Commercial Evaluation (HIT-ACE) methodology: Description and application to clinical feedback systems. Implementation Science, 11, 128.
Lyon, A. R., Pullmann, M. D., Walker, S. C., & D’Angelo, G. (2017). Community-sourced intervention programs: Review of submissions in response to a statewide call for “promising practices.”. Administration and Policy in Mental Health and Mental Health Services Research, 44, 16–28.
Lyon, A., Cook, C. R., Brown, E., Locke, J., Ehrhart, M., Davis, C., & Aarons, G. (2018). Assessing the organizational implementation context in the education sector: Confirmatory factor analysis of measures of implementation leadership, climate, and citizenship. Implementation Science, 13, 5.
Lyon, A. R., Comtois, K. A., Kerns, S. E. U., Landes, S. J., & Lewis, C. C. (2019). Closing the science-practice gap in implementation before it widens. In A. Shlonsky, R. Mildon, & B. Albers (Eds.), The science of implementation. Springer (in press).
McHugh, R. K., & Barlow, D. H. (2010). The dissemination and implementation of evidence-based psychological treatments. A review of current efforts. American Psychologist, 65, 73–84.
McKleroy, V. S., Galbraith, J. S., Cummings, B., Jones, P., Harshbarger, C., et al. (2006). Adapting evidence-based behavioral interventions for new settings and target populations. AIDS Education & Prevention, 18, 59–73.
Morina, N., Koerssen, R., & Pollet, T. V. (2016). Interventions for children and adolescents with posttraumatic stress disorder: A meta-analysis of comparative outcome studies. Clinical Psychology Review, 47, 41–54.
Nadeem, E., & Ringle, V. (2016). De-adoption of an evidence-based trauma intervention in schools: A retrospective report from an urban school district. School Mental Health, 8, 132–143.
Novins, D. K., Green, A. E., Legha, R. K., et al. (2013). Dissemination and implementation of evidence-based practices for child and adolescent mental health: A systematic review. Journal of the American Academy of Child & Adolescent Psychiatry, 52, 1009–1025.
Owens, J., Lyon, A. R., Brandt, N. E., Maisa Warner, M., Nadeem, E., Spiel, C., & Wagner, M. (2014). Implementation science in school mental health: Key constructs and a proposed research agenda. School Mental Health, 6, 99–111.
Powell, B. J., McMillen, J. C., Proctor, E. K., Carpenter, C. R., Griffey, R. T., Bunger, A. C., et al. (2012). A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review, 69, 123–157.
Powell, B. J., Proctor, E. K., & Glass, J. E. (2014). A systematic review of strategies for implementing empirically supported mental health interventions. Research Social Work & Practice., 24, 192–212.
Powell, B. J., Waltz, T. J., Chinman, M. J., Damschroder, L. J., Smith, J. L., et al. (2015). A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science, 10, 1–14.
Powell, B. J., Beidas, R., Lewis, C. C., Aarons, G. A., McMillen, J. C., Proctor, E. K., & Mandell, D. S. (2017). Methods to improve the selection and tailoring of implementation strategies. The Journal of Behavioral Health Services & Research, 44, 177–194.
Proctor, E. K., Landsverk, J., Aarons, G., Chambers, D., Glisson, C., & Mittman, C. (2009). Implementation research in mental health services: An emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Services Research, 36, 24–34.
Proctor, E. K., Powell, B. J., & McMillen, J. C. (2013). Implementation strategies: Recommendations for specifying and reporting. Implementation Science, 8, 1–11.
Resnicow, K., Baranowski, T., Ahluwalia, J., & Braithwaite, R. (1990). Cultural sensitivity in public health: Defined and demystified. Ethnicity and Disease, 9, 10–21.
Ringwalt, C. L., Vincus, A., Ennett, S., Johnson, R., & Rohrbach, L. A. (2004). Reasons for teachers’ adaptation of substance use prevention curricula in schools with non-white student populations. Prevention Science, 5, 61–67.
Ringwalt, C. R., Vincus, A., Ennett, S. T., Hanley, S., Bowling, J. M., & Rohrbach, L. A. (2009). The prevalence of evidence-based substance use prevention curricula in U.S. middle schools in 2005. Prevention Science, 10, 33–40.
Russ, S. J., Sevdalis, N., Moorthy, K., et al. (2014). A qualitative evaluation of the barriers and facilitators toward implementation of the WHO surgical safety checklist across hospitals in England: Lessons from the “surgical checklist implementation project”. Annual Surgery, 261, 81–91.
Sholomskas, D. E., Syracuse-Siewert, G., Rounsaville, B. J., Ball, S. A., Nuro, K. F., & Carroll, K. M. (2005). We don’t train in vain: A dissemination trial of three strategies of training clinicians in cognitive-behavioral therapy. Journal of Consulting and Clinical Psychology, 73, 106–115.
St. Peter Pipkin, C., Vollmer, T. R., & Sloman, K. N. (2010). Effects of treatment integrity failures during differential reinforcement of alternative behavior: A translational model. Journal of Applied Behavior Analysis, 43, 47–70.
Stahmer, A. C., Reed, S., & Lee, E. (2015). Training teachers to use evidence-based practices for autism: Examining procedural implementation fidelity. Psychology in the School, 52, 181–195.
Stetler, C. B., Legro, M. W., Rycroft-Malone, J., Bowman, C., Curran, G., & Guihan, M. (2006). Role of “external facilitation” in implementation of research findings: A qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implementation Science, 1, 23.
Urbach, J., Moore, B. A., Klinger, J. K., Galman, S., Haager, D., Brownell, M. T., & Dingle, M. (2015). “That’s my job”: Comparing the beliefs of more and less accomplished special educators related to their roles and responsibilities. Teacher Education and Special Education, 38, 323–336.
Vescio, V., Ross, D., & Adams, A. (2008). A review of research on the impact of professional learning communities on teaching practice and student learning. Teaching and Teacher Education, 24, 80e91.
Walker, V. L., Chung, Y.C., & Bonnet, L. K. (2017). Function-based intervention in inclusive school settings: A meta-analysis. Journal of Positive Behavior Interventions.
Waltz, T. J., Powell, B. J., Matthieu, M. M., Damschroder, L. J., Chinman, M. J., Smith, J. L., et al. (2015). Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: Results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Science, 10, 1–8.
Weiner, B. J., Lewis, C. C., Stanick, C. F., Powell, B. J., Dorsey, C. N., Clary, A. S., Boynton, M. H., & Halko, H. (2017). Psychometric assessment of three newly developed implementation outcome measures. Implementation Science, 12, 1–12.
Weisz, J., & Kazdin, A. (2017). Evidence-based psychotherapies for children and adolescents. Guilford.
Wensing, M., & Grol, R. (2005). Methods to identify implementation problems. In R. Grol, M. Wensing, & M. Eccles (Eds.), Improving patient care: The implementation of change in clinical practice (pp. 109–120). Edinburgh: Elsevier.
Wensing, M., Oxman, A., Baker, R., et al. (2011). Tailored implementation for chronic diseases (TICD): A project protocol. Implementation Science, 6, 1–8.
Williams, N. J. (2016). Multilevel mechanisms of implementation strategies in mental health: Integrating theory, research, and practice. Administration and Policy in Mental Health and Mental Health Services Research, 43, 783–798.
Funding
This publication was supported in part by funding from the Institute of Education Sciences (R305A160114 PIs - Lyon and Cook; R305A170292 PIs – Cook and Lyon). BJP was supported by K01MH113806, R25MH080916, and UL1TR001111. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health or Institute of Education Sciences.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
The authors declare that they have no conflict of interest.
Ethical Approval
Exempt status was obtained from the university IRB prior to conducting the study.
Informed Consent
No informed consent was necessary as part of this study.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Cook, C.R., Lyon, A.R., Locke, J. et al. Adapting a Compilation of Implementation Strategies to Advance School-Based Implementation Research and Practice. Prev Sci 20, 914–935 (2019). https://doi.org/10.1007/s11121-019-01017-1
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11121-019-01017-1