Evidence-based practices (EBPs) have research and data documenting improvement in student outcomes (Burns et al., 2017; Forman et al., 2014) and demonstrate effectiveness (Kelly, 2012). Despite identification and dissemination of EBPs, there remains a well-documented research-to-practice gap in school psychology (Reddy et al., 2017; Reinke et al., 2011; Sanetti & Collier-Meek, 2019). School staff, including school psychologists, often continue to implement practices that do not have scientific support in improving student outcomes. A promising strategy to bridge this research-to-practice gap is the use of de-implementation science. De-implementation science aims to remove practices not supported by research in order to support implementation science to apply EBPs. Of note, implementation science has also been under-examined in school psychology scholarship (Forman et al., 2014; Rosenfield, 2000), which has likely impeded effective implementation practices. Research in de-implementation science has primarily come from medicine, and although this is a relatively new area of research, preliminary evidence suggests utilizing de-implementation science can support reductions in low-value and unsupported practices in medicine (Sypes et al., 2020).

School psychology is a unique field with its own professional contexts; at the same time, it is important for school psychology to use knowledge created in other fields, modifying it to meet the specific needs of the field. De-implementation strategies include behavior change, which is within the scope of school psychology (Patey et al., 2018). Implementation science is important in school psychology because of the significant organizational and cultural considerations that practicing school psychologists must take into account when encouraging systems to implement EBPs (Forman & Selman, 2011). Three of the 2020 NASP domains of practice pertain to systems-level services: school-wide practices to promote learning, preventive and responsive services, and family-school collaboration services (NASP, 2020). The clear emphasis within the NASP domains on systems-level support demonstrates the need for and importance of de-implementation and implementation science within school psychology, which primarily occurs at the systems-level.

Consequently, the purpose of this paper is to use research, primarily from medicine, to describe de-implementation science and provide a framework in relation to current research in implementation science in school psychology. We use this framework to describe how de-implementation science processes could be applied to two areas relevant to school psychology: reading instruction and mental health services. Although the focus is on de-implementation, we include information on implementation as well because they are so closely related. We end by describing future research directions for de-implementation science in school psychology and application for closing the science to practice gap.

De-Implementation Science

In schools, it is important to put practices in place that provide effective and beneficial services to students. However, we often need to first remove practices that are not effective or beneficial to students in order to make room for the EBP. De-implementation includes intentionally identifying and eliminating practices that do not have scientific support. In many cases, de-implementation must be carried out in conjunction with implementation, as typically there is not a practice vacuum but a practice that needs to be changed (McKay et al., 2018; Patey et al., 2018; Upvall & Bourgailt, 2018). School psychology scholarship has actively engaged in developing EBPs and implementation research, but we found no research related to de-implementation in school psychology.

An important component of creating a culture of EBPs and a willingness to move away from ineffective past practice is to encourage school staff, including administrators, teachers, and school psychologists, to actively question the evidence base of current practices. This evidence base can come both from broader research and from data related to implementation in the specific school or district. Through deliberate questioning and reflective practice, school staff can improve practice and student outcomes (Upvall & Bourgailt, 2018). When practices with limited evidence are in place, practitioners may feel that the larger system is limiting their capacity to de-implement ineffective practices and implement effective practices. This highlights the need for systems-level change and an increased focus on how to increase the perceived value of evidence in relation to the weight of current practice and beliefs (Montini & Graham, 2015; Prasad & Ioannidis, 2014).

De-implementation can be particularly difficult because it often requires abandoning a practice that has been used for a long period of time (van Bodegom-Vos et al., 2016). Some educators struggle to de-implement practices due to cognitive biases and misapplied heuristics that can interfere with clinical reasoning skills as well as competing demands that can interfere with the time and effort it takes to change practice (VanDerHeyden, 2018; Wilcox & Schroeder, 2015). Relevant examples include overconfidence bias (confidence in decisions that are actually wrong), belief perseverance (holding onto a belief or practice in spite of evidence to the contrary), and sunk costs (sticking with a practice because of how much we have invested in it; Lilienfeld et al., 2012; Wilcox & Schroeder, 2015). Consequently, removing the practice can be challenging, even when evidence suggests that it is not producing the intended results.

Effective de-implementation in school systems is likely fostered through strong consultation skills and ongoing relationships especially to help overcome cognitive biases, an area of expertise for school psychologists, rather than through a single session explaining why a specific practice has low value or is even harmful (Shaw, 2021). As a field, we are well situated to serve as translational scientists, using research to inform practice in schools. In addition to consultation, the skills of a translational scientist that are strong in school psychology training include thinking about systems, skillfully communicating with multiple groups, innovatively addressing barriers, and actively working across disciplines (Petscher et al., 2020).

Shaw (2021) describes multiple approaches to changing ineffective practice specific to school psychology. The first, knowledge transmission, is done through providing evidence supporting a practice or evidence that a practice is not useful. The second, bottom-up models, involve changing practice through professional training programs. One challenge with these models is that they assume that providing training successfully stops teaching practices that are not evidence-based. As we will note in the example of reading decoding, this is not always the case. Finally, in top-down models, policies strongly encourage removing specific practices and implementing other practices. However, top-down models run the risk of reducing professional autonomy in practice decision making. An example of school psychology as a field using a top-down model is its major role in the conception of Response to Intervention (RTI), which is now included in federal regulation.

De-implementation occurs in stages, with widespread de-implementation of a non-EBP as the overarching goal. There are multiple de-implementation frameworks, all of which include similar steps (Davidson et al., 2017; Grimshaw et al., 2020; Nilsen et al., 2020). In this paper, we will follow Davidson et al. (2017) iterative nine-phase process for de-implementation. Although this framework is based in the field of medicine, the main ideas of this process are transferrable to school psychology. The nine phases in Davidson et al. (2017) framework are as follows. First, identify the practice that needs to be de-implemented. Second, document the current prevalence of its use. Third, determine the factors that maintain the practice, including context, beliefs, and reinforcements. Fourth, review relevant methods for extinguishing the practice in the system. Fifth, choose the one that will best address the factors that are currently maintaining the practice. Sixth, execute the de-implementation method experimentally. Seventh, evaluate the effectiveness of the de-implementation method. Eighth, collect evidence of savings and improved outcomes from the successful de-implementation. Finally, determine the next practice for de-implementation.

Davidson et al. (2017) iterative de-implementation process mirrors the steps of school psychology’s often-used problem-solving model (problem identification, problem analysis, plan implementation, and plan evaluation). As a result, school psychologists should readily find connections between the de-implementation stages and their current practices. For example, school psychologists use the problem-solving model when consulting and collaborating with teachers regarding a student’s behavioral challenges and when engaging in ethical problem solving.

De-implementation of ineffective interventions for EBPs will benefit a district in multiple ways. Districts will save money and resources by replacing ineffective interventions with EBPs, ensuring that students make progress, and creating better outcomes. De-implementation models require ongoing progress monitoring, which is often missing in ineffective practices. Ongoing progress monitoring will also strengthen interventions, providing information on effectiveness through visual analysis, and allow for school personnel to engage in the problem-solving method (Stahmer et al., 2018).

Causes and Consequences of Ineffective Intervention and Research-to-Practice Gap

School personnel are expected to implement scientifically-based interventions and to evaluate their effectiveness, with the goal of having more students achieve academic proficiency (No Child Left Behind [NCLB], 2002). In response to this charge, school psychologists have researched and evaluated many EBPs (Sanetti & Collier-Meek, 2019). Nonetheless, there remains a research-to-practice gap, with the vast majority of Nationally Certified School Psychologists reporting that they rarely or never implement EBPs in their practice (Hicks et al., 2014). More school psychologists reported using personal experience rather than reference books or journal articles to inform their professional practice (Bramlett et al., 2002). In this survey, the majority of school psychologists reported that did not receive adequate training in EBP, which likely contributes to lower rates of EBP implementation (Hicks et al., 2014). Additionally, school psychologists receive limited training in implementation science from either a research or a practice perspective, making it challenging for them to effectively use their skills and knowledge to change practice in schools (Sanetti & Collier-Meek, 2019).

Barriers to De-Implementation and Implementation

Many researchers have examined the barriers to changing practice through de-implementation, which suggests the importance and complexity of de-implementation of non-EBP (e.g., Ennett et al., 2003; Gottfredson & Gottfredson, 2002). Identified barriers occur at both the individual and the organizational/systems levels. At the individual level in schools, the skills, attitudes, and beliefs of school staff and other stakeholders impact the success of both de-implementation and implementation. Specifically, teachers struggled to use effective strategies to implement EBP when they did not receive adequate training in those instructional techniques (Ennett et al., 2003), and they were less willing to implement them when they were add-ons rather than an integrated component of curriculum or school environment (Gottfredson & Gottfredson, 2002). When administrators and school staff have positive beliefs about a non-EBP that is currently in place, they may resist assistance offered by the school psychologist to de-implement the non-EBP and replace it with an EBP (Durlak & DuPre, 2008; Forman et al., 2009, 2013; Kelly, 2012). Educators may also be reluctant to change longstanding ineffective practices in favor of an EBP, particularly if they believe it requires more effort (Lovett & Harrison, 2021).

Barriers to de-implementation at the organizational or systemic level include politics, policy, too many competing demands, lack of time, and lack of funding (Durlak & DuPre, 2008; Forman et al., 2009; Hicks et al., 2014). Funding in particular has been found to be a major concern (Reinke et al., 2011), with 54% of intervention developers themselves reporting money as the primary barrier to successful implementation (Forman et al., 2009). As a result, even if the current practice is harmful, continuing with the ineffective practice in place is often viewed as more manageable than attempting to implement a new EBP. Another organizational barrier to de-implementation relates to the cultural norms of the school system and the ability of the school psychologist to influence and shape the educational environment. Successful de-implementation requires alignment of the intervention with school philosophy, goals, policies, and existing programs as well as buy-in from teachers and administrators (Forman et al., 2009, 2013). The school’s perception of school psychologists as leaders and facilitators of systems-level support also contributes to how well a new intervention is received. Consequently, without effective communication on the importance of schoolwide de-implementation, it can become difficult to de-implement effectively at a school- or systems-level (Lovett & Harrison, 2021).

Consequences of Maintaining Ineffective Practices and the Research to Practice Gap

Using non-EBPs can result in poorer outcomes for students in a number of ways. For example, when students do not receive adequate reading instruction, many of them do not acquire expected reading skills. Only 34% of eighth-grade students in the USA are reading at or above proficiency (National Assessment of Educational Progress [NAEP], 2015), and 17% of Canadians cannot read above a below basic level (Heisz et al., 2016). As a result, these students have fewer educational and employment opportunities (Cree & Steward, 2012). As another example, although 70% of adults with a mental health diagnosis received their diagnosis before they turned 18 (Kessler et al., 2005), 70% and 80% of students with mental health needs do not have access to services in the USA and Canada, respectively (U.S. Surgeon General, 1999; UNICEF Canada, 2007). When EBPs in prevention and early intervention of mental health disorders are not provided in schools, students are less likely to succeed academically and more likely have negative long-term consequences (Leadbeater & Gladstone, 2016).

Frustratingly, but perhaps unsurprisingly, under-resourced schools, which tend to serve marginalized or minoritized groups, are less likely to bridge the research-to-practice gap (Mulé et al., 2014). This means that students from marginalized or minoritized groups (e.g., non-White, lower income, ELL, rural) are less likely to receive effective intervention services in their schools. Additionally, school psychologists often know little about the effectiveness of many EBPs with diverse populations in varying family and community contexts and may not be equipped to modify their practices for culturally and linguistically diverse students (Forman et al., 2013). The field is further limited in understanding a wide range of individual needs, as research on historically marginalized populations has primarily focused on African-American and Hispanic/Latino youth (Mak et al., 2007). This leaves other historically marginalized groups largely absent from the literature (Forman et al., 2013). In an effort to reduce this discrepancy, modification of EBPs for cultural adaptation has been considered, and several aspects of intervention have been identified and targeted in preliminary research. Some specific adaptations have included language, cultural content, cultural resonance, treatment goals, and integration of stakeholders into the process of adaptation (Domenech-Rodriguez et al., 2011).

Examples of Challenges with De-Implementation and Implementation Science

Reading Instruction

Whole Language

Whole language is a constructivist philosophy rather than an instructional approach, so any instructional activities designed through the lens of this philosophy are whole language approaches (Goodman, 1986). In the constructivist view, the idea of teaching direct instruction is antithetical to the idea that all knowledge is co-constructed (Seidenberg, 2017). One primary tenet of whole language is that reading is a “psycholinguistic guessing game” (Goodman, 1967 p. 127), in which students use cues to guess the meaning of unknown words. A second tenet is that children gain literacy through authentic, meaningful interactions with text (Goodman, 1967). Castles and colleagues (2018) described the error in this assertion by noting that

“[T]hese theorists observed rapid construction of meaning for texts in skilled adult readers and concluded that instruction should focus on these skills. But such a conclusion is analogous to observing skilled concert pianists and concluding that piano instruction should involve putting a child in front of a Tchaikovsky score. The missing piece of the puzzle here is how these processes develop in children. . .” (p. 19)

Although whole language remains popular in both teacher training programs and in practice, decades of research has indicated that this is not an effective strategy for reading instruction (Castles et al., 2018; National Reading Panel [NRP], 2000), making it a strong candidate for de-implementation (Upvall & Bourgailt, 2018). Additionally, research has also clearly identified that direct, explicit, systematic instruction in the Big Five in reading (phonemic awareness, alphabetic principle, fluency, vocabulary, and comprehension) as well as related areas including language (NRP, 2000), morphology (Castles et al., 2018; Gold & Rastle, 2007), and orthography (Castles et al., 2018; Kaefer, 2016) are effective to strategies for reading instruction, making this a strong candidate for implementation as well (Forman et al., 2013; Kelly, 2012).

In a survey of literacy experts including teachers and post-secondary professionals from multiple countries including the USA and Canada, 60% of respondents indicated that teachers are not prepared to provide effective reading instruction (International Literacy Association [ILA], 2020). Not surprisingly, respondents were divided on the importance of phonics instruction, with 31% noting that it deserved more attention and 24% noting that it deserved less attention (ILA, 2020). A national survey of primary school teachers’ reading instructional practices found that over 40% reported not using a specific reading program (Kretlow & Helf, 2013). While a majority of teachers reported teaching all components of the Big Five, they used an “eclectic” rather than a systematic approach (Kretlow & Helf, 2013, p. 177). Additionally, over half had either never received professional development related to reading instruction or had not received any professional development in the last decade (Kretlow & Helf, 2013). Unfortunately, student reading skills suffer as a result of ineffective reading instruction, resulting in greater need for reading intervention and poor reading outcomes (Jamieson, 2006; NAEP, 2015).

De-Implementation and Implementation of Evidence-Based Reading Instruction

In the area of reading, school psychologists have provided significant research in evidence-based reading instruction (e.g., Torgesen et al., 1994), publishing these results and sharing them at conferences and within their schools (Sanetti & Collier-Meek, 2019). Unfortunately, we have been less successful in widespread changes in practice in spite of the widespread implementation of Response to Intervention (RTI) and Multi-Tiered Systems of Support (MTSS).

Davidson et al. (2017) iterative process for de-implementation provides guidance on de-implementing that can be applied to whole language instruction. Through decades of research, whole language has clearly been identified as a practice that does not serve students well which is the first phase of de-implementation (Davidson et al., 2017). Explicit direct instruction in phonemic awareness and phonics are evidence-based alternatives for early reading instruction. In the second phase, (identifying the prevalence), it is clear that this practice is quite prevalent across North America (Moats, 2000; Slavin et al., 2011). The prevalence of whole language instruction in individual systems where it will be de-implemented should be determined by local researchers or practitioners within those systems. Phase 3 includes investigating the context, reinforcements, and beliefs that maintain the practice. This will require training, addressing the challenges with the face validity of whole language, and using school data to demonstrate reading performance of students who are taught to read using whole language compared with those receiving evidence-based reading instruction, using consultation skills to support this process. As whole language is an entrenched practice with significant buy-in from educators, this phase, focusing on outcomes for their students, will be important in helping to increase teacher willingness to abandon whole language for systematic, explicit, direct instruction. In phase four, determining how to change teacher/interventionist behaviors requires significant administrative support in determining how to reinforce the move away from whole language (closely related to phase 3). In a qualitative study on translating evidence-based reading strategies into practices, teachers noted several components that supported their adoption of these practices (Grima-Farrell et al., 2019). Examples included focusing on addressing the needs of students in teachers’ classrooms, ensuring that the strategies are usable in classroom contexts, and giving agency to teachers, especially in the context of university-school research partnerships. Relatedly, phase six involves conducting a de-implementation experiment, removing the practice from one school, and implementing EBP in reading—explicit direct instruction. Phases seven and eight involve evaluating the difference in student outcomes and related cost savings. Finally, phase 9 involves determining if there is another non-EBP that should be de-implemented next (Davidson et al., 2017).

Although the specifics will likely differ for each school, research implementing evidence-based reading instruction in schools can be used to inform other schools in the process. Lessons from one study of a multi-year collaboration between university researchers and a school indicated that in-depth support beyond typical professional development resulted in longer-lasting changes in practice (Greenwood et al., 2003). Challenges with researcher-led models in schools included the fact that systems-level work did not extend beyond the school or beyond the length of researcher involvement in spite of improved student outcomes in reading.

Strategies to de-implement whole language instruction and implement explicit direct instruction will require work at the systems level, including state/provincial and federal policy, as they dictate the training teachers receive and curricular expectations (Montini & Graham, 2015). Likewise, evidence alone is often not enough to change practice. If it were, reading instruction practice would have already changed, as research on efficacious reading instruction has been available for several decades (see Castles et al., 2018 for a summary). In order to change practice, teachers and school administrators need to not only know the science supporting evidence-based reading instruction, but researchers and school psychologists also need to connect that knowledge to beliefs and goals in order to increase teacher engagement with evidence-based reading practices (Solari et al., 2020). Additionally, reading research needs to provide additional support for its effectiveness. Much of the reading research has demonstrated its efficacy, but how well these practices translate to the messy reality classrooms is also important as noted in the study by Greenwood et al. (2003), especially considering the diversity of classroom environments (Forman et al., 2013). Other factors that contribute to practices that schools choose to implement, including inertia and professional stance, must be addressed in order to change practice as well (Prasad & Ioannidis, 2014).

Finally, school psychologists, through their training and professional development, need to have a strong understanding of reading development, evidence-based reading interventions, and translating research into practice in order to effectively support de-implementation of unsupported practices and implementation of supported practices (Solari et al., 2020). Unfortunately, over half of practicing school psychologists in one study reported that they did not have a strong understanding of the scientific basis of reading interventions, making this an area for focus in school psychology training as well (Nelson & Machek, 2007). School psychologists also need more specific training in the translational sciences implementation and de-implementation in order to make meaningful differences in practice (Forman et al., 2013).

School-Based Mental Health Services

As noted previously, although most adults with a mental health diagnosis received it before they graduated (Kessler et al., 2005), most do not have access to services. This lack of school-based service results in lost opportunities for early intervention, which is often more successful than interventions implemented later (UNICEF Canada, 2007; U.S. Surgeon General, 1999). When students do receive mental health services, they are most likely to receive them in schools (Rones & Hoagwood, 2000), providing school psychologists and other school-based mental health service providers a unique opportunity to deliver EBPs for mental health (Shernoff et al., 2017).

Despite the research in EBPs for mental health, research surveying practicing school psychologists suggests a limited use of these practices. One reason for this gap may be a lack of training. Unsurprisingly, school psychologists are less committed to implementing an EBP when they have not been trained in it. However, in one survey of practicing school psychologists, “less than half of participants reported having had training in standard cognitive-behavioral interventions” (Forman et al., 2012, p. 216). Even practicing school psychologists who had recently completed a course of EBPs in mental health have reported using generic counseling strategies or other mental health interventions without an evidence base (Farmer et al., 2002; Forman et al., 2009). Regardless of the reason for the limited use of EBPs for mental health in schools, the consequences for students can be severe. When EBPs in prevention and early intervention of mental health disorders are not provided in schools, students are less likely to succeed academically and more likely to have negative long-term consequences (Leadbeater & Gladstone, 2016).

De-Implementation and Implementation of Evidence-Based Mental Health Services

Following Davidson et al. (2017) iterative process for de-implementation can also be applied to the area of EBPs in mental health services. First, school psychologists identify the practice that needs to be de-implemented. Here, practicing school psychologists may recognize that they, or their schools, are using a generic or non-evidence-based counseling strategy they found online and that there is a lack of improvement in their client. Both components—non-EBP and a lack of student improvement—are crucial to identifying a practice that needs to be de-implemented as a practice with limited research behind it may nonetheless be beneficial to a particular student or group of students, especially newer interventions with a developing research base.

Non-EBP exists at all three levels of MTSS. An example of a tier I intervention that is not supported by research is DARE to reduce drug use (Lilienfeld et al., 2012; Rosenbaum & Hanson, 1998). School-based mental health interventions are often provided in a small-group or individual setting in Tier 2 or 3. An example of a non-EBP at Tier 2 that could be considered for de-implementation if it is not helping students’ mental health is an unstructured lunch bunch group. Despite the widespread use of lunch bunch groups in schools, particularly at the elementary level, a review of the literature shows limited to no research evaluating their effectiveness. An example of a non-EBP at Tier 3 that could be considered for de-implementation if it is not helping the student’s mental health is the use of board games for therapy. While commonly used to build rapport or engage students’ interest in sessions, the successful use of board games to support children’s and adolescents’ mental health has been largely unsubstantiated by the literature (Matorin & McNamara, 1996).

In the second phase, school psychologists would determine the current prevalence of the non-EBP’s use. As cited above, there is considerable research that school psychologists are not using EBPs for mental health and that this can have negative effects on students. Unfortunately, because of the scarcity of literature on the practice of either lunch bunch groups or board games, the current prevalence of either practice use cannot be accurately determined. School psychologists may also examine how frequently they are using the non-EBP in their own practice or how often it is being used in their building or district by other mental health service providers. The authors recognize that, in their own graduate training including practicum and internship placements and school-based practice, both lunch bunch groups and board games were frequently used or recommended.

In phase 3, school psychologists would determine what factors maintain the practice of the non-EBP. This may involve many variables, including comfort/familiarity with the practice, lack of training in EBPs, or lack of funding to purchase manualized programs. Given the potential widespread use of lunch bunch groups and board games, comfort or familiarity with these practices may be a factor maintaining their use. Nonetheless, school psychologists should examine which factors are relevant to their own or their school’s/district’s practice. Fourth, school psychologists review relevant methods for extinguishing the practice in the system. Here, school psychologists may consider other EBPs in mental health that can replace the current practice (see phase 6 below). Alternatively, if the non-EBP is being used by another practitioner, school psychologists can use consultative and collaborative skills to help the practitioner understand the problem and support them in moving toward a solution.

The fifth phase requires school psychologists to choose how to best address the factors that are currently maintaining the non-EBP. For example, school psychologists may consider engaging in professional development on current EBPs in mental health services to inform their own practice. This professional development could involve reviewing databases such as What Works Clearinghouse (Institute of Education Sciences, n.d.) or the Substance Abuse and Mental Health Services Administration’s (SAMHSA) Evidence-Based Practices Resource Center (SAMHSA, n.d.), reading NASP Best Practices chapters, or attending conferences (such as the NASP annual convention).

School psychologists execute the de-implementation method experimentally in phase 6. School psychologists would then de-implement their current practice and replace it with an EBP For example, lunch bunch groups often focus on promoting social skills. Many evidence-based social skill intervention programs exist that could be implemented (e.g., Second Step, PATHS, Positive Action Program; Frey et al., 2014). For use of board games in individual counseling, the presenting problem for the student would need to be considered before a replacement EBP could be selected. As an example, if the child is demonstrating symptoms of anxiety, the school psychologist may consider a program such as Coping Cat (Kendall & Hedtke, 2006).

In the seventh and eighth phases, school psychologists evaluate the effectiveness of the de-implementation method and collect evidence of improved outcomes from the successful de-implementation. They may do this by gathering data to determine the effectiveness with the client after removing the current practice and replacing it with the EBP. These data would likely be collected on an individual basis and could include measures such as self-report rating scales, teacher or parent rating scales, number of office referrals, or even progress toward IEP goals. Lastly, they would determine the next practice for de-implementation (e.g., EBPs for behavioral disorders such as ADHD).

Unlike in the previous example, de-implementation of generic or non-EBPs in mental health and implementation of EBPs can likely be implemented more easily by school psychologists. Because school-based mental health services are often considered a school psychologists’ area of practice (Shernoff et al., 2017), the decision of what strategies to use in individual or group counseling sessions is more likely to be left up to the school psychologist. Certainly, after successful individual de-implementation, school psychologists may broaden their scope to a systems-level de-implementation with other mental health service providers in the building or district or with tier 1 preventative programs. But this approach has the benefit of more immediate effects which can be used to support broader de-implementation efforts. This may be an appealing entry into de-implementation, then, as school psychologists can more easily engage in the process without some of the institutional barriers a systems-level approaches will encounter.

Discussion

Suggestions for De-Implementation to Support Implementation Practice in School Psychology

One barrier to incorporating de-implementation science in school psychology research, practice, and training is the lack of knowledge and competence that school psychologists have in this area (Forman et al., 2013). As asserted by Sanetti and Collier-Meek (2019), we can address barriers as a field by providing additional training in graduate programs and support for both school psychology researchers and practitioners) on de-implementation and implementation science as well as training in EBP, as school psychologists report inadequate knowledge in the areas of EBP in reading and mental health services (Forman et al., 2012; Nelson & Machek, 2007). Generally, school psychology training programs have a stronger emphasis on training in evidence-based assessment than evidence-based intervention through course work and practicum experiences, suggesting that this is an area of growth for training programs in order to support changes in de-implementation. In practice, many school psychologists fill the narrow assessment-focused role of psychometrician. However, role expansion has broadened the scope of practice for school psychologists in many places (Fagan & Wise, 2000). Central emerging tasks for school psychologists include using data in decision-making, advocating for systems-level improvements, and consultation, through which we can support effective de-implementation and implementation of EBP.

In addition to training in EBP, then, school psychologists may also benefit from training surrounding data-based decision-making and research and EBP that specifically hones in on critical evaluation of their local school practices. De-implementing non-EBPs and replacing them with EBPs requires that school psychologists intentionally develop and utilize scientific skills to critically evaluate their practice and research, including awareness of cognitive bias in their own practice and attention to warning signs of pseudoscience (Lilienfeld et al., 2012). Graduate training programs may consider emphasizing practice-based research skills, such as action research (Song et al., 2014), which involves a “systematic inquiry into a practice problem with the goal of improving it (and their school clients) by implementing actions” (p. 257).

As noted above, de-implementation and implementation often take place at the systems-level. As a result, school psychologists must take into account significant organizational and cultural considerations and complexities when encouraging systems to de-implement non-EBPs and implement EBPs. Nonetheless, the continued calls for advocacy as a core role and function of school psychologists (Oyen & Wollersheim-Shervey, 2019) press current practitioners and trainers of future practitioners to actively advocate for EBP to promote student achievement and well-being. Another emerging role that many school psychologists fill is to provide staff development and in-service training. This is an opportunity for school psychologists to use their consultative skills to train staff in basic scientific literacy, to support de-implement practices, and to train them in implementation of an EBP (Shaw, 2021). Consultative skills can be honed to develop staff, administrator, and other stakeholder support as well as to provide good training and ongoing support (Forman et al., 2014).

Although researchers have a good understanding of evidence-based practices in their field, they likely have less knowledge regarding the practices that are actually implemented in schools and barriers to successful implementation. In research, then, school psychologists should collaborate with schools to study the practices that are currently in place and potential steps to remove barriers at the systems and individual school level. Researchers would be equipped, given this knowledge, to work through the first two stages of the de-implementation process: identifying the prevalence of practices as well as documenting the lack of evidence of some of these practices (and therefore a need for de-implementation), as well as the additional steps to move EBPs forward in schools. All of these tasks center on the basic problem-solving model. It is up to school psychologists, whether they are practitioners, trainers, or researchers, to identify and define the problem (non-EBPs), design and implement interventions (de-implementation and implementation), and to re-evaluate to determine if the problem has been adequately addressed.

Further Directions

School psychology has played an important role in several areas of implementation change in school systems, most notably in implementing RTI and MTSS. However, as a field, we have taken a less active role in de-implementation, which has limited the effectiveness of our implementation gains. The difficulty in de-implementing ineffective reading instruction and mental health services in schools provide an example of this gap. It is important for school psychologists to take a more active and intentional role in closing the research-to-practice gap to promote improved educational outcomes for all students. One way to fill this role is by supporting de-implementation in conjunction with implementation research science in schools. As Forman et al. (2013) suggested, in order to support school-based implementation science, school psychology researchers and practitioners will need to receive training in implementation science and to create stronger collaborative relationships with the school boards. Forman et al. (2013) do not mention de-implementation practices; however, it is necessary to also attend to them in order to effectively implement evidence-based practice. Given their training in data-based decision making and consultation, school psychology researchers and practitioners engaged in local research fill the unique position to conduct this research and to interpret and disseminate these results on the ground floor. Moving forward, school psychology researchers should look to understand de-implementation science as it relates to the field of school psychology, how de-implementation is practiced in applied settings, and its impact on students.