Key words

Systematic and dynamic continuous improvement processes are foundational to high-quality school mental health programs that maximize impacts for rural youth (Owens, Watabe, & Michael, 2013). The need for effective and efficient school mental health (SMH) programs has been well documented (Armbruster, Gerstein, & Fallon, 1997; Flaherty & Weist, 1999; Jennings, Pearson, & Harris, 2000; Nabors & Reynolds, 2000). This scholarship highlights persistent gaps in mental health services for children and adolescents, the negative impact of those gaps on short- and long-term outcomes among youth, and the promise of SMH programs in addressing gaps and improving outcomes for all young people (Armbruster et al., 1997; Diala et al., 2001; Nabors & Reynolds, 2000; Weist, Myers, Hastings, Ghuman, & Han, 1999). A growing body of evidence suggests the need for high-quality and effective SMH programs may be particularly acute in rural communities . Research points to higher prevalence rates for some mental health outcomes among rural youth (Eberhardt et al., 2001; Havens, Young, & Havens, 2011) along with barriers to accessing mental health services, including increased stigma, that reduce mental health service utilization in rural populations (Calloway, Fried, Johnsen, & Morrissey, 1999; Gamm, Stone, & Pittman, 2010; Holzer, Goldsmith, & Ciarlo, 1998; Owens et al., 2013). Responding to these needs, a limited but growing body of research supports the positive impacts of rural SMH programs (Evans, Radunovich, Cornette, Wiens, & Roy, 2008; Morsette et al., 2009).

While limited research may have slowed advances in rural SMH, a literature on best practices for rural SMH programs is emerging (Owens et al., 2013). Primary among these are best practice processes for planning, implementing, and improving rural SMH programs. Using data for quality assurance and improvement purposes is recognized as a critical feature of effective and efficient SMH programs ( Barrett, Eber, & Weist, 2013; Wandersman, 2003; Wandersman et al., 2008; Weist et al., 2005). The rapid growth of processes, models, systems, tools, measures, and technology specifically intended to facilitate improvement in SMH programs illustrates the importance of effective data use to plan, implement, evaluate, and improve programs and services within the field (Barrett et al., 2013; Horner, Sugai, & Anderson, 2010; National Center on Response to Intervention, 2010; Wandersman, 2003; Wandersman et al., 2008) . The knowledge and resources generated through this work can inform best practices for improvement in rural SMH programs. To maximize this potential, key stakeholders in rural SMH must be equipped to navigate the complexities of continuous improvement in school and school mental health.

Drawing upon the rich and evolving work on continuous improvement in SMH, we aim to provide the rural SMH workforce and other stakeholders with background, knowledge, and resources to plan, implement, and improve rural SMH programs. First, we briefly describe the current context of continuous improvement in SMH programs. We then highlight several “best practice” elements of an improvement system for SMH through a description of tiered systems of support widely implemented in schools. A case study illustrates how these best practices might be applied to a hypothetical rural school mental health program. Finally, we discuss key contextual factors of rural SMH programs that create barriers and opportunities for continuous improvement with a focus on local and broader implications for practice, policy, and research.

The Context of Continuous Improvement in SMH Programs

Over the last 15 years, federal policy and funding priorities reshaped the educational landscape by emphasizing the use of evidence to make educational decisions, a reliance on research-based educational practices, and accountability for schools based on student performance outcomes (Every Student Succeeds Act, 2015; Individuals with Disabilities Act, 2004; Mandinach, 2012; No Child Left Behind, 2002). The impacts of this cultural shift in data use are apparent in the daily practice of school personnel who participate in data teams, access student data via school-based databases, visualize those data using data walls or data dashboards, engage in data-based decision-making, and select best practices based on specific and measurable goals framed as student learning outcomes. Local SMH programs are inexorably linked to school culture, while broader erudition on “what works” in SMH evolved during this era of accountability in schools.

The importance of data use and evaluation is echoed across disciplines in SMH. This can be seen in the standards set forth by the fields of school social work, school psychology, and school counseling. The National Association of School Social Worker’s Standards for School Social Work Practice (2012), the American School Counselor Association School Counselor Competencies (2012), and the National Association of School Psychologists’ Model for Comprehensive School Psychological Services (2010) all identify data use as a key competency in SMH practice and require practitioners to be able to collect, understand, and use data in their work. While a comprehensive review of current trends in school improvement is beyond the scope of this chapter, facilitating continuous improvement in SMH programs is clearly embedded within the broader context of data use in schools.

In defining ten key principles for Best Practice in Expanded School Mental Health , Weist et al. (2005) highlighted the importance of using data to continuously evaluate and improve school mental health programs: “Quality assessment and improvement activities continually guide and provide feedback to the program” (Principle 5, p. 9). This principle reflects a shift within the broader school context and the field of education toward performance improvement processes more common in other industries such as healthcare and manufacturing (APQC, 2014; Park, Hironaka, Carver, & Nordstrum, 2013). These cross-industry influences are evident within the realm of school improvement and reform, including the philosophy, methodologies, and scholarship of Improvement Science (Bryk, Gomez, Grunow, & LeMahieu, 2015).

Planning or improvement cycles are essential parts of data-driven decision-making originally developed in the for-profit world and now frequently used in education and social services (Lozeau, Langley, & Denis, 2002). In essence, planning or improvement cycles are change-management processes wherein groups of individuals or stakeholders work together to develop a defensible plan for addressing mutually agreed upon needs or demands within a community or organization. Planning and improvement cycles have some paradoxical characteristics; they are structured yet flexible, they are linear yet iterative, and they are simple yet complicated.

At first glance, the simple, structured, linear nature of all planning or improvement cycles is apparent. Each is expressed in terms of simple steps or stages from the most basic PIE (Plan, Implement, Evaluate; Flaspohler et al., 2003; Wandersman et al., 2003) and PDSA (Plan, Do, Study, Act; Deming, 1982) to the ten steps of GTO (Getting to Outcomes; Chinman, Imm, & Wandersman, 2004).

Embedded in each simple step lies the complex process of completing the step; as Wandersman has said, improvement cycles are common sense but not common practice (Fetterman & Wandersman, 2005). For example, planning necessarily entails identifying goals, objectives, processes, outcomes, impacts, and measurements. Each of these steps requires specific skills and abilities to complete. Like other organizations, schools often lack the readiness or capacity to use planning or improvement cycles as a part of the school mental health program (Flaspohler, Meehan, Maras, & Keller, 2012; Maras, Splett, Reinke, Stormont, & Herman, 2014). For example, insufficient training and support, limited resources, and competing priorities are all barriers to planning, implementing, and evaluating school mental health programs (Lendrum, Humphrey, & Wigelsworth, 2012). Most proponents of any particular planning or improvement cycle affirm that organizations should modify existing structures of the planning process to meet their own organizational needs and desires. The iterative or cyclical nature of these steps are often expressed through circular depictions emphasizing that the steps are intended to be repeated as part of an ongoing cycle of continuous improvement.

Given gaps in schools’ capacity complex frameworks or models for data use in school mental health may overwhelm schools, particularly rural schools that struggle with fewer resources and different challenges in terms of staff capacity and turnover (Maiden & Stearns, 2007). Before reviewing common frameworks for data use current used in schools in more detail, we first describe how an improvement cycle is activated by an interdisciplinary team using diverse sources of data. Fundamentally, interdisciplinary teams use diverse sources of school data as part of ongoing improvement cycles to plan, implement, and evaluate effective school mental health programs.

Problem-Solving Teams

There is widespread agreement that interdisciplinary planning teams are an essential component of effective school mental health programs (Anderson-Butcher & Ashton, 2004; Markle, Splett, Maras, & Weston, 2014). The basic premise of such teams is that engaging diverse stakeholders in shared decision-making results in better programs and outcomes for students. Many schools have some kind of school mental health team that serves this function to some degree even as teams are multi-purposed, include diverse members, and have different titles (Nellis, 2012). While certainly not exhaustive, such a team may be called: Care Team, Student Intervention Team (SIT) , Problem-solving Team (PST) , Student Assistance Team (SAT) , pre-referral intervention team (PIT) , PBS team (referring to Positive Behavior Supports, discussed in more detail below), RtI team (referring to Response to Intervention, discussed in more detail below), interagency team, System of Care team (or SoC team) , or wellness team (Burns, Peters, & Noell, 2008). While differences in terminology can be challenging for nonschool personnel, these teams share many common characteristics. We use the term problem-solving team (PST) throughout the rest of this chapter.

A best practice in the continuous improvement of school mental health programs is that schools should have a PSTs (Markle et al., 2014), and many schools use these teams (Algozzine, Newton, Horner, Todd, & Algozzine, 2012; Nellis, 2012). PSTs are interdisciplinary and have a variety of functions ranging from student referral and evaluation to planning service delivery to implementing evidence-based practices to enacting systems change (Bahr & Kovaleski, 2006; Bahr, Whitten, & Dieker, 1999; Nellis, 2012).

Research suggests PSTs produce benefits at both the student and school levels. At the student level, PSTs are associated with increased student attendance and academic achievement (Oppenheim, 1999), less student misconduct (Smith, Armijo, & Stowitschek, 1997), and fewer referrals for special education evaluation and placement (Kovaleski & Glew, 2006). Further, in a review of nine studies examining the effectiveness of pre-referral intervention teams, Burns and Symington (2002) found an overall effect size of 1.15 for student outcomes like time on task, task completion, scores on behavior rating scales, and observations of target behaviors. At the school level, research findings suggest a 0.90 effect size for system outcomes including referrals to special education, new placements in special education, percentage of referral diagnosed with a disability, number of students retained in a grade, and increase in consultative activity by school psychologists (Burns & Symington, 2002). Overall, it appears PSTs are associated with positive outcomes for both students and schools.

Research suggests that in order to effectively improve student outcomes, PSTs must closely follow evidence-based problem-solving procedures (Kovaleski, Gickling, Morrow, & Swank, 1999). Given the importance of adhering to evidence-based practices, it is important to consider best practices for PSTs, with a specific focus on the ways in which teams engage in the problem-solving process. Based on an extensive review of the literature and years of field-based experience, Markle et al. (2014) suggest the following best practices for PSTs: secure teacher and administrator buy-in regarding the importance and benefits of PSTs; recruit an interdisciplinary team who have clearly defined roles; clearly and collaboratively decide upon the purpose of the PST and the procedures the team will follow (i.e., how often the team will meet, ground rules, agendas); use a systematic planning or improvement cycle, often referred to as a “problem-solving process”; and ongoing professional development with a specific focus in data-based decision-making, sharing practice, and evaluating team progress and effectiveness.

Although all of these best practices are important, one practice in particular is worthy of further consideration: use of a systematic problem-solving process. This process is critical to the effectiveness of a PST because it dictates how and when decisions are made; having a clear problem-solving process is associated with greater team satisfaction as well as success in generating useful, step-by-step intervention plans (Chalfant & Pysh, 1989; Safran & Safran, 1996). To be clear, these processes mimic the planning or improvement cycles described above, but a unique research base for PST processes exists. There are a number of best practices for the problem-solving process: emphasizing the use of data-based decision-making and evidence-based interventions (Doll et al., 2005), emphasizing problem-solving rather than problem identification (Burns et al., 2008), being efficient with time (Doll et al., 2005), defining problems in measurable terms (Safran & Safran, 1996), and exploring various options (Etscheidt & Knestling, 2007).

One model for the problem-solving process includes five steps : (1) identify and describe the problem, (2) analyze the problem, (3) develop a plan/possible solution, (4) implementation, and (5) evaluation (Flaspohler, Ledgerwood, & Andrews, 2007). In step one, it is suggested that PSTs being by developing a clear, objective, and measurable description of the problem. In step two, PSTs should analyze the problem using relevant data in order to determine the source of the problem. Further, if needed, they should gather more data. This step should end with the generation of a hypothesis statement about the problem. In step three, PSTs focus on developing a plan by selecting a measurable goal, determining specific and feasible strategies, interventions, and/or supports, deciding on how progress will be monitored, and delegating responsibilities for implementation. In step four, PSTs move to implementing the plan they outlined, collect data on fidelity to strategies, interventions, and/or supports, and monitor progress. In the final step, PSTs evaluated the progress monitoring data and fidelity to the intervention plan and determine next steps based on these data. This problem-solving process can be used by any interdisciplinary team to strategically use data to plan, implement, and evaluate supports for students. Given the obvious importance of data within a data-based decision-making model, the following section reviews types of data commonly found in schools , as well as other data drivers in schools.

School Data

Schools use a variety of data sources to inform decisions about student’s academic and behavioral health needs as well as school climate. In the following section, we will discuss these various sources of data and their specific uses in schools. Particular attention will be paid to how rural school collect and use these sources of data.

Academic Data Used in Schools

In terms of academic data , most data is connected to individual student performance and is used to improve students’ academic program and performance. Academic data might include assessment results, grades, work samples, and teacher report. For instance, researchers sampled seven school districts and found that most teachers reported using assessments to both monitor student progress and as a way to improve student scores on state tests (Shepard, Davidson, & Bowman, 2011). Specifically, teachers in these school districts used assessment results to determine what material to re-teach their students certain material and as a way to data to guide classroom instruction (Means, Padilla, Debarger, & Bakia, 2009).

In general, research suggests that the effective use of data impacts student performance. For example, in a study that involved three rural schools, researchers found that data was used to help children with disabilities reach their highest potential (Kerr, Marsh, Ikemoto, Darilek, & Barney, 2006). In the study, the teachers made decisions about which topics to cover at parent teacher conferences and how students with disabilities should be taught. Another school in Milwaukee used reading scores to hire specialists and offer help to students who were struggling with reading (Mason, 2002). This particular school collected data to ensure they had the appropriate staff to tutor students who were struggling in reading. A literature review of rural schools’ implementation of data-driven intervention framework showed that of the eleven studies examined, all studies demonstrated enhanced academic achievement for at-risk students (Dexter, Hughes, & Farmer, 2008). In general, there is evidence to suggest meaningful use of data may improve academic performance and student success.

Nonacademic Data

Data use in schools is not strictly relegated to academic data; schools also collect numerous other types of data. For example, attendance data is often collected in schools, even in schools without a strong culture of data use. This is not surprising considering that poor attendance has been linked to lower academic achievement (Balfanz & Byrnes, 2013) and increased risk of drop-out (Rumberger & Thomas, 2000). In addition, many states base a portion of school funding on student attendance (Epstein & Sheldon, 2002).

Office discipline referral (ODR) data is commonly used in schools to make decisions about behavioral interventions (Elliott, 2008; Fuchs & Fuchs, 2006; Sandomierski, Kincaid, & Algozzine, 2007). One of the most efficient ways to collect ODR data is via the School-Wide Information System (SWIS) program , an online database that school staff can easily access (Irvin et al., 2006). For each incident, school staff input the name of the referring teacher, the name of the student, time of day the incident occurred, the nature of the incident, and the location of the incident. SWIS is capable of running different types of reports that range from exploring the rates of different types of problem behaviors that result in ODRs to examining schoolwide ODR patterns to looking at individual students’ ODR rates (Irvin et al., 2006). In their study, Irvin et al. (2006) collected data from school personnel at 32 elementary and middle schools that use the SWIS program to log ODR data. They found that not only did the schools survey enter ODR data into SWIS regularly, but they also actively used this data to inform decision-making. Specifically, schools reported using ODR data/SWIS reports to help with early identification of student problem behavior, identification of specific problem behaviors schoolwide, and development and/or problem-solving of behavioral program interventions.

SWIS is just one type of database for ODR data. Other studies have asked schools to develop their own databases to house ODR data and have found that these also provide disaggregated data regarding the average number of ODRs per student, average number of ODRs per day, and the proportion of students with both one or more and ten or more ODRs (Sprague, Sugai, Horner, & Walker, 1999). It is important to note, however, that ODR data is not the only important type of behavioral data to collect from students. Studies have shown that ODR data consistently fail to capture students experiencing internalizing problems (McIntosh, Campbell, Carter, & Zumbo, 2009; Sandomierski et al., 2007). However, few screening or identification measures of internalizing symptoms have been investigated or explored (Sandomierski et al., 2007).

Many states have developed systems that help schools collect data. For instance, the state of Ohio provides another way to collect ODR data with their Education Management Information System (EMIS) . All incidents that require disciplinary action are entered into EMIS. School districts can then use this data to identify which students are in need of further behavioral services. Ohio also requires all community mental health providers who deliver Medicaid-funded school-based mental healthcare to collect student data before and after the intervention using the Ohio Problem, Functioning, and Satisfaction Scales (Ohio Scales). These scales incorporate self-, parent-, and teacher-reports of student behavior, and when they are used as pre- and posttest measures, can provide useful information about the effectiveness of mental health interventions. Using EMIS and Ohio Scales data are good examples of how schools can use preexisting data to help students in need.

EMIS is only one example of how states help schools collect data. Researchers from the Mid-Atlantic Regional Education Laboratory investigated how Arkansas, Texas, Florida, and Virginia support local data use (Gottfried, Ikemoto, Orr, & Lemke, 2011). Overall, researchers found that all four states implemented three types of policies/practices to enhance local data use. First, they found that each of these states created an electronic data warehouse to house various kinds of data from schools statewide. These warehouses have the capacity to house both school-level and state-level data (i.e., attendance rates per school and performance on statewide tests). Second, researchers found that each state provides teachers, administrators, and principals with access to their individual schools’ or districts’ data. In Arkansas, a rural state, school staff have access to school-level data about demographics (e.g., language spoken at home, dropouts and withdrawals listed by race/ethnicity, enrollment by race/ethnicity, graduation rates by gender and race), school personnel (e.g., teacher experience, teacher certifications, district years of experience for staff), and school finances (e.g., building losses, poverty index, property values). Finally, all four states are also committed to building local capacity to analyze and understand data. States accomplish this goal through offering professional development trainings in how to understand and analyze data to teachers and other school staff statewide. Both Arkansas and Virginia offer web-based professional development to increase accessibility to training statewide. Overall, this study exemplifies how states can play a critical role in helping individual schools use data.

Many high schools nationwide use the Youth Risk Behavior Surveillance System (YRBSS) to collect behavioral health data on their students. The YRBSS was developed by the Centers for Disease Control and Prevention (CDC) in order to assess the following six categories of health risk behavior in adolescents: behaviors that contribute to unintentional injuries and violence, risky sexual behaviors, alcohol and drug use, tobacco use, unhealthy dietary behaviors, and inadequate physical activity (CDC, 2015). The YRBSS is administered to high school students nationwide every 2 years. In addition to surveying students, the CDC also surveys school staff to better understand the types of school health policies in place at their schools. The CDC uses this data to create School Health Profiles (Foti, Balaji, & Shanklin, 2011). The data is not only used by the CDC, however. It is also used by state, territorial, and local agencies (like schools) to create change at the local level (CDC, 2015).

After conducting interviews with representatives from state and local agencies around the country, Foti et al. (2011) found that states disseminate YRBSS data to localities in a variety of ways ranging from posting data on their websites to delivering the data directly to principals and teachers statewide. At a more local level, states also use YRBSS to inform professional development programs for teachers. States not only used the YRBSS survey data to identify locations with high-risk populations for certain issues, but they also use the School Health Profiles data to determine which schools were in need of more policy and curriculum development. This enabled states to target the populations in greatest need of services. Researchers also found that this data was useful in seeking funding for different programs. Taken together, these results indicate that both academic and behavioral data are collected and used by schools to inform decisions about student care.

Other Data Drivers in Schools

While tiered frameworks drive data collection and use in many schools, there are also other data drivers in schools. For instance, the federal government requires data collection and use as part of the special education referral and evaluation process for all schools (IDEA, 2004). Schools that have student assistance programs likely collect data as part of the process. Additionally, school personnel may collect data related to their own programs. For example, school counselors may collect data about student response to a mental health intervention.

Special Education

The special education referral and evaluation process requires data collection and use in schools. Prior to special education referral, school staff members have likely collected data related to the students’ academic performance which demonstrates educational difficulty (Center for Parent Information and Resources, 2016). This data may include but is not limited to grades, performance on standardized measures (e.g., CBM assessments, content specific developmental assessments, state assessments), and work samples. After referral, the student may be formally evaluated for special education services. Depending on the suspected disability, the following data collection procedures may be used: norm-references standardized assessments (e.g., intelligence assessments, academic achievement assessments, and behavior scales), parent, teacher, and child interviews, assessments of hearing, vision, fine, and gross motor skills, and observations.

Student Assistance Programs

Student Assistance Programs (SAP) are defined as school-based programs designed to address social, mental health, emotional, and academic needs of students (Torres-Rodriguez, Beyard, & Goldstein, 2010). Initially SAPs developed in response to student substance use, but have expanded to address a broader scope of student concerns that impede academic performance. Although SAPs can vary greatly, most depend on student referral and analysis of student referral data. The data is then used to develop an action plan for the student. As with the tiered models, once the plan has been implemented additional data should be collected to determine plan effectiveness and student growth (Torres-Rodriguez et al., 2010).

School Personnel Specific Data

School personnel also sometimes collect discipline specific data. For instance, in the American School Counselor Association (ASCA) National Model (2012), data use is considered one of the eight core skills of a professional school counselor. Research shows that school counselors often document the number of guidance lessons taught, teacher and parent consultations, and student contacts (Studer, Oberman, & Womack, 2006). More recently school counselors have begun collecting program evaluation data in order to determine if their services were impacting student performance (Studer et al., 2006; Young & Kaffenberger, 2011). Additionally, school nurses also collect and use a host of data including health screening data, immunization records, and student visits reports (Bergren et al., 2016).

Tiered Response Models

A variety of systems have been developed to help school personnel work together to organize and provide a continuum of student supports within a planning and ongoing improvement cycle. These tiered response models share common characteristics including alignment with fundamental public health concepts (Vaughn, Wanzek, & Fletcher, 2007). As summarized by Miles and colleagues (2010) in their presentation of a public health approach to children’s mental health, these concepts include: a population focus, an emphasis on promotion and prevention within a continuum of supports, addressing determinants of health, and engaging in a systematic implementation process (see Table 3.1, p. 40). These concepts emerge differently through unique tiered response models within schools, but the foundational link to public health is evident regardless of domain or context. These systems focus on all students (population focus); seek to optimize health, prevent problems from developing, and addressing problems that do develop (continuum of supports); attend to context and systems (determinants of health); and adhere to a systematic planning or improvement process (process). Regarding this last component, all tiered response models hinge on interdisciplinary teams using data within ongoing improvement cycles. Universal screening provides data that is used to identify individuals in need or services and additional data is collected to track the progress of those receiving targeted services (Vaughn et al., 2007).

Tiered response models are often visualized as the basic public health triangle (Vaughn et al., 2007) that includes three tiers of support: Tier I (primary prevention and promotion supports for all students), Tier II (secondary supports for students needing more help), and Tier III (tertiary or targeted supports for students needing the most support). These tiers encompass the continuum of supports provided by school mental health programs (Adelman & Taylor, 2010). While models are conceptually and functionally similar, these approaches have been criticized for being developed and delivered within silos resulting in fragmented supports for students across academic, behavioral, and mental health domains (Eagle, Dowd-Eagle, Snyder, & Holtzman, 2015; Maras et al., 2014). To address this problem, generic language like “tiered response models” or the more popular “Multi-tiered Systems of Support” or MTSS (Gamm et al., 2012) are often used in an attempt to be more inclusive. We use the term “Tiered Response Model” throughout this chapter unless referencing a particular approach. Below we describe three-tiered response models commonly used in schools: Response to Intervention (RtI) , Positive Behavior Interventions and Supports (PBIS) , and Interconnected Systems Framework (ISF) .

Response to Intervention

RtI is often associated with academic intervention, although recent literature shows that behavioral interventions can be integrated into the RtI framework (Bohanon, Goodman, McIntosh, & Talk, 2009). For the purposes of this chapter, only the academic applications of the RtI framework will be discussed. RtI was popularized by the 2004 Individuals with Disabilities Education Improvement Act as a method for both ensuring that schools are meeting students’ needs and identifying students with special academic needs (Fuchs & Fuchs, 2006; Sandomierski et al., 2007). In RtI, Tier 1 is concerned with both providing evidence-based academic instruction to all students in the school and screening all students to assess their academic needs in hopes of preventing the development of academic problems and identifying students in need as soon as possible. The framework calls for students to be assessed at baseline and then have their progress monitored throughout the school year. Curriculum-based measurement (CBM) , which is a brief (less than 5 min) assessment of a student’s performance in either basic skills or content knowledge, is the most common academic universal screening and progress monitoring tool (Ball & Christ, 2012).

School staff members use the collected assessment data to identify students who are in need of further services. Students who are identified as in need of services move into Tier 2. In Tier 2, groups of struggling, at-risk students receive targeted evidence-based intervention. Students receiving Tier 2 supports are progress monitored to determine if they are benefiting from this intervention. If students are found not to benefit from Tier 2 intervention, they are moved into Tier 3, individualized intervention. Again, this intervention is evidence based, but rather than receive the intervention in a group setting, students are given a personalized treatment plan that is tailored for the student. At this point, students are often receiving services from special education staff (Elliott, 2008; Fuchs & Fuchs, 2006; Sandomierski et al., 2007). Overall, the RtI framework aims to use data-driven decision-making to better identify and treat students with special academic needs.

Positive Behavior Intervention and Supports

PBIS follows the same data-driven, three-tiered framework as RtI, but uses behaviorally focused data to provide behaviorally focused interventions (Sandomierski et al., 2007). Before implementing PBIS, schools must first identify three to five schoolwide behavioral expectations that all students will be taught and expected to comply with (OSEP Technical Assistance Center on Positive Behavioral Interventions and Support, 2016) . Typically, a team of approximately ten school staff members attend a 2–3 day training where they decide upon the school expectations. The team will then create a matrix that details how these expectations look in non-classroom areas like the bathroom, hallways, gym, and cafeteria. Once these expectations are in place, the school can begin implementing the universal Tier 1 strategies. In Tier 1, all students in the school are taught the behavioral expectations. Additionally, school staff implement universal screening and continuous monitoring of student behavior. Many schools choose to use office discipline referral (ODR) data, as it is easily accessible.

Another type of behavioral data that has been suggested in the literature as a universal screener and progress monitoring tool is Direct Behavior Ratings (DBR) (Chafouleas, Kilgus, & Hernandez, 2009). Direct behavior ratings involve observing specified target behaviors and then rating those behaviors following a specified observation period. Students identified as “at-risk” through data analysis move into Tier 2. In Tier 2, similar to RtI, students receive targeted, evidence-based interventions. Often these interventions take the form of group therapy, mentoring, social skills groups, of check-in/check-out (CICO) interventions. Again, data is collected to monitor student progress, and students who do not respond to Tier 2 intervention are referred for more intense interventions at Tier 3. Students at Tier 3 often receive individualized treatment plans that are tailored to the specific students’ behavioral needs (Muscott, Mann, & LeBrun, 2008; Sandomierski et al., 2007). Ultimately, PBIS utilizes continuous progress monitoring and data collection to better meet students’ behavioral needs.

Interactive Systems Framework

Using the same three-tiered framework, the ISF integrates PBIS and School Mental Health (SMH) systems and, like three-tiered systems discussed previously, the ISF requires the systematic collection and use of data (Barrett et al., 2013). Since the model integrates PBIS, it is not surprising that universal screening and progress monitoring data is needed in the ISF. Additional data collection tools suggested in the ISF include rating scales, surveys, and/or interviews completed by students, caregivers, school staff and mental health providers. Additionally, direct observations of students’ behavior are encouraged (Maggin & Mills, 2013). Many of the tools encouraged as part of the ISF are also found in other three-tiered approaches, such as PBIS. However, the data collected in the ISF has both a behavioral and mental health focus. Additionally, the ISF encourages collaboration and data collection with mental health providers outside of the schools , which is not typically part of other behaviorally oriented tiered models, such as PBIS.

Case Example

This case example was informed by the authors’ collective experiences pertaining to working with, working in, working for, and attending rural schools. This example focuses on how a problem-solving team in a rural school can use data as part of an improvement cycle to plan, implement, and evaluate a school mental health program.

Lӓndlich Public Schools (LPS) is a K-12 school district serving about 175 students in a rural Midwest community. LPS’ single facility is centrally located in the most populated town (Lӓndlich) within the large geographic region served by the district. The school is supported by 22 full-time teachers and staff, as well as one school administrator, one school nurse, and one school counselor. Any additional support services required from the school are contracted out. Health and social services are primarily accessed via satellite clinics located approximately 20 min from LPS. There is no public transportation in this area. The Lӓndlich Community Center was recently renovated to accommodate larger events held by community groups in the area.

Last school year a freshman student at a neighboring school district committed suicide, prompting the local community mental health agency to develop and deliver several free workshops on adolescent suicide prevention for community members. LPS’ school counselor attended one of those workshops and then convened the school nurse, school administrator, and a high school teacher to discuss suicide prevention at LPS. Initial conversations identified a number of potential next steps, including additional professional development for teachers on warning signs of suicide or a schoolwide assembly with a guest speaker from the local community mental health agency. Members of this informal group quickly realized they did not fully understand the mental health needs of LPS students or current resources available to support students and their families. LPS’ administrator contacted a colleague from a neighboring district to learn more about what she was doing to address students’ mental health in her district. This colleague shared some information about their efforts to develop a student referral team and provide some training on mental health to teacher via free webinars.

At their next meeting, LPS’ administrator presented what he had learned along with a brief summary of LPS’ most recent state accreditation data on student achievement, attendance, graduate rates, and suspensions/expulsions. He also provided weekly attendance data collected by the school. The school counselor described the mental health supports she provided as part of her comprehensive school counseling program, and the school nurse reviewed some basic health service data. The high school teacher, who also coaches the football team, talked about benefits he observed from partnering older students with younger students as informal peer mentors. He suggested teachers would be open to additional training on mental health as long as it did not require much additional time beyond scheduled PD. They all reviewed a report produced by the state department of health and senior services on youth mental health needs in their county. The administrator shared that one school board member had recently voiced concerns about the school offering any mental health supports. The school nurse reminded the group about the high parent turnout at the family fun night the school organized last year. She recalled how much teachers had enjoyed participating in the event with their own families.

Based on their discussion, the group prioritized three goals for the next school year. First, they decided to formalize a school wellness team that could also serve the advisory functions for the school nurse and school counselors’ programs (e.g., wellness committee, school counseling advisory committee). They identified a parent who might be interested in joining their group and invited the third grade teacher to participate as well. The school administrator agreed to include small stipends for members of this team in his next budget request from the school board. Given the small size of their district, team meetings would be held before or after school outside of contract time.

They decided the group would meet monthly and that a smaller subgroup comprised of the school nurse, school counselor, and high school teacher would meet as needed to discuss individual student needs. This subgroup decided to adapt the forms the LPS administrator had received from his colleague with plans to evaluate the forms and their teams’ success after about 3 months. The school counselor agreed to contact the youth counselor at the local community mental health agency to learn more about how students and families could access services. Noting that teacher professional development was already planned for the next school year, the team agreed to offer specific recommendations for the next year and will brainstorm alternatives throughout the year.

Second, noting a significant drop in student attendance between February and May during the last school year, the team determined they needed to gather additional information from students and parents to identify reasons for the increased absences. They received strong support from the LPS school board including permission to collect data via a survey distributed during a football game. They plan to share their findings and possible solutions with the board. Finally, the group decided LPS’ next family fun night would have a mental health theme that focuses on strengthening families. The high school teacher agreed to learn more about free resources available online they might implement or share during the night. They identified three concrete goals for their family fun night and discussed ways to collect data to measure their success in reaching those goals.

Opportunities and Challenges for Rural School Mental Health Programs

With less funding and less experienced staff members, the systematic use of data within an improvement cycle is critical to the success of rural school mental health programs but perhaps even less likely to occur given the limited capacities often present in rural schools. Many rural schools experience below-average funding due to low income and low wealth of the school district residents (Maiden & Stearns, 2007). Low levels of funding contribute to rural districts’ inability to attract and retain qualified and experienced personnel (Provasnik et al., 2007). Discussed in detail in this text, school mental health programs may be an ideal way to address barriers to accessing affordable, acceptable, and appropriate mental health services for children and adolescents in rural communities (Owens et al., 2013); however, we believe the success of these programs hinges on their foundation in continuous improvement cycles.

Despite the availability of data, research shows that schools struggle with data use (Means et al., 2009; Means et al., 2010). Many school personnel are reluctant or unable to effectively use data due to lack of time and resources (Howley, Larson, Andrianaivo, Rhodes, & Howley, 2007; Kerr et al., 2006; Means et al., 2010). Teachers often have a great deal of work to do and the prospect of having to analyze and understand student data can feel overwhelming, especially when they have to do it on their own time outside of school hours. Due to smaller staff sizes, rural teachers may assume numerous roles with diverse job functions which further complicates the use of disparate data to improve performance. Time is clearly a significant barrier for all schools but perhaps even more so to rural schools.

Similarly, Owens et al. (2013) describe how school mental health professionals in rural communities are often required to be generalists in their practice out of necessity. The school counselor in the case example above must have broad expertise in providing a continuum of care for young people spanning an age range from 5 years old to adulthood. Further, a counselor working in a rural district may be the only mental health professional in a small community which may further stress her/his capacity. Employing an improvement cycle may streamline the development and delivery of mental health supports, but rural districts will have to make an extraordinary commitment to those processes to realize those benefits.

School personnel also often have limited capacity to interpret data. That is, teachers and other school personnel are not often trained in how to read reports or use data to make changes to their teaching practices (Kerr et al., 2006; Mason, 2002; Means et al., 2009). Research shows that professional development targeted at increasing both data use and capacity for data use has a positive impact (Means et al., 2010). For example, Robinson, Bursuck, and Sinclair (2013) collected and analyzed data related to two rural elementary schools’ implementation of RtI. A theme that emerged in the study was that staff professional development was essential in order to support data-based decision-making. That said, rural schools often have high staff turnover and struggle to attract and retain qualified and experienced personnel (National Center for Education Statistics, 2007). Staff training may be needed on a consistent basis in order to ensure all staff members remain abreast of best practices in data-use. Unfortunately, rural schools may have difficulty gaining access to high-quality professional development trainings where they would learn these crucial skills (Harmon, Gordanier, Henry, & George, 2007) due to limited funds and geographic isolation.

Even when schools are classified as high-data users, school staff members rarely use data to alter schoolwide policies and practices, or individual teaching styles (Means et al., 2010; Shepard et al., 2011). Even though school staff members say they are willing to use data to identify and fix problems with student behavior and performance a salient problem exists: transforming knowledge into practice. In a study about how schools use data from interim and benchmark mathematics assessments, Shepard et al. (2011) found that when students performed poorly on the assessments, teachers did not often question why they performed poorly or how they could improve the way they delivered the material, but instead simply re-taught the material before moving on to the next topic. It may be important for schools to go beyond using data to inform practice to using data to change practice.

Developing and supporting a culture of data use among school staff has emerged as one of the best ways to change teacher behavior related to data use (Howley et al., 2007; Means et al., 2009; Means et al., 2010). To do this, school leadership must take the lead and make data-use a school priority. At a basic level, it is important for schools to use trustworthy data that is specifically pertinent to their current needs. Both teachers and principals have cited the data’s believability as a barrier to data use. That is, they report not believing that the data is valid or accurate and that is does not appear to be useful (Kerr et al., 2006; Means et al., 2010). This is especially the case with data collected from statewide tests. In their survey of teachers and principals from three urban school districts, Kerr et al. (2006) found that both teachers and principals voiced concerns about statewide test data. They reported feeling like statewide interim tests do not accurately assess student ability, students do not perform up to the potential on them because they are not motivated to do so, and they do not provide individual- or classroom-level item analysis. Teachers and principals reported preferring classroom assessments and reviews of student work because they are more meaningful to the school. If data is perceived to be irrelevant and/or invalid, school staff are less likely to use the data to effect change . In addition to inherent believability issues, state assessment data is also difficult for schools to use because it only indicates which areas students struggle in, but does not provide insight into why students struggle in these particular areas (Shepard et al., 2011). This makes it difficult to use the data to effect change. Using state test data may be particularly difficult for rural schools since they often have small student bodies and their overall scores are vulnerable to outliers (Reeves, 2003). That is, it only takes a couple of very low or very high scores to skew the whole school’s data and paint an incorrect picture of the school’s effectiveness. If teachers and principals perceive data to be irrelevant or invalid, they will be less likely to use it to create change (Howley et al., 2007; Mason, 2002; Parke, 2012). Once reliable and valid data is collected, leadership may have to purchase or create a data-system that is easily accessible by all school staff.

It important for schools to create organizational structures that increase data use (Howley et al., 2007; Means et al., 2010). Creating taskforces, sometimes referred to as data teams or problem-solving teams, with both teacher and leadership representation is core to this infrastructure. The team works to first determine which problems they want to target and then determine what kind of data they need to assess the problem and create solutions. For example, McIntosh et al. (2013) examined 217 schools across 14 states that had implemented PBIS in order to determine factors that predicted PBIS sustainability. They found that team use of data predicted increased sustainability. School leadership must set aside time for teachers to review and discuss data in structured small groups. Schools can also hire or assign data support staff to be used as a resource for teachers who are having trouble analyzing, understanding, or translating their data (Kerr et al., 2006; Means et al., 2010). However, this might be an admitted barrier for rural schools as they often have small staff sizes, which may or may not include support staff trained in data use and may not have the funds available to hire additional staff (Robinson et al., 2013). Promoting effective data use for improvement clearly seems to be a particular challenge for rural school.

Conclusion: The Promise of Networked Learning

Improvement science (Bryk et al., 2015) has heavily influenced recent scholarship that focuses on how schools can “get better at getting better.” While not focused on school mental health per se, the foundational tenants of this work complement provisions of the Every Student Succeeds Act (ESSA) that allow states greater flexibility to reimagine assessment and accountability frameworks that move away from compliance regulation toward local ownership of performance results and professional responsibility for continuous school improvement (Darling-Hammond et al., 2016). Importantly, ESSA further requires states to include a nonacademic indicator of school quality of student success; school mental health experts are encouraging states and school districts to capitalize on these policy changes to integrate mental health as a key component of broader school improvement efforts (e.g., Adelman & Taylor, 2016). While promising, it is not yet clear how states will translate ESSA and if/how states will strengthen their existing infrastructure to help rural schools to enact any changes nevertheless those that might directly or tangentially affect rural school mental health programs. Regardless, there is a need for innovative methods to build capacity for improvement among rural schools to support effective school mental health programs.

Networked learning, achieved via networked improvement communities (NICs) focused on problems of practice specific to rural schools, may optimize districts’ response to shifting state and federal education policies. A NIC is a scientific learning community comprised of committed stakeholders with diverse and valued expertise organized to enact, sustain, and scale a collective impact approach to solving a specific problem of practice (Bryk, Gomez, & Grunow, 2011). Not dissimilar to the PSTs described above, these networks are distinguished by four essential characteristics: (1) shared focus on a specific aim (i.e., problem of practice); (2) guided by a thorough understanding of the problem, the system that produces it, and a theory of improvement relevant to it; (3) disciplined by a cyclical improvement research method used to develop, test, and refine interventions; and (4) organized to accelerate the use of this cyclical research method to produce and effectively integrate interventions into practice across distinct educational contexts.

Revisiting LPS, the hypothetical school district used in the earlier case example, may help illustrate the potential benefits of NIC membership for rural school districts. In that case, a community mental health agency offered workshops on adolescent suicide prevention. The school counselor attended one of these workshops which prompted local action by the LPS administrator and staff. School mental health professionals or administrators from other local districts could have also participated in one of the workshops, and these districts could have formed a NIC focused on suicide prevention . Each district could have pursued local actions with the support of other districts focusing on the same problem of practice, creating opportunities to leverage current knowledge across districts and perhaps pool resources to access additional mental health resources. Alternately, the community mental health agency could have facilitated connections at a regional level to increase the pool of shared knowledge from which each participating district could benefit. The NIC could create and support relationships among school mental health professionals who, as was the case for the counselor at LPS, are often isolated in rural schools. Similarly, district administrators could collaborate on a variety of broader systems issues such as funding, engaging parents and other caretakers in the work, accessing public mental health services, and/or addressing stigma in their communities.

The organization of multiple entities into a NIC creates different kinds of affordances for disseminating best practices and support their use, both in terms of creating a single point of access to and for practitioners, as well as establishing a system of support comprised of local experts working in similar contexts. This type of arrangement also facilitates inquiry at a broader level that could advance systems-levels innovation in education (Peurach, 2016). With regard to the hypothetical NIC described above, potential investigations might focus on the relative utility of different kinds of school data when planning universal mental health supports; child-level outcomes of school-based v. community-based mental health services; or the ideal balance between in-person, on-site, and web-based PD on mental health for teachers. Such scholarship could distinguish between rural, suburban, and urban settings to inform policy and funding decisions, thus strengthening the system necessary to facilitate effective rural school mental health programs.