Abstract
Principals’ efforts to support the implementation of interventions involve coordination among multiple actors in their social networks. However, less is known about how distinct features of these social networks are associated with principals’ perceptions of the social validity of interventions. In this paper, we used Neal and Neal (Implement Sci 14:16, 2019. https://doi.org/10.1186/s13012-019-0860-z) implementation capital framework to test hypotheses about the associations between two features of principals’ social networks (i.e., bonding and bridging social capital) and three aspects of social validity (i.e., acceptability, understanding, and feasibility). Specifically, we tested these hypotheses in a statewide representative sample of 180 Michigan secondary school principals supporting the implementation of early warning signs (EWS), a systems-level intervention to prevent student dropout. Consistent with our hypotheses, we found that bonding social capital was positively associated with acceptability and bridging social capital was positively associated with understanding. But, contrary to our hypotheses, bonding and bridging social capital were not associated with feasibility. Drawing on these findings, we discuss future directions for research and practice implications to improve principals’ perceptions of the social validity of mental health interventions.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Educators commonly face significant challenges integrating interventions with their daily practice in schools and encounter uncertainty about the adaptability of interventions (Barwick, Barac, Akrong, Johnson, & Chaban, 2014; DiGennaro, Martens, & McIntyre, 2005; McIntyre, Gresham, DiGennaro, & Reed, 2007; Sanetti & Kratochwill, 2009). These challenges often hamper the implementation of interventions, reducing their potential to improve student mental health and behavioral outcomes (Durlak & Dupre, 2008). As school leaders, principals establish the climate for and facilitate the implementation of interventions by their staff. Specifically, principals often set standards for implementation, provide implementation support to their staff, and convey knowledge about interventions (Locke et al., 2019). Given these multiple leadership roles, it is critical to find ways to improve principals’ perceptions of the social validity of interventions.
Chafouleas, Briesch, Riley-Tillman, and McCoach (2009) have discussed several aspects of an intervention’s social validity, including acceptability (i.e., the extent to which interventions are perceived as agreeable or fair), understanding (i.e., the extent to which interventions are easy to use), and feasibility (i.e., the costs of an intervention in terms of resources and time). These aspects of social validity are associated with characteristics of interventions, such as their complexity to implement and their compatibility with the educational system (e.g., Briesch, Briesch, & Chafouleas, 2015; Rogers, 2003). However, less is known about other factors that may make interventions socially valid for principals. Principals’ efforts to support the implementation of interventions involve coordination among multiple actors in their social networks including school staff, information brokers, program development, and support team members (Fixsen, Naoom, Blasé, Friedman, & Wallace, 2005; Saldana, Chamberlain, Wang, & Brown, 2012). Thus, the current study aims to address the following research question: Are principals’ social networks associated with interventions’ social validity? We explore this question within the context of Michigan public school principals’ implementation of early warning signs (EWS) to prevent student dropout. This multi-step, systems-level intervention encourages teachers and other school staff to use data to monitor student attendance, behavior, and grades, and to select programs and practices to improve students’ mental health, behavioral, and academic outcomes.
The remainder of the paper is organized into four sections. In the first section, we briefly review the literature on the implementation of EWS and its implications for school mental health, interventions’ social validity in the context of education, how social networks are important for the implementation of school mental health interventions, and Neal and Neal’s (2019) implementation capital framework. We conclude this first section by using this framework to derive hypotheses about the association between two features of principals’ social networks (i.e., bonding and bridging social capital) and three aspects of social validity (i.e., acceptability, understanding, and feasibility). In the second section, we describe how we collected data on social networks and perceptions of EWS’s social validity from a random sample of 180 principals in Michigan. In the third section, we test our hypotheses using a series of regression models, finding support for the hypothesized association between bonding and acceptability, and between bridging and understanding. We conclude in the fourth section with future directions for research and implications for leveraging principals’ social networks to improve the social validity of systems-level school-based interventions.
Background
Preventing School Dropout as a School Mental Health Intervention: The Case of EWS
Public schools are promising sites for the implementation of mental health interventions, which seek to improve students’ social, emotional and behavioral outcomes (Durlak & Wells, 1997). In the USA, public schools enroll approximately 50.7 million children and adolescents reflecting approximately 90% of all children and adolescents enrolled in school (Hussar et al., 2020). Thus, by providing mental health interventions in public schools, it is possible to reach the majority of US children and adolescents. Additionally, meta-analytic studies and reviews have linked school-based mental health interventions for children and adolescents to improved mental health outcomes (e.g., Durlak & Wells, 1997; Durlak, Weissberg, Dymnicki, Taylor, & Schellinger, 2011; Greenberg, Domitrovich, & Bumbarger, 2001).
Systems-level interventions that focus on improving children’s and adolescents’ mental health by changing aspects of the environment may be particularly well suited for dissemination through schools (Durlak & Wells, 1997). In the current study, we focus on understanding Michigan public school principals’ facilitation of the implementation of early warning signs (EWS), a systems-level intervention that aims to reduce adolescent school dropout. EWS encourages school staff to use data to identify students at risk of dropout, select and implement programs and practices for these at-risk students, and monitor the effectiveness of these programs and practices (Corrin, Sepanik, Rosen, & Shane, 2016; Faria et al., 2017; Mac Iver, Stein, Davis, Balfanz, & Fox, 2019; O’Cummings & Therriault, 2015; Rumberger et al., 2017). In this study, we are focused on EWS as an intervention itself, which occurs at the system level among school leaders and staff; we are not focused on the specific student-level programs and practices implemented as part of EWS. As of 2014–2015, 52% of public high schools reported implementing EWS (US DOE, 2016). This study examines Early Warning Intervention and Monitoring System (EWIMS), a specific form of EWS developed by the National High School Center at American Institutes for Research implemented in Michigan, which involves seven specific steps (see Table 1).
School dropout and school mental health are strongly linked because emotional and behavioral problems both predict the likelihood of, and may be exacerbated by, school dropout (Esch et al., 2014). First, emotional and behavioral problems predict the likelihood of school dropout. For example, students with disruptive behavior disorders such as conduct disorder (e.g., Breslau, Miller, Chung, & Schweitzer, 2011; Kessler, Foster, Saunders, & Stang, 1995), problems with substance abuse and delinquency (Battin-Pearson et al., 2000), and low self-concept, motivation, and school belonging (McWhirter, McWhirter, McWhirter, & McWhirter, 2012) are at higher risk of dropping out of school. Therefore, schools often select programs and practices that target the mental health of students flagged by EWS as exhibiting early warning signs of dropout. In a recent study of the use of EWS in Midwestern schools, 24.4% schools provided physical and mental health services to students identified at risk for dropout, 16% provided behavioral programs or practices, and 14% provided social emotional learning programs or practices (Faria et al., 2017). For example, a common student-level practice that targets mental health implemented as part of the EWS intervention is Check and Connect, where school commitment, problem solving skills, and persistence are fostered through a positive relationship with a mentor (Sinclair, Christenson, Evelo, & Hurley, 1998; Sinclair, Christenson & Thurlow, 2005). Second, students who do drop out of school are at increased risk of experiencing short and long-term mental health issues. For instance, school dropout predicts subsequent internalizing behaviors, suicidality, and substance use among adolescents or young adults (e.g., Lansford, Dodge, Petit, & Bates, 2016; Liem, Lustig, & Dillon, 2010; Szlyk, 2020) as well as substance use among older adults (e.g., Lansford et al., 2016). Therefore, systems-level interventions like EWS that reduce rates of school dropout are a key strategy for ameliorating youth mental health issues.
Finally, as a systems-level intervention, EWS is likely generalizable to other systems-level interventions, such as multi-tiered systems of support (MTSS), response to intervention (RtI), positive behavioral interventions and supports (PBIS), and other systems-level frameworks targeting mental health outcomes (e.g., Doll & Cummings, 2008; Wells, Barlow, & Stewart-Brown, 2003). In such systems, data are universally collected and analyzed, students that are identified as at risk are provided with programs and practices, progress is monitored to assess students’ response to intervention, and students with limited response are provided with increasingly individualized and intensive programs and practices (Batsche et al., 2005). Each system requires coordination among multiple stakeholders to implement their corresponding roles and responsibilities, facilitated by adequate resource allocation by a principal or other administrator with decision-making power (Mac Iver et al., 2019). As such, principals’ social networks may be consistently important for their perceptions of social validity across systems-level interventions.
Social Validity and Implementation
There is growing acknowledgment that achieving distal intervention outcomes like enhanced student mental health requires attention to proximal outcomes related to the implementation effort itself. For example, a review of over 500 youth prevention and wellness promotion interventions concluded that better implementation was associated with positive youth outcomes (Durlak & Dupre, 2008). The implementation of interventions can be improved by ensuring that the interventions have social validity. The construct of social validity has foundations in applied behavior analysis, which aimed to generate contributions that are socially important. The importance of those contributions would require validation by society with respect to the significance of the goals, appropriateness of the process, and importance of the effects (Baer, Wolf, & Risley, 1968; Wolf, 1978). Much of the subsequent work on interventions’ social validity focused on perceptions of an intervention’s acceptability, that is, perceptions of “whether treatment is fair, reasonable, and intrusive, and whether treatment meets with conventional notions about what treatments should be” (Kazdin, 1980, p. 259; see Eckert and Hintze (2000) and Finn & Sladeczak (2001) for reviews). More recently, Chafouleas et al. (2009) have expanded the scope of social validity beyond considerations of acceptability to also include aspects such as an understanding of the intervention and its feasibility in terms of time and resources.
These dimensions of social validity have most often been used descriptively to justify the use of specific Tier 1 or Tier 2 interventions. At Tier 1, these measures have been used to provide evidence of the social validity of universal classroom management interventions (e.g., Chafouleas, Sanetti, Jaffery, & Fallon, 2012; Collier-Meek, Fallon, & DeFouw, 2017) and nutrition education curricula (e.g., Izumi et al., 2015). At Tier 2, these measures have been used to provide evidence of the social validity of instructional practices (e.g., Neugebauer, Chafouleas, Coyne, McCoach, & Briesch, 2016) and more targeted behavior management practices like the Daily Report Card (e.g., Fabiano, Pyle, Kelty, & Parham, 2017; Sims, Riley-Tillman, & Cohen, 2017) and Check-In, Check-Out (e.g., Fallon & Feinberg, 2017; Kilgus, Fallon, & Feinberg, 2016).
Less often, studies examine social validity as an outcome, focusing on the predictive role of intervention characteristics (Briesch et al., 2015) or participation in formal intervention training (e.g., consultation calls; Jackson, Herschell, Schaffner, Turiano, & McNeil, 2017). For example, Briesch et al. (2015) found that elementary school teachers’ ratings of social validity differed across five different classroom management strategies. In particular, certain classroom management strategies like group contingency systems were more likely to be viewed as incompatible with teachers’ values and were rated lower in acceptability. Jackson et al. (2017) examined whether community-based clinicians were more likely to view parent–child interaction therapy as socially valid if they participated in consultation calls that provided additional opportunities for training. Interestingly, increased participation in these consultation calls did not improve the social validity of parent–child interaction therapy among clinicians.
Although prior work has examined predictors of interventions’ social validity among implementers (e.g., teachers, clinicians), examining predictors of social validity among organizational leaders is also important. Within the implementation science literature, organizational leadership is recognized as critical for setting a positive climate for implementation, promoting positive attitudes toward interventions, strategic decision-making around implementation, and the support of on-the-ground staff involved in implementation efforts (Aarons, Ehrhart, Farahnak, & Sklar, 2014). More specifically, within schools, principals play a central role as organizational leaders in their schools and spend a significant portion of their time engaged in leadership activities with staff such as communication, modeling, professional development, and facilitating collaboration that can support implementation efforts (Blase & Blase, 1999; Goldring, Huff, May, & Camburn, 2008; Locke et al., 2019; Lyon et al., 2018). In EWS, specifically, principals facilitate implementation by fostering a culture that supports the initiative, ensuring that staff meetings are scheduled and attended, and allocating human resources to data analysis and intervention implementation (Mac Iver et al., 2019). However, principals may encounter challenges related to data interpretation, time and monetary resources needed to support implementation, and communication with staff responsible for EWS implementation. The challenges associated with facilitating implementation may shape principals’ perceptions of the social validity of EWS.
Social Networks and Implementation
Beyond intervention characteristics and training, it is important to consider other factors that may be associated with interventions’ social validity. Specifically, because intervention implementation does not occur within a vacuum (Neal & Neal, 2019) and often involves interactions with individuals inside and outside one’s immediate social context (e.g., Aarons, Hurlburt, & Horwitz, 2011), it is important to consider social factors, especially social networks, as predictors of intervention social validity. Social networks are patterns of relationships (e.g., communication, advice, friendship) between a set of actors (e.g., individuals involved in the implementation of an intervention; Kornbluh & Neal, 2015; Marin & Wellman, 2011). In this study, we adopt a personal or ego network approach to assess principals’ own networks of advice seeking, as opposed to a whole network approach that might be used to assess a principal’s position within a broader (e.g., the whole school) network of others (e.g., Marsden, 2011). We focus on principals’ personal networks because the implementation capital framework described below, which we seek to test, contends that social validity is associated with the composition and structure of one’s own network.
Figure 1A provides a hypothetical example of an advice network that a principal might use to support the implementation of a systems-level intervention like EWS. In line with the Exploration, Adoption/Preparation, Implementation, Sustainment framework (e.g., Aarons et al., 2011), this principal seeks implementation advice from individuals in both the inner context of the school district and the outer context of individuals outside the school district. First, the principal seeks advice from a cluster of colleagues within the school (indicated by solid lines), including a teacher, curriculum director, and counselor. As staff working in the same school, these individuals are also likely to know and seek advice from each other and will all have detailed knowledge of the same local context. As a result, they are likely to provide similar and potentially redundant advice to the principal. Second, the principal also seeks advice from individuals outside the school (indicated by dashed lines), including a principal at another school, a researcher, and a member of a county technical assistance team. Because these individuals are not co-workers, they are unlikely to know or talk to each other, and each one has the potential to provide the principal with unique non-redundant information during multiple phases of the intervention. For example, the researcher who has studied a program or practice that was selected for students as a result of the EWS intervention (e.g., Check and Connect; Sinclair et al., 2005) may offer assistance on specific questions about implementation, the principal in another school may offer practical on-the-ground advice about how to develop consensus across stakeholders, and the county technical assistance personnel may offer help using online student data tools. As Aarons et al. (2011) note, these relationships in the principals’ inner and outer context play different roles in different phases of the implementation process including exploration, adoption, implementation, and sustainment. In this study, because we are focused on the EWS intervention, which had already been identified and adopted in Michigan schools, we focus specifically on the role that network relationships play during the implementation phase, during which principals are actively facilitating implementation in their schools.
The Implementation Capital Framework
Although social networks and social relationships are widely recognized as critical for implementation and intervention efforts in schools, the implementation capital framework suggests that specific features of social networks are associated with specific implementation outcomes, including specific dimensions of social validity (Neal & Neal, 2019). Figure 1B provides a logic model to illustrate how the implementation capital framework suggests that different types of network ties build different types of social capital, which in turn can enhance the acceptability, understanding, and feasibility dimensions of an intervention’s social validity.
First, when individuals have a circle of people that they can turn to for support, and who can also turn to each other for support, as the hypothetical principal in Fig. 1A does within their district, they have bonding social capital. A network with bonding social capital includes others who are also involved in implementation and can foster a sense that “we’re all in this together.” These others can provide reinforcement and strengthen positive norms around intervention implementation that increase perceptions of an intervention as “agreeable, palatable, or satisfactory” (Proctor et al., 2011, p. 67; Neal & Neal, 2019). Chafouleas et al. (2009) call this aspect of social validity, acceptability, which they define as “[perceiving] a treatment to be appropriate, fair, and reasonable” (p. 36). For example, a principal who can turn to enthusiastic staff in their own school (e.g., teachers, counselors, social workers) to build support for implementing an intervention may find the intervention more acceptable than a principal who can turn to only one or two school staff members. Therefore, we hypothesize that principals whose social networks provide bonding social capital through within-district advice contacts will exhibit more acceptability toward implementing EWS (H1).
Second, when individuals have a diverse and unrelated set of people that they can turn to for support, as the hypothetical principal in Fig. 1A does outside their district, they have bridging social capital. A network with bridging social capital includes others who, because they are outside the immediate setting, can offer new perspectives on the intervention. Specifically, these others can provide access to unique information, including information that can be helpful for solving implementation challenges, and thus can increase perceptions that an intervention can be “successfully used or carried out” (Proctor et al., 2011, p. 69; Neal & Neal, 2019). Chafouleas et al. (2009) call this aspect of social validity, understanding, which they define as “an individual’s knowledge of what the intervention is, how to carry it out, and why it is being implemented” (p. 38).Footnote 1 For example, a principal who can turn to varied sources outside the school building for information about an intervention (e.g., program developers, researchers, colleagues in a different district) may obtain unique information from each of these sources that increases intervention understanding. Therefore, we hypothesize that principals whose social networks provide bridging social capital through outside-district advice contacts will exhibit more understanding of EWS (H2).
Third, sometimes an individual’s social network includes both dense within-setting contacts and diverse outside-setting contacts, as is the case for the hypothetical principal in Fig. 1A. In such cases, the individual enjoys a combination of bonding and bridging social capital, which in addition to their respective advantages noted above, can also have synergistic advantages for implementation when combined. First, the trust inherent in bonding social capital can make a principal feel more comfortable delegating intervention activities to others, which can reduce the amount of time and effort required from a single person tasked with implementing an intervention. For example, a principal who has a large network of school staff who are enthusiastic about implementing EWS will be more able to form an EWS team (step 1, see Table 1), thereby reducing the time requirements from any single person responsible for implementation. Second, the access inherent in bridging social capital can facilitate acquisition of material resources from outside sources, which can reduce the “cost impact of an intervention effort” (Proctor et al., 2011, p. 69; Neal & Neal, 2019). For example, a principal who also has a network with bridging ties may identify donors who can assist with the purchase of books or other materials required for the programs and practices selected through EWS, thereby reducing the resource requirements. Together, the principal’s effort-sharing within-school ties and resource-finding outside-school ties can improve the feasibility of an intervention, which Chafouleas et al. (2009) define as the “time, resource, and effort requirements” of an intervention (p. 38).Footnote 2Therefore, we hypothesize that principals whose social networks provide both bonding and bridging social capital through a combination of within- and outside-district advice contacts will view the implementation of EWS as having more feasibility (H3).
Method
Participants
The participants included a random representative sample of 180 Michigan public school principals who were employed in schools serving any combination of grades 7–12 and who reported being involved in the implementation of EWS. Table 2 summarizes several characteristics of these principals and their schools and compares these characteristics to statewide averages. A majority of participating principals were white (88.9%) and male (69.4%). They were 45 years old on average (SD = 6.69) and had served in their current role as principal for an average of 5.25 years (SD = 4.39). Approximately one-quarter of these principals (26.11%) held an advanced post-Master’s degree, such as an Education Specialist (Ed.S.) degree or doctoral degree (Ed.D. or Ph.D.). Comparing the secondary school principals in our sample to all principals in the state of Michigan, we found that they were more likely to be male and to have slightly fewer years of experience, but otherwise were representative of the population in terms of race, age, and advanced degree attainment.
These principals served a diverse range of schools with an average enrollment of 525 students, among whom an average of 74.9% were white and 51.9% were low income. The students in these schools had an average student growth percentile on the state assessment of 47.2 in English Language Arts and 47.1 in Mathematics. The minority of staff in these schools were male (33.3%) and held a post-baccalaureate degree (36.6%), but a majority of staff were white (92.5%). Comparing the schools served by principals in our sample to all secondary schools in the state of Michigan, we found that they were slightly larger and served a higher percentage of white students, but otherwise were representative of the population in terms of student performance and staff composition.
Procedures
We invited a random 50% of all Michigan public school principals of schools serving grades 7–12 (N = 652) to participate in a Web-based survey of their experiences implementing EWS. Participants were offered a gift card valued at $10–20 as a participation incentive. Each week until data collection ended, non-respondents received a reminder email, and some also received a follow-up phone call (Neal, Neal, & Piteo, 2020). A total of 253 principals started the survey (38.8% click-through rate), of which 191 completed the survey and reported they were involved with implementing EWS (29.3% participation rate). Of these eligible participants, 11 skipped one or more of the outcome scale items described below and were excluded from the analysis, yielding an analytic sample of 180 principals (27.6% effective response rate).
Measures
Social Validity
The three aspects of social validity—acceptability, understanding, and feasibility—were measured using items from the respective subscales of the Usage Rating Profile-Intervention (URP-I; Chafouleas et al., 2009) and the Usage Rating Profile-Intervention Revised (URP-IR; Briesch, Chafouleas, Neugebauer, & Riley-Tillman, 2013). The original item wordings were adapted to the study context, for example, by replacing the general phrase “child’s problem” with the more specific phrase “student dropout” (see Table 3). The acceptability subscale included 17 items, such as “The EWS steps are a good way to prevent student dropout.” The understanding subscale included 8 items, such as “I understand how to use the EWS steps.” Finally, the feasibility subscale included 10 items, such as “The amount of time required to use the EWS steps is reasonable.” All items were measured on a six-point Likert scale ranging from “Strongly Disagree” (1) to “Strongly Agree” (6); the measurement properties of these scales in our sample below are reported below in the results.
Bonding and Bridging Social Capital
To measure social capital, each principal was asked “In the last year, when you faced a challenge implementing the steps of EWS, who did you go to for advice?” Space was provided to identify up to 10 people, by first name and last initial to preserve their anonymity; only 13 respondents (7.2%) used all 10 spaces. For each named source of advice, the principal was also asked whether the person worked in the same district, another district, a county or state-level education agency, or somewhere else. We operationalized bonding social capital as the number of individuals named in the principal’s own district, which captures the size of the principal’s local implementation support network. We operationalized bridging social capital as the number of individuals named by the principal who do not work in the principal’s own district, which captures the extent to which the principal’s implementation support network bridges into other settings.Footnote 3
Control variables
Because “people are not passive recipients of [interventions]” we also included in our analysis several personal characteristics as control variables (Greenhalgh, Robert, Macfarlane, Bate, & Kyriakidou, 2004, p. 598). Personal characteristics that may be associated with principals’ perceptions of interventions’ social validity included gender, level and recency of education (Rogers, 2003; Williford, Wolcott, Whittaker, & Locasale-Crouch, 2015), work experience (e.g., Frambach & Schillewaert, 2002; Hargreaves, 2005), and experience with the intervention (e.g., Damschroder et al., 2009). We measured self-reported gender (male) and educational attainment (post-MA degree) using binary indicator variables. We measured experience as a principal (role experience) and with their current degree (degree experience) as the number of years since becoming principal or obtaining the degree. Finally, we measured principals’ EWS experience using the number of EWS’s 7 steps they are involved with implementing.
Analysis Plan
To examine the measurement properties of the URP-I and URP-IR in our sample, we conducted an exploratory factor analysis with an orthogonal varimax rotation and computed Cronbach’s alpha for the acceptability, understanding, and feasibility subscales. To test our hypotheses, we mean-centered all continuous variables and then estimated four linear regressions. Model 1 tested H1 by predicting acceptability as a function of bonding social capital, controlling for bridging social capital, the interaction between bonding and bridging social capital, and personal characteristics. Model 2 tested H2 by predicting understanding as a function of bridging social capital, controlling for bonding social capital, the interaction between bonding and bridging social capital, and personal characteristics. Model 3 tested H3 by predicting feasibility as a function of bonding social capital, bridging social capital, and their interaction, controlling for personal characteristics. Finally, model 4 was exploratory and predicted intent to implement as a function of bonding social capital, controlling for bridging social capital, the interaction between bonding and bridging social capital, and personal characteristics. Replication data and code are available at https://www.osf.io/s4nmx.
Results
Table 3 reports the results of the exploratory factor analysis of the URP-I and URP-IR items in our sample. We retained four factors, which explained 83% of the total variance. The first factor, which explained 27.34% of the variance, included 7 of the 8 items associated with the URP-I or URP-IR understanding subscale, and no other items. We used these 7 items to construct our understanding subscale, which exhibited a Cronbach’s α of 0.945 in our sample. The second factor, which explained 21.82% of the variance, included 9 of the 10 items associated with the URP-I or URP-IR feasibility subscale, and no other items. We used these 9 items to construct our feasibility subscale, which exhibited a Cronbach’s α of 0.919 in our sample. The third and fourth factors, which explained 17.38% and 16.83% of the variance, respectively, included 10 of the 17 items associated with the URP-I or URP-IR acceptability subscale, and no other items. The third factor included items that capture intent to implement the intervention (e.g., I am committed to carrying out the EWS steps), while the fourth factor included items that capture acceptability of the intervention for addressing the specific issue of student dropout (e.g., The EWS steps are a good way to prevent student dropout). Therefore, we used the 6 items that load on the third factor to construct a scale of intent to implement EWS (α = 0.897), and the 4 items that load on the fourth factor to construct a scale of acceptability (α = 0.882). Because we did not have any original hypotheses about principals’ intent to implement EWS, we conducted exploratory analyses on this factor, but did not offer specific hypotheses about this unique concept. Importantly, although the analyses presented below used subscales based on the factor structure of these data in our sample, we obtained similar results and reached identical conclusions if our analyses used the subscales originally defined by Chafouleas et al. (2009) and Briesch et al. (2013) instead.
Table 4 reports the means, standard deviations, and Pearson correlations for each variable included in the models. Principals sought advice about implementing EWS from an average of 3 in their own district (bonding social capital) and from an average of none or one person outside their own district (bridging social capital). Principals reported viewing EWS as having high levels of all aspects of social validity (acceptability M = 4.85, understanding M = 4.40, feasibility M = 4.06) as well as the intent to implement EWS (M = 4.90). Correlations among the aspects of social validity as well as intent to implement EWS were moderate (r = 0.36–0.65), indicating they are related but distinct.
Table 5 reports the results of four linear regressions. In model 1, which test H1, we found that bonding social capital is significantly associated acceptability (b = 0.06, p = 0.001). In model 2, which tests H2, we found that bridging social capital is significantly associated with understanding (b = 0.15, p = 0.018). In model 3, which tests H3, we found that the interaction of bonding and bridging social capital is not significantly associated with feasibility (b = 0.003, p = 0.896). Finally, in model 4, which was exploratory in nature, we found that bonding social capital is significantly associated with intent to implement EWS (b = 0.05, p = 0.003). Personal characteristics generally had no statistically significant relationship with social validity; however, principals holding a post-MA degree displayed significantly higher levels of acceptability (b = 0.34, p = 0.002), understanding (b = 0.39, p = 0.025), and feasibility (b = 0.33, p = 0.015). Additionally, principals with more EWS experience had significantly higher levels of understanding (b = 0.08, p = 0.03).
Discussion
Principals often struggle to support their staff in the implementation of systems-level interventions in their schools, compromising the potential of these interventions (Durlak & Dupre, 2008). Uncovering what predicts interventions’ social validity among principals can provide insight into efforts to improve intervention implementation. Past work has examined whether characteristics of interventions predict features of social validity, including acceptability, understanding, and feasibility (Briesch et al., 2015). However, less is known about the role of social networks in predicting interventions’ social validity. In this study, we applied Neal and Neal’s (2019) implementation capital framework to explore the extent to which features of principals’ social networks predict the acceptability, understanding, and feasibility of early warning signs (EWS) to prevent student dropout.
Supporting H1, the results of our study demonstrate that bonding social capital is positively associated with acceptability. Specifically, principals with larger within-district support networks viewed the EWS steps as more acceptable. As Neal and Neal (2019) argued, the within-district networks that comprise bonding social capital can substantiate positive norms, bolster trust, and increase teamwork around the implementation of an intervention, all which likely lead to higher levels of acceptability. Supporting H2, findings also show that bridging social capital is positively associated with intervention understanding. Specifically, principals with larger out-of-district support networks reported higher levels of understanding of the EWS steps. In line with Neal and Neal (2019), the ties to other settings that comprise bridging social capital can provide access to novel information and resources that can help boost principals’ understanding of multi-step, systems-level interventions like EWS.
The results of our study do not provide support for H3. Contrary to expectations, principals’ combination of bonding and bridging social capital were not associated with the feasibility of EWS. Given our modest sample size of 180 principals, it is likely the study is underpowered to detect this interaction effect; however, the lack of support for this hypothesis may also be related to the multiple dimensions of feasibility. As defined by Chafouleas et al. (2009), feasibility captures the cost of interventions in terms of time, resources, and effort. A principal’s social network may help reduce some of the time and material costs associated with an intervention, for example, by facilitating delegation or locating external financial support. However, other time and material costs are often linked to the complexity of the intervention itself (Proctor et al., 2011)—that is, some interventions simply require significant time and resources—and therefore may not be directly associated with principals’ social networks.
Beyond bonding and bridging social capital, some personal covariates were also associated with the social validity of EWS. First, principals with a post-MA degree (e.g., Ed.S., Ed.D., or Ph.D.) exhibited more acceptability of EWS, more understanding of EWS, and viewed EWS as more feasible than their counterparts with less educational attainment. This finding is consistent with the prior literature that suggests that individuals with more years of education are more likely to be early adopters of interventions than their peers (Rogers, 2003). Second, principals who had experience being involved in the implementation of a larger number of the EWS steps exhibited more understanding of EWS. Firsthand experience with multiple parts of EWS appears to enhance principals’ knowledge about how to implement the intervention.
In addition to acceptability, understanding, and feasibility, our exploratory factor analysis of URP-I items revealed a separate factor focused on principals’ intent to implement EWS. Because we did not have any a priori hypotheses about this factor, we conducted an exploratory analysis. Similar to intervention acceptability, we found that principals with larger within-district support networks (i.e., more bonding social capital) indicated more intent to implement EWS. Intent to implement an intervention has been conceptualized as a key implementation outcome, which Proctor et al. (2011) call “adoption.” The implementation capital framework notes that like acceptability, adoption reflects individuals’ perceptions about an intervention and should be associated with bonding social capital, which provides reinforcement and strengthens positive norms about the intervention (Neal & Neal, 2019). This initial exploratory finding lends some support to this premise of the implementation capital framework and should be explored further in future work.
Limitations and Future Directions
This study has some limitations that lend themselves to suggestions for future research. First, the sample was limited to one state in the Midwest (Michigan), one type of educator (principals), and one systems-level intervention (EWS). Future research may wish to replicate the study in other geographic regions, with other types of educators (e.g., teachers or specialists, such as school psychologists), and with other interventions (e.g., PBIS). Second, the cross-sectional nature of the data prevents claims of causality. Specifically, it is possible that principals’ prior acceptability and understanding of EWS inspired them to more proactively seek out support through their social networks, in addition to the social network improving their perceptions of social validity of EWS. Researchers may wish to conduct longitudinal studies to further understand the direction of the relationship between social networks and social validity of interventions. Third, the current study adopted a variable-centered individual-level analysis by focusing on the characteristics of principals within schools. Future studies may benefit from adopting a person-centered approach to identify classes of principals with distinct perspectives of interventions, and from adopting a setting-level approach to identify effects of bridging and bonding social capital that occur at the setting level (i.e., the school; Neal & Neal, 2019). Fourth, the URP-IR (Briesch et al., 2013) includes social validity subscales for system climate (i.e., educators’ perception of intervention fit with the climate of their school) and system support (i.e., educators’ perceptions of practical support needed to carry out the intervention). Due to time considerations, we did not collect these subscales in the current study. However, in future research, it would be interesting to see whether aspects of bonding and bridging capital are associated with these forms of social validity. Fifth, although the response rate was comparable to other Web-based surveys in school psychology (Castillo, Curtis, Brundage, March, & Stockslager, 2014), there may have been response bias between principals who did and did not complete the survey, such that principals who responded may have been more invested in EWS implementation and thus more likely to respond. Finally, our measures of bonding and bridging social capital focus on the location of principals’ ties (i.e., within and outside the school, respectively), but future studies may benefit from also examining other features of the network when operationalizing these constructs, including tie strength and network structure (e.g., density; see Carolan, 2013; Lee, 2010; Penuel, Sussex, Korbak, & Hoadley, 2006).
Practice Implications
This study has several implications regarding high-quality implementation of systems-level interventions like EWS. Findings support the relationship between social networks and the social validity of interventions, which highlights the importance for principals to develop consensus, collegiality, and buy-in among implementers. Bonding social capital was associated with increased acceptability of the intervention, suggesting a need for principals to build connections among their staff. Clear communication and a transparent decision-making process about the roles and responsibilities of implementers may foster bonding social capital and the sense that each team member makes an important contribution to the process. Findings also highlight the importance for principals to foster networks outside of their local districts. Bridging social capital was associated with increased understanding of the intervention, underscoring the need for principals to spend some time and resources outside of their district to develop relationships with others who may then provide advice on implementation. Essentially, resources spent developing social networks inside and outside of principals’ home districts may improve implementation of systems-level interventions and are resources well spent.
Finally, increasing bonding and bridging capital may require more conscious effort when implementing systems-level mental health and behavioral interventions. In particular, bonding and bridging social capital may be relatively easier to foster for instructional interventions compared to systems-level mental health and behavioral interventions like EWS for two reasons. First, teachers may be less inclined to view implementation of systems-level mental health and behavioral interventions as part of their roles and responsibilities (Reinke, Stormont, Herman, Puri, & Goel, 2011), potentially decreasing the number of teachers willing to provide support in the principal’s social network. Second, because principals’ networks are likely to primarily include other educators and not mental health professionals, they may find it more challenging to identify relevant sources of support for systems-level mental health and behavioral interventions.
Conclusion
This study contributes to the literature by testing hypotheses of the implementation capital framework (Neal & Neal, 2019), conducted through the lens of the implementation of EWS, a systems-level intervention to prevent school dropout. Results supported the relationship between bonding and bridging social capital, and social validity of the intervention, highlighting the importance of allocating time and resources toward fostering principals’ social networks both inside and outside of their home districts. As schools continue to struggle with high-quality implementation of complex, systems-level interventions, this study offers unique insight into how principals can use their social networks to realize the full potential of the interventions they choose to adopt.
Notes
In the implementation capital framework, Neal and Neal (2019) use terminology from Proctor et al.’s (2011) framework of implementation outcomes. Proctor et al. (2011) use the term, feasibility, to describe what Chafouleas et al. (2009) refer to as understanding. In this paper, we use Chafouleas et al.’s (2009) term, understanding, to describe the ease of use of an intervention.
We explored more complex operationalizations of bonding and bridging social capital that also incorporates information about the principal’s frequency of seeking advice from each person: (1) defining bonding social capital as the number of high-frequency (i.e., daily or weekly) within-district advice contacts, and (2) defining bridging social capital as the number of low-frequency (i.e., monthly or annually) outside-district advice contacts. These operationalizations mirror’s Granovetter’s (1973) conception of bonding ties as strong (i.e., close, frequent) and bridging ties as weak (i.e., distant, infrequent). Because these more complex operationalizations yields similar model estimates and identical conclusions, we use simpler operationalizations for clarity.
References
Aarons, G. A., Ehrhart, M. G., Farahnak, L. R., & Sklar, M. (2014). Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annual Review of Public Health, 35, 255–274. https://doi.org/10.1146/annurev-publhealth-032013-182447.
Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 4–23. https://doi.org/10.1007/s10488-010-0327-7.
Baer, D. M., Wolf, M. M., & Risley, T. R. (1968). Some current dimensions of applied behavioral analysis. Journal of Applied Behavioral Analysis, 1, 91–97. https://doi.org/10.1901/jaba.1968.1-91.
Barwick, M. A., Barac, R., Akrong, L. M., Johnson, S., & Chaban, P. (2014). Bringing evidence to the classroom: Exploring educator notions of evidence and preferences for practice change. International Education Research, 2(4), 1–15. https://doi.org/10.12735/ier.v2i4p01.
Batsche, G., Elliott, J., Graden, J., Grimes, J., Kovaleski, J. F., Prasse, D., et al. (2005). IDEA 2004 and response to intervention: Policy considerations and implementation. Alexandria, VA: National Association of State Directors of Special Education.
Battin-Pearson, S., Newcomb, M. D., Abbott, R. D., Hill, K. G., Catalano, R. F., & Hawkins, J. D. (2000). Predictors of early high school dropout: A test of five theories. Journal of Educational Psychology, 92, 568–582. https://doi.org/10.1037/0022-0663.92.3.568.
Blase, J., & Blase, J. (1999). Principals’ instructional leadership and teacher development: Teachers’ perspectives. Educational Administration Quarterly, 35(3), 349–378. https://doi.org/10.1177/0013161x99353003.
Breslau, J., Miller, E., Chung, W. J. J., & Schweitzer, J. B. (2011). Childhood and adolescent onset psychiatric disorders, substance use, and failure to graduate high school on time. Journal of Psychiatric Research, 45(3), 295–301. https://doi.org/10.1016/j.jpsychires.2010.06.014.
Briesch, A. M., Briesch, J. M., & Chafouleas, S. M. (2015). Investigating the usability of classroom management strategies among elementary schoolteachers. Journal of Positive Behavior Interventions, 17, 5–14. https://doi.org/10.1177/1098300714531827.
Briesch, A. M., Chafouleas, S. M., Neugebauer, S. R., & Riley-Tillman, T. C. (2013). Assessing influences on intervention implementation: Revision of the usage rating profile-intervention. Journal of School Psychology, 51(1), 81–96. https://doi.org/10.1016/j.jsp.2012.08.006.
Carolan, B. (2013). social network analysis and education: Theory, methods & applications. London: Sage.
Castillo, J. M., Curtis, M. J., Brundage, A., March, A. L., & Stockslager, K. M. (2014). Comparisons of response rates, respondent demographics, and item responses for web-based and mail survey modes in a national survey of school psychologists. Trainers’ Forum, 32(2), 32–50.
Chafouleas, S. M., Briesch, A. M., Riley-Tillman, T. C., & McCoach, D. B. (2009). Moving beyond assessment of treatment acceptability: An examination of the factor structure of the Usage Rating Profile-Intervention (URP-I). School Psychology Quarterly, 24, 36–47. https://doi.org/10.1037/a0015146.
Chafouleas, S. M., Sanetti, L. M. H., Jaffery, R., & Fallon, L. M. (2012). An evaluation of a classwide intervention package involving self-management and a group contingency on classroom behavior of middle school students. Journal of Behavioral Education, 21(1), 34–57. https://doi.org/10.1007/s10864-011-9135-8.
Collier-Meek, M. A., Fallon, L. M., & DeFouw, E. R. (2017). Toward feasible implementation support: E-mailed prompts to promote teachers’ treatment integrity. School Psychology Review, 46(4), 379–394.
Corrin, W., Sepanik, S., Rosen, R., & Shane, A. (2016). Addressing early warning indicators: Interim impact findings from the Investing in Innovation (i3) evaluation of Diplomas Now. New York, NY: MDRC.
Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation science, 4(1), 50. https://doi.org/10.1186/1748-5908-4-50.
DiGennaro, F. D., Martens, B. K., & McIntyre, L. L. (2005). Increasing treatment integrity through negative reinforcement: Effects on teacher and student behavior. School Psychology Review, 34, 220–231. https://doi.org/10.1080/02796015.2005.12086284.
Doll, B., & Cummings, J. (Eds.). (2008). Transforming school mental health services: Population-based approaches to promoting the competency and wellness of children. Bethesda, MD: National Association of School Psychologists.
Durlak, J. A., & Dupre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350. https://doi.org/10.1007/s10464-008-9165-0.
Durlak, J. A., Weissberg, R. P., Dymnicki, A. B., Taylor, R. D., & Schellinger, K. B. (2011). The impact of enhancing students’ social and emotional learning: A meta-analysis of school-based universal interventions. Child Development, 82, 405–432. https://doi.org/10.1111/j.1467-8624.2010.01564.x.
Durlak, J. A., & Wells, A. M. (1997). Primary prevention mental health programs for children and adolescents: A meta-analytic review. American Journal of Community Psychology, 25, 115–152. https://doi.org/10.1023/a:1024654026646.
Eckert, T. L., & Hintze, J. M. (2000). Behavioral conceptions and applications of acceptability: Issues related to service delivery and research methodology. School Psychology Quarterly, 15, 123–148. https://doi.org/10.1037/h0088782.
Esch, P., Bocquet, V., Pull, C., Couffignal, S., Lehnert, T., Graas, M., et al. (2014). The downward spiral of mental disorders and educational attainment: a systematic review on early school leaving. BMC Psychiatry, 14(1), 237. https://doi.org/10.1186/s12888-014-0237-4.
Fabiano, G. A., Pyle, K., Kelty, M. B., & Parham, B. R. (2017). Progress monitoring using direct behavior rating single item scales in a multiple-baseline design study of the daily report card intervention. Assessment for Effective Intervention, 43(1), 21–33. https://doi.org/10.1177/1534508417703024.
Fallon, L. M., & Feinberg, A. B. (2017). Implementing a Tier 2 behavioral intervention in a therapeutic alternative high school program. Preventing School Failure: Alternative Education for Children and Youth, 61(3), 189–197. https://doi.org/10.1080/1045988x.2016.1254083.
Faria, A.-M., Sorensen, N., Heppen, J., Bowdon, J., Taylor, S., & Eisner, R. et al. (2017). Getting students on track for graduation: Impacts of the early warning intervention and monitoring system after one year (REL 2017–272). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Midwest. Retrieved from https://ies.ed.gov/ncee/edlabs/regions/midwest/pdf/REL_2017272.pdf
Finn, C. A., & Sladeczek, I. E. (2001). Assessing the social validity of behavioral interventions: A review of treatment acceptability measures. School Psychology Quarterly, 16(2), 176–206. https://doi.org/10.1521/scpq.16.2.176.18703.
Fixsen, D.L., Naoom, S.F., Blasé, K.A., Friedman, R.M., & Wallace, F. (2005). Implementation Research: A Synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231).
Frambach, R. T., & Schillewaert, N. (2002). Organizational innovation adoption: A multi-level framework of determinants and opportunities for future research. Journal of Business Research, 55(2), 163–176. https://doi.org/10.1016/s0148-2963(00)00152-1.
Goldring, E., Huff, J., May, H., & Camburn, E. (2008). School context and individual characteristics: What influences principal practice? Journal of Educational Administration, 46, 332–352. https://doi.org/10.1108/09578230810869275.
Greenberg, M. T., Domitrovich, C., & Bumbarger, B. (2001). The prevention of mental disorders in school-aged children: Current state of the field. Prevention and Treatment, 4, 1–62. https://doi.org/10.1037/1522-3736.4.1.41a.
Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of innovations in service organizations: Systematic review and recommendations. The Milbank Quarterly, 82(4), 581–629. https://doi.org/10.1111/j.0887-378x.2004.00325.x.
Hargreaves, A. (2005). Educational change takes ages: Life, career and generational factors in teachers’ emotional responses to educational change. Teaching and Teacher Education, 21(8), 967–983. https://doi.org/10.1016/j.tate.2005.06.007.
Hussar W., Zhang, J., Hein, S., Wang, K., Roberts, A., & Cui, J. et al. (2020). The condition of education 2020 (NCES 2020–144). U.S. Department of Education, National Center for Education Statistics. Washington, DC. Retrieved July 9, 2020 from http://nces.ed.gov/pubsearch.
Izumi, B., Hoffman, J., Eckhardt, C., Johnson, A., Hallman, J., & Barberis, D. (2015). Harvest for Healthy Kids: A nutrition education curriculum aligned with the Head Start Child Development and Early Learning Framework. NHSA Dialog, 18(2), 43–56.
Jackson, C. B., Herschell, A. D., Schaffner, K. F., Turiano, N. A., & McNeil, C. B. (2017). Training community-based clinicians in parent-child interaction therapy: The interaction between expert consultation and caseload. Professional Psychology: Research and Practice, 48(6), 481–489. https://doi.org/10.1037/pro0000149.
Kazdin, A. E. (1980). Acceptability of alternative treatments for deviant child behavior. Journal of Applied Behavior Analysis, 13, 259–273. https://doi.org/10.1901/jaba.1980.13-259.
Kessler, R. C., Foster, C. L., Saunders, W. B., & Stang, P. E. (1995). Social consequences of psychiatric disorders, I: Educational attainment. American Journal of Psychiatry, 152(7), 1026–1032. https://doi.org/10.1176/ajp.152.7.1026.
Kilgus, S. P., Fallon, L. M., & Feinberg, A. B. (2016). Function-based modification of check-in/check-out to influence escape-maintained behavior. Journal of Applied School Psychology, 32(1), 24–45. https://doi.org/10.1080/15377903.2015.1084965.
Kornbluh, M., & Neal, J. W. (2015). Social network analysis. In L. A. Jason & D. S. Glenwick (Eds.), Handbook of methodological approaches to community-based research (pp. 207–218). New York: Oxford University Press.
Lansford, J. E., Dodge, K. A., Petit, G. S., & Bates, J. E. (2016). A public health perspective on school dropout and adult outcomes: A prospective study of risk and protective factors from age 5 to 27 years. Journal of Adolescent Health, 58, 652–658. https://doi.org/10.1016/j.jadohealth.2016.01.014.
Lee, M. (2010). Researching social capital in education: some conceptual considerations relating to the contribution of network analysis. British Journal of Sociology of Education, 31, 779–792. https://doi.org/10.1080/01425692.2010.515111.
Liem, J. H., Lustig, K., & Dillon, C. (2010). Depressive symptoms and life satisfaction among emerging adults: A comparison of high school dropouts and graduates. Journal of Adult Development, 17(1), 33–43. https://doi.org/10.1007/s10804-009-9076-9.
Locke, J., Lee, K., Cook, C. R., Frederick, L., Vázquez-Colón, C., Ehrhart, M. G., et al. (2019). Understanding the organizational implementation context of schools: A qualitative study of school district administrators, principals, and teachers. School Mental Health, 11(3), 379–399. https://doi.org/10.1007/s12310-018-9292-1.
Lyon, A. R., Cook, C. R., Brown, E. C., Locke, J., Davis, C., Ehrhart, M., et al. (2018). Assessing organizational implementation context in the education sector: confirmatory factor analysis of measures of implementation leadership, climate, and citizenship. Implementation Science, 13(1), 5. https://doi.org/10.1186/s13012-017-0705-6.
Mac Iver, M. A., Stein, M. L., Davis, M. H., Balfanz, R. W., & Fox, J. H. (2019). An efficacy study of a ninth-grade early warning indicator intervention. Journal of Research on Educational Effectiveness, 12, 363–390. https://doi.org/10.1080/19345747.2019.1615156.
Marin, A., & Wellman, B. (2011). Social network analysis: An introduction. In P. Carrington & J. Scott (Eds.), The SAGE handbook of social network analysis (pp. 11–25). London: Sage.
Marsden, P. V. (2011). Survey methods for network data. In J. Scott & P. Carrington (Eds.), The SAGE handbook of social network analysis (pp. 370–388). London: Sage.
McIntyre, L. L., Gresham, F. M., DiGennaro, F. D., & Reed, D. D. (2007). Treatment integrity of school-based interventions with children in the Journal of Applied Behavior Analysis 1991–2005. Journal of Applied Behavior Analysis, 40, 659–672. https://doi.org/10.1901/jaba.2007.659-672.
McWhirter, J., McWhirter, B., McWhirter, E., & McWhirter, R. (2012). At risk youth (5th ed.). London: Cengage Learning.
Neal, J. W., & Neal, Z. P. (2019). Implementation capital: Merging frameworks of implementation outcomes and social capital to support the use of evidence-based practices. Implementation Science, 14, 16. https://doi.org/10.1186/s13012-019-0860-z.
Neal, Z., Neal, J. W., & Piteo, A. (2020). Call me maybe: Using incentives and follow-ups to increase principals’ survey response rates. Journal of Research on Educational Effectiveness, 13(2). https://doi.org/10.1080/19345747.2020.1772423.
Neugebauer, S. R., Chafouleas, S. M., Coyne, M. D., McCoach, D. B., & Briesch, A. M. (2016). Exploring an ecological model of perceived usability within a multi-tiered vocabulary intervention. Assessment for Effective Intervention, 41(3), 155–171. https://doi.org/10.1177/1534508415619732.
O’Cummings, M., & Therriault, S. B. (2015). From accountability to prevention: Early warning systems put data to work for struggling students. https://eric.ed.gov/?id=ED576665
Penuel, W. R., Sussex, W., Korbak, C., & Hoadley, C. (2006). Investigating the potential of using social network analysis in educational evaluation. American Journal of Evaluation, 27, 437–451. https://doi.org/10.1177/1098214006294307.
Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A., et al. (2011). Outcomes for implementation research: Conceptual definitions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research, 38, 65–76. https://doi.org/10.1007/s10488-010-0319-7.
Reinke, W. M., Stormont, M., Herman, K. C., Puri, R., & Goel, N. (2011). Supporting children’s mental health in schools: Teacher perceptions of needs, roles, and barriers. School Psychology Quarterly, 26(1), 1. https://doi.org/10.1037/a0022714.
Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York: The Free Press.
Rumberger, R., Addis, H., Allensworth, E., Balfanz, R., Bruch, J., & Dillon, E. et al. (2017). Preventing drop-out in secondary schools (NCEE 2017-4028). Washington, DC: National Center for Education Evaluation and Regional Assistance (NCEE), Institute of Education Sciences, U.S. Department of Education. https://whatworks.ed.gov
Saldana, L., Chamberlain, P., Wang, W., & Brown, C. H. (2012). Predicting program start-up using the stages of implementation measure. Administration and Policy in Mental Health and Mental Health Services Research, 39, 419–425. https://doi.org/10.1007/s10488-011-0363-y.
Sanetti, L. M. H., & Kratochwill, T. R. (2009). Toward developing a science of treatment integrity: Introduction to the special series. School Psychology Review, 38, 445–459. https://doi.org/10.1037/a0015431.
Sims, W. A., Riley-Tillman, C., & Cohen, D. R. (2017). Formative assessment using Direct Behavior Ratings: Evaluating intervention effects of daily behavior report cards. Assessment for Effective Intervention, 43(1), 6–20. https://doi.org/10.1177/1534508417708183.
Sinclair, M. F., Christenson, S. L., Evelo, D. L., & Hurley, C. M. (1998). Dropout prevention for youth with disabilities: Efficacy of a sustained school engagement procedure. Exceptional Children, 65(1), 7–21. https://doi.org/10.1177/001440299806500101.
Sinclair, M. F., Christenson, S. L., & Thurlow, M. L. (2005). Promoting school completion of urban secondary youth with emotional or behavioral disabilities. Exceptional Children, 71(4), 465–482. https://doi.org/10.1177/001440290507100405.
Szlyk, H. S. (2020). Resilience among students at risk of dropout: Expanding perspectives on youth suicidality in a non-clinical setting. School Mental Health. https://doi.org/10.1007/s12310-020-09366-x.
Therriault, S. B., O’Cummings, M., Heppen, J., Yerhot, L., & Scala, J. (2017). Early warning intervention and monitoring system implementation guide. Michigan Department of Education.
U.S. Department of Education, Office of Planning, Evaluation, and Policy Development, Policy and Program Studies Service. (2016). Issue brief: Early warning systems. https://www2.ed.gov/rschstat/eval/high-school/early-warning-systems-brief.pdf
Wells, J., Barlow, J., & Stewart-Brown, S. (2003). A systematic review of universal approaches to mental health promotion in schools. Health Education, 103, 197–220. https://doi.org/10.1108/09654280310485546.
Williford, A. P., Wolcott, C. S., Whittaker, J. V., & Locasale-Crouch, J. (2015). Program and teacher characteristics predicting the implementation of Banking Time with preschoolers who display disruptive behaviors. Prevention Science, 16(8), 1054–1063. https://doi.org/10.1007/s11121-015-0544-0.
Wolf, M. M. (1978). Social validity: The case for subjective measurement or how applied behavior analysis is finding its heart. Journal of Applied Behavior Analysis, 11(2), 203–214. https://doi.org/10.1901/jaba.1978.11-203.
Acknowledgements
This study was funded by a Small Research Grant from the Spencer Foundation (#201900052).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Ethical approval
This study was reviewed and determined exempt by Michigan State University's Institutional Review Board (#STUDY00001336).
Conflict of interest
All authors of this manuscript declare that they have no conflicts of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Neal, J.W., Neal, Z.P., Barrett, C.A. et al. Are Principals’ Social Networks Associated with Interventions’ Social Validity?. School Mental Health 12, 812–825 (2020). https://doi.org/10.1007/s12310-020-09388-5
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12310-020-09388-5