Abstract
Children in rural settings are less likely to receive mental health services than their urban and suburban counterparts and even less likely to receive evidence-based care. Rural schools could address the need for mental health interventions by using evidence-based practices within a tiered system of supports such as positive behavioral interventions and supports. However, very few school professionals, with or without mental health training, have received training on evidence-based practices. Rural schools need implementation strategies focused on training to prepare school personnel for the implementation of interventions with fidelity. Little is known about training strategies that are feasible and appropriate for the rural school context. User-centered design is an appropriate framework for the development of training strategies for professionals in rural schools because of its participatory approach and the development of products that fit the context where they are going to be used. The purpose of the study was to develop and assess components of an online training platform and implementation strategy based on the user-centered design. Quantitative and qualitative data from 25 participants from an equal number of schools in rural areas of Pennsylvania were used in the study. A mixed-methods design utilizing complementary descriptive statistics and theme analyses indicated that the training platform and implementation strategy were perceived as highly acceptable, appropriate, feasible and usable by school professionals. The resulting training platform and implementation strategy will fill a void in the training literature in rural schools.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
Introduction
Children in rural settings are less likely to receive mental health services than their urban and suburban counterparts and even less likely to receive evidence-based care (Wagenfeld, 2003). Student mental health concerns can be addressed in rural schools by using multi-tiered systems of support (MTSS; Center on MTSS, 2022) to organize and adopt feasible and cost-effective mental health evidence-based practices (EBPs; Anderson et al., 2013; Herschell et al., 2021; Kelleher & Gardner, 2017; Wagenfeld, 2003). MTSS, such as positive behavioral interventions and supports (PBIS; Sugai & Horner, 2009), can be extended to integrate and implement mental health EBPs according to level of risk or problem severity (Olson et al., 2021). With adequate training and support, school-based mental health staff such as counselors, school psychologists, and social workers can deliver services to students and assist other school staff with implementing EBPs at the targeted and indicated levels. Such school professionals in rural schools are ideally positioned to address mental health problems because of their knowledge of mental health issues and their experience working with children (Berryhill et al., 2021; Foster, 2005). Unfortunately, most school professionals in rural settings are not trained on specific mental health EBPs (Siceloff et al., 2017) because of difficulty accessing training (Harmon et al., 2007).
The purpose of the present study is to (a) describe the development of an online training platform and implementation strategy for school professionals with some mental health background in rural schools based on the user-centered design approach (Lyon & Koerner, 2016), and (b) examine the platform’s and implementation strategy’s perceived feasibility, appropriateness, acceptability, and usability.
Remote training has been used for training school personnel in rural schools. For example, remote coaching has been used for the implementation of PBIS (McDaniel & Bloomfield, 2020; McDaniel et al., 2020) and to enhance effective classroom behavior management by teachers (Bice-Urbach & Kratochwill, 2016; Fischer et al., 2016). School professionals can be trained remotely using synchronous (live/interactive) or asynchronous (previously recorded, non-interactive, accessed on-demand) approaches. Synchronous consultation has been used to support teachers who were experiencing difficulties addressing disruptive behavior in the classroom (Bice-Urbach & Kratochwill, 2016). Results showed that student disruptive behaviors eased after individualized behavior support plans were implemented. Further, teachers found the remote consultation experience feasible and acceptable.
Rural School Context
Children in rural areas present with similar levels of mental health concerns as children in urban areas but experience more barriers in terms of accessing support than their urban counterparts (Bureau of Health Workforce, 2017; Kelleher & Gardner, 2017; Robinson et al., 2017). Mental health services in rural communities are marred by accessibility, availability, affordability, and acceptability challenges (Ezekiel et al., 2021; Wilson et al., 2015). For example, rural communities have fewer mental health professionals who have been trained on EBPs compared to urban and suburban communities (Larson et al., 2016). Children in rural communities are less likely to have adequate health insurance coverage than children in other locations, and many parents do not have the means to pay out of pocket for services (Newkirk & Damico, 2014). Transportation barriers affect families in rural communities significantly more than families in other settings (Arcury et al., 2005). Further, the stigma of mental health services is still a potent barrier among parents in rural locations (Owens et al., 2007; Polaha et al., 2015). Fortunately, providing services in the school setting can address these systemic and cultural barriers because those services would be widely available, offered in a normalized setting that minimizes stigma, and provided free or at subsidized cost (Kern et al., 2017; Owens et al., 2002; Stephan et al., 2007). Although rural schools are increasingly playing an important role in tending to the behavioral health of students (Hoover & Mayworm, 2017; Owens et al., 2013), they also face significant challenges. For example, rural schools have difficulty attracting trained mental health professionals (American Psychological Association, 2016), have large staff turnover (Lee et al., 2009), receive inadequate funding for mental health services (Slade, 2003), and have difficulty gaining access to quality professional development training (Harmon et al., 2007). Providing school professionals with appropriate implementation strategies (i.e., training) that are effective, available on demand, and built for the specific rural context might position rural schools to better serve student mental health needs while simultaneously contributing to narrowing services disparities (Moon et al., 2017; Paulson et al., 2015; Wilger, 2015).
The use of a participatory approach for the development of an implementation strategy for remote training might contribute to greater participation rates among school professionals and help to sustain the use of EBPs with students over time. User-centered design (UCD; Lyon & Koerner, 2016) is a useful framework for the development of a remote training strategy.
User-Centered Design
User-centered design (UCD) has been used extensively in the design of digital products (Abras et al., 2004), mental health interventions (Lyon & Koerner, 2016) and digital mental health interventions (Mohr et al., 2017). UCD is a process that bases the design of an innovation on information provided by constituents, or people who will use the innovation (Goodman et al., 2012; Hanington & Martin, 2012). The general development approach in UCD includes evaluating stakeholder needs in the context in which the product is going to be used, discussing design ideas with stakeholders, developing prototypes of those ideas at varying levels of “fidelity,” conducting initial evaluations with stakeholders, refining the prototypes, evaluating the prototypes to determine if they achieve their purpose, and implementing and evaluating the results (Lyon et al., 2020).
In the present study, the target users for the training platform are school professionals with some mental health training who work in rural schools. The target problems addressed via UCD are the professionals’ training needs based on an examination of prior experiences with mental health training and context-specific implementation barriers (Dopp et al., 2018). The development of the “product” (i.e., platform components and implementation strategy) can be guided by the product’s perceived feasibility, appropriateness, acceptability, and usability of the various prototypes (Lyon & Koerner, 2016). In implementation research, perceived appropriateness refers to the perceived fit, relevance, or compatibility of the innovation for a specific setting; feasibility refers to the extent to which an innovation can be successfully used in a particular setting; and acceptability refers to the perception among stakeholders as to whether the innovation is agreeable, palatable, or satisfactory (Lewis et al., 2015; Proctor et al., 2011). Finally, usability has been defined as the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction (International Organization for Standardization, 1998).
Training Platform and Implementation Strategy
Initial training workshops and ongoing supervision are key strategies for implementing EBPs in schools. Multicomponent training strategies for mental health therapists, comprised of an initial workshop followed by ongoing consultation, have been found to be more effective than a single workshop for enhancing therapist clinical skills and knowledge, treatment adherence, and clinical outcomes (Beidas & Kendall, 2010; Herschell et al., 2021).
Existing models of remote training are delivered synchronously via webinar lectures or interactive coaching, and to a lesser extent, asynchronously through previously recorded modules (e.g., Walker & Baird, 2019). Synchronous training has been found to be effective and acceptable to school professionals (Bice-Urbach & Kratochwill, 2016; Fischer et al., 2016). However, relying solely on synchronous training might not be feasible and the training might not be sufficiently potent for school professionals. Finding sufficient time for training has been an important barrier to training busy school professionals (Moore et al., 2022). Given that learning how to implement a new intervention can be difficult and time intensive, offering additional asynchronous training to professionals in rural schools seems both appropriate and likely necessary (King et al., 2021). Asynchronous training would allow busy school professionals to learn EBP implementation at their own pace, review specific session activities right before meeting students for group or individual sessions, and reinforce previously learned material by reviewing modules or consulting manuals online.
The original implementation strategy for this study was developed based on the existing research training literature (Beidas & Kendall, 2010; Herschell et al., 2010; Sholomskas et al., 2005) and our previous work in under-served schools (Eiraldi et al. 2020; Eiraldi et al., 2018; Eiraldi et al., 2019; Eiraldi et al., 2016). The implementation strategy consisted of two distinct levels of online training support: (a) initial synchronous workshop, plus asynchronous modules; and (b) initial synchronous workshop, plus asynchronous modules, plus scheduled synchronous consultation. However, given the specific rural school context, the implementation strategy was adapted in this study based on an examination of stakeholder training needs, barriers to and facilitators of remote training, as well as perceived appropriateness, feasibility and acceptability of the implementation strategy.
Purpose of the Present Study
The purpose of this pre-implementation mixed-methods study in rural schools was to describe the development of a key component of the training platform (asynchronous modules) and the adaptation of the implementation strategy both guided by UCD. We employed a mixed-methods design as the collection of quantitative rankings alone would be insufficient for the purpose of the study (Creswell & Clark, 2011). Rich qualitative data were necessary to understand training needs and barriers and facilitators to inform prototype revisions alongside quantitative rankings. Qualitative data led the adaptation of the implementation strategy, while quantitative data provided a general assessment of feasibility, appropriateness, acceptability, and usability through prototype iterations. We expected that an analysis of reported training needs and barriers to and facilitators of remote training would inform the adaptation of an implementation strategy that would be acceptable and feasible to implement in the rural school context, and that participants would express willingness to participate in training. The study was conducted prior to a randomized controlled trial to examine school professional and student outcomes. The protocol for the larger study has been described in detail elsewhere (Eiraldi et al., 2022).
The contributions of the study to school mental health include the use of UCD for the development of a remote training platform and implementation strategy for school-based mental health professionals in rural schools, and the platform’s perceived feasibility, appropriateness, acceptability, and usability.
Method
The study employed a mixed-methods design with a QUAL + quan structure in which qualitative data served a primary role and quantitative Likert-scale rankings provided a supportive, secondary role (Creswell & Clark, 2011). The function of the mixed-methods design was that of complementarity. We used semi-structured interviews and open-ended responses (QUAL) to elaborate upon quantitative findings (quan) and to better understand the process of implementation of remote consultation as experienced by stakeholders (). The qualitative data provided insight into training needs, barriers, and facilitators that informed prototype revisions and was collected alongside quantitative rankings, which provided a high-level assessment of feasibility, appropriateness, acceptability, and usability across participants.
Participants
The study was conducted with school professionals from 25 K-8 rural schools in Pennsylvania that were implementing PBIS at Tier 1. Most participants were female, white, and non-Hispanic. More than two thirds (68%) of participants were school counselors. Social workers made up 8% of the sample. The other 16% included the following roles: reading specialist and PBIS coach, teacher, emotional and autistic support teacher, and behavior interventionist. Almost an equal number of school professionals had less than 10 years of experience on the job as had more than 10 years of experience (see Table 1).
Inclusion Criteria
Any school, designated by the US Census Bureau as “rural,” with a PBIS program that was implementing Tier 1 with fidelity, with or without a functioning Tier 2, was considered for inclusion in this study. Fidelity of implementation at Tier 1 was important because Tier 1 is foundational for the development of mental health interventions at the advanced tiers of support (Hawken et al., 2009). Due to the COVID-19 pandemic, most schools in Pennsylvania shifted the criterion from scoring above a certain percentage to simply issuing a score. As such, participant schools entered the study with only a prior indication of implementing PBIS with fidelity, based on the results of the 2019–2020 school year measurement according to criteria set by the Pennsylvania PBS Network (i.e., a score ≥ 70% for Tier 1 on the Tiered Fidelity Inventory; Algozzine et al., 2014). Any school-based professional (e.g., school counselor), with or without experience implementing Tier 2 interventions, based at a school implementing PBIS, was eligible for inclusion in the study.
Measures and Study Procedures
Study staff emailed or called principals of rural schools who were implementing PBIS to explain the study and to ask if they would allow a school mental health professional or other professional from their school to participate. If the principal agreed, study staff described the study to the school professional and asked if they were interested in participating. If they were interested, they were read a consent form and asked to verbally consent. The verbal consent form indicated that agreeing to participate would entail participating in qualitative interviews, completing surveys, watching and rating modules on perceived feasibility, appropriateness, acceptability, and usability, and potentially participating in a training randomized controlled trial.
We conducted evaluative and iterative strategies (Kirchner et al., 2018) to ensure that the remote training strategy would be a good fit with the rural school context. Due to unanticipated logistical barriers, we made modifications to the UCD approach. Instead of collecting data concurrently from all schools, we collected data as they entered the study because recruitment of schools took more time than anticipated. This prevented us from concurrently evaluating the fidelity of several prototypes, as we had originally planned (Eiraldi et al., 2022). Instead, we evaluated one paper prototype of the interventions and implementation strategy (Prototype 1), followed by one asynchronous video prototype of the interventions and a revised paper prototype of the implementation strategy (Prototype 2). At the conclusion of the evaluation of Prototype 2, we used the member checking approach (Harvey, 2015) to provide participants with a chance to provide feedback on the perspective provided by other respondents.
We used a demographic questionnaire to collect information about the demographic characteristics of study participants.
We conducted two semi-structured, qualitative interviews with each of the 25 participants. Theme saturation was achieved by conducting more than 12 interviews in a largely homogenous population (Guest et al., 2006). The interviews were conducted by one of the co-authors (RC), a qualitative data specialist. Interviews were conducted over the phone and scheduled at a time that was convenient to participants.
The first qualitative interview guide elicited views about past experience with professional training and perceived barriers and facilitators to participation in consultation sessions and conducting groups with students (e.g., What would make it difficult for you to participate in consultation sessions and conduct groups with students?) This interview was conducted immediately after the participant consented to participate in the study.
After analyzing the results of the first interview, and informed by prior training work with school professionals in underserved schools (Eiraldi et al., 2020; Eiraldi et al., 2015), we developed and administered a second interview. For the second interview, we provided participants with a written description of Prototype 1. We provided a description of EBPs that would be offered, training and consultation components, a rationale for the need for each component, a description of training modules and the approximate time required for them, as well as a description of the implementation strategy (i.e., remote training and consultation; Kern et al., 2011). Participants were asked to rate the EBPs and the different components of the training and consultation for feasibility and acceptability using a 5-point Likert-type scale. They were also asked why components did or did not appear feasible or acceptable (Kern et al., 2011) and their perspectives about willingness to participate in remote training (See Fig. 1).
Asynchronous Modules
We developed video modules showing a step-by-step walkthrough of group session preparation for Coping Power Program (CPP; Lochman et al., 2008) for children at risk for externalizing behavior problems, CBT Anxiety Treatment in Schools (CATS; Khanna et al., 2016) for children at risk of anxiety problems, and Check-in/Check-out (CICO; Hawken et al., 2014) for children at risk for externalizing behavior problems. Additional video modules included strategies for managing child therapy groups, screening procedures for children in Tier 2 and “red flags” for identifying children at risk for externalizing and internalizing disorders, and a review of barriers and ways to overcome barriers to implementing mental health EBPs.
Implementation Strategy
We developed an implementation strategy with three main components: (a) initial training workshop; (b) e-learning training modules on demand; and (c) consultation. The consultation components consisted of didactics and coaching. Didactics included: (a) discussing student referrals; (b) conducting a step-by-step walkthrough of the session objectives; (c) reviewing the theoretical principles behind the intervention components for that session; (d) encouraging adherence to the intervention manual; (e) problem-solving barriers to implementation and helping school professionals reflect on past challenges in order to successfully implement the upcoming sessions; and (f) enhancing school professionals’ use of empathy and positive reinforcement. Coaching included (a) setting goals for content delivered from the manual; (b) self-reflection; and (c) receiving performance feedback. We emphasized the importance of implementing the program as intended, and the expectation that school professionals would be expected to reach a high level of fidelity when implementing the interventions. Then, school professionals would be asked to reflect on the previous intervention session, and the consultant would make some observations about the previous group session or CICO case. The consultant would be expected to discuss how the school professional handled student behavior in session, including overall level of participation and enthusiasm, and disruptive behavior. Participants would also be told the approximate amount of time that it would take the participant and consultants to complete each component. Finally, participants would be told that the final version of all asynchronous videos and copies of implementation and intervention manuals would be available for download from a project website during the clinical trial phase of the study.
Prototype 2 of Video Modules: Ratings and Qualitative Questions
Participants were emailed instructions and two uniform resource locators (URLs) that they used to watch and rate modules for perceived feasibility, appropriateness, acceptability, and usability of the second prototype. Given that some of the e-learning modules were quite lengthy, we divided the sample of participants and randomly assigned them to three smaller groups of about eight participants so that each group would watch and review different modules. Participants were instructed to watch the modules, rate them, and provide their opinions about them using free text format. They were asked to complete the Intervention Appropriateness Measure [IAM], Acceptability of Intervention Measure [AIM], and Feasibility of Intervention Measure (FIM; Weiner et al., 2017) via Research Electronic Data Capture (REDCap). The three measures are comprised of 4 items, each rated on a 5-point Likert-type scale (1 = completely disagree, to 5 = completely agree). The Cronbach alphas for the measures range from 0.85 to 0.91. A three-factor CFA exhibited acceptable fit (CFI = 0.96, RMSEA = 0.08) and high factor loadings (0.75 to 0.89), indicating structural validity. Seven-week test–retest reliability coefficients ranged from 0.73 to 0.88. Regression analysis indicated each measure was sensitive to change in both directions (Weiner et al., 2017). Participants were also asked to provide comments to expand on their ratings (e.g., “Please comment on the module about CICO”). Participants were also instructed to complete the Usability Subscale (US) of the Telehealth Usability Questionnaire (TUQ; Parmanto et al., 2016) to measure usability. The US is 7-point Likert-type instrument (1 = Strongly disagree, to 7 = Strongly agree). The Cronbach alpha of the usability measure is 0.93 (Parmanto et al., 2016). We made slight adaptations to the US (e.g., changing the word “systems” to “training modules”) for the evaluation of all instructional modules. After completing the questionnaire, respondents were also asked to comment on their answers (e.g., “What was simple to use about the training modules?”).
Prototype 2 of Implementation Strategy: Ratings and Qualitative Questions
Participants were also emailed descriptions of the revised implementation strategy and asked to complete a survey regarding perceived acceptability, feasibility, and willingness to participate in remote training and a qualitative interview about their views of the proposed implementation strategy. The survey had 18 questions (e.g., How willing would you be to participate in the coaching portion of the consultation?) rated on a 1 (not willing at all) to 5 (extremely willing) Likert-type scale.
In the second one-on-one interview, the same 25 participants responded to the paper prototype of the remote training strategy. They were asked why components did or did not appear feasible or acceptable (e.g., What makes the coaching part of the consultation feasible or not feasible?) and their perspectives about willingness to participate in remote training.
Member Checking
After reviewing results of the quantitative surveys, we summarized the information from the free text responses. We utilized a modified synthesized member checking process (Birt et al., 2016; Harvey, 2015) to provide participants with a chance to provide feedback on the perspective other respondents provided (Harvey, 2015). We synthesized and summarized emerging themes from the previous survey and asked participants to rank on a Likert scale the extent to which these themes matched their experience or perspective. We also provided an opportunity for participants to answer open-ended questions to explain their perspectives.
Data Analyses
For the examination of prior training experiences and barriers and facilitators of remote training, we imported transcripts of semi-structured Interview # 1 into NVivo (QSR International, 2020), a qualitative data management and analysis software. Analyses were guided by an integrated approach (Bradley et al., 2007) that included identification of a priori attributes of interest (i.e., constructs important to consider in the development of the remote training strategy) and modified grounded theory, which provides a rigorous, systematic approach to identifying emergent codes and themes.
For the examination of perceived feasibility, appropriateness, acceptability, and usability of the prototypes, we imported transcripts of semi-structured Interview # 2 into NVivo for data management and analyses. We also coded open-ended responses from surveys. Over two iterations, we gathered data from 5-point rating scales (AIM, IAM, FIM, usability) and qualitative data (i.e., semi-structured interviews, written answers) simultaneously (2011b; Palinkas et al., 2011a). We utilized descriptive statistics (mean, standard deviation) of acceptability, feasibility, and willingness to participate for each of the components of the training and use of remote technology. Interview and open-ended response data were analyzed to elaborate upon quantitative findings and to better understand the process of implementation of remote consultation as experienced by stakeholders (Palinkas, Horwitz et al., 2011b).
Results
The pool of potential participants was comprised of school professionals from 153 schools in Pennsylvania, classified as rural (fringe, distant, remote) according to the US Census Bureau, implementing PBIS at Tier 1. We emailed or called the principal from these schools to explain the study and let school staff participate. We were able to explain the study to 77 principals (50% response rate). The principals from 25 schools agreed to let the school professionals participate (16% participation rate); all participants from the 25 schools consented to participate in the study. A participant had incomplete data but was kept in the study. The final sample was composed of 25 school professionals (see Table 1).
User-Centered Design Prototype Evaluation and Modification
This section includes a description of stakeholder feedback and modifications of the training platform and implementation strategy. Data from the 25 participants are organized into three sub-sections: Barriers and Facilitators to Participating in Remote Training, based on Interview # 1 data; Asynchronous Modules, based on Survey # 2 and member check data; and Implementation Strategy, based on Interview # 2, Survey # 2, and member check data. Data are presented pertaining to qualitative theme analysis for barriers and facilitators to participating in remote training, quantitative descriptive statistics and qualitative theme analysis for asynchronous video modules, and descriptive statistics and theme analysis for implementation strategy. These data were summarized and discussed by the research team in order to make modifications to the original versions of the training platform and implementation strategy. When quotes are provided, we identify them using P (participant) followed by the participant identification number. We end this section describing the changes that were made to the modules and implementation strategy.
Barriers and Facilitators to Participating in Remote Training
In the first qualitative interview, participants reported their previous experience with training as well as potential barriers and facilitators to the remote training and consultation process described to them. These barriers and facilitators fell into the following themes: prior experience, perceptions about engagement in remote format, the availability of necessary resources, time as a barrier, and school buy-in and support as a facilitative condition.
Prior Experience
All participants reported previously participating in remote trainings, and all but one participant reported prior at least some training in mental health. Although many participants mentioned graduate courses as a source of this training, participants also mentioned trainings to meet continuing education requirements, PBIS forums, trainings as a part of professional association membership, trainings from their regional technical assistance centers, and other trainings and courses. Most participants stated that they have received training on or related to PBIS, CICO, safe crisis management and intervention. A few participants reported receiving training on trauma-informed approaches, and one participant reported receiving training on cognitive behavioral therapy.
Some participants raised that participating in remote training and consultation would be feasible and/or acceptable due to their prior experiences. As one participant (P14) stated, “Not that COVID has given us a whole lot of positives, but this is one of them, because I think we've all gotten very used to using computers to communicate with each other and just kind of thinking outside of the box. So that doesn't make me nervous.”
Engagement in Remote Format
Several participants plainly stated that participating in remote training would not be a problem, but a few participants noted that some components of in-person training are missed in remote training. As one participant (P17) explained, “I don't like [remote] quite as well as in person because I feel like I get distracted more easily. So, I feel like I have to work a little harder to really focus and pay attention when it's done on teleconference… I just would prefer in-person, if I had my choice.” A different participant (P04) was concerned about the interaction piece of remote trainings, but explained that it would be possible to promote collaboration virtually:
…in a small group where you're able to, you know, just like collaborate together, that would be much different…We hold our staff meetings virtually now and it's a small group and we talk about some different things with kids. And those I do enjoy because—or I get a lot out of—because we can all kind of bounce ideas off of each other.
Another participant (P15) shared that in-person trainings were preferable, but that they have had no problems with remote trainings: “I think it's a little bit more difficult to do remote than it is in person and live, just for the personal interaction piece of it. But I haven't really had any major issues with doing remote trainings.”
Resources
A few participants stated that they had access to computers and internet, which would support feasibility of the remote training and consultation. One respondent (P15) noted that they often have technology issues at their district, which might cause a barrier, explaining, “Our technology at our school district is not the best. So, there are moments where our internet goes down, we lose power…So I would say my only barrier is our lack of like solid technology.”
Time
Time emerged as the most commonly reported barrier to participating in remote training and consultation as well as implementing interventions. As one participant (P09) generally stated, “I mean, time is always, like, of the essence…” Most respondents specifically pointed to their own complicated job descriptions which require them to prioritize emergencies in the school building. A couple respondents also specifically noted that it can be more difficult to indicate that you are busy when participating in remote training as compared to in-person training; therefore, they may be more likely to be interrupted. As one respondent (P20) explained, “I would still probably be in the school building and I would get interrupted…They would expect me to still be working even though I was in a training.”
School Buy-in and Support
Most participants named school and administrator buy-in and support as a primary facilitator for participating in remote training and consultation. One participant (P07) described how they believed support from administration might help mitigate barriers around time:
I don’t really think that there would be difficulty because it lends itself to, like, my actual job and the school district is supporting the participation…It wouldn’t be difficult for me, but I guess the only logistic that would have to be worked out would be in terms of scheduling just to make sure that I have access to the students at the time that we would need them or whatever. But yeah, I can’t really right now think of any barriers, so to speak, mainly because the school district is on board. So, they’re aware; it’s something we’ve committed to working through.
Many echoed this sentiment, stating that their administration would provide support for time and scheduling barriers.
Asynchronous Modules
Three groups each watched and rated the modules. We used random group assignment. The first group (n = 8) watched 12 CPP modules, the second group (n = 9) watched 8 CATS modules, and the third group (n = 7) watched 8 CICO modules as well as a few additional modules not associated with a specific intervention. The additional modules included: (1) a module about effective strategies for running groups with children, (2) a module about how to conduct a brief in-service training with school faculty about recognizing signs of externalizing and internalizing problems in students and an overview of the screening process for identifying students for Tier 2 interventions, and (3) a module about barriers to implementing mental health EBPs, including those identified by participants, and ways to overcoming them. Participants rated each group of modules for perceived acceptability, appropriateness, feasibility, and usability. The scores were uniformly high. Acceptability scores ranged from 3.89 (CATS; SD = 0.66) to 4.36 (CICO; SD = 0.67), appropriateness scores ranged from 4.18 (Identifying students in need of services; SD = 0.77) to 4.57 (CICO; SD = 0.53), and feasibility scores ranged from 4.17 (CATS; SD = 0.59) to 4.57 (Group management; SD = 0.53). The lowest score for usability was 6.11 (CATS; SD = 0.91), and the highest score was 6.67 (CPP; SD = 0.44), see Table 2.
In general, the qualitative data mirrored the quantitative ratings. Most participants felt positively about the training modules; they reported that the modules were useful and easy to use. In open-ended responses, participants reported that they specifically liked that the modules were broken down by modules or parts of the intervention manuals and that they were convenient and easy to use. However, participants also provided critical areas of feedback, particularly regarding engagement, sound quality and background noise, formatting, clarity, and the explicit connection of information. Below we summarize the feedback participants provided in these areas.
Engagement
Participants assigned to most groups reported that the modules were a bit boring and/or repetitive. Several participants across most groups reported resources or strategies that could strengthen learning, including providing a “training packet” (P03) or physical document to use to take notes; breaking information down into more, shorter slides; organizing the bullet points so that they appear in the order that the narrator is speaking about them; integrating more visuals and examples; and asking questions during the module to keep the participant engaged.
In the member check, almost everyone agreed that training modules should have supporting documents, such as a “training packet,” PowerPoint slides, or a workbook. Although no one strongly disagreed, participants had mixed perspectives about whether the training modules should have interactive elements, such as questions to answer while watching. Only one person (P10) added an additional remark about this in open-ended responses, explaining, “I complain when training modules have interactive elements, but it does help me focus and motivate me to more completely learn the information.”
Sound Quality/Background Noise
Participants in most of the assigned video groups noted background noise as a quality issue. However, when asked directly in the member check survey about the issue, we saw mixed levels of satisfaction with background noise across all groups. For example, one participant (P18) wrote, “It was not a major problem.” Across all groups, participants reported moderate to high satisfaction with text size, indicating that text size may not have been a general issue. The member check confirmed mixed satisfaction with background noise across all groups.
Formatting
Participants also noted some quality issues such as the size of the text on the slides, inconsistent formatting, and, in one specific video group, verbal information not always aligning with text information. One participant (P12) stated that “overall presentation” made the modules difficult to use and questioned whether slides were “ADA compliant” because they were difficult to read due to color or size. Participants suggested changing the size of the text and cleaning up the audio. In the member check, we found that there was mixed satisfaction with consistent formatting across the CATS and Coping Power groups but moderate to high satisfaction with text size overall.
Content Clarity and Explicit Connection of Information
Although there were some mixed perspectives, most participants thought that the modules were clear and provided information that would help them with implementation, or, as one participant (P01) said, “to the point.” Participants specifically valued the examples that were provided in the training modules. However, they provided some feedback to improve this component of the modules. A few participants stated that specific modules (the modules about managing therapy groups) needed more examples and others (in the CATS group) thought that examples were unrealistic, not clear and concise enough, and that the trainee should be better oriented to the part of the group that they are about to watch in the video example.
Implementation Strategy
The survey scores from the 25 participants about perceived acceptability, feasibility, and willingness to participate in remote training were uniformly high. Acceptability scores ranged from 4.77 (SD = 0.43) for use of remote technology to 4.92 (SD = 0.27) for the CATS group intervention. Feasibility scores ranged from 4.40 (SD = 0.87) for the CPP group intervention to 4.81 (SD = 0.49) for coaching. Willingness to participate scores ranged from 4.58 (SD = 0.90) for the CICO individualized intervention to 4.81 (SD = 0.57) for CATS (see Table 3).
In the second one-on-one interview in which the same 25 participants responded to the paper prototype of the remote training strategy, participants reported themes that built upon those they reported in the first qualitative interview. These themes were related to prior experience, engagement, resources, time, and school buy-in and support.
Prior Experience
As in the first interview, participants responding to the proposed implementation strategy reported that their prior experiences with training would be a facilitator for participating in remote training and consultation. Specifically, several participants noted that they had experience with telehealth technology and a few shared that they had experiences similar to the didactics protocol described in the paper prototype. In contrast, one participant (P08) noted that the didactics protocol would be new to them and would require “getting comfortable with it and everything.”
Engagement in Remote Format
A few participants responding to the proposed implementation strategy in the second interview maintained a preference for in-person training opportunities, which was raised in the first interview. However, they similarly did not report that doing trainings remotely would be wholly negative: “I think that it can have its benefits.” (P19).
Resources
Participants also echoed sentiments reported in the first interview when reporting that they have resources, specifically computers and internet, that would support feasibility of the training and consultation. While one spoke about occasional internet glitches and another spoke about issues with telephone service, others voiced that they had no concerns about the resources needed to participate.
Time
As in the first interview, the biggest barrier that emerged in the interview responding to the proposed implementation strategy was the challenge of time to participate in online trainings and consultation. Participants spoke about the same issues (having to prioritize emergency situations and juggle multiple duties).
Despite this, participants noted several strategies that would mitigate barriers related to time: having a consistent schedule or planning ahead of time, allowing for flexibility when necessary, and making use of hours outside of the regular school day.
A few participants noted that it is easier to attend online trainings or remote consultation due to convenience. One of these participants (P14) similarly noted that they preferred in-person trainings but considered the convenience of online trainings: “It does create that flexibility and, again, like reduces transportation and stuff like that.” Another participant (P13) also explained that remote trainings are easier to attend as they require fewer logistics to figure out, such as finding coverage: “I have found that meetings in general that have been taking place via Zoom or some type of [virtual] meeting…we’ve had more success in general. Whether that’s people being able to attend, not having to get as much coverage.”
School Buy-in and Support
Echoing responses from the first interview, many participants named school-wide support as a facilitator to participation in the proposed implementation strategy. Several voiced that they had support from their administration, but a few noted that feasibility of the proposed implementation strategy would depend on administration and staff support.
Summary of Changes to Training Platform and Implementation Strategy
Quantitative ratings and qualitative feedback from participants guided revision of the asynchronous video modules and implementation strategy. Quantitative and qualitative data specific to the asynchronous video modules informed the addition of several engagement strategies and refinement of modules to increase their quality and clarity. Data related to the training and consultation implementation strategy informed strategies to mitigate barriers related to time, engagement, and school buy-in and support. We outline the changes made to the modules and implementation strategy in Table 4.
Discussion
We used UCD to develop an online training platform and accompanying implementation strategy for school professionals serving children at-risk for mental health problems in rural schools and examined stakeholders’ responses about the platform and implementation strategy. The training platform addresses an acute need for specialized training in EBPs by professionals in rural schools (Harmon et al., 2007). An important first step was the assessment of participants’ previous experiences with training and perceived barriers and facilitators of remote training. This information was used to develop the paper modules and implementation strategy to fit the context of rural schools. Most participants (i.e., school counselors or social workers; 76% of the sample), were mental health professionals. A few participants (e.g., a reading specialist and PBIS coach, a regular education teacher) received little to no prior mental health training in pursuit of their professional degree. Most participants reported having prior experience with receiving in-person and remote training to meet continuing education requirements, such as training on mental health interventions. However, very few reported receiving training on interventions for internalizing problems, and only one participant reported receiving training on an EBP. This is consistent with findings from previous studies, indicating that most professionals in rural schools have not been trained on interventions for internalizing disorders, or on EBPs for any disorder (e.g., Siceloff et al., 2017).
The most important barriers reported by participants included having difficulty finding the time to participate in training, receive consultation from members of the research team, or deliver interventions to students. Participants also reported that it could be difficult to obtain buy-in from teachers and caregivers for the completion of measures, and from administrators for the implementation of EBPs. Obtaining parent/guardian consent to allow children to receive mental health interventions was also identified as a potential barrier. The presence of these barriers is consistent with findings from previous studies in rural schools (Moore et al., 2022). Participants identified solutions for dealing with the time barrier, including having flexible times for participation in remote training, receiving advance notice so they can fit training into their schedule, and obtaining buy-in from administrators so they can have some flexibility in their schedule.
Participants identified several concerns about the video modules, including inadequate engagement in the delivery of material, sound quality, formatting of slides, and clarity of content.
We addressed the reported barriers in the revised videos and implementation strategy. For example, we included an EBP for anxiety problems, offered specific training on how to identify and screen children at risk for mental health problems, and provided training on how to run groups effectively. The revised implementation strategy includes a flexible schedule for conducting the initial training and subsequent consultation to better fit the schedule of busy school, professionals, and more interactive communication during the initial training. To increase buy-in, we added sections to the asynchronous training to highlight the evidence supporting intervention effectiveness, and intervention impact on students. The implementation strategy now includes sharing information with administrators and reminding them about the need for their support of school professionals delivering the interventions.
We edited the modules to address audio and visual formatting issues, clarified information on several slides that were reported to be confusing, and improved the flow of how information was presented on the modules. We noted that obtaining parent/guardian consent could be difficult to obtain in some cases and offered different options for describing the study to caregivers and obtaining consent/assent.
Participants reported that participation in remote training would be feasible and acceptable given prior experiences, and that they have access to computers and internet. However, some participants noted that they had encountered some problems with internet connectivity, and they raised concern that they would likely be interrupted during supervision and implementation of the different interventions.
Many participants in the qualitative interviews reported being excited and motivated to participate in remote training. This is not surprising given that school professional in rural schools has few opportunities to gain access to quality professional development training (Harmon et al., 2007).
UCD was a very helpful framework for guiding the development and refining of training components and implementation strategy. The framework, which uses a participatory approach, helps with the development of products that are responsive to context (Goodman et al., 2012), and that are acceptable, appropriate, and feasible and easy to use (Lyon et al., 2020).
Limitations
The study has some limitations. First, no school psychologists participated in the study. School principals nominated participants who were members of the PBIS leadership team. Although no data were collected about individual members of the larger leadership team, it is possible that school psychologists were not nominated because of their busy schedule with testing or because they were not taking part in this team. Second, it is not clear to what extent participant schools are representative of other rural schools. Although the response rate was adequate, the participation rate was low. Other studies conducted in schools have also reported low participation rates (e.g., Heinrichs et al., 2005). Third, we were able to only do two iterations of the prototypes. This might have limited the refinement process of the training platform and implementation strategy, which might result in unforeseen implementation barriers. Forth, the platform and implementation strategy were largely developed based on data provided by school counselors and social workers and, as such, data on feasibility, acceptability, and appropriateness might not generalize to other school professionals (e.g., school psychologists), school faculty, or paraprofessionals. Fifth, the quantitative data analysis was only descriptive. A more robust statistical analysis delineating differences across groups based on group participant demographics, EBP-specific components, and module length, would have strengthened the results. Sixth, only participant-level data were collected. Future researchers could strengthen the study by including school-level demographic information.
Implications and Future Directions
The participation of school professionals from rural schools in a training development project based on implementation science approaches has implications for training. Implementation science has been described as “essential to the process of translating evidence-based interventions into the unique context of schools” (Forman et al., 2013, p. 77). Training programs in school psychology, counseling and social work have steadily increased instruction on mental health EBPs and implementation of EBPs (Regehr et al., 2007; Shernoff, 2017; Zyromsky et al., 2018). However, a large number of school practitioners still do not use EBPs either because they have never received appropriate instruction or because they face significant barriers to implementing them (Hicks et al., 2014). A solution to this training gap is training school professionals in the places where they work. Training professionals in rural schools (a specific school context) requires strategies focused on implementing with fidelity, given the close connection between fidelity and student outcomes (Durlak & Dupre, 2008) and a delivery approach that accounts for specific barriers and facilitators for the rural school context (Paulson et al., 2015).
Given existing barriers such as limited time for training activities, finding the right combination of remote training components (e.g., use of asynchronous modules, synchronous coaching) vis-à-vis fidelity and student outcomes, seem like an important next step. This will be addressed in the upcoming clinical trial.
Conclusions
The study makes contributions to the research literature by providing a step-by-step description of the development of a remote training platform and implementation strategy based on UCD. The use of a participatory approach for the development of the training strategy should increase training buy-in and minimize common implementation barriers for the use of EBPs in a group of under-served schools. Providing school professionals with appropriate implementation strategies (i.e., training) that are effective, available on demand, and built for the specific rural context, might enable rural schools to better serve student mental health needs and contribute to narrowing services disparities (Moon et al., 2017; Paulson et al., 2015; Wilger, 2015).
Availability of Data and Material
Data are available based upon reasonable requests.
References
Abras, C., Maloney-krichmar, D., & Preece, J. (2004). User-centered design. In W. Bainbridge (Ed.), Encyclopedia of human-computer interaction. Sage Publications.
Algozzine, B., Barrett, S., Eber, I., George, H., Horner, R., Lewis, T., Putnam, B., Swain-Bradway, J., McIntosh, K., & Sugai, G. (2014). School-Wide PBIS Tiered Fidelity Inventory. OSEP Technical Assistance Center on Positive Behavioral Interventions and Supports.
American Psychological Association. (2016). County-level analysis of U.S. licensed psychologists and health indicators. Washington, DC: Authors.
Anderson, N. J., Neuwirth, S. J., Lenardson, J. D., & Hartley, D. (2013). Patterns of care for rural and urban children with mental health problems. Retrieved from Portland, ME: https://chronicleofsocialchange.org/report/patterns-of-care-for-rural-and-urban-children-with-mental-health-problems
Arcury, T. A., Preisser, J. S., Gesler, W. M., & Powers, J. M. (2005). Access to transportation and health care utilization in a rural region. The Journal of Rural Health, 21(1), 31–38. https://doi.org/10.1111/j.1748-0361.2005.tb00059.x
Beidas, R. S., & Kendall, P. C. (2010). Training therapists in evidence-based practice: A critical review of studies from a systems-contextual perspective. Clinical Psychology Science and Practice, 17(1), 1–30. https://doi.org/10.1111/j.1468-2850.2009.01187.x
Berryhill, B., Carlson, C., Hopson, L., Culmer, N., & Williams, N. (2021). Adolescent depression and anxiety treatment in rural schools: A systematic review. Journal of Rural Mental Health, 46(1), 13–27. https://doi.org/10.1037/rmh0000183
Bice-Urbach, B. J., & Kratochwill, T. R. (2016). Teleconsultation: The use of technology to improve evidence-based practices in rural communities. Journal of School Psychology, 56, 27–43.
Birt, L., Scott, S., Cavers, D., Campbell, C., & Walter, F. (2016). Member checking: A tool to enhance trustworthiness or merely a nod to validation? Qualitative Health Research, 26(13), 1802–1811. https://doi.org/10.1177/1049732316654870
Bradley, E. H., Curry, L. A., & Devers, K. J. (2007). Qualitative data analysis for health services research: developing taxonomy, themes, and theory. Health Services Research, 42(4), 1758–1772. https://doi.org/10.1111/j.1475-6773.2006.00684.x
Bureau of Health Workforce (January 2017). Designated health professional shortage areas statistics. In Washington, DC: U.S Department of Health & Human Services.
Center on MTSS (2022). Essential components of MTSS. American Institutes of Research. https://mtss4success.org/essential-components
Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research (2nd ed.). Sage Publications.
Dopp, A. R., Parisi, K. E., Munson, S. A., & Lyon, A. R. (2018). A glossary of user-centered design strategies for implementation experts. TBM. https://doi.org/10.1093/tbm/iby119
Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3–4), 327–350. https://doi.org/10.1007/s10464-008-9165-0
Eiraldi, R., Benjamin Wolk, C., Locke, J., & Beidas, R. (2015). Clearing hurdles: The challenges of implementation of mental health evidence-based practices in under-resourced schools. Advances in School Mental Health Promotion, 8(3), 124–145. https://doi.org/10.1080/1754730X.2015.1037848
Eiraldi, R., Power, T., Schwartz, B., Keiffer, J., McCurdy, B., Mathen, M. & Jawad, A.F. (2016). Examining effectiveness of group cognitive behavioral therapy for externalizing and internalizing disorders in urban schools. Behavior Modification, 40, 611–639. https://doi.org/10.1177/0145445516631093
Eiraldi, R., Mautone, J. A., Khanna, M. S., Power, T. J., Orapallo, A., Cacia, J., Schwartz, B. S., McCurdy, B., Keiffer, J., Paidipati, C., Kanine, R., Abraham, M., Tulio, S., Swift, L., Bressler, S. N., Cabello, B., & Jawad, A. F. (2018). Group CBT for externalizing disorders in urban schools: Effect of training strategy on treatment fidelity and patient outcomes. Behavior Therapy, 49(4), 538–550. https://doi.org/10.1016/j.beth.2018.01.001
Eiraldi, R., McCurdy, B., Schwartz, B., Benjamin Wolk, C., Abraham, M., Jawad, A. F., Nastasi, B., & Mautone, J. (2019). Pilot study for the fidelity, acceptability and effectiveness of a PBIS program plus mental health supports in under-resourced urban schools. Psychology in the Schools, 56, 1230–1245. https://doi.org/10.1002/pits.22272
Eiraldi, R., Khanna, M. S., Jawad, A. F., Power, T. J., Cacia, J., Cabello, B., Schwartz, B. S., Swift, L., McCurdy, B., Keiffer, J., Kanine, R., Orapallo, A., McCurdy, B., & Mautone, J. A. (2020). Implementation of targeted mental health interventions in urban schools: Preliminary findings on the impact of training strategy on program fidelity. Evidence-Based Practice in Child and Adolescent Mental Health, 5(4), 437–451. https://doi.org/10.1080/23794925.2020.1784056
Eiraldi, R. McCurdy, B. L. Khanna, M. S. Goldstein, J. Comly, R. Francisco, J. Rutherford, L. E. Wilson, T. Henson, K. Farmer, T. Abbas F. J. (2022). Development and evaluation of a remote training strategy for the implementation of mental health evidence-based practices in rural schools: Pilot study protocol. Pilot and Feasibility Studies, 8, 128. https://doi.org/10.1186/s40814-022-01082-4
Ezekiel, N., Malik, C., Neylon, K., Gordon, S., Lutterman, T., & Sims, B. (2021). Improving behavioral health services for individuals with SMI in rural and remote communities. In Washington, D.C.: American Psychiatric Association for the Substance Abuse and Mental Health Services Administration.
Fischer, A. J., Dart, E. H., Hartman, K. L., Steeves, R. O., & Gresham, F. M. (2016). An investigation of the acceptability of videoconferencing within a school-based behavioral consultation framework. Psychology in the Schools, 53(3), 240–252. https://doi.org/10.1002/pits.21900
Forman, S. G., Shapiro, E. S., Codding, R. S., Gonzalez, J. E., Reddy, L. A., Rosenfield, S. A., Sanetti, L. M., & Stoiber, K. C. (2013). Implementation science and school psychology. School Psychology Quarterly, 28, 77–100. https://doi.org/10.1037/spq0000019
Foster, S. (2005). School mental health services in the United States, 2002–2003. Rockville: Center for Mental Health Services, Substance Abuse and Mental Health Administration.
Goodman, E., Kuniavsky, M., & Moed, A. (2012). Observing the user experience: A practitioner’s guide to user research (2nd ed.). Morgan Kaufmann.
Guest, G., Bunce, A., & Johnson, L. (2006). How many interviews are enough?: An experiment with data saturation and variability. Field Methods, 18(1), 59–82. https://doi.org/10.1177/1525822X05279903
Hanington, B., & Martin, B. (2012). Universal methods of design: 100 ways to research complex problems, develop innovative ideas, and design effective solutions. Rockport Publishers.
Harmon, H. L., Gordanier, J., Henry, L., & George, A. (2007). Changing teaching practices in rural schools. The Rural Educator, 28(2), 8–12. https://doi.org/10.35608/ruraled.v28i2.480
Harvey, L. (2015). Beyond member-checking: A dialogic approach to the research interview. International Journal of Research and Method in Education, 38(1), 23–38. https://doi.org/10.1080/1743727X.2014.914487
Hawken, L. S., Adolphson, S. L., McLeod, K. S., & Schumann, J. (2009). Secondary-tier interventions and supports. In W. Saylor, G. Dunlap, G. Sugai, & R. Horner (Eds.), Handbook of positive behavior support (pp. 395–420). Springer.
Hawken, L. S., Bundock, K., Kladis, K., O’Keeffe, B., & Barrett, C. A. (2014). Systematic review of the check-in, check-out intervention for students at risk for emotional and behavioral disorders. Education and Treatment of Children, 37(4), 635–658. https://doi.org/10.1353/etc.2014.0030
Heinrichs, N., Bertram, H., Kuschel, A., & Hahlweg, K. (2005). Parent recruitment and retention in a universal prevention program for child behavior and emotional problems: barriers to research and program participation. Prevention Science, 6(4), 275–286. https://doi.org/10.1007/s11121-005-0006-1
Herschell, A. D., Kolko, D. J., Baumann, B. L., & Davis, A. C. (2010). The role of therapist training in the implementation of psychosocial treatments: A review and critique with recommendations. Clinical Psychology Review, 30(4), 448–466. https://doi.org/10.1016/j.cpr.2010.02.005
Herschell, A. D., Schake, P. L., Hutchison, S. L., Karpov, I. O., Gavin, J. G., Crisan, T. B., & Wasilchak, D. S. (2021). Evaluating the effectiveness of a statewide school-based behavioral health program for rural and urban elementary-aged students. School Mental Health: A Multidisciplinary Research and Practice Journal, 13(4), 743–755. https://doi.org/10.1007/s12310-021-09441-x
Hicks, T. B., Shahidullah, J. D., Carlson, J. S., & Palejwala, M. H. (2014). Nationally certified school psychologists’ use and reported barriers to using evidence-based interventions in schools: The influence of graduate program training and education. School Psychology Quarterly, 29(4), 469–487. https://doi.org/10.1037/spq0000059
Hoover, S. A., & Mayworm, A. M. (2017). The benefits of school mental health. In K. D. Michael & J. P. Jameson (Eds.), Handbook of rural school mental health (pp. 3–16). Springer.
International Organization for Standardization. (1998). ISO-9241-11.:Ergonomic requirements for office work with visual display terminals (VDTs) e Part 11: Guidance on usability.
Kelleher, K. J., & Gardner, W. (2017). Out of sight, out of mind - behavioral and developmental care for rural children. New England Journal of Medicine, 376(14), 1301–1303. https://doi.org/10.1056/NEJMp1700713
Kern, L., Evans, S. W., & Lewis, T. J. (2011). Description of an iterative process for intervention development. Education and Treatment of Children, 34(4), 593–617. https://doi.org/10.1353/etc.2011.0037
Kern, L., Mathur, S. R., Albrecht, S. F., Poland, P., Rozalski, M., & Skiba, R. J. (2017). The need for school-based mental health services and recommendations for implementation. School Mental Health, 9, 205–217. https://doi.org/10.1007/s12310-017-9216-5
Khanna, M. S., Eiraldi, R., Schwartz, B., & Kendall, P. C. (2016). CBT for anxiety treatment in schools. Unpublished.
King, H. C., Bloomfield, B. S., Wu, S., & Fischer, A. J. (2021). A Systematic review of school teleconsultation: Implications for research and practice. School Psychology Review. https://doi.org/10.1080/2372966X.2021.1894478
Kirchner, J. E., Waltz, T. J., Powell, B. J., Smith, J. L., & Proctor, E. K. (2018). Implementation strategies. In R. C. Browson, G. A. Colditz, & E. K. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (2nd ed.). Oxford University Press.
Larson, E. H., Patterson, D. G., Garberson, L. A., & Andrilla, C. H. A. (2016). Supply and distribution of the behavioral health workforce in rural America. Data Brief #160. In Seattle, WA: WWAMI Rural Health Research Center, University of Washington.
Lee, S. W., Lohmeier, J. H., Niileksela, C., & Oeth, J. (2009). Rural schools’ mental health needs: Educators’ perceptions of mental health needs and services in rural schools. Journal of Rural Mental Health, 33(1), 26–31. https://doi.org/10.1037/h0095970
Lewis, C. C., Fischer, S., Weiner, B. J., Stanick, C., Kim, M., & Martinez, R. G. (2015). Outcomes for implementation science: An enhanced systematic review of instruments using evidence-based rating criteria. Implementation Science, 10, 155. https://doi.org/10.1186/s13012-015-0342-x
Lochman, J. E., Wells, K. C., & Lenhart, L. (2008). Coping power child group program: Facilitator guide. Oxford University Press.
Lyon, A. R., Dopp, A. R., Brewer, S. K., Kientz, J. A., & Munson, S. A. (2020). Designing the future of children’s mental health services. Administration and Policy in Mental Health, 47(5), 735–751. https://doi.org/10.1007/s10488-020-01038-x
Lyon, A. R., & Koerner, K. (2016). User-centered design for psychosocial intervention development and implementation. Clinical Psychology (new York), 23(2), 180–200. https://doi.org/10.1111/cpsp.12154
McDaniel, S. C., & Bloomfield, B. S. (2020). School-wide positive behavior support telecoaching in a rural district. Journal of Educational Technology Systems, 48, 335–355. https://doi.org/10.1177/0047239519886283
McDaniel, S. C., Bloomfield, B. S., Guyotte, K. W., Shannon, T. M., & Byrd, D. H. (2020). Telecoaching to support schoolwide positive behavior interventions and supports in rural schools. Journal of Education for Students Placed at Risk (JESPAR), 26(3), 236–252. https://doi.org/10.1080/10824669.2020.1834395
Mohr, D. C., Lyon, A. R., Lattie, E. G., Reddy, M., & Schueller, S. M. (2017). Accelerating digital mental health research from early design and creation to successful implementation and sustainment. Journal of Medical Internet Research, 19(5), e153. https://doi.org/10.2196/jmir.7725
Moon, J., Williford, A., & Mendenhall, A. (2017). Educators’ perceptions of youth mental health: Implications for training and the promotion of mental health services in schools. Children and Youth Services Review, 73, 384–391. https://doi.org/10.1016/j.childyouth.2017.01.006
Moore, A., Stapley, E., Hayes, D., Town, R., & Deighton, J. (2022). Barriers and facilitators to sustaining school-based mental health and wellbeing interventions: A systematic review. International Journal of Environmental Research in Public Health. https://doi.org/10.3390/ijerph19063587
Newkirk, V., & Damico, A. (2014). The Affordable Care Act and insurance coverage in rural areas. In Kaiser Family Foundation.
Olson, J. R., Lucy, M., Kellogg, M. A., Schmitz, K., Berntson, T., Stuber, J., & Bruns, E. J. (2021). What happens when training goes virtual? Adapting training and technical assistance for the school mental health workforce in response to COVID-19. School Mental Health, 13, 160–173. https://doi.org/10.1007/s12310-020-09401-x
Owens, J. S., Richerson, L., Murphy, C. E., Jageleweski, A., & Rossi, L. (2007). The parent perspective: Informing the cultural sensitivity of parenting programs in rural communities. Child and Youth Care Forum, 36(5–6), 179–194. https://doi.org/10.1007/s10566-007-9041-3
Owens, J. S., Watabe, Y., & Michael, K. D. (2013). Culturally responsive school mental health in rural communities. In C. S. Clauss-Ehlers, Z. Serpell, & M. D. Weist (Eds.), Handbook of culturally responsive school mental health: Advancing research, training, practice, and policy (pp. 31–42). Springer.
Owens, P. L., Hoagwood, K., Horwitz, S. M., Leaf, P. J., Poduska, J. M., Kellam, S. G., & Ialongo, N. S. (2002). Barriers to children’s mental health services. Journal of the Ameerican Academy of Child and Adolescent Psychiatry, 41(6), 731–738. https://doi.org/10.1097/00004583-200206000-00013
Palinkas, L. A., Aarons, G. A., Horwitz, S., Chamberlain, P., Hurlburt, M., & Landsverk, J. (2011a). Mixed method designs in implementation research. Administration and Policy in Mental Health, 38(1), 44–53. https://doi.org/10.1007/s10488-010-0314-z
Palinkas, L. A., Horwitz, S. M., Chamberlain, P., Hurlburt, M. S., & Landsverk, J. (2011b). Mixed-methods designs in mental health services research: A review. Psychiatric Services, 62(3), 255–263. https://doi.org/10.1176/appi.ps.62.3.255
Parmanto, B., Lewis, A. N., Graham, K. M., & Bertolet, M. H. (2016). Development of the telehealth usability questionnaire (TUQ). International Journal of Telerehabilitation, 8(1), 3–10. https://doi.org/10.5195/ijt.2016.6196
Paulson, L. R., Casile, W. J., & Jones, D. (2015). Tech it out: Implementing an online peer consultation network for rural mental health professionals. Journal of Rural Mental Health, 39(3–4), 125–136. https://doi.org/10.1037/rmh0000034
Polaha, J., Williams, S. L., Heflinger, C. A., & Studts, C. R. (2015). The perceived stigma of mental health services among rural parents of children with psychosocial concerns. Journal of Pediatric Psychology, 40(10), 1095–1104. https://doi.org/10.1093/jpepsy/jsv054
Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A., Griffey, R., & Hensley, M. (2011). Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health. https://doi.org/10.1007/s10488-010-0319-7
QSR International Pty Ltd. (2020) NVivo (released in March 2020), https://www.qsrinternational.com/nvivo-qualitative-data-analysis-software/home
Regehr, C., Stern, S., & Shlonsky, A. (2007). Operationalizing evidence-based practice: The development of an institute for evidence-based social work. Research on Social Work Practice, 17(3), 408–416. https://doi.org/10.1177/1049731506293561
Robinson, L. R., Holbrook, J. R., Bitsko, R. H., Hartwig, S. A., Kaminski, J. W., Ghandour, R. M., Peacock, G., Heggs, A., & Boyle, C. A. (2017). Differences in health care, family, and community factors associated with mental behavioral and developmental disorders among children aged 2–8 years in rural and urban areas—United States, 2011–2012. MMWR Surveillance Summary, 66, 1–11. https://doi.org/10.15585/mmwr.ss6608a1
Shernoff, E. S., Bearman, S. K., Kratochwill, T. R., & Eckert, T. (2017). Training the next generation of school psychologists to deliver evidence-based mental health practices: Current challenges and future directions. School Psychology Review, 46(2), 219–232. https://doi.org/10.17105/SPR-2015-0118.V46-2
Sholomskas, D. E., Syracuse-Siewert, G., Rounsaville, B. J., Ball, S. A., Nuro, K. F., & Carroll, K. M. (2005). We don’t train in vain: A dissemination trial of three strategies of training clinicians in cognitive-behavioral therapy. Journal of Consulting Clinical Psychology, 73(1), 106–115. https://doi.org/10.1037/0022-006X.73.1.106
Siceloff, E. R., Barnes-Young, C., Massey, C., Yell, M., & Weist, M. D. (2017). Building policy support for school mental health in rural areas. In K. D. Michael & J. P. Jameson (Eds.), Handbook of rural school mental health (pp. 17–33). Springer.
Slade, E. P. (2003). The relationship between school characteristics and the availability of mental health and related health services in middle and high schools in the United States. Journal of Behavioral Health Services Research, 30(4), 382–392. https://doi.org/10.1007/BF02287426
Stephan, S. H., Weist, M., Kataoka, S., Adelsheim, S., & Mills, C. (2007). Transformation of children’s mental health services: The role of school mental health. Psychiatric Services, 58(10), 1330–1338. https://doi.org/10.1176/ps.2007.58.10.1330
Sugai, G., & Horner, R. (2009). Defining and describing school wide positive behavior support. In W. Sailor, G. Dunlap, G. Sugai, & R. Horner (Eds.), Handbook of positive behavior support (pp. 307–326). Springer.
Wagenfeld, M. O. (2003). A snapshot of rural and frontier America. In B. H. Stamm (Ed.), Rural behavioral health care: An interdisciplinary guide (pp. 33–40). American Psychological Association.
Walker, J. S., & Baird, C. (2019). Using “remote” training and coaching to increase providers’ skills for working effectively with older youth and young adults with serious mental health conditions. Children and Youth Services Review, 100, 119–128. https://doi.org/10.1016/j.childyouth.2019.02.040
Weiner, B. J., Lewis, C. C., Stanick, C., Powell, B. J., Dorsey, C. N., Clary, A. S., Boynton, M. H., & Halko, H. (2017). Psychometric assessment of three newly developed implementation outcome measures. Implementation Science, 12, 108. https://doi.org/10.1186/s13012-017-0635-3
Wilger, S. (2015). Special Considerations for Mental Health Services in Rural Schools. Washington, DC: Substance Abuse and Mental Health Services Administration (SAMHSA).
Wilson, W., Bangs, A., & Hatting, T. (2015). The future of rural behavioral health. In National Rural Health Association Policy Brief. Washington, DC: NRHA
Zyromski, B., Dimmitt, C., Mariani, M., & Griffith, C. (2018). Evidence-based school counseling: Models for integrated practice and school counselor education. Professional School Counseling, 21(1), 1–12. https://doi.org/10.1177/2156759X18801847
Acknowledgements
We acknowledge the effort and commitment of busy behavioral health staff, which has made this study possible.
Funding
Funding for the study was provided by the Department of Pediatrics, University of Pennsylvania.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare no financial or non-financial conflicts of interest related to this study.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Eiraldi, R., Comly, R., Goldstein, J. et al. Development of an Online Training Platform and Implementation Strategy for School-Based Mental Health Professionals in Rural Elementary Schools: A Mixed-Methods Study. School Mental Health 15, 692–709 (2023). https://doi.org/10.1007/s12310-023-09582-1
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12310-023-09582-1