Abstract
Reading screenings are an essential element of a preventative model of education. Early language and literacy screenings can identify students at risk of later reading difficulties. This pilot study investigated the feasibility and impact of a community-based organization providing free language and literacy screenings using an application based screening with largely automated scoring. The community organization paired screening results with parent education on language and literacy acquisition and evidence-based instructional practices tailored to the students’ identified risks. The mixed methods utilized survey data from parents/caregivers (n = 19) and volunteer screeners (n = 8) and interviews of community partners (n = 2), volunteers (n = 2), and parents (n = 2). Results of the pilot met the feasibility and impact goals. Community partners felt it was important to provide access to screening, and volunteers found the screening application easy to administer. Volunteer screeners reported the screening application was easy to administer, and children were engaged throughout the screening. Parents reported that the screening results and parent education significantly impacted their decision-making for their child(ren).
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
While reading has always been a critical skill, the demands of today’s information society exponentially increase the need for all adults to have language and literacy skills. Students who experience reading difficulties are at higher risk for several negative outcomes, including academic failure and various adverse mental health outcomes (Arnold et al., 2005). Despite the powerful outcome data, United States reading proficiency rates as measured by the National Assessments of Educational Progress (NAEP), indicate about 65% of fourth and eighth-grade readers fall “at Basic” or “Below Basic” (National Center for Education Statistics [NCES], 2019, 2022). Current approaches to reform have produced little change since data collection began (NCES, 1992).
Reading disabilities account for 75% of all students classified under the Individuals with Disabilities Education Act’s (IDEA)’s Specific Learning Disability category, which made up 32% of all students in special education in 2021–2022 (NCES, 2023). Wide variance in the definition of dyslexia has long persisted (Shaywitz & Shaywitz, 2020), and this variability has impacted operationalization and identification efforts (Miciak & Fletcher, 2020). Dyslexia initially appeared in federal legislation in Education For All Handicapped Children, 1975 under the umbrella term of “Learning Disabilities”. Other definitions of dyslexia, such as those in the Diagnostic and Statistical Manual of Mental Disorders (5th ed.; DSM–5; American Psychiatric Association, 2013) and the United States Individuals with Disabilities Act of 2004 [IDEA] (2004), have been criticized for several reasons, including bias in identification of minority students (Shaywitz & Shaywitz, 2020) and conflating dyslexia with difficulties in comprehension despite strong evidence to the contrary (Nation & Snowling, 1998). More recently, dyslexia was codified into United States federal legislation in the First Step Act of 2018 (P.L. 115–391, 2018). This definition includes the unexpected nature of dyslexia and its presence despite normal aptitude for reading that impairs one’s reading and spelling abilities and stems from difficulty in phonological processing (Lyon et al., 2003; P.L. 115–391, 2018).
Estimated to affect 5% to 17% of the population (Grigorenko et al., 2020), dyslexia is believed to have a multifactorial casual nature based on a confluence of neurological, behavioral, and environmental factors (van Bergen et al., 2014). The interaction and combination of these factors and the resulting heterogeneity of presentation have led some to argue that the term dyslexia should be thought of not as an underlying condition of a specific category of reading disabilities but rather as a synonym for reading disability (Elliott, 2020; Lopes et al., 2020; Protopapas, 2019).
Identification of Reading Difficulties
Based on operational guidance following the inclusion of dyslexia as a learning disability in 1975, reading disabilities, including dyslexia, have used a discrepancy model. This qualification method suggested that students qualify based in part on a “severe discrepancy between the achievement and intellectual ability” (Assistance to states for education for handicapped children: Procedures for evaluating specific learning disabilities, 1977, p. 65082). This approach, though widely used (Lopes et al., 2020; Mercer et al., 1996), left specification of the eligibility criteria and procedures for identification arbitrarily to the states and resulted in wide rates of variability in identification prevalence (Frankenberger & Harper, 1985), often between 1 and 2 standard deviations (Ozernov-Palchik et al., 2017). This approach has received growing criticism over its validity (see Francis et al., 2005 for a review) as well as its reliability (Shaywitz et al., 1992). Critically, this approach, known as the “wait-to-fail” method (Flowers et al., 2001), does not reliably identify younger students, specifically those in kindergarten and first grade. Identification frequently occurs in second grade and beyond (Ozernov-Palchik & Gaab, 2016), which is problematic for a number of reasons. Intervention for reading have been documented to be maximally effective in kindergarten and first grade, likely due to the heightened plasticity of the brain (Stanley et al., 2018; Wanzek et al., 2018). Delays in the implementation of instruction, both explicit core instruction and remedial instruction, have led to additional time required to close the gap relative to age-appropriate reading skills (Connor et al., 2013; Lovett et al., 2017). Students who are weak readers at the age of nine are unlikely to catch up to their same-age peers (Shaywitz et al., 1999) and experience compounding effects of diminished vocabulary and knowledge acquisition (Quinn et al., 2020). Ozernov-Paloichik and Gaab (2016) identified this phenomenon of dyslexia identification occurring during the window for most effective intervention as the “Dyslexia Paradox.” In addition to academic impact, these students endure negative emotional strain as a result of delayed identification and effective instructional remediation (Gibson & Kendall, 2010). In response to extant research and calls from the field, the 2004 reauthorization of IDEA mandated that schools may not require the use of a “severe discrepancy” between intellectual ability and achievement as part of identification and must permit the use of a response to intervention (IDEA, 2004).
The Response to Intervention and Instruction (RTII) model is a preventative model also known as Multi-tiered System of Supports (MTSS). RTII and MTSS are frameworks that incorporate evidence-based class-wide literacy instructional programs, universal screening to identify later reading difficulties, tiers of instruction and intervention with increasing intensity, as well as frequent progress monitoring (Otaiba & Kim, 2022). Universal screening is advised to occur minimally twice a year for all students (Gersten et al., 2008). The scope of data collection needed within universal screening necessitates measurement tools that are efficient, reliable, repeatable, and cost effective. While screening measures alone are not adequate for a diagnosis of dyslexia, the use of rapid, reliable, and valid screening measures has wide support among literacy experts as it potentially reduces the time students wait for intervention (Petscher et al., 2019). The early identification of reading difficulty has been demonstrated to reduce the incidence of reading disability from 12% to 18% to 1.4% to 5.4% (Foorman et al., 1998).
Reading Screening
Screening is a brief evaluation to identify the risk of performing below a specific threshold at a specific time (Petscher et al., 2019). Screening for dyslexia has a long history dating back over 50 years (Jansky & De Hirsch, 1972) and was encouraged by Senate Resolution 680 in 2018 (S Res 680, 115 Cong, 2018). Screening is both time-efficient and inexpensive and is meant to assess all students (Petscher et al., 2019). Screening measures often focus on identified risk factors, including phonological awareness, letter knowledge, rapid naming, oral language, and family history. Given the reduced effectiveness of reading intervention at later ages and the extensive negative outcomes associated with reading disabilities, early screening is widely supported (Petscher et al., 2019; Torgesen et al., 1999). Experts recommend that screening occurs at a minimum in kindergarten through third grade. Ideally, screening would begin in preschool and continue into middle and high school (Petscher et al., 2019). Early language and literacy screenings have an explicit connection to the science of behavior by examining human behavior through data collection on targeted tasks such as curriculum-based measures (Deno & Mirkin, 1977). This is echoed in a Science of Reading-aligned approach to literacy where it has been shown that targeted, explicit, and systematic instruction of specific foundational skills has a critical role in improving literacy achievement (Castles et al., 2018).
Challenges to Screening
Any screening before kindergarten presents the challenge of both reaching students who have not yet entered the K–12 public school system and locating and training adults to conduct the screenings (as students may or may not be enrolled in schools). Some children ages 3–5 years attend preschools in a variety of settings (e.g., public, private, Head Start) that are funded by a variety of entities (e.g., state and federal departments of education and private pay). Furthermore, these programs follow the guidelines of diverse accrediting bodies, including state and federal agencies and professional organizations such as National Association for the Education of Young Children (NAEYC). Educators working in these settings vary in their training and expertise (NAEYC, 2020).
A secondary issue is the variation in how stakeholders interpret and enact statewide policies, resulting in variable outcomes. As a result, programs adopt and implement (or do not implement) different strategies and tools to effectively screen and properly identify students with detectable concerns (Odegard et al., 2020). Without adherence to an evidence-based framework implemented with fidelity, educators are at risk of promoting practices that have negative outcomes for children (Van Norman et al,, 2020). Furthermore, educators require professional development to understand assessment tools and data-based decision making. For screening to be effective, districts need to ensure that the tools have acceptable reliability, validity, and classification accuracy (Petscher et al., 2019). Finally, not all states require universal screening for language and literacy delays (Heubeck, 2023; National Center on Improving Literacy, 2023).
Parent Involvement: Advocacy and Engagement
Recently, the persistence of low proficiency rates (NAEP, 2019, 2022), along with a growing movement to address instructional inadequacies for early reading instruction and parent advocacy (Hanford, 2017; The Reading League [TRL], 2023) has led to a flurry of state and federal legislation focused on improving literacy outcomes and dyslexia identification. As a result of parent advocacy, a host of dyslexia-specific legislation has been passed in 48 states (National Center on Improving Literacy [NCIL], 2023). At the time this paper was written, nine states, including Pennsylvania, where the current study is situated, do not require dyslexia screening in the educational code (Heubeck, 2023; NCIL, 2023). The rapid increase of dyslexia legislation is attributed to grassroots parent advocacy efforts resulting from the difficulties parents experienced while advocating for services for their own child(ren) (Ward-Lonergan & Duthie, 2018; Youman & Mather, 2015).
Active Parent Engagement
Traditional parent involvement typically includes activities parents and families can engage in that support the schools’ agenda and priorities or, as Ferlazzo (2011) describes schools “telling parents how they can contribute” (p. 12). In contrast to parent involvement, parental engagement is characterized by an equitable relationship that “enables parents to actively contribute their experiences, insights, and knowledge in ways that benefit children’s learning” (Baxter & Kilderry, 2022, p. 2). Parental engagement, mandated in the 2015 reauthorization P.L. 114-95 (Every Student Succeeds Act), extends parental participation beyond the limited role of school supporter, home tutor, and audience (Williams & Chavkin, 1984). The dimension of community organizing in parent engagement focuses on developing community capacity alongside and interconnected with school improvement through public accountability (Gold et al., 2004). Engaged parents and caregivers, leveraging social and cultural capital, can be effective in advocating for legislation but also in collaborating with educators on designing and implementing interventions.
Our Context: Community-Based Literacy Reform
Members of a community-based organization sought to meet the need for early language and literacy screeners in Pennsylvania. The organization was formed in 2017 as a response to the experience of parents whose efforts to advocate for their child’s language and literacy needs were unmet by their school district. The organization initially focused on a single suburban district with the mission of “all students reading to the best of their potential with as little emotional impact as possible” (Everyone Reads PA, 2023, https://www.pareads.org/).
Early initiatives focused on identifying evidence-based professional development for the local school district and family-specific advocacy for identifying reading deficits and evidence-based instruction. Quickly, the need spread beyond the confines of the school district, and a sharp increase in need arose during the instructional disruptions of COVID-19. As a result, the group expanded its reach across the state and established 501c3 status. The priorities of the group were (a) increasing parent knowledge of resources of evidence-based reading acquisition, development, and instructional resources, (b) effective advocacy strategy for the identification of reading deficits, and (c) implementation of evidence-based instruction and intervention. During the summer of 2021, the community-based organization provided literacy screeners to families using an adaptive gamified mobile app-based screener. The screening application (Gaab & Petscher, 2021) is based on consensus reports of predictors of reading success, including phonemic awareness (National Institute of Child Health and Human Development [NICHD], 2000; Rayner et al., 2001). The screening application is designed to be self-guided with supervision. The student engages with an interactive character who leads them through a series of eight tasks related to early literacy skills that identify students at risk for reading challenges or dyslexia. The administration time takes between 20–50 minutes per child. The scoring of the screening application is largely automated, alleviating the need for knowledgeable administrators. The screening application assessed established risk domains for later reading difficulties, including phonological and phonemic awareness, oral language, sound–symbol knowledge, and rapid automatized naming. The published technical manual for the screening application reports high marginal reliability ranging from .85 (Letter Naming) to .99 (Phonological Awareness Blending) (Gaab & Petscher, 2021). Predictive validity scores, reported in the technical manual, were acceptable ranging from .61 (Dyslexia Risk) to .67 (Word Reading Success) (Gaab & Petscher, 2021). Additionally, the tool reports high classification accuracy ranging from .85 (Fall to Winter) to .88 (Spring) (Gaab & Petscher, 2021).
Over the course of the 2021–22 school year, the organization administered 65 kindergarten screeners. Community volunteers, after administering the literacy screenings using the application, met with 46 families to discuss results and educate parents and families on evidence-based reading acquisition and development, their child’s literacy profile using the screening application’s results, and resources for home-based instruction as well as contacts and next steps for families whose child was identified as “at risk” for either moderate word-level reading difficulties or dyslexia. A total of 26 students were identified as at risk for reading failure and 11 children as “at risk” for dyslexia.
The Current Study
The current pilot study investigated the community-based organization’s (501c3) efforts to provide free language and literacy screenings through the Early Language and Literacy Screening (ELLS) initiative to communities and to use the results to foster active parent participation and advocacy for students at risk for language and literacy disabilities. To support this effort, the community-based organization purchased and donated licenses for the literacy screening to the community partners. The research questions guiding the study were: (1) What is the feasibility of a community-based organization’s initiative to provide language and literacy screenings at no cost to the community? and (2) Did the screenings and initiative education efforts impact participants?
Method
Evaluation Design
After securing IRB approval, we utilized a mixed-method design to answer the two research questions (Greene et al., 1989). “In a complementarity mixed-method study, qualitative and quantitative methods are used to measure overlapping of a phenomenon, yielding an enriched, elaborated understanding of that phenomenon” (Greene et al., 1989; p. 257). We collected three forms of data, including survey data (quantitative) and interview data (qualitative). We also collected and analyzed programmatic data focused on the implementation timeline and program expansion for context (see Fig. 1).
Participants
Working with a community-based organization who sponsored the Early Language and Literacy Screening (ELLS) initiative, we targeted participants from three key stakeholder groups receiving or participating in ELLS: Parents and Caregivers, Volunteer Screeners and Community Partners. Parents/Caregivers (P/C) were individuals whose children had been screened using the screening application. Volunteer Screeners (VS) were individuals who had administered the screenings. Community Partners (CP) were defined as individuals who represented larger organizations, specifically early childhood programs, who committed to adopting the program and serving as a community-based site for screenings and parent education.
All P/Cs, VSs, and CPs were known to authors 1 and 3. We used purposive sampling and reached out to participants representing each of these stakeholder groups who had direct, and in many cases extensive, experience with the community-based organization’s ELLS initiative (Palinkas et al., 2015).
Survey Participants
Author 3 emailed the specific survey link to each PC who registered for a screening for their child (N = 77) and each VS (N = 8) who had participated in the program. Of the PCs’ group, 31 people did not open the email when it was sent originally, so a second email was sent to that group. After a follow-up email was sent the next week, 19 P/C surveys were completed, with a 25% response rate (Hoyle, 1999). Of the VS group, all eight recipients opened and completed the surveys.
Of the P/C participants (n = 19), all had children who had been screened at least once (six children had more than one screening). All P/Cs had completed at least a bachelor’s degree, eight (42.1%) had completed a master’s degree, and one (5.3%) had completed a doctoral degree (see Table 2). Most (84%) of the participants had children enrolled in public schools, while 16% had children enrolled in private schools. All survey participants were from or served children in suburban school districts.
Interview Participants
All survey participants answered a final survey question indicating if they would be willing to participate in an interview and supplied their contact information if they were willing. Author 1 emailed and/or called to survey respondents from the parents/caregivers and community volunteers who indicated that they would be willing to do a follow-up interview. As the program had two community partners at the time, Author 1 contacted those organizations to request interview participants. She conducted interviews with two members of each participant group (I = 6) (Table 1). It should be noted that one of the VSs also qualified as a P/C since their child had been screened through the program. This person completed the survey and discussed the screening application and education efforts from both perspectives.
Tools
Survey
We designed two brief surveys, one for P/Cs and one for VSs, using the Qualtrics platform. The P/C survey collected demographic data and asked five closed and three open-ended questions to determine reasons for seeking out the screening, the impact of the screening results, and how P/Cs had or planned to use the results. An example of an open-ended question was: How would you describe your child’s school or district’s use/reaction to your concerns/information? An example of a question with a Likert response option was: To what degree did the information you learned about reading acquisition and your child’s reading profile impact your family?
The survey for VSs also assessed participant demographics as well as participants’ perceptions of administering the screener. Nine of the questions were closed (either Likert response or multiple choice) and three were open-ended. The open-ended questions focused on the administration of the screening and the likelihood of using the screening tool in the future. An example of a question with a Likert response option was: How likely are you to recommend that other community organizations provide EarlyBird Screeners? An example of an open-ended question was: What would you recommend to improve this initiative?
Interviews
Similar to the survey, we developed separate interview protocols for each of the different stakeholder groups. Each protocol was intended to collect data on the same topics as the survey, but our goal was to gather more detailed and comprehensive data. Examples of questions asked included:
For P/C interviews: Did the information and screening results impact how you approached your child’s school or district regarding their risk for reading or language concerns?
For VS interviews: What do you see as the potential for using this screening application in the children that you’re using it with and in the community that you’re working in?
For CP interviews: Were there any obstacles that you would identify for other communities to utilize the screening application to help their communities identify students at risk for language and learning difficulties?
Data collection
Author 1 conducted all of the interviews. The interviews were semi-structured and guided by the respective protocols. Each interview was conducted via phone and lasted approximately 30 minutes. Each was audio recorded in full. She also took notes during each interview. The author was able to ask participants to extend and elaborate on their responses in order to add depth to our understanding of the screening process and ways to improve and/or expand its use.
Data Analysis
We used sequential mixed method data analysis and examined the quantitative data prior to analyzing the qualitative analysis (Palinkas et al., 2015).
Survey Data
Survey data were compiled and analyzed both separately and in the context of the qualitative analysis. Survey data were primarily nominal, so we generated means and examined trends (see Table 2). Survey open-question responses were used to identify potential follow-up questions for the interviews.
Interview Data
Each interview was transcribed verbatim by the first author. These transcripts were sent to the participants to check for accuracy, then shared (with any identifying information removed) with the other authors. We utilized thematic analysis to determine themes and codes for the qualitative data. In the first round of deductive coding, we used the research questions as a guide to identify the frequency of trends in the interview transcripts (Roberts et al., 2019). We worked together to refine the preliminary codes and develop a primary and secondary code book. No transcripts would be described as outliers. Using these refined codes, we each analyzed the transcripts a second time. During the second round of coding, we independently used an inductive approach to determine themes (both expected and unexpected) and developed a code book (Roberts et al., 2019). We discussed our findings, and we reached consensus when discrepancies or differences arose (Campbell et al., 2013).
Trustworthiness
All of the participants were known to Authors 1 and 3 and all interviews were conducted by Author 1. It can be assumed that the familiarity of the participants with the authors affected the trustworthiness in two ways. On one hand, the level of trust between Author 1 and the participants could possibly result in more in-depth responses and more ability to probe the responses during the interviews. On the other hand, participant responses may have leaned more positively in an effort to please Authors 1 and 3 who have a vested interest in the success of the program.
In an effort to maintain trustworthiness several measures were taken. First, all of the authors were involved in developing the survey and interview protocol to help limit bias. Two of the authors did not participate in the interviews which, it can be assumed, resulted in more fidelity to the transcripts as written and a more objective orientation to the data. In addition, each transcript was emailed to the interview participants to check for accuracy. Finally, the codes were examined in the context of the survey data and programmatic information to determine commonalities and identify any outliers.
Researcher Positionality
The first and third authors are both experts in literacy interventions for young children. They worked with the community to develop and launch the Early Language and Literacy Screening Initiative (ELLSI) as a way of addressing what they saw as a need for the early detection of reading difficulties. Author 3 recruited and trained the initial volunteer screeners (after the initial group, a snowball process was used for recruitment and training). Author 2’s expertise is in the areas of early childhood special education, parent outreach, and inter-professional collaboration.
Findings
Findings are presented for the multi-year program development and for the impact of use of the EarlyBird screening application as a community-based intervention.
The Development of a Community-Based Screening Model
As shown in Table 3, planning for a community-based screening initiative began prior to 2021. Over the next two years, community-based intervention activities expanded to reach more children and families as the initiative grew in size and scope. As part of program evaluation activities, survey and/or interview data were collected from all three stakeholder groups (P/Cs, VSs, CPs) to determine the feasibility of the model and to gather information about the screening tool and its usefulness.
Community Partners
Two CPs were interviewed. Results from the thematic analysis show that CPs focused on the importance of early screening and identification, the value of having a centralized place-based screening model, the ease of use of the screening application, and the challenges of the model in terms of cost, educators exhibiting resistance to change and new resources, and access to technology. Both CPs stressed the importance of early detection and early intervention and highlighted the benefits of being able to centralize these efforts. One CP stated,
I just believe that getting early detection, just having all the information you can about kids and reading is really, really important. I really believe that all people should be able to read. When we started the kindergarten, it was me and [volunteer screener] who helped with it, but it was me and then the [program] that backed what we were doing. I think we could find kids that had any markers or any red flags for any learning or any reading issue.
CPs also expressed an appreciation for the ease of use of the screening application. Though participants were neither experts in literacy nor the screening application, both shared that children were engaged while using the screening application and that the results were relatively easy to interpret and share with families. Finally, while cost was a challenge in that it required financial support, both CPs stated that being able to screen such a large number of children was ultimately cost-effective. According to a CP,
It is extremely easy to administer, and the results are really clear and easy to understand. I’m not an expert at all in this kind of stuff and I was able to help kids take the screener, and then also, you know, after looking at it several times… even I could understand the results. It also wasn’t terrifically expensive, …we were able to screen a lot of kids and get a lot of information for those parents for a relatively low cost. So, I think it was a very efficient way to… get parents [the] information they need to help their kids.
While both CPs also noted educator resistance to trying/trusting the new screening application and equitable access to technology, specifically iPads and stable internet, as additional challenges, they indicated that the ELLS model undoubtedly benefited children and families.
Volunteer Screeners
The ELLS model relied almost entirely on VSs, a group that has expanded via a train-the-trainer training model. Both survey (n = 8) and interview (n = 2) data were collected from this group of participants. Survey data show that all VS participants would (a) recommend use of the screening application to other community-based organizations, (b) would recommend schools and districts adopt the screening application, and (c) would be somewhat (n = 1) or very likely (n = 7) to recommend to other VSs. In terms of administering the screener, all participants (n = 8) indicated it was easy to learn administration. Finally, all participants (n = 8) stated that they would volunteer their time again to administer the screening application.
During follow-up interviews with two VSs, they echoed the same themes as CPs with two additional themes emerging. First, both VSs spoke about their eagerness to share the screening application with those who worked in the same or similar spaces to generate awareness about the initiative through a grassroots-type movement. According to one of the VSs,
I shared it with our reading specialists and actually, I even shared it with … we have a team of …eight to 10 reading specialists in the district that I’m in and I shared it with them, showing them the reports that it populates. Another reading specialist had even tried it with her own child to be able to kind of speak to it. We definitely talked about it.
VSs were not explicitly instructed to share information about the ELLS model with colleagues or peers outside of the initiative; however, it was interesting to note that both participants highlighted their excitement about sharing the resource with others.
VSs also added an additional challenge, stating that while the screening application itself was easy to administer and results were easy to interpret, the process of conducting the screening application with individual children then engaging families in one-on-one conversations about results, next steps, and advocacy was time-consuming. A VS shared,
One of the things that I questioned, [though] I did not get to this point because I didn’t give it to a whole group, would be finding the time… [you] need to be able to actually record each kiddo [for one of the screening application measures]. [It] would be [hard] finding the time and being able to facilitate doing a whole class, unless you have the manpower for someone to pull the kids out or do it in a quiet spot…I kind of was thinking that that would be an obstacle.
Due to the age of the children being screened, VSs needed to spend time overseeing the screening process. VSs also expressed concern over the time-consuming nature of sharing results and offering parent education to individual families. This feedback led to a programmatic change during Stage 2 of the initiative, at which time VSs began holding group meetings to conduct parent education on literacy acquisition and interpreting screening results (see Table 3).
The Impact of the Screening Application on Children and Families
To determine the impact of the screening application on children and families and to better understand how families moved forward with the information provided in the parent education piece of the initiative, survey (n = 19) and interview (n = 5) data were collected from this stakeholder group. A thematic analysis shows that the initiative validated the concerns of parents, led to an increase in knowledge around screening results and early literacy strategies, and helped parents better understand how to navigate early intervention systems and advocate for their children with local districts.
Concerns Are Validated
Survey data show that most families sought out language and literacy screenings due to concerns related to “child(ren’s) language or reading acquisition or present levels” (n = 13). Children having “a family history of language or reading difficulties” was the second most cited reason (n = 11), while the screening application being free and easy was third (n = 10). Follow-up interviews supported these findings. One parent, for example, stated:
With my daughter… I knew at a really early age that she had delayed speech. I want to say even when she was three, [her speech] wasn’t very clear. I took her to a speech pathologist because I could see how hard she was struggling with her word finding. When COVID hit, I was trying to work with her on our letters so that I knew she wouldn’t fall behind… We worked every day and she could not remember her letters and then I was noticing she was constantly confusing [them]… But with her, I saw the signs so early just because I was like, ‘Oh my god, this is what my [older] son was like!’ The same exact symptoms...that’s why I was able to catch it early.
Other interviewees shared that though they believed something was “off” or “different” about their children’s early language and literacy development, they were unable to get confirmation of their concerns through other avenues they previously sought out. It was not until children were screened via the EarlyBird screening application that parents had concrete evidence of and specific information about gaps in language and literacy acquisition.
Families Experience an Increase in Knowledge
In terms of the impact of the model, specifically the part where VSs worked with families either individually or in a group setting to explain screening results and offer parent education, survey data showed that the majority of P/Cs (n = 15) indicated these measures as having a “significant” impact. In response to an open-ended question on how participating in the initiative helped them, parents offered responses like, “It gave us direction on how to help our child at home,” and “We have learned a lot about how children learn to read. We are using this knowledge to ensure our youngest is learning to read in the right way.” All P/Cs surveyed gave the experience a five-star rating and shared that they either had, or would, recommend participating in future ELLS events to other families.
Families Are Supported Through Advocacy Work
P/Cs represented 11 different school districts within a large metropolitan region. Of the P/Cs surveyed, 16 children attended local public schools while three attended private school. All P/Cs indicated that they had either used, or planned to use, their child(ren)’s screening results to advocate on behalf of their child(ren). In follow-up interviews, participants explained how the screening results and parent education had helped them approach or interact with their local schools and districts. For example, one P/C shared, “It has helped us bridge the gap in communicating better with our school. This year, I was able to present a series of screener results to our new kindergarten teacher to jumpstart a conversation on ways to approach reading with my daughter.” Both survey and interview data support that parents shared their children’s screening results with classroom teachers and school-based personnel in order to educate teachers on each child’s specific needs (Table 4).
At the district-level, interviewees expressed frustration that their children were not receiving what the family perceived to be the proper services and support. The screening application’s results bolstered the family’s case for intervention, while the parent education provided by the community volunteers supported families as they navigated these key conversations with their districts. According to a parent:
She did not qualify for early intervention. However, the screening application showed significant discrepancies in my daughter’s learning profile and the need for intervention. It was explained to me [by community volunteers] why she needed more intensive services, which, when discussing with the school, they agreed upon…[The community-based organization provided me with] resources….to learn more about the science of reading, and also other services such as OT, which my daughter also qualified for, [and] gave me a better understanding of her needs. It is fair to say she would not be receiving the level of support [she is now] if not for the help from [the initiative].
Both survey and interview data show that the parent education offered by CVs included an extensive amount of support that gave families information on and tips for how to best advocate for their children with districts in order to secure additional services and/or specialized interventions.
Discussion
It is well documented that early language and literacy screenings are a critical component to addressing the inadequate reading proficiency (Gersten et al., 2008; January & Klingbeil, 2020; Lonigan & Shanahan, 2009) scores that have persisted for nearly 30 years (NAEP, 2022). Despite broad consensus on this and widespread attention, additional work needs to occur to increase equitable access to early identification of reading difficulties. As a result of the variability of implementation among and within the 41 states that have enacted screening initiatives (Heubeck, 2023; NCIL 2023), early evaluations of these efforts have not resulted in identification rates commensurate with researchers’ expectations (Dellinger, 2021; NCIL, 2023; Phillips & Odegard., 2017). Though early language and literacy screenings are only the first step of a complex and varied process for identification, the value of identifying risk for later reading difficulties (see Catts & Petscher, 2022) and earlier provision of educational support underscores screening importance.
The lower than expected identification rates are attributable to definitional and operational variability in screening targets, tools, time frames, and parent notification among other factors (Gearin et al., 2022). Furthermore, state mandates are often left to district leadership to interpret, creating further variability. Studies have reported that school officials, including leadership, school psychologists, and teachers, would benefit from additional professional development on screening and identification and evidence-based responses to screening outcomes (Otaiba et al., 2019; Sanfilippo et al., 2020; Schelbe et al., 2022).
As a result of the obstacles of school-based screening models, students’ reading deficits are not identified until after the window of maximally impactful instruction, meaning that these students are likely to remain poor readers at graduation (Wanzek & Vaughn, 2007). Reading difficulties continue to remain misdiagnosed, underdiagnosed, or left untreated, with minorities and students from lower socioeconomic backgrounds disproportionately impacted (Gaab & Petscher, 2022).
Implementing a preventative approach effectively is highly complex and requires systemic efforts that include collaboration, awareness of contextual barriers, and targeted strategies to overcome these barriers (Goldstein & Olszewski, 2015). The research questions for this study investigated (1) the feasibility of a community-based organization’s initiative to provide language and literacy screenings at no cost to the community and (2) the screenings and initiative’s education efforts impact on participants. The results of this pilot study were supportive of the feasibility and impact of leveraging community-based organizations to provide reliable, valid, and accurate language and literacy screenings to kindergarten-age students in their communities.
Feasibility of Community-Based Screening
The CPs who participated in the study indicated that the provision of the screenings was both valuable and important to the communities they served. CPs highlighted the early identification of language and literacy difficulties as central to enabling community members to equitably access proficient reading. The results of the study met the feasibility goals of the initiative by providing a total of 65 screeners with eight VSs who did not possess specialized knowledge of language, literacy, or education. While the cost of the screening was a concern from a CP who works within a school district, another CP felt the $16.50 per screening was a “relatively low cost” and well worth the value of the results and impact to children. Schools and school districts may seek to ameliorate the costs of screening by partnering with community-based organization’s and exploring funding opportunities through state grants and private foundations targeting literacy.
The VSs reported that learning how to administer the screenings and the administration itself was “easy” and indicated they would volunteer to administer them again. Additionally, the VSs reported that they found the screener to be engaging and “kid friendly.” However, the length of the administration, particularly in the classroom context as one VS stated, may be an obstacle. Furthermore, parent education and individual discussions of student results and potential next steps were also found to be time-consuming.
Impact of Screening and Parent Education
While time-consuming, the impact of the screening results and education on language and literacy acquisition had a “significant” impact, and P/Cs reported that the education and information they were given from the community organization helped guide next steps. As a result of the screening, P/Cs reported that their concerns regarding their child’s language and literacy development had been validated, which is consistent with other studies showing that parents become aware of their child’s difficulties as early as age four (Denton et al., 2022). Parents have also reported negative mental health outcomes as they experience self-doubt, anxiety, and stress when they are waiting for identification through traditional pathways (Leitão et al., 2017).
P/Cs reported taking the next steps if their child was found to be at risk by finding tutoring, providing evidence-based activities to foster language and literacy at home or seeking additional evaluations. Research has indicated that parents can have a positive impact when utilizing direct and explicit teaching strategies with their children (Mitchell & Begeny, 2014). P/Cs also reported either using or planning to use the screening results in their advocacy for their child’s needs. This aligns with previous research that supports parent education and empowerment strategies to mitigate feelings of despair and pressure to acquire specialized knowledge (Trainor, 2010). Overall, the early identification of risk for later reading difficulties enabled P/Cs to address the problem and mitigate the negative later outcomes. The results of this study served as a useful basis for continuing and expanding the initiative to a variety of communities, including urban districts and minoritized communities.
Implications
The results of this pilot study are important as it offers a viable pathway to equitable provision of reliable, valid, and accurate language and literacy screenings at an early age. This pathway overcomes barriers that hamper the present screening efforts in school-based settings. The results of this study suggest that community-based screenings are a viable pathway in a multipronged approach to a preventative education model. While the sample size was small and highly educated, the community organization education and screening results made P/Cs feel better equipped to navigate their child’s language and literacy needs and advocate for them in complex school processes.
Meeting Families Where They Are
Additionally, the parent education element through a community-based organization leverages sources of capital (Bourdieu, 1986) that have been valuable in school improvement initiatives that engage parents. Community-based organizations may be a viable resource to access intimate knowledge of school contexts (cultural capital) (Bourdieu, 1986). Using local contexts to provide the screening also has benefits in that it may offer opportunities to leverage social capital in the communities. Screenings in familiar community settings such as libraries or local community centers may also foster local community awareness and closer connection with community-based literacy initiatives. Situating screeners in community settings also provides additional benefits for parents of young children. Parents may distrust schools from their own educational experiences (which may be likely given the high degree of heritability of dyslexia and other reading disabilities) and feel more comfortable seeking a screening outside of the school setting.
Increasing Social Support Through Capital
Additionally, with the adoption of a parent education model, community organizations can offer both social support as well as critical navigational capital of their child (Bourdieu, 1986). Parents in this study were provided with education on language and literacy development and evidence-based practices and resources that could be utilized at home. The community organization also provided specific advocacy recommendations if their child was identified as at risk for later reading difficulties.
While it is emphasized that screening is not enough to diagnose any disability, it is meant to ideally prevent the manifestation of reading disabilities or enable improved outcomes as a result of evidence-based instruction within the timeframe of the brain’s heightened plasticity (Gaab & Petscher, 2022). The protective factors of early identification, identifying appropriate evidence-based instruction, and expanding supports outside the school setting act as protective and promotive factors in risk-resiliency models (Barnes & Peltier, 2022; Catts & Petscher, 2022).
Limitations
As this was a pilot study, several limitations were present that are important to note. First, the sample size was small, and limited participant demographic data were collected. Increasing the sample size and including varied demographic data would offer more insight into the representativeness for the wider population. Furthermore, the locations served by these communities were limited to primarily suburban districts, which are known to be less diverse both racially and ethnically. Expanding future studies to include urban districts, which are more diverse, as well as rural districts, would offer more insight. Additionally, the population included in this study included CVs who admitted access to social capital, including knowledge and other resources which may not be accessible in other populations.
Next Steps
Following the successful pilot study, the initiative plans to expand community partnerships in minoritized communities. In future implementations of this initiative, the data collection tools will be refined to include additional demographic variables, among other elements. Given that identifying reading disabilities (Odegard et al., 2020) and parent involvement (Berthelsen & Walker, 2008; Daniel, 2015) are decreased in minoritized communities, the outcomes of this initiative in these contexts will be critical.
Data Availability
All data sets generated during this current study are not publicly available due to confidentiality limits signed by the study participants but are available from the corresponding author on reasonable request.
References
American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders (5th ed.). https://doi.org/10.1176/appi.books.9780890425596
Arnold, E. M., Goldstone, D. B., Walsh, A. K., Reboussin, B. A., Daniel, S. S., Hickman, E., & Wood, F. B. (2005). Severity of emotional and behavioral problems among poor and typical readers. Journal of Abnormal Child Psychology, 33(2), 205–217. https://doi.org/10.1007/s10802-005-1828-9
Assistance to states for education for handicapped children: Procedures for evaluating specific learning disabilities. 42 Fed. Reg. 65082-65085. (1977). (C.F.R. 89 Stat. 773 G1082-G1085). https://www.govinfo.gov/content/pkg/FR-1977-12-29/pdf/FR-1977-12-29.pdf
Barnes, Z. T., & Peltier, T. K. (2022). Translating the science of reading screening into practice: policies and their implications. Perspectives on Language and Literacy, 48(1), 38–44. https://doi.org/10.35542/osf.io/gcx46
Baxter, G., & Kilderry, A. (2022). Family school partnership discourse: Inconsistencies, misrepresentation, and counter narratives. Teaching and Teacher Education, 109, 103561. https://doi.org/10.1016/j.tate.2021.103561
Berthelsen, D., & Walker, S. (2008). Parents’ involvement in their children’s education. Family Matters, 79, 34–41.
Bourdieu, P. (1986). The forms of capital. In I. Szeman & T. Kaposy (Eds.), Cultural theory: An anthology (pp. 81–93). John Wiley & Sons.
Campbell, J. L., Quincy, C., Osserman, J., & Pedersen, O. K. (2013). Coding in-depth semistructured interviews: problems of unitization and intercoder reliability and agreement. Sociological Methods & Research, 42(3), 294–320. https://doi.org/10.1177/0049124113500475
Castles, A., Rastle, K., & Nation, K. (2018). Ending the reading wars: reading acquisition from novice to expert. Psychological Science in the Public Interest, 19(1), 5–51. https://doi.org/10.1177/1529100618772271
Catts, H. W., & Petscher, Y. (2022). A cumulative risk and resilience model of dyslexia. Journal of Learning Disabilities, 55(3), 171–184. https://doi.org/10.1177/00222194211037062
Connor, C. M., Morrison, F. J., Fishman, B., Crowe, E. C., Otaiba, S. A., & Schatschneider, C. (2013). A longitudinal cluster-randomized controlled study on the accumulating effects of individualized literacy instruction on students’ reading from first through third grade. Psychological Science, 24(8), 1408–1419. https://doi.org/10.1177/0956797612472204
Daniel, G. (2015). Patterns of parent involvement: A longitudinal analysis of family–school partnerships in the early years of school in Australia. Australasian Journal of Early Childhood, 40(1), 119–128. https://doi.org/10.1177/183693911504000115
Dellinger, H. (2021). Feds say TEA has failed to fix its special education problems as it had promised. Houston Chronicle. https://www.houstonchronicle.com/news/houston-texas/houston/article/Feds-say-TEA-has-failed-to-correct-deficiencies-16425992.php#:~:text=In%20January%202018%2C%20the%20federal,disabilities%2C%20as%20required%20by%20law
Deno, S. L., & Mirkin, P. K. (1977). Data-based program modification: A manual. Council for Exceptional Children.
Denton, C. A., Hall, C., Cho, E., Cannon, G., Scammacca, N., & Wanzek, J. (2022). A meta-analysis of the effects of foundational skills and multicomponent reading interventions on reading comprehension for primary-grade students. Learning and Individual Differences, 93, 102062. https://doi.org/10.1016/j.lindif.2021.102062
Education For All Handicapped Children Act, Pub. L. No. 94-142, 89 Stat 773 U.S.C. (1975). https://www.govinfo.gov/content/pkg/STATUTE-89/pdf/STATUTE-89-Pg773.pdf
Elliott, J. G. (2020). It’s time to be scientific about dyslexia. Reading Research Quarterly, 55(S1), S61–S75. https://doi.org/10.1002/rrq.333
EveryoneReads PA. (2023). https://www.pareads.org/
Ferlazzo, L. (2011). Involvement or engagement? Educational Leadership, 68(8), 10–14.
First Step Act of 2018. 18 U.S.C. § 3632(h) (2018). https://www.congress.gov/115/plaws/publ391/PLAW-115publ391.pdf
Flowers, L., Meyer, M., Lovato, J., Wood, F., & Felton, R. (2001). Does third grade discrepancy status predict the course of reading development? Annals of Dyslexia, 51, 49–71.
Foorman, B. R., Francis, D. J., Fletcher, J. M., Schatschneider, C., & Mehta, P. (1998). The role of instruction in learning to read: Preventing reading failure in at-risk children. Journal of Educational Psychology, 90(1), 37–55.
Frankenberger, W., & Harper, J. (1985). Variations in multidisciplinary team composition for identifying children with mental retardation. Mental Retardation (Washington), 24(4), 203–207.
Francis, D. J., Fletcher, J. M., Stuebing, K. K., Lyon, G. R., Shaywitz, B. A., & Shaywitz, S. E. (2005). Psychometric approaches to the identification of LD: IQ and achievement scores are not sufficient. Journal of Learning Disabilities, 38(2), 98–108. https://doi.org/10.1177/0022219405038002010
Gaab, N., & Petscher, Y. (2021). Earlybird technical manual. https://www.gaablab.com/screening-for-reading-impairments
Gaab, N., & Petscher, Y. (2022). Screening for early literacy milestones and reading disabilities: The why, when, whom, how, and where. Perspectives on Language and Literacy, 48(1), 11–18 https://www.gaablab.com/s/Winter-2022-Gaab-and-Petscher-Final-p11-18.pdf
Gearin, B., Petscher, Y., Stanley, C., Nelson, N. J., & Fien, H. (2022). Document analysis of state dyslexia legislation suggests likely heterogeneous effects on student and school outcomes. Learning Disability Quarterly, 45(4), 267–279. https://doi.org/10.1177/0731948721991549
Gersten, R., Chard, D. J., Jayanthi, M., Baker, S. K., Morphy, P., & Flojo, J. (2008). Mathematics instruction for students with learning disabilities or difficulty learning mathematics: A synthesis of the intervention research. Center on Instruction. http://files.eric.ed.gov/fulltext/ED521890.pdf
Gibson, S., & Kendall, L. (2010). Stories from school: Dyslexia and learners' voices on factors impacting on achievement. Support for Learning, 25(4), 187–193. https://doi.org/10.1111/j.1467-9604.2010.01465.x
Gold, S. E., Mundell, L., & Brown, C. (2004). Bringing community organizing into the school reform picture. Nonprofit and Voluntary Sector Quarterly, 33(3), 54S-76S. https://doi.org/10.1177/0899764004265439
Goldstein, H., & Olszewski, A. (2015). Developing a phonological awareness curriculum: Reflections on an implementation science framework. Journal of Speech, Language, and Hearing Research, 58(6), S1837–S1850. https://doi.org/10.1044/2015_JSLHR-L-14-0351
Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255–274. https://doi.org/10.2307/1163620
Grigorenko, E. L., Compton, D. L., Fuchs, L. S., Wagner, R. K., Willcutt, E. G., & Fletcher, J. M. (2020). Understanding, educating, and supporting children with specific learning disabilities: 50 years of science and practice. The American Psychologist, 75(1), 37–51. https://doi.org/10.1037/amp0000452
Hanford, E. (2017) Hard to read: how American schools fail kids with dyslexia. [Audio Podcast episode]. Educate Podcast. American Public Media. https://www.apmreports.org/episode/2017/09/11/hard-to-read
Heubeck, E. (2023, July 12). California joins 40 states in mandating dyslexia screening. EducationWeek. https://www.edweek.org/teaching-learning/california-joins-40-states-in-mandating-dyslexia-screening/2023/07
Hoyle R. H. (Ed.). (1999). Statistical strategies for small sample research. Sage. Individuals With Disabilities Education Act of 2004, Pub. L. 108-446, 118 Stat. 2658 (2004). https://www.govinfo.gov/content/pkg/PLAW-108publ446/pdf/PLAW-108publ446.pdf
Individuals With Disabilities Education Act of 2004, Pub. L. 108-446, 118 Stat. 2658. (2004). https://www.govinfo.gov/content/pkg/PLAW-108publ446/pdf/PLAW-108publ446.pdf
Jansky, J., & de Hirsch, K. (1972). Preventing reading failure: Prediction, diagnosis, intervention. Harper & Row.
January, S. A. A., & Klingbeil, D. A. (2020). Universal screening in grades K-2: A systematic review and meta-analysis of early reading curriculum-based measures. Journal of School Psychology, 82, 103–122. https://doi.org/10.1016/j.jsp.2020.08.007
Leitão, S., Dzidic, P., Claessen, M., Gordon, J., Howard, K., Nayton, M., & Boyes, M. E. (2017). Exploring the impact of living with dyslexia: The perspectives of children and their parents. International Journal of Speech-Language Pathology, 19(3), 322–334. https://doi.org/10.1080/17549507.2017.1309068
Lonigan, C. J., & Shanahan, T. (2009). Developing early literacy: Report of the National Early Literacy Panel. Executive summary. A scientific synthesis of early literacy development and implications for intervention. National Institute for Literacy https://files.eric.ed.gov/fulltext/ED508381.pdf
Lopes, J. A., Gomes, C., Oliveira, C. R., & Elliott, J. G. (2020). Research studies on dyslexia: Participant inclusion and exclusion criteria. European Journal of Special Needs Education, 35(5), 587–602. https://doi.org/10.1080/08856257.2020.1732108
Lovett, M. W., Frijters, J. C., Wolf, M., Steinbach, K. A., Sevcik, R. A., & Morris, R. D. (2017). Early intervention for children at risk for reading disabilities: The impact of grade at intervention and individual differences on intervention outcomes. Journal of Educational Psychology, 109(7), 889–914. https://doi.org/10.1037/edu0000181
Lyon, G. R., Shaywitz, S. E., & Shaywitz, B. A. (2003). A definition of dyslexia. Annals of Dyslexia, 53, 1–14. https://doi.org/10.1007/s11881-003-0001-9
Mercer, C. D., Jordan, L., Allsopp, D. H., & Mercer, A. R. (1996). Learning disabilities definitions and criteria used by state education departments. Learning Disability Quarterly, 19(4), 217–232. https://doi.org/10.2307/1511208
Miciak, J., & Fletcher, J. M. (2020). The critical role of instructional response for identifying dyslexia and other learning disabilities. Journal of Learning Disabilities, 53(5), 343–353. https://doi.org/10.1177/0022219420906801
Mitchell, C., & Begeny, J. C. (2014). Improving student reading through parents’ implementation of a structured reading program. School Psychology Review, 43(1), 41–58. https://doi.org/10.1080/02796015.2014.12087453
Nation, K., & Snowling, M. J. (1998). Semantic processing and the development of word-recognition skills: Evidence from children with reading comprehension difficulties. Journal of Memory and Language, 39(1), 85–101. https://doi.org/10.1006/jmla.1998.2564
National Association for the Education of Young Children. (2020). Professional standards and competencies for early childhood educators. https://www.naeyc.org/resources/position-statements/professional-standards-competencies
National Center for Educational Statistics. (1992). National Assessment of Educational Progress: The nation’s report card: Reading. United States Department of Education. https://www.nationsreportcard.gov/highlights/reading/1992/
National Center for Education Statistics. (2023). Students With Disabilities. Condition of Education. United States Department of Education, Institute of Education Sciences. Retrieved [9/19/23]. https://nces.ed.gov/programs/coe/indicator/cgg/students-with-disabilities
National Center for Educational Statistics. (2019). National Assessment of Educational Progress: The nation’s report card: Reading. United States Department of Education https://www.nationsreportcard.gov/highlights/reading/2019/
National Center for Educational Statistics. (2022). National Assessment of Educational Progress: The nation’s report card: Reading. United States Department of Education https://www.nationsreportcard.gov/highlights/reading/2022/
National Center for Improving Literacy. (2023). The state of dyslexia. https://improvingliteracy.org/state-of-dyslexia
National Institute of Child Health and Human Development [NICHD]. (2000). Report of the National Reading Panel. Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction (NIH Publication No. 00-4769). Washington, DC: U.S. Government Printing Office. https://www.nichd.nih.gov/sites/default/files/publications/pubs/nrp/Documents/report.pdf
Odegard, T. N., Farris, E. A., Middleton, A. E., Oslund, E., & Rimrodt-Frierson, S. (2020). Characteristics of students identified with dyslexia within the context of state legislation. Journal of Learning Disabilities, 53(5), 366–379. https://doi.org/10.1177/0022219420914551
Otaiba, S. A., & Kim, Y.-S. G. (2022). Monitoring Student: Responsiveness: Early reading instruction using a response to instruction framework. Literacy Today, 39(3), 18–26.
Otaiba, S., Baker, K., Lan, P., Allor, J., Rivas, B., Yovanoff, P., & Kamata, A. (2019). Elementary teacher’s knowledge of response to intervention implementation: A preliminary factor analysis. Annals of Dyslexia, 69, 34–53. https://doi.org/10.1007/s11881-018-00171-5
Ozernov-Palchik, O., & Gaab, N. (2016). Tackling the ‘dyslexia paradox’: Reading brain and behavior for early markers of developmental dyslexia. Wiley Interdisciplinary Reviews: Cognitive Science, 7(2), 156–176. https://doi.org/10.1002/wcs.1383
Ozernov-Palchik, O., Norton, E. S., Sideridis, G., Beach, S. D., Wolf, M., Gabrieli, J. D., & Gaab, N. (2017). Longitudinal stability of pre-reading skill profiles of kindergarten children: implications for early screening and theories of reading. Developmental Science, 20(5)
Palinkas, L. A., Horwitz, S. M., Green, C. A., Wisdom, J. P., Duan, N., & Hoagwood, K. (2015). Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Administration and Policy in Mental Health and Mental Health Services Research, 42, 533–544. https://doi.org/10.1007/s10488-013-0528-y
Petscher, Y., Fien, H., Stanley, C., Gearin, B., Gaab, N., Fletcher, J. M., & Johnson, E. (2019). Screening for Dyslexia. Retrieved from www.improvingliteracy.org
Phillips, B. A. B., & Odegard, T. N. (2017). Evaluating the impact of dyslexia laws on the identification of specific learning disability and dyslexia. Annals of Dyslexia, 67, 356–368. https://doi.org/10.1007/s11881-017-0148-4
Protopapas, A. (2019). Evolving concepts of dyslexia and their implications for research and remediation. Frontiers in Psychology, 10, 2873–2873. https://doi.org/10.3389/fpsyg.2019.02873
Quinn, J. M., Wagner, R. K., Petscher, Y., Roberts, G., Menzel, A. J., & Schatschneider, C. (2020). Differential codevelopment of vocabulary knowledge and reading comprehension for students with and without learning disabilities. Journal of Educational Psychology, 112(3), 608–627. https://doi.org/10.1037/edu0000382
Rayner, K., Foorman, B. R., Perfetti, C. A., Pesetsky, D., & Seidenberg, M. S. (2001). How psychological science informs the teaching of reading. Psychological Science in the Public Interest, 2(2), 31–74. https://doi.org/10.1111/1529-1006.00004
Roberts, K., Dowell, A., & Nie, J. B. (2019). Attempting rigour and replicability in thematic analysis of qualitative research data: A case study of codebook development. BMC Medical Research Methodology, 19(1), 1–8. https://doi.org/10.1186/s12874-019-0707-y
S. Res. 680, 115 Cong., 164 Cong. Rec. 6817 (2018) (enacted). https://www.govinfo.gov/content/pkg/CREC-2018-10-11/pdf/CREC-2018-10-11-senate.pdf
Sanfilippo, J., Ness, M., Petscher, Y., Rappaport, L., Zuckerman, B., & Gaab, N. (2020). Reintroducing dyslexia: Early identification and implications for pediatric practice. Pediatrics, 146(1), 1–9. https://doi.org/10.1542/peds.2019-3046
Schelbe, L., Pryce, J., Petscher, Y., Fien, H., Stanley, C., Gearin, B., & Gaab, N. (2022). Dyslexia in the context of social work: Screening and early intervention. Families in Society, 103(3), 269–280.
Shaywitz, B. A., & Shaywitz, S. E. (2020). The American experience: Towards a 21st century definition of dyslexia. Oxford Review of Education, 46(4), 454–471. https://doi.org/10.1080/03054985.2020.1793545
Shaywitz, B. A., Fletcher, J. M., Holahan, J. M., & Shaywitz, S. E. (1992). Discrepancy compared to low achievement definitions of reading disability: Results from the Connecticut longitudinal study. Journal of Learning Disabilities, 25(10), 639–648. https://doi.org/10.1177/10443894211042323
Shaywitz, S. E., Fletcher, J. M., Holahan, J. M., Shneider, A. E., Marchione, K. E., Stuebing, K. K., Francis, D. J., Pugh, K. R., & Shaywitz, B. A. (1999). Persistence of dyslexia: The Connecticut longitudinal study at adolescence. Pediatrics, 104(6), 1351–1359. https://doi.org/10.1080/03054985.2020.1793545
Stanley, C. T., Petscher, Y., & Catts, H. (2018). A longitudinal investigation of direct and indirect links between reading skills in kindergarten and reading comprehension in tenth grade. Reading and Writing, 31, 133–153. https://doi.org/10.1007/s11145-017-9777-6
Stanley, C., Petscher, Y., & Pentimonti, J. (2019). Classification accuracy. United States Department of Education, Office of Elementary and Secondary Education, Office of Special Education Programs, National Center on Improving Literacy. Retrieved from improvingliteracy.org.
The Reading League. (2023). Science of Reading: Defining Guide. https://www.thereadingleague.org/what-is-thescience-of-reading/
Torgesen, J. K., Wagner, R. K., Rashotte, C. A., Rose, E., Lindamood, P., Conway, T., & Garvan, C. (1999). Preventing Reading Failure in Young Children With Phonological Processing Disabilities: Group and Individual Responses to Instruction. Journal of Educational Psychology, 91(4), 579–593. https://doi.org/10.1037/0022-0663.91.4.579
Trainor, A. A. (2010). Diverse approaches to parent advocacy during special education home-school interactions: Identification and use of cultural and social capital. Remedial and Special Education, 31(1), 34–47. https://doi.org/10.1177/0741932508324401
van Bergen, E., van der Leij, A., & de Jong, P. F. (2014). The intergenerational multiple deficit model and the case of dyslexia. Frontiers in Human Neuroscience, 8, 346–346. https://doi.org/10.3389/fnhum.2014.00346
Van Norman, E. R., Nelson, P. M., & Klingbeil, D. A. (2020). Profiles of reading performance after exiting Tier 2 intervention. Psychology in the Schools, 57(5), 757–767. https://doi.org/10.1002/pits.22354
Wanzek, J., & Vaughn, S. (2007). Research-based implications from extensive early reading interventions. School Psychology Review, 36(4), 541–561. https://doi.org/10.1080/02796015.2007.12087917
Wanzek, J., Stevens, E. A., Williams, K. J., Scammacca, N., Vaughn, S., & Sargent, K. (2018). Current evidence on the effects of intensive early reading interventions. Journal of Learning Disabilities, 51(6), 612–624. https://doi.org/10.1177/0022219418775110
Ward-Lonergan, J. M., & Duthie, J. K. (2018). The state of dyslexia: recent legislation and guidelines for serving school-age children and adolescents with dyslexia. Language, Speech, and Hearing Services in Schools, 49(4), 810–816. https://doi.org/10.1044/2018_LSHSS-DYSLC-18-0002
Williams Jr, D. L., & Chavkin, N. F. (1984). Researched based guidelines and strategies to train teachers for parent involvement [Paper presentation]. American Educational Research Association 69th Annual Meeting, Chicago, IL., United States.
Youman, M., & Mather, N. (2015). Dyslexia laws in the USA: An update. Perspectives on Language and Literacy, 41(4), 10–18.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
Author information
Authors and Affiliations
Contributions
J.G., M.E.S., and K.M. jointly conceptualized the study, conducted data analysis, and provided final approval for the version to be published. J.G. led the drafting of the article. M.E.S. led methodological considerations. K.M. led the implementation of the screenings and provided parent education.
Corresponding author
Ethics declarations
Research Involving Human Participants
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.
Informed Consent
Informed consent was obtained from all individual participants involved in the study.
Competing Interests
Authors 1 and 3 were affiliated with the 501c3 community-based organization sponsoring the screening initiative. No author received compensation for their work.
Conflict of Interest
Authors 1 and 3 were both affiliated with the 501c3 community-based organization sponsoring the screening initiative. Neither author received compensation for their work.
Supplementary Information
ESM 1
(DOCX 32.9 kb)
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Galbally, J., Sheppard, M.E. & Mayer, K. Community-Based Early Language and Literacy Screenings. Behav. Soc. Iss. 33, 411–434 (2024). https://doi.org/10.1007/s42822-023-00153-2
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s42822-023-00153-2