Keywords

Introduction

The student demographic profiles at public two-year colleges in Texas reflect the racial and ethnic communities they serve. As one of four majority-minority states in the country—where people of color outnumber the White population—Texas ranks second and third in the USA in the number of Latina/o and African-American residents, respectively (Murdock et al., 2014). Among students pursuing postsecondary education in Texas, Latina/o students represent the largest minority student population attending public two-year institutions, commensurate with national trends (Fry, 2011; Nuñez et al., 2015).Footnote 1 In addition to serving a student demographic that is representative of their local communities, two-year colleges both in Texas and across the states are valuable components of our nation’s democracy and economy. As community college researchers have noted, the financial disparities that plague our country would be much more severe without the existence of two-year colleges to maintain a competitive workforce and sustain America’s middle class (Mellow & Heelan, 2008).

Illustrative of the high proportion of minority students attending public two-year institutions, half of the 50 community college districts in Texas are MSIs, defined in the Higher Education Act as institutions that have received federal funds to serve certain racial/ethnic minorities and low-income students. Specifically, of the 25 community college districts in Texas with the MSI designation, all are HSIs. One district, the Houston Community College System, is also an AANAPISI.Footnote 2 These institutions play critical roles in providing postsecondary access and promoting degree attainment for the most disadvantaged students in the state, in addition to engaging and empowering students of color (Center for MSIs, 2015). In fact, community colleges in Texas enroll over half of the state’s college students (compared to 45% nationwide) and award 37% of all college degrees in the state (THECB, 2016).

Nationally, MSIs have been underfunded (Center for MSIs, 2015), and Texas is one state that has a history of providing inequitable state support to some of these institutions. Indeed, a history of inadequate per-student funding for institutions along the US-Mexico border culminated in a 1987 lawsuit against Texas by the Mexican American Legal Defense and Educational Fund (MALDEF). MALDEF successfully argued that the state had significantly limited postsecondary opportunities for students living on and near the border by allocating only 10% of state funding for higher education to institutions located in the border area, when 20% of the state’s population lived in that region (Ortegon, 2014).

Since the lawsuit, higher education funding in the borderlands has increased considerably (Kauffman, 2016). The adoption of a new, outcomes-based funding model for community colleges in 2013, however, has revitalized questions relating to equitable funding, particularly for public two-year community and junior colleges.Footnote 3 Higher education scholars (McKinney & Hagedorn, 2015) and observers (e.g., Helig, 2013) have begun to anticipate and monitor the effects of a funding model that rewards student achievement metrics and educational milestones (e.g., course completion, graduation, and transfer) to institutions that serve some of the most vulnerable students in the state. The concerns associated with the new funding model in Texas (e.g., relating to potential unintended consequences, such as grade inflation or “creaming”) mirror those pertaining to POBF in other states. Of particular concern are the impacts of the new funding model on MSIs, especially since these institutions serve large proportions of students of color, who have been historically underserved in higher education (Jones, 2014).

The purpose of this chapter is to examine trends in funding for minority-serving community college districts and how the new POBF model has affected state funding for community college districts that serve high levels of racial and ethnic minorities. This chapter begins by presenting a brief review of the literature on two-year MSIs and the impact of POBF models on community colleges in other states. We then briefly discuss our data sources and methods that led to our descriptive findings. The subsequent section presents an overview of funding for public two-year (i.e., community and junior) colleges in Texas and describes the newly implemented POBF model. Following this background, we delineate funding trends for community colleges in Texas and how these changed under POBF, focusing on MSIs. In addition to differentiating by MSI and non-MSI designation, we examine changes in POBF for institutions with varying levels of minority students, disaggregating by race and ethnicity. We conclude by discussing how the metrics that Texas uses to determine POBF allocations might impact higher education equity.

Theoretical Framework

The theoretical framework that will guide this study is Critical Policy Analysis (CPA), which centers on the equitability of the distribution of POBF at two-year MSIs and two-year non-MSIs in Texas. According to Henry et al. (2013), CPA aims to “investigate the ways in which key terms are used, and the extent to which particular policies and practices are consistent with our moral vision for education” (Henry et al., 2013, p. 19).” Specifically, CPA frames this chapter by contributing to the understanding of POBF policy in Texas and addressing ways that POBF is impacting higher education equity in Texas.

Review of Literature

Two-Year MSIs and POBF

Over one-fifth of community colleges nationwide qualify as MSIs (Center for MSIs, 2015), and there are 227 two-year MSIs located among the 32 states that have implemented POBF policies (Jones, 2014; National Conference of State Legislatures, 2016). MSIs include HBCUs, HSIs, TCUs, Alaska Native and Native Hawaiian-Serving Institutions (ANNHs), Native American-Serving Nontribal Institutions (NASNTIs), Predominately Black Institutions (PBIs), and AANAPISIs. To qualify as an MSI and receive federal funding, institutions must meet federal requirements that are specific to the MSI institutional types (e.g., 10–25% full-time enrollment of the target minority group and specified levels of low-income student enrollment).

Two-year MSIs are traditionally expected to accomplish more with less by serving more students with fewer resources and lower per-student expenditures on academic support and institutional resources for underserved students (Cunningham et al., 2014). POBF policies with a stronger emphasis on student outcomes could potentially place them at a greater disadvantage. Under POBF models, policymakers utilize performance metrics to determine a portion of each institution’s (or system’s) appropriation from state funds. Commonly used metrics include retention rates, course completions, graduation rates, and degrees awarded (National Conference of State Legislatures, 2016). In the two-year sector, metrics often comprise transfer rates to four-year institutions, certificates awarded, and associate degrees awarded. Because they rarely address minority students specifically (e.g., these students’ enrollment or the services provided to support them), commonly used metrics do not adequately capture the performance of MSIs. Indeed, POBF experts stress that designers of POBF models should ensure that output metrics are responsive to input factors, like students’ levels of academic preparation, and develop measures that are aligned with the unique missions of MSIs (Jones, 2014).

In recent years, state policymakers have increasingly incorporated metrics specific to improving access. For instance, some POBF models provide additional funding for colleges to enroll and graduate students from underrepresented backgrounds, including students of color, Pell Grant recipients, first-generation students, and adult students. The metrics for underrepresented students also aim to prevent POBF from disadvantaging colleges such as MSIs that serve a larger proportion of students who require more support and resources to graduate. The development of such metrics has resulted in part from concerns over unintended consequences of POBF. Examples of unintended consequences include colleges decreasing the academic rigor of programs; reducing the number of requirements to graduate; and increasing selectivity, thus enrolling students with higher probabilities of graduating. Unintended consequences may disproportionally impact students attending MSIs and can be especially detrimental to colleges with an open-access mission (Lahr et al., 2014). As noted in a subsequent section, Texas’s model does not incorporate metrics that directly reward institutions for serving underserved populations.

POBF Research on Community Colleges

As of 2013, at least 19 states employed POBF for community colleges (D’Amico et al., 2014). Despite POBF’s lauded potential for improving completion rates, POBF models can also elicit unintended consequences, as previous experiences with variants of this funding method have shown. To contextualize our analysis, in this section, we summarize studies that examine the impact of POBF on community colleges. As described below, POBF has resulted in unintended consequences, such as potentially encouraging certificate completion in lieu of degrees, enforcing more accountability standards on faculty and staff, and “creaming” specific students in order to increase POBF.

Recent studies that examine how POBF affects student outcomes at community colleges have found POBF has a significant impact on the amount of short-term certificates—but not long-term certificates or associate degrees—awarded at community colleges (Hillman et al., 2015; Tandberg et al., 2014). In light of this evidence, POBF scholars have questioned whether community colleges are encouraging students to complete short-term certificate programs in order to secure more performance funds. This practice could result in inequitable outcomes, since short-term certificates tend to have a lower return on investment than long-term certificates and associate degrees (Dadgar and Trimble, 2015). These practices are especially dangerous for many first-generation college students, who are unfamiliar with the return on investment rates for college certificates versus college degrees.

This study is also informed by previous research that examined POBF in Washington State from the perspective of community college administrators, faculty, and staff, who discussed their campus experience, viewpoints, and knowledge of Washington’s POBF policy, the Student Achievement Initiative (SAI) (Li, 2016). The study revealed that college officials were well aware of the POBF policy and its effects on their own departments. However, several participants felt excluded from the policy design process and were uninterested in learning about it when it was first introduced. Some participants criticized the policy for being poorly designed and having a problematic data tracking system. Among community college officials, faculty members were identified as the group expressing the least support for the SAI and demonstrating the most resistance to accountability policies in general. This particular finding is critical, since faculty directly influence the academic experiences–and, ultimately, the success–of students.

The final study that is most relevant for this analysis consists of one that applied metrics from a POBF model to examine academic progress and educational outcomes among students enrolled at one of the largest community college districts in Texas (McKinney & Hagedorn, 2015). The authors also identified the students that generated the most and least POBF for the community college district. The results of this study revealed that the characteristics of students who generated the least POBF under the adopted model were male, African-American, age 20 and older, General Educational Development (GED) holders, and assigned to the lowest levels of developmental math. The students in the study who were identified for generating the most POBF were Asian, full-time, and Pell Grant recipients. The authors warned that POBF policy could pressure underfunded institutions such as community colleges to consider recruiting and targeting specific students in order to increase POBF allocations. These actions could hinder the college access and success of our nation’s most disadvantaged students, whose only postsecondary opportunity is the community college.

Data and Methods

This descriptive analysis of funding allocations for MSIs under the state’s new POBF model relies primarily on institution-level data from the Texas Higher Education Coordinating Board (THECB), the state’s higher education agency. In particular, we downloaded data from this agency’s Higher Education Accountability System (from txhighereddata.org). From here, we obtained annual data for the following categories of variables: (1) total fall headcount enrollment; (2) total enrollment by student subgroups, including racial/ethnic minorities; (3) total semester credit hours (SCHs); (4) tuition and fees; and (5) finances per FTE from various sources, including from state appropriations. In addition, this repository of information contained data on individual institutions’ performance on the various performance (i.e., success points) metrics included in the state’s new POBF model for public community and junior colleges. We complemented these data on success points accumulated with a variable on success points funding earned by each institution. The data on success points funding are from the coordinating board’s “Formula Funding” website. With the exception of the success points (which are only available starting in 2014), all variables are from 2000 through 2015.

In addition to these data, we consulted the list of accredited postsecondary minority institutions from the United States Department of Education (USDOE). We used this list to code two-year institutions in Texas as MSIs or non-MSIs.Footnote 4 To verify that our interpretations of the model and the data were accurate, we consulted with one official at the coordinating board.

Our analysis involved generating descriptive statistics of funding for MSIs and non-MSIs before and after POBF implementation. In addition to trends in state funding for MSIs and non-MSIs, we examined variability in funding across institutions before and after POBF. Turning to POBF specifically, we examined the amount of performance funds generated by MSIs as compared to non-MSIs. We also illuminated the effect of specific performance metrics on institutions by MSI status by comparing the funding for MSIs and non-MSIs that was tied to specific performance metrics. Finally, we examined the relationship between institutions’ performance on each performance metric and various student demographic characteristics (e.g., proportions of minority students). Before presenting the findings from this descriptive analysis, the following section summarizes the context and recent history of funding for two-year institutions in Texas.

Public Community/Junior College Funding in Texas

Public two-year institutions—or community/junior colleges—are one of five types of public institutions in Texas, each subject to a different funding allocation model. The Texas Legislature appropriates funding to each of the 50 community college districts (rather than to individual institutions). The number of campuses in each district ranges from 1 (in most districts) to 7 (in the Dallas County Community College District). During each legislative session, the Texas Legislature distributes funding for the following two years (i.e., a biennium) since the legislature meets every other year.

Before the implementation of POBF in the 2014–2015 biennium, state allocations for community colleges were based on contact hours, weighted differentially by discipline. As illustrated in Fig. 4.1, in addition to state funds, public two-year institutions receive local tax and tuition and fee revenues.

Fig. 4.1
figure 1

Major Sources of Operating Revenue* for Community Colleges in Texas, FY 2011. Source: Legislative Budget Board 2013

In the past decade (between 2003 and 2013), annual state funding per FTE student has been comparable across MSIs and non-MSIs. The median annual state appropriations per FTE student across this time period is $2,790 at MSIs compared to $2,857 at non-MSIs. A two-sample T-test of the difference in per-FTE student funding from the state across MSIs and non-MSIs indicates that funding for the two groups is not significantly different. With this analysis of funding trends as background, we now turn to the POBF model and its effect on funding distributions for MSIs.

Student Success Points Program

In 2011, the Texas Legislature adopted House Bill 9, which directed the THECB to develop POBF models for public higher education institutions in Texas in consultation with institutional representatives. The THECB and the Texas Association of Community Colleges (TACC), a group that represents and advocates for community colleges in Texas, formally proposed the recommended POBF model—titled the Student Success Points Program—to the legislature. The legislature adopted the new model in 2013 and first used it to determine allocations for public community/junior colleges during the 2014–2015 biennium.

Model Design

According to the TACC (2013), the Student Success Points program is modeled after Ewell’s 2006 Milestone Events Model (Leinbach & Jenkins, 2008). This type of funding model accounts for students’ distinct levels of college readiness upon enrolling in college. Rather than focusing exclusively on outcomes, it rewards student progress—including credit hour accumulation and developmental education course completion—toward a degree or certificate. Notably, the Student Success Points Model is a distribution model, which determines what share of the pie each institution receives (and not how large the pie should be).

Texas’s POBF model applies to all public community/junior colleges and has three components: core operations, weighted contact hours, and success points. For core operations, each community college district receives $1 million per biennium ($500,000 per year). Thus, core operations funding does not vary across institutions and does not depend on any input- or output-related factors. Aside from core operations, 90% of formula funding is distributed based on contact hours weighted by discipline (for a total of $1.54 billion in 2014–2015). The remaining 10% ($172 million in 2014–2015) is allocated based on institutions’ success points. Specifically, the Student Success Points model is based on the following metrics:

  • developmental education in math, reading, and writing (with a premium for math);

  • first college-level course passed in math, reading, and writing (with a premium for math);

  • completion of 15 and 30 SCHs;

  • degrees or certificates awarded (with a premium for critical fields); and

  • transfers to a university after completion of 15 SCHs.

In 2013, the legislature determined funding for public community and junior colleges for the 2014–2015 biennium using the Student Success Points model (based on core operations, contact hours, and success points). Per-student funding for all public two-year institutions in Texas declined in the years leading up to the implementation of POBF in 2014, but increased under the new funding model. As illustrated in Table 4.1, MSIs received slightly lower median per-FTE student funding from the state prior to the implementation of POBF (both in the decade before POBF and in the two years leading up to the new policy). However, under the new model, they received a slightly higher rate per-FTE student ($3,061) compared to their non-MSI counterparts ($2,919). Thus, in the aggregate, MSIs fared slightly better than non-MSIs under POBF.

Table 4.1 Median Per-Student Funding for Public Community/Junior Colleges from State Appropriations, by MSI Designation and POBF Operation

Figure 4.2 illustrates the variability in per-student funding for community colleges in Texas. Before POBF, there was a greater range in appropriations to individual institutions. Most of this variability is attributed to funding outliers in 2003 and 2006. Between 2003 and 2013, funding per FTE ranged from $4 (to Weatherford College in 2003) to over $300,000 (to South Texas College in 2003) before the implementation of POBF. In contrast, under POBF, there were fewer outliers, and funding ranged from $2,055 (at Blinn College) to $4,399 (at Howard County Junior College District). Although this method results in a more equal distribution, lower enrollment institutions, which do not benefit from economies of scale, may warrant additional funding. For example, institutions with declining enrollments still incur fixed costs, such as building maintenance costs that do not fluctuate with enrollment. Because these institutions could receive significantly reduced funding (depending on the scope of their enrollment declines), they might be disadvantaged under a more egalitarian distribution model. Future analyses should explore the effects of a more equal funding distribution on low-enrollment institutions.

Fig. 4.2
figure 2

Per-Student Funding for Community Colleges in Texas, Before Performance Funding (2003–2013) and During Performance Funding (2014–2015)

*Note: South Texas College is excluded from this figure, since it is a significant outlier. Specifically, South Texas College received $304,161 in state funding per-FTE in 2003. The next highest value, which is represented on the graph, is $14,026 awarded to Amarillo College in 2003

Turning to the Student Success Points allocation, which accounts for 10% of the formula (with 90% based on contact hours), we examine the amount of funding from success points earned by MSIs and non-MSIs. To adjust for volume, we specifically examine success points funding by fall student headcount enrollment for MSIs and non-MSIs. This finding, depicted in Fig. 4.3, indicates that MSIs earn more funds per student based on performance (student success) points than their non-MSI counterparts.

Fig. 4.3
figure 3

Total Success Points Funding by Fall Headcount Enrollment by MSI Designation, 2014

For a deeper analysis of how the performance-based portion of the model allocates funds to MSIs and non-MSIs, we explore the distribution of student success point funding specifically. Table 4.2 disaggregates student success point funding accumulation by each performance metric included in the Student Success Points model. In addition to the weight associated with each metric, this table presents the accumulation of points for each category by MSI designation.

Table 4.2 Weighted Student Success Points by Total Fall Headcount Enrollment for Each Metric, by Weight and MSI Designation, 2014–2015 

As illustrated in the table, MSIs score lower on “outcomes” metrics (i.e., degree and certificates awarded and degrees awarded in critical fields). On the other hand, they yield more funding from “progression metrics” (15 and 30 SCHs) than non-MSIs. Regarding developmental education and gateway courses, the analysis reveals differences by subject area. MSIs outperform non-MSIs in student success point funding earned for math metrics. On the other hand, non-MSIs garner higher levels of student success point funding tied to reading and writing performance. This discrepancy in the performance of MSIs and non-MSIs by subject area may be attributed to higher levels of English-Language Learners (ELLs) at HSIs. ELLs may be less successful in the reading and writing developmental education and gateway courses. Future research should explore the factors that explain this disparity. Notably, math metrics are weighted more heavily in the funding model than metrics in other subject areas, granting MSIs a slight advantage in funding for gateway and developmental education metrics.

Finally, we were interested in examining how institutions with various student demographic characteristics fared in success point accumulation in 2014–2015, irrespective of their MSI designation. This analysis afforded us a finer level of detail since the MSI classification does not capture, for example, the percentage of part-time students enrolled or specific proportions of minority student enrollments. Specifically, we grouped institutions into quartiles based on the proportion of enrollments by race/ethnicity, gender, and part-time status in 2014. We then plotted their per-student student success points accumulated (by institution) against these quartiles. Figure 4.4 illustrates three clear trends that emerged from this analysis. First, institutions with higher proportions of students who are Hispanic accumulated lower levels of total success points. This trend is consistent and significant. Second, also unambiguously, institutions with higher proportions of White students earned more total success points per student. Third, institutions with higher proportions of part-time students earned significantly fewer points per student, per year in 2014 and 2015.

Fig. 4.4
figure 4

Total Success Points by Enrollment by Quartiles for Proportions of Student Subgroup Enrollments, FY 2014–2015

In addition to those trends, two findings from this descriptive analysis were unexpected. First, institutions with percentages of Asian students above the median (quartiles 3 and 4) earn lower levels of success points per student than those with lower proportions of Asian students. This finding was surprising given McKinney and Hagedorn’s (2015) finding that Asian students yield higher levels of revenue for institutions under POBF. Another notable finding was the positive relationship between percentages of female students and success points per student up to the third quartile. Excluding the 25% of institutions with the highest female student enrollments, higher female student enrollments appear to be positively associated with success point accumulation. The noteworthy effects relating to Asian and female students warrant further analysis.

Discussion and Policy Implications

With a CPA lens, we investigated how the new POBF model for two-year institutions in Texas distributed funds to two-year MSIs and two-year non-MSIs during the first biennium of implementation (2014–2015). The analysis revealed that, under POBF, two-year MSIs receive higher levels of per-student funding, in the aggregate than two-year non-MSIs. This contrast is especially notable since, before the implementation of POBF, MSIs received slightly lower per-student funding than non-MSIs. In addition, funding levels for all institutions increased under the POBF model. Taken together, these summary findings suggest that MSIs generally fared positively under POBF. However, further examination of the individual portions of each component of the POBF model and of success points distributions for institutions with various student demographics reveals a more nuanced picture.

Turning to specific performance metrics, the analysis revealed that MSIs earn more state funding from math metrics, both in developmental education courses and gateway courses. In contrast, students at two-year non-MSIs are earning more certificates and degrees, including in critical fields. This finding highlights the importance of including in funding formulas metrics that assess student progress, including developmental education completion and credit-bearing course completion, in addition to those that measure outcomes.

Furthermore, the findings from our analysis of success point distributions reveal that institutions with higher proportions of students who are Hispanic earned lower levels of total success points; on the other hand, institutions with higher proportions of White students earned more success points. The results from our analysis are consistent with previous research on HSIs that suggests some do not include their HSI designation in their mission statements, and fall behind in producing equitable results for Latinos/as in earning degrees and pursuing STEM fields in comparison to White students (Contreras et al., 2008). As such, the performance metrics in their current form disadvantage institutions that serve the highest proportions of Hispanic students, regardless of their HSI status. In turn, by attending institutions with lower resources, these students are also at a disadvantage as they pursue higher education. This finding, coupled with the results related to the relative advantage to institutions with high proportions of White students, is notable. Policymakers in Texas have discussed the possibility of increasing the proportion of funding that is based on performance so that 75% of funding is based on contact hours and 25% is based on student success points (compared to 10% in 2014-2015). As the success points portion of the model becomes more heavily weighted, if the metrics remain unchanged, institutions that serve the most vulnerable students may become disadvantaged under POBF. In light of our findings, POBF model designers should consider including a premium for minority students to encourage institutions to continue serving all students. Likewise, because part-time students yield lower revenues for institutions, policy designers should consider a premium in funding models for serving part-time students.

Finally, under the POBF model, the distribution of funding across all institutions is more consistent and equal than it was before POBF was adopted. Prior to POBF, per-student funding ranged from $4 for one institution to over $300,000 for another. Under the new model, all institutions received between $2,000 and $4,400 per student, thus eliminating outliers. While this equality can be viewed positively, these more formulaic distributions under POBF may prohibit special funding that is necessary for some institutions (e.g., ones that have experienced dramatic enrollment declines) under certain circumstances. In Texas, policymakers may have discretion to provide these special funds in a separate pool outside the funding formula, which would not be captured in our analysis. The flexibility to provide special funds, particularly for institutions that have been historically underresourced, is important. Future studies of state financing under POBF should examine the extent to which these new models balance equitability in funding and flexibility to allocate additional funds in special circumstances.

Conclusion

Hispanics represent the largest minority student population attending community colleges in Texas, and 25 of the 50 community/junior college districts in the state are designated as HSIs. Thus, the findings that reveal institutions with higher proportions of Hispanic students earned lower levels of total success points signal a cause for concern. The inequitable results for Hispanic students raise questions about the extent to which HSIs in Texas are equipped to serve their Latino/a population, especially given larger enrollments of students, more students with financial need, and higher student-faculty ratios than non-HSIs (Rodriguez & Calderon Galdeano, 2015).

In its current form, the POBF model that Texas utilizes is equitable and does not disadvantage institutions that serve the state’s most disadvantaged students. Although Texas adopted a POBF model that yields equitable funding outcomes for both two-year MSIs and two-year non-MSIs, its disparate impact on Hispanic and part-time students and the generally lower performance of MSIs on some of the chosen success points–especially outcomes metrics–warrants further consideration. Nevertheless, some parts of the model, especially its inclusion of developmental education and course completion metrics, represent a sound design and should be considered by other states.

In an effort to maintain its national and global competitiveness, Texas launched 60x30TX, its new higher education strategic plan that aims for 60% of the 25- to 34-year-old Texas population to hold a certificate or degree by 2030 (THECB, 2015). As of 2013, only 38% of adults in that age range had a postsecondary credential. To reach the state’s goal, Texas community colleges may depend on additional resources to serve more students and ensure their success. If adequately designed to account for community colleges’ unique missions, the state’s POBF model may become critical for improving the persistence and completion rates of the state’s most disadvantaged students, bringing the state closer to reaching the 60x30TX goal. To achieve that goal, however, state policymakers must ensure that MSIs, which serve the largest demographic among community college students in Texas, are not disadvantaged under the new funding model.