As the cost of drugs (if not yet hospital care) began to rise during the late-eighteenth and into the nineteenth century, a new institution arose in cities across the infant United States. This was the dispensary, an English import. These local clinics offered a range of basic health services to poorer residents, in addition to being an important early source of medical education for aspiring physicians (Markle 2000). Philadelphia hosted one of the first American dispensaries (1786), with other East Coast cities following suit in the years immediately thereafter. These facilities would start small, but quickly grew to provide care to a sizable portion of the population. Indeed, by 1900 it is estimated that around half of New York City’s population received care through dispensaries [1].

The growth of dispensaries was partly driven by concerns associated with Alger under the umbrella of ‘pauperism’—namely, that admission to hospitals and (largely) free care there could breed dependence, and that moral upkeep required one to take great pains to avoid becoming an inpatient. The stigma associated with early hospitals and almshouses similarly led to great demand for alternate sources of less stigmatizing care in the community. Despite serving large numbers within the urban populace by the turn of the twentieth century, however, the days of the dispensaries were numbered.

In the final years of the nineteenth century states, under pressure from certain members of the medical profession, states began passing laws that subjected dispensaries to tighter regulation, and in some cases sought to limit the opening of new such facilities. This campaign was ostensibly driven by concerns arising from the potential of dispensary abuse. Some argued that at least some of those frequenting dispensaries and receiving their (largely) free care could actually afford to pay. The spectre of pauperism was raised anew, with some medical professionals arguing that charitable efforts in health care generally tended to increase the numbers of those falling into, or remaining in, poverty [2]. In the process, they were cheating struggling urban physicians out of a living, while exhibiting basic moral failings. Indeed, some physicians asserted that the great availability of free care in dispensary settings bred its own form of dependence.

At the same time, hospitals were a growing presence in urban communities, with many opening their own outpatient clinics. The 1911 Flexner Report, moreover, systematized national medical education, and placed much of it firmly within hospital walls. At the same time, the growing specialization within the medical profession meant there were correspondingly fewer generalists willing to practice within the setting of the dispensary. While other physicians remained strong supporters of free medical care in the community, other leading figures within the profession took up arms against it. In the end, dispensaries in cities across the country folded. They were replaced by hospital facilities and, within a few decades, a new source of community care: the community health center.

Neighborhood Health Centers: The First Wave

Community health centers, also known as neighborhood health centers, had their (initial) first arose at around the time dispensaries were falling into abeyance. Wilbur Phillips, a social advocate with ties to the Socialist Party, developed some of the first neighborhood health centers in Milwaukee and Cincinnati in the years immediately before World War I. These ventures, in turn grew out of his work in establishing urban milk depots in New York, which began simply as sources of clean milk for new mothers, but soon came to offer medical examinations for young children, as well as providing health advice for mothers. Phillips, who would today undoubtedly be known as a community organizer, elaborated on these prototype institutions in the two aforementioned Midwestern cities, establishing full-service health facilities that brought together a wealth of medical and social welfare functions under one roof. Each center covered a distinct, limited range of territory, and some included within their remit political organizing [3].

The fledgling, largely voluntary neighborhood health center ‘movement’ received a shot in the arm after World War I. Fresh off its experiences on the battlefield, the American Red Cross took on a leadership role in establishing and staffing health centers in cities across the country. Unlike dispensaries, these health centers were conceived as full-service institutions bringing together a wide range of social and health functions and agencies under a single roof. By 1920, a study counted 72 health centers spread across 49 communities, including seven cities that hosted more than one such center. Of these 72, around half were operated under public auspices, with voluntary organizations taking the lead across the other half. The Red Cross claimed involvement in the case of 19 facilities. The rise and spread of neighborhood health centers was encouraged by the federal government, which began issuing grants toward their development under the Sheppard–Towner Act of 1921, which focused on infant and maternal health [4].

Over the next decade, health centers continued to spread across the American urban landscape. The New York City Department of Health opened two centers as demonstrations, one in East Harlem, and the other in the Bellevue-Yorkville section of Manhattan. The East Harlem center was singled out by Charles Wilinsky, sometime health commissioner of the City of Boston and leading proponent neighborhood health centers (he extolled them as the ‘department store’ of health care) as a prime example of the form [5]. Hermann Biggs, who had been appointed New York state Commissioner of Health in 1914, was favorably impressed with this model of health delivery, proposing its extension to counties and jurisdictions across the Empire State in the 1920s [3]. His proposal met with stiff resistance in a state legislature swayed by arguments from the New York state medical society, which claimed it amounted to ‘state medicine’ [6].

By 1926, there were eleven neighborhood health centers in the City of St. Louis alone. A 1930 federal report on health centers, taking an exceedingly broad definition of the term, counted over 1500 such facilities across the country, of which 80% had a post-1910 vintage. In some ways, this represented the high-water mark of this first community health center movement. As previously insular immigrant communities dispersed, its members becoming better educated and assimilated into American society, the glaring need for coordination of services on the local level (arguably) became less urgent, particularly as these new(er) Americans turned increasingly to private medical care. The previously dizzying array of bodies, public and private, responsible for social welfare became more centralized over time, also obviating the need for intensive local coordination [3]. The growing focus on individual medical care, at the expense of a community focus, and the same professional opposition that had doomed the dispensaries, likely also played a role in the decline of these first community health centers. Their situation was hardly helped by the end of Sheppard–Towner grants in 1929 [4].

Community Health Centers: The Second Wave

Community-minded medical leaders met with resistance from other members of the medical profession as they set about founding first dispensaries, and then neighborhood health centers. Nonetheless, the idea of providing a range of health and social welfare services under one roof, educating members of the community, while offering meaningful local representation and (in some cases) political organizing did not die. A confluence of political circumstances and actors came together to engineer the second coming of the local health center during the mid-1960s.

By the 1960s, restrictive immigration laws had largely shut off the flow of southern and eastern European immigrants who had previously flocked to urban neighborhoods. In their place, the Great Migration of the (roughly) 1910s–1940s brought millions of African–Americans from impoverished areas of the South to cities across the country, though especially those in the North and Midwest. In many cases escaping conditions of abject poverty, these migrants sought to improve their economic situation while escaping the worst of the Jim Crow regime that had institutionalized racism decades after the abolition of slavery. Though a robust African–American middle class would arise with time, deeply-ingrained racism, so-called White Flight from changing urban neighborhoods and the resultant cratering of the local tax base, as well as a succession of discriminatory policies against the African–American urban poor often led to entrenched poverty and the decline of central city services—and poor health outcomes [7]. Conditions became ripe for the coordination of basic health and social welfare measures paired with education and political empowerment a neighborhood, or community health center could offer.

The first wave of programs under the larger banners of the War on Poverty and Great Society worked to empower a generation of local activists. These activists needed an ‘acceptable’ outlet for their growing ardor. At the same time, the 1965 passage of Medicare and Medicaid somewhat neutralized opposition to government action to ameliorate the health circumstances of needy groups [8]. The ground was thus set for the establishment of a new wave of community health centers, initially under the auspices of the Great Society’s Office of Economic Opportunity (OEO). With the support of the powerful senator from Massachusetts, Edward Kennedy, the OEO Act was amended to allow for the founding of community health centers, and the centers were given an initial appropriation of $51 million in the spring of 1967 [9].

Community Health Centers are located in cities across the country, as well as in many rural areas. They are charged with providing health services to anyone who enters, regardless of ability to pay—and, significantly, may not inquire as to a patient’s immigration status. Over the last 50 years, they have become a crucial part of the urban health safety net, among the only facilities offering consistent, often high-quality care to the most vulnerable populations, including the homeless and undocumented immigrants. This second wave of community health centers has proved to have wider reach and greater longevity than the first: while in 1971 there were around 150 such centers, the number had risen to one thousand by 2002 [10]. Under the Affordable Care Act, a Community Health Center trust fund of $11 billion (on top of a $2 billion boost under the so-called Stimulus bill in 2009) was established, in recognition of the role such facilities were to play in providing care to the poor, and many newly-insured under the Act [11].

As of 2016, around 24 million Americans received care through community health centers and those facilities considered functional ‘clones.’ Their patient profile differs considerably from other provider types (like hospital outpatient departments and private physicians). Whereas about 60% of patients seeking care from private physicians carried commercial insurance, the corresponding figure from community health centers was 16%. Five to seven percent of patients receiving care from hospital outpatient departments and private physicians were uninsured; a full 28% of community health center patients lacked insurance, with another 47% on Medicaid [12].

Community health centers remain an important part of the health safety net for many urban (and rural) residents. Along with the other elements of the urban health infrastructure, they have witnessed the continuing shift away from communicable disease with high rates of mortality, to chronic disease which, by definition, yields fewer immediate deaths but can in certain cases significantly curtail quality of life among those impacted.