Keywords

Cultivating Sensitivity

Imagine that you are trying to build a good society, one characterized by social harmony, stability, civility, mutual understanding, and fairness. Would individual freedom and autonomy be part of that picture? Or might they create the very conflicts and tensions the good society was designed to resolve? What would you be willing to give up for the utopian vision? And what are your chances of attaining it once you have made those concessions?

Today, in the academic world and outside of it, a vocabulary has arisen that, in the most positive reading, reflects the age-old desire for a better world. Universities, in many ways self-enclosed entities despite their dependence on federal, state, and/or private largesse, routinely proclaim their commitments to terms that are largely interchangeable, each defined by reference to ideologically related terms: multiculturalism, cultural competence, diversity, equity, and inclusion. These terms all rapidly shade off into moralistic claims for, most broadly and vaguely of all, “social justice.”

The result is that the modern university, ostensibly devoted to free intellectual inquiry in service to the pursuit of knowledge, has become the location most vulnerable to the pressures and consequences of sweeping utopian aspirations. This is particularly pernicious in the context of higher education, where time is short and each new shift in focus necessarily occurs at the expense of something else. In fields such as applied psychology, teacher training, social work, and similar programs, social justice ideology has, in recent decades, come to play a powerful role, increasingly embracing a narrow and highly problematic perspective that in fundamental ways works against the very notion of creating autonomous individuals capable of determining their own political and moral commitments.

The institutional politicization of education might be somewhat acceptable in an academic setting presenting a spectrum of views on important political and social issues. But this is rarely the case. Instead, left-liberal ideas, often derived from a threadbare Marxism, are routinely promoted as the only possible legitimate perspectives. This has created the phenomenon that is referred to as “political correctness,” a term whose use was originally inflected with irony but that has become a straitjacket, inhibiting dissent and even overtly attacking the very notion of free speech. Such a subjugation of educational aims to ideological ones has made American universities come perilously close to destroying the very values that made them famous all over the world.

In conjunction with the reality of shorter semesters and lowered expectations for student performance, the new politics of education carries significant consequences: precious time and resources are diverted from education proper, which is instead explicitly subordinated to the political passions of the moment. Entire programs these days proudly proclaim their social justice agenda and require conformity on the part of students and faculty. The constantly repeated retort to critics of such an agenda is that “politics is already everywhere” (a perspective popularized by Foucault) and all that matters is who has the power to impose their own interests by making them pass for truth (Gitlin, 2005, p. 403), a view that effectively characterizes totalitarian regimes. It is something new, however, to see professors in liberal western democracies assert that they are merely doing openly what (so they claim) has always been done surreptitiously – politicizing the classroom – and being proud of it. Such statements efface the crucial distinction between the schooling mandated by societies in which debate is stifled and that promoted by liberal arts universities in western democracies, where education rather than indoctrination is or should be the goal.

The role of the university – it cannot be repeated often enough–ought to be to teach students how to think, not what to think. Universities fail to fulfill their function if, while opening their doors to more and more people and groups in search of higher education, they simultaneously close them to the exploration of competing ideas and opt instead for the inculcation of currently favored dogma. This has taken the form of prohibiting certain speech (through speech codes and so-called harassment policies)Footnote 1 but also, and increasingly, compelling approved speech. In other words, avoiding the supposed harm of hurtful statements and dissenting views is but one pole of the new bureaucracies; the other is forcing expressions of obeisance to particular ideas. Both are regularly found in the entire “social justice” agenda.

As Jonah Goldberg observes in his book The Tyranny of Clichés (2013), “One simply cannot be in the do-goodery business without making reference to the fact that you’re fighting for social justice.” The result, he states, is that the term “social justice” is now used as merely “a placeholder for goodness” (pp. 132–34).Footnote 2 But how, one may well ask, can anyone be opposed to social justice? Goldberg explains:

The fundamental problem with social justice is that there are no limiting principles to it. It is an open-ended license for the forces of goodness to do what they think is right forever. It is an empowering principle for the high moral ground in all political debates. There are no boundaries, no internal checks, no definitional roadblocks. It’s social justice for as far as the eye can see. (p. 144, his emphasis)

For their part, universities have been actively engaged in promoting such an agenda for years, under a variety of names. In 2002, the National Council for Accreditation of Teacher Education (NCATE) made the term “dispositions” part of the accreditation process of every teacher. Nor was this a vague suggestion. KC Johnson, an accomplished scholar and teacher, was initially denied tenure at Brooklyn College on the pretext of being “uncollegial” because of his criticisms of current campus orthodoxies, in particular the weeding out of teaching candidates for not having the right “disposition,” meaning commitment to “social justice” (Johnson, 2005; Editor, 2016, March 22).

Despite criticisms of this trend from people of varying political persuasions, the demand that students and faculty manifest specific political commitments has only increased, matched by a corresponding decrease in the actual intellectual and ideological diversity of higher education . Programs in “social justice education ,” for example, are not shy about declaring their convictions and insisting applicants share them. At the University of Massachusetts Amherst (2009), students wishing to enter the Master’s and Doctoral programs in social justice education must demonstrate their “applied social justice experience.” The all-encompassing goal of these programs is plainly laid out: “Students in social justice education study the inequities that people experience on the basis of their social group memberships, through systems of constraint and advantage reproduced through the social processes of exploitation, marginalization, powerlessness, cultural imperialism, and violence.”

Thus does a contentious interpretation of what causes social inequality get recast as truth, to be communicated to students as unquestionable fact. From there, social justice activism spreads throughout the country as graduates take their training into teaching, school counseling, special education, research, and school administration – positions from which they will train future generations in the same narrow perspective.

New faculty appointments are being defined in similar terms: A recent listing for a tenure-track position in “Diversity and Community Studies” at Western Kentucky University (2016) states: “While disciplinary training and area(s) of research are open, candidates should have experience with community-based participatory action research and social justice scholarship.” Candidates must, furthermore, demonstrate “commitment to transformative pedagogies and social justice approaches.” Contributions to the goal of “diversity” are also crucial. At Texas Women’s University (2016), the Department of Multicultural Women’s and Gender Studies, which offers a PhD, MA Graduate Certificate, and undergraduate minor, is seeking a chair to lead its “exciting multicultural curriculum that integrates diverse perspectives and critically applies feminist/womanist scholarship on behalf of social justice.”Footnote 3

Nor are such commitments demanded only of faculty entering a few particular fields. At the University of California, San Diego (2016), a “knowledge of diversity, equity, and inclusion ” (DEI) has been required of all candidates for a Bachelor’s degree as of 2011. Furthermore, elaborate institutionalization of DEI throughout the university in all its aspects is spelled out, including through evaluation of faculty members’ “contributions to diversity.” The University of Cincinnati, inspired by UC San Diego, announced that starting on July 1, 2016, a “diversity and inclusion” statement will be required of applicants for faculty and staff positions (Reilly, 2016). As the university’s senior associate vice president and chief human resources officer sanctimoniously and repetitiously explained: “This application request recognizes that the university is a diverse environment and signals that diversity and inclusion are important enough that we’re asking applicants about contributions or potential contributions up front. We’re all better off with diversity in our lives, [as] part of demonstrating our commitment to diversity and inclusion and setting expectations and priorities.” Using more up-to-date language, Cincinnati’s Office of Diversity and Inclusion at the College of Medicine stresses that the College has “enhanced our curriculum to incorporate integrated cultural competence activities” and that “cultural competence is infused into the curriculum.” As if this were not sufficient, it also requires a “class oath” to diversity (University of Cincinnati College of Medicine, 2015).

For those who remember the loyalty oaths of the 1950s imposed by some states on public school teachers, who in many cases were also required to affirm that they were not members of the communist party, the current demands for ideological conformity should be disquieting, not to mention a clear violation of First Amendment rights. In a democratic society, citizens participate in civic affairs without being forced to declare their politics any more than their religious beliefs. Open debate – and dispute – about such issues is crucial, as is the freedom to express unpopular opinions without fear of punishment. Although education, even if intended primarily to transmit skills and information, is not entirely value-neutral, that is a far cry from setting education up as a system for the indoctrination of young people into the unquestioned and unquestionable beliefs of the moment. Free inquiry evaporates in a climate in which the “right” ideas cannot be challenged and, instead, must be adhered to as articles of faith transformed into a condition of entering public schools and civic life.

The spread of declarations of commitment to “diversity” throughout educational institutions in the United States has been an extraordinary phenomenon. “In the marketplace of political culture,” as Randall Kennedy notes, very few terms have acquired influence as quickly as “diversity” – and that influence continues to grow (cited in Baehr & Gordon, 2017).

Interestingly, all these terms tend to be defined by reference to the other like-minded terms. Thus, after decades of usage of “multiculturalism” in many contexts, the term was recently defined by Kenan Malik (2015) as “the embrace of an inclusive, diverse society.” He goes on to note: “Multiculturalism has become a proxy for other social and political issues: immigration, identity, political disenchantment, working-class decline.” In other words, like Goldberg’s definition of “social justice,” multiculturalism stands in for “goodness,” that is, manifesting the correct attitudes about a whole range of issues.

Within the “social justice” agenda, the vocabulary of “cultural competence” is a fairly recent addition, not yet as widespread as the far more familiar “multiculturalism” and “diversity.” Nonetheless, a Google search in February 2018 of the terms “cultural competence” plus “UMass Amherst” – the university from which I recently retired – turned up nearly 20,000 links. Like the other current touchstones, however, “cultural competence” is an accordion phrase, capable of expansion as needed.

But there are important distinctions to be drawn between it and the earlier terms. Though they all reflect a similar ideology, now overtly proclaimed in academe, “cultural competence” has a certain cachet lacking in mere “diversity,” “inclusion,” and “multiculturalism,” for it implies an effort to acquire and act on professional expertise and knowledge, ostensibly to meet the legitimate needs of certain clients. Ironically, in the age of Foucault (1977), for whom knowledge is inseparable from power (hence his coinage “knowledge/power”) – diffuse, ubiquitous, and conceived above all as the negative power to coerce, oppress, and control – the very concept of competence (akin to that of “expertise”) can be and often has been used to delegitimize those whose access to knowledge necessarily makes them guilty of having “privilege” of one sort or another.

If “cultural competence” today nonetheless carries a positive connotation (a good to be acquired), “privilege” – whether material or not – implies the reverse. It is one of the nefarious sins that must constantly be sought out and identified, taking its place alongside “microaggressions” and other dangers purportedly besetting academe. These dangers seem to have multiplied in recent decades, so that students now require “trigger warnings” and “safe spaces,” in and out of the classroom, to guarantee the comfort of those unable to live without perpetual vigilance over their daily lives. Only in this manner can they spot the interminable affronts that need to be denounced and rooted out, so as to usher in the better society hovering just out of sight.

Cultural competence, then, in addition to the uncomfortable whiff of privilege the term ironically exudes, cannot escape the shadow of its opposite: cultural incompetence , not something anyone would wish to display. In much the same way, “multiculturalism” evokes its nemesis, the specter of uniculturalism , as if, without endless propagandizing, this is what American society would indulge in. In the academic world and beyond it, most people know which side of these rhetorical divides they must be on. So far reaching is this nomenclature that merely to question the concepts or point out their slipperiness is already to have committed grievous social and ideological sins that invite name-calling, ostracism, and possibly job loss.

Even university presidents and other high-level administrators have discovered this in recent years (e.g., Lawrence Summers, former president of Harvard; Tim Wolfe, former president of the University of Missouri; or Mary Spellman, former dean of students at Claremont McKenna College). Forced resignations or summary dismissals usually follow abject apologies worthy of Soviet reeducation camps or Maoist “self-criticism” sessions, designed as rituals of public humiliation and object lessons to others who might consider straying from the official path of righteousness. In business, the media, and virtually all other arenas, the same consequences may occur when a comment deemed offensive to one or another protected group or cherished belief is uttered.

Leaving aside the thorny question of just how “culture” is to be defined, however, there is nothing to prevent the recognition that it is socially useful for people to understand basic rules of conduct that may prevail not only within their own but also in other groups. That is why entrepreneurs and politicians, for example, seek instruction from the scores of cultural competency trainers and experts easily found online, rather than risking gaffes that will scuttle the very things they are trying to accomplish in their national and international endeavors. Indeed, an Internet search (February 2018) readily produced over half a million links and resources dealing with cultural competence training, experts, curricula, and consultants and hundreds of thousands of links to articles, textbooks, manuals, and guides on the subject of acquiring or building cultural competence.

If all of us obviously benefit from having some degree of “competence” in dealing with people of other cultures, this has always been so, and humanity has nonetheless managed to survive without special training by professionals in this arena. As Denis Dutton (1995) observes, “Multiculturalism is, after all, something that simply happens whenever cultures live with each other, a fact continuous through recorded history.” The novelty, he goes on to say, is that “in European and American society today, the term has come to denote not a social given, but a political imperative: multiculturalism is ideology.” And in its name, protests, grievances, demands, and accusations currently flourish.

Through a series of pointed contrasts, Craig Frisby demonstrates why “multiculturalism research” is an oxymoron, “a sociopolitical ideology, not a science” (2013, his emphasis). He also provides a useful chart to help clarify the features distinguishing sociopolitical ideologies from objective empiricism (2013, Table 10.3) The goal of empiricism, he explains, is to discover objective truth, whereas the goal of multiculturalism is simply to advance multiculturalism, and to do this it relies on the “claim that cultural groups determine their own versions of reality” (p. 518). Such a postmodern view is then used to bolster current orthodoxies that allow no challenges whenever a minority group member claims to have been the victim of, say, racism or sexism – the mere assertion of which is taken as truth. Similarly, research, however sound, daring to question multiculturalism’s core beliefs is regularly condemned, along with its authors. The ensuing problem is that when truth and falsity are adjudicated by who speaks and who feels aggrieved, not by the relationship such charges bear to empirical evidence, we’re in free fall and can expect that those who yell the loudest and claim the greatest oppression will rule. “Gotcha” becomes the predominant game; free speech withers and intergroup relations deteriorate – not a pretty picture and certainly not the way a democracy is supposed to function.

In practice, however, only some identities are credited, and these, unsurprisingly, turn out to be associated with oppression, as if today’s activists believe in what Bertrand Russell (1950) ironically referred to as the “superior virtue of the oppressed.” Russell followed through on the logic of this delusion, arguing that if the left really believed it, they should want to promote the conditions (such as poverty) that produce this oppression. But logic is not the point. Rather, what carries the day is a moral ranking of identities, a mere reversal of the traditional ranking that placed the wealthy and powerful at the top of the heap. Today, such identities, while secretly envied and emulated, are taken as a negative, as is the mere fact of being white and male, hence the wholesale disparagement of “white male culture” and the now commonplace emphasis on “toxic masculinity.” This is reflected as well in the very different curriculum found in Men’s Studies programs with their emphasis on guilt and shame (Raphael, 2016), as opposed to Women’s Studies, which are devoted to celebrating women and denouncing “heteronormativity.” If to Foucault knowledge is power, to the politically correct, an identity involving oppression is advertised in an effort to acquire power and position.

Thus, in jarring contradiction to the project of overcoming ethnocentrism by embracing multiculturalism and cultural competence, we are instead fostering a society fragmented into identity groups loudly proclaiming their oppressions, real or imaginary, and all insisting on redress.

The notion of identity politics is thought to have developed as Marxism, with its focus on the working class and call for economic justice, fell out of favor. Though at times used by social scientists in a neutral way to describe how social movements “alter the self-conceptions and societal conceptions of their participants” (Anspach, 1979, p. 765), the term quickly came to characterize race and gender struggles rooted in individual identity but that could be leveraged into a political weapon. As its use spread, it proved of great political utility for any group attempting to press its case and browbeat others into submission, allegedly as compensation for claims of past or present oppression. While it is still used most commonly in relation to race, gender, class, and sexuality, it morphs and expands as groups break down into more and more specific delineations of identity, always insisting on recognition that the more oppressed identities an individual can claim, the better.

The use of identitarian labels, it turns out, does not lead to an appreciation of cultural difference. Instead, the potential celebration of other cultures veers into a solipsistic preoccupation with one’s own suffering, taken as a point of pride and always attributed to a devalued group identity, while actual diversity of perspectives and political positions is discouraged if not disallowed. As Peter Wood noted in his book Diversity: The Invention of a Concept, the most familiar form of diversity ideology today:

asserts that American society is a hierarchy in which whites oppress other groups, and that individuals participate in the perpetuation of this hierarchy by harboring hurtful stereotypes about the members of the oppressed groups. The word “diversity” in this context refers to the set of beliefs that liberates the individual from his attachment to these stereotypes by allowing him to see the worthiness of the oppressed groups. (2003, p. 93)

With such persistent emphasis on the suffering of minority groups at the hands of the majority, it’s no wonder, as Kenneth Minogue (2010, p. 89) wryly notes, that the number of oppressed people in the west today by some counts “greatly exceeds the entire population of Western states.” Far from being oppressed by oppression, then, the very claim to being oppressed turns out to be an effective bludgeon. Thus the “oppression sweepstakes,” as Noretta Koertge and I labeled the unseemly competition for most oppressed status, seem destined to continue unabated (Patai & Koertge, 2003).

The impressive growth in claims of oppression and discrimination may be further explained by the fashion today of insisting that many people suffer from multiple oppressions, a view that has been a cornerstone of Women’s Studies programs for some time. Here too, however, the terminology has changed. What feminists used to call an “integrated analysis” (of race, class, sexuality, ethnicity, ability, etc.) has been replaced by an “intersectional” analysis, lately spreading well beyond the academy. This verbal magic has the advantage of being able to incorporate the ever-expanding categories of oppression, which have been transformed into a viable political currency, while suggesting that a sophisticated analysis is at hand, instead of the rather simplistic view of identities that is in fact being promoted. It is also likely to keep a lid on competition among the oppressed or at least to reshape it.

The belief in “the superior virtue of the oppressed,” of which Bertrand Russell wrote in 1950, developed so rapidly thereafter that by 1985 Joseph Epstein could excoriate the rising tide of “virtucrats” – those “empowered by the unfaltering sense of their own virtue,” who identify with the oppressed and thereby gain a sense of moral superiority (the only kind of hierarchy apparently still allowed). As an example, Epstein mentions a young reporter who grilled him about a supposedly antigay comment he had made 15 years earlier. She was, he says, “truly flying on virtumatic” and “ablaze now with her own goodness.” But even if “you can’t make a convincing case for being virtuous on your own, perhaps you can climb aboard one or another wagon of group virtue.” As further inducement, many virtucrats “do a brisk business in awards, honors and other riches of this earth.”

Nor is it difficult to spot a virtucrat, notes Epstein (1985):

Whatever he may ostensibly be saying, what he is really saying is, “I’m fundamentally a damn fine person.” Understanding this is a great aid in the contemporary world. It helps one to understand why so many people espouse opinions that they don’t finally believe in; merely enunciating those opinions--opinions held to be congruous with goodness--makes them feel good.

Fast-forward another 30-some years, we find both individual people and organizations heavily invested in what critics of these trends now call “virtue signaling.” Such signaling, it seems, cannot be left to individual initiative; institutions must demonstrate it as well. The University of Michigan, for example, actually monitors students’ “cultural sensitivity levels ” as part of an $85 million diversity initiative. Using an “Intercultural Development Inventory,” the university will henceforth assess students’ “ability to shift cultural perspective and appropriately adapt behavior to cultural differences and commonalities” (University of Michigan, 2016). After exposure to individualized learning plans and training opportunities, students will be retested to judge whether they have “improved.” Needless to say, these misguided efforts must rest on gross stereotypes accompanied by ideological fervor. They will, predictably enough, produce verbal conformity as students strive to avoid being judged racist and hence inadequate in their “cultural sensitivity levels” (Soave, 2016, October 10).

Such diligence is nothing new. Alan C. Kors wrote about the problem of imposed diversity training – already well established by 2000 – in an essay entitled “Thought Reform 101: The Orwellian Implications of Today’s College Orientation” (Kors, 2000). What is perhaps new is the complete normalization of ideological policing at all levels and the explosion of administrators tasked with imposing an unquestioned orthodoxy on all aspects of the university. Kors’ warning still rings true today – if it is not already too late:

Thought reform is making its way inexorably to a college near you. If we let it occur at our universities and accept it passively in our own domains, then a people who defeated totalitarians abroad will surrender their dignity, privacy, and conscience to the totalitarians within.

Overcoming Incompetence

In today’s complex world, in which basic human skills appear to be in short supply, it is hardly surprising that even educated adults are thought to need special training in dealing with Others. At the beginning of their book Building Cultural Competence: Innovative Activities and Models, editors K. Berardo and D. K. Deardorff (2012) identify themselves as “seasoned intercultural trainers.” Arguing that “facilitators and consultants” are required so that “key intercultural skills” become widespread, they laud their book, one of the many in this burgeoning area, as replete with “brand-new exercises, updates on classic models… – and dare we say improvements – of more conventional activities and exercises that can help build the intercultural competence of the people we work with.”

There is considerable irony in this situation, for while each individual, separately and presumably through effort, may aspire to acquire cultural competence, the object of that competence by definition is regarded not as an individual but always as a member of a group. Despite warnings from scholars such as W. O’Donohue and L. Benuto (who write of “the ultimately harmful effects of treating some clients as individuals and some as members of groups (2010, p. 37)), it is now sanctioned practice to subsume the encounter with a unique person into generalized beliefs about a group, all in the name of cultural competence. Unlike terms such as “diversity” or “multiculturalism,” however, which purportedly describe states of affairs or attitudes, cultural competence presupposes a subject: someone who has it or wishes to acquire it, not merely someone who endorses it. And it is to be unleashed upon an Other, for all these terms express a profound conviction of the Otherness of others. Craig Frisby has incisively labeled this practice “Group Identity Doctrine” (2013, p. 555).

Identity, however, is no simple matter. Even at a time when biology is dismissed as a social construct and choosing one’s identity is seen by many as a right, not all such choices are considered legitimate and given support by social justice warriors. At the moment, for example, it is something of an orthodoxy (supported by the US government) that individuals, regardless of genetics and genitalia, can select to “identify” as male or female and gain pertinent new rights corresponding to their proclaimed gender. The same generosity is not available to those (such as Rachel Dolezal and Ward Churchill) who take on a racial or ethnic identity that does not correspond to “facts” – those intractable things that nonetheless persist – and who eventually find themselves denounced for perpetrating a deception on the public. Self-definition suddenly ceases to count as accusations of fraud abound.

Recently, the term “cultural competence” has glided seamlessly from professional training manuals into the mouths even of protesting undergraduates. In late 2015, students at Amherst College provided a living example of its current usage. To express their solidarity with oppressed Blacks worldwide, they presented 11 demands to the college. Demand # five (cited by C. Friedersdorf, 2015) was particularly interesting for its adoption of the requisite terminology and its vision of appropriate remedies:

[Amherst College] President Martin must issue a statement to the Amherst College community at large that states we do not tolerate the actions of student(s) who posted the “All Lives Matter” posters, and the “Free Speech” posters “in memoriam of the true victim of the Missouri Protests: Free Speech.” Also let the student body know that it was racially insensitive to the students of color on our college campus and beyond who are victim to racial harassment and death threats; alert them that Student Affairs may require them to go through the Disciplinary Process if a formal complaint is filed, and that they will be required to attend extensive training for racial and cultural competency.

In a lengthy article for The Atlantic on this episode, Friedersdorf writes: “Protestors were trying to punish counter-protests with an extensive, compulsory racial-reeducation program.” He then quips: “Perhaps the curriculum could be issued in a little red book.” He also takes note of the childlike tone of the demands: Protestors insisted on statements of apology from various officials but also that students be excused from classes. Even if “vestiges of institutional racism” do exist on college campuses, he concludes, “pursuing remedies should no more be left to the whims of 18-year-old social-justice ideologues than Wall Street reforms should have been left to Occupy Wall Street.”

Not every use of the original slogan “Black Lives Matter” is considered acceptable, nor is everyone entitled to employ it. At Gettysburg College, for example, students protested an antiabortion poster put up by a conservative group using that very slogan to call attention to the alarming statistics on abortion among Black women – and were censured for making an ideologically incorrect use of the phrase. The college’s Chief Diversity Officer explicitly condemned the antiabortion students’ tactics. “Posters that were hung last week in numerous locations on campus singled out African-American women in an effort to promote pro-life positions,” she wrote. “These posters also made misleading use of ‘Black Lives Matter’.” She therefore unleashed a long-planned “bias-response team,” which will supposedly organize educational opportunities rather than restrict free speech, we are assured (Wexler, 2016). Nor, of course, is the slogan heard in reference to the disproportionately high rates of murders and rapes of Blacks by other Blacks in the United States.

Popular slogans can thus land people in trouble if used for promoting non-approved views. At DePaul University, a private Catholic school, President Rev. Dennis H. Holtschneider in October 2016 ordered a campus Republican group to redesign its posters containing the apparently unacceptable slogan “Unborn Lives Matter,” which he viewed as a provocation (Jaschik, 2016; Volokh, 2016).Footnote 4 This came too late, however, to save his position. Student demands for abject capitulation from administrators had already prevailed after an incident in May 2016, when DePaul students disrupted a talk (sponsored by the College Republicans) by the notorious Milo Yiannopoulos. Despite President Holtschneider’s expressions of sympathy with the protesters, calls for his resignation quickly multiplied, and, a few weeks later, he did indeed announce his resignation as of the end of the 2016–2017 academic year (Woelfel, 2016). Today he is Executive Vice President and CEO of Ascension, the world’s largest Catholic healthcare system.

George Orwell, whose portrait of a totally repressive society in Nineteen Eighty-Four was based largely on Soviet Communism, saw that the manipulations of language – whether through the suppression and rewriting of history, the existence of unrelenting surveillance and censorship, or the imposition of Newspeak, a language with a perpetually shrinking vocabulary – were all ultimately intended to control not merely opinion and behavior but the very ability to think for oneself. Though not that long ago it might have seemed implausible even to mention Orwell in the context of American universities, Alan C. Kors and Harvey A. Silverglate extensively documented, in their groundbreaking book The Shadow University: The Betrayal of Liberty on America’s Campuses (1998), how far we had already traveled down the road of censoring speech and restricting the free exchange of ideas. After The Shadow University appeared, the numerous additional cases Kors and Silverglate heard about regarding campus attempts to control speech led them to establish the nonpartisan not-for-profit Foundation for Individual Rights in Education (FIRE) in 1999.

Initially, Kors and Silverglate not only hoped but also believed that FIRE would become obsolete rather quickly. This turned out to be far too optimistic an expectation, as the categories of alleged infractions, the procedures and punishments imposed in shocking absence of due process, and university offices created to deal with the new campus objectives have all steadily grown in the intervening years. Despite FIRE’s substantial success in defending free speech, the organization must proceed on a case-by-case basis, preferring suasion rather than legal action. But many universities have demonstrated remarkable recalcitrance in sticking to their supposedly high-minded goals.

In this changed climate, the practice of restricting speech has taken on additional routinized forms. Students, and faculty members as well, successfully force disinvitations or, less frequently, “voluntary” withdrawals of lecturers and even commencement speakers whose ideas are unacceptable on today’s diversity-obsessed campuses (see FIRE’s Disinvitation Report 2014 : A Disturbing 15-Year Trend), as has happened to Ayaan Hirsi Ali, Robert J. Birgeneau, Arthur Jensen, Condoleezza Rice, George Will, and many others. In the name of needing safety and protection from disturbing ideas and those who endorse them, students also regularly shout down speakers with whom they disagree, disrupting and even completely impeding events they don’t approve of.

“Inclusion” – in other words – never extends to views unpopular in the modern university. Although it is another ubiquitous demand and goal these days, inclusion is, in fact, rather at odds with diversity and multiculturalism. The former tends to be centripetal and the latter two centrifugal. Consider the fate of Yugoslavia, merely one dramatic case among many, where for decades under Tito’s totalitarian rule, inclusion in the nation meant suppression of ethnic and religious differences. Once that regime began to crumble, the social fissures underlying the enforced unity emerged, leading to years of brutal war and the eventual dissolution of the nation into its diverse ethnic and religious groups.

The tribalism of many societies, with their endless strife and violence, evidently offers no lesson, and instead of defending hard-won liberal democratic values such as individual freedom and a secular state not subservient to religion or political ideology, members of the most egalitarian nations in the world seem eager to relinquish those values, oblivious to the predictable results. In this way, the commitment to multiculturalism and cultural competence, purported to contribute to “social justice,” instead threatens to drive out critical thinking and common sense as current orthodoxies are enforced.

Kenan Malik (2015) notes a paradox of multiculturalism that also applies to the terms “cultural competence” and “diversity”:

Multicultural policies [and attitudes, he might have added] accept as a given that societies are diverse, yet they implicitly assume that such diversity ends at the edges of minority communities. They seek to institutionalize diversity by putting people into ethnic and cultural boxes…. Such policies, in other words, have helped create the very divisions they were meant to manage.

At a time when most universities have established speech codes, it is worth stressing that one cannot even begin to defend liberal or any other principles without invoking free speech, the key ingredient in being able to argue for anything not already consecrated by prevailing norms. In an environment that curtails speech, whether for the sake of promoting “diversity,” “comfort,” “safety,” “antiracism,” or, say, fascism and one-party rule, it is dangerous, often impossible, to express views that challenge official ideas and beliefs, for the consequences of nonconformity can be swift and unrelenting. This is why the First Amendment, which prevents the government from both proscribing and prescribing speech and ideas, is of such crucial significance in the United States.Footnote 5 Yet critics of the First Amendment overlook the reality that approved speech needs no defense, since it routinely meets with assent and endorsement. It is precisely the speech judged to be repugnant and offensive that requires protection, and it is naïve to pretend that one can prohibit certain offensive terms while not affecting the expression of unpopular viewpoints and ideas.

Are most students aware of the dangers of curtailing free speech? Apparently not. Vann Newkirk (2016) recently reported on a survey of 3000 students at 240 colleges. He argued that the “hand-wringing” over the decline of free speech is misguided: “According to this survey, the vast majority of college students, even women and black students, believe campuses should not restrict political views as a matter of policy, even if those views are objectionable to some.” He then skirted over the more disturbing, but not surprising, survey result: “Students tend to draw the line at slurs and ethnically stereotypical costumes, however, with 69 and 63 percent, respectively, believing campuses should have the ability to restrict those kinds of expressions.”

The sight of American students shouting (often using obscenities) for restrictions on others’ speech, enabled and protected by administrators no doubt eager to keep their jobs, is not only disturbing in itself but also disturbingly shortsighted. Reigning orthodoxies undergo rapid change, and these censors could suddenly find themselves with no grounds on which to defend their own right to free speech and the dissenting views that cannot be expressed without it.Footnote 6

Perhaps because of the obstacles faced by direct restrictions on speech/expression, thanks to First Amendment protections (which courts in the United States have had a way of upholding, even if it means allowing the American flag to be burned or trampled), the major vehicle for controlling unpopular speech has for the past few decades been Title IX of the 1972 Education Amendments, designed to prohibit sexual discrimination in federally funded schools and colleges. Through a series of expansions and directives over subsequent years, Title IX gave rise to speech codes that were often subsumed under harassment policies. In clear violation of the First Amendment, these policies routinely target words as well as actions, using the broad concept of “hostile environment harassment,” a convenient catch-all category that serves to allow legally protected speech to be restricted.

Claiming that others’ unkind words and unwelcome jokes constitute discrimination has turned out to be a highly effective strategy by which grievances, conflicts, and resentments play out with maximum damage to the accused. The consequences of the endless rules and regulations suppressing speech and managing interactions in the name of the sensitivities of “protected groups” have created a campus climate inimical to a society comprising people capable of thinking for themselves and daring to express their views. To encourage civility is not the same thing as to impose censorship, for the list of terms and ideas (as well as gestures, glances, overheard jokes, and innumerable other potential offenses) is ever-shifting, and in the ideologically charged atmosphere of the academy, everyone is potentially at risk of facing a false or trivial accusation.Footnote 7

An already troubling situation was made considerably worse in April 2011, when, as Robert Shibley, Executive Director of the Foundation for Individual Rights in Education, explains in Twisting Title IX, the US Department of Education’s Office for Civil Rights “issued a letter unlawfully mandating that the standard of proof in campus sexual misconduct cases be set at the lowest possible level: a ‘preponderance of the evidence,’ or a mere 50.01 percent likelihood of guilt” (Shibley, 2016, p. 3). This letter, readily accepted as gospel by almost all its recipients, further eroded the due process rights of the accused and the independence of universities (see Admin., 2011). In September 2017, President Trump’s Secretary of Education Betsy DeVos rescinded the Obama-era directive and attempted to restore due process. Predictably, this move was attacked by many as an effort to tip the scale in favor of rapists (Berenson, 2017).

What has been the administrative response to infractions of the basic rules of intellectual engagement and academic freedom? Too often, rather than defending free speech and assembly, administrations have capitulated, even endorsed, these assaults. Driven partly by fear of lawsuits and partly by administrators’ own ideological zeal, they have embraced speech codes while at times creating tiny “free speech zones” in some part of the campus – implying that the entire rest of the university is a restricted-speech area. Universities also impose ever more sensitivity training and those requisite orientation sessions that often amount to little more than thought reform aimed at incoming and existing students. Faculty and administrators who do not automatically toe the line may find themselves denied tenure, facing job loss, or, as noted above, forced into resigning. And this is happening not just with the acquiescence but, more frightening still, with the active support and insistence of many academics – students, staff, faculty, and administrators themselves – in the name of “social justice.”Footnote 8

But it is not only speech and assembly that are under attack. Those who are today called social justice warriors consider statistical disparities of whatever sort between groups to be evidence of injustice and discrimination of one or another type, in need of remediation. In other words, identical outcomes become the only proper measure of a just society. Like all other such charges, the concern with disparate outcomes is applied opportunistically, in accordance with its potential political utility for aggrieved groups.

In his book The Servile Mind, Kenneth Minogue (2010) argues that an erosion of individual moral and ethical awareness has occurred as the intelligentsia panders to fantasies of social perfection and expects government and its agents to provide it. This results in the creation of a servile mind, immersed in constant rhetorical posturing about improving the world, while sacrificing individual moral commitments to “politico-moral” grandstanding. Outward manifestations of one’s good intentions and proper awareness, including shame over one’s own supposed privilege and denunciation of that of others, are routine. Fifteen years earlier, Thomas Sowell (1995) had made similar observations, in his book The Vision of the Anointed: Self-Congratulation as a Basis for Social Policy: “People are never more sincere than when they assume their own moral superiority,” which lends them “a special state of grace.” This allows them to consider those who disagree with them as being not only in error but in sin, a vision that, Sowell argues, has not fundamentally altered in the past 200 years.

Throughout the country, this familiar stance is adopted in the name of promoting social justice. In late 2015, to take just one example from the numerous recent cases, after a professor was (absurdly) accused of racism at the University of Kansas, the provost sent a letter to the campus community previewing an “action plan” to eradicate racism on campus (Vitter, 2015). The plan included the usual “mandatory education, through facilitated sessions, on inclusion and belonging for all students, faculty, staff, and administrators and a plan for accountability.” He then reminded the campus of an annual program called the Tunnel of Oppression, which had been set up by the Office of Multicultural Affairs in 2001. Professors were urged to offer students extra credit for visiting the Tunnel of Oppression (2015), described on a special website as “a tour that will engage students in an immersive experience of scenes where participants will experience, firsthand, different forms of oppression through interactive acting, viewing monologues, and multimedia. Participants directly experience the following scenes of oppression: ability, class, body image, immigration, homophobia, genocide, relationship violence, and race.” But even this “immersive experience” cannot stand on its own, and the website offers further help: “At the completion of the Tunnel experience participants will go through an active processing session where they will discuss the experience and learn how they can rethink their role in creating positive social change.”Footnote 9

Within the classroom itself, another popular response on the part of educators nationwide has for some time involved implementing “anti-oppression pedagogy ” – no longer confined merely to a few identity programs or schools of education. Several decades of social justice curricula, diversity, and inclusion have apparently been no more effective than 15 years of Tunnel of Oppression experiences. Equality of opportunity for formerly disadvantaged groups, contrary to what the naïve might imagine, has proven sadly insufficient, which is why “anti-oppression pedagogy,” like the Tunnel of Oppression, aims at our very souls: “Anti-Oppression Pedagogy teaches how to structurally analyze systems of oppression, while contemplative practices cultivate an embodied self-awareness. Mindful anti-oppression pedagogy merges the two to cultivate an embodied social justice. This website offers a collection of contemplative practices that have been used in social justice classrooms” (Berila, n.d.).

At the University of Massachusetts Amherst, new “General Education Diversity Requirements” were proposed to the Faculty Senate in 2016 that would impose on students courses having “learning outcomes,” such as “diminish the perpetuation of discrimination and oppression,” and suggesting that training sessions for teachers may be necessary (Patai & Silverglate, 2016). In the end a somewhat more moderate form of these requirements was quietly adopted, requiring all entering undergraduates to take, in their first year, one newly designed “diversity” course, with another to be taken later. Existing courses that fulfill this new diversity requirement must be revised according to new guidelines and submitted for approval. Among the six “learning outcomes,” one focuses on understanding of “diverse social, cultural, and political perspectives,” another on “critical awareness of how individual perspectives and biases influence ways of seeing the world,” and a third on “knowledge of structural and cultural forces that shape or have shaped discrimination based on factors such as race, ethnicity, language, religion, class, ability, nationality, sexuality, or gender” (University of Massachusetts, 2017).

In keeping with the university’s anxiety about these issues, over the past few years, the Chancellor’s office has constantly sent out memos about new initiatives and administrative offices devoted to diversity, equity, and inclusion, which, as elsewhere in the nation, consume ever more resources within higher education.Footnote 10 In The Shadow University, Kors and Silverglate (1998) traced the proliferation of bureaucracy in higher education to the mid-1980s, with the arrival of racial, religious, and sexual minority groups in significant numbers. “Fearful that such diverse students would all but kill one another without the benign supervision of student-life administrators, colleges began to ramp up their staffs. Soon every dean of student life had several deputy deans, and each deputy had assistant deans.” This problem was compounded as increasing government regulations were imposed in exchange for infusions of cash into the academy, Silverglate points out (2018). And those infusions were ever more necessary as universities came to be places employing more bureaucrats and their underlings than faculty members. Add the shifting ideological climate, and we arrive at the contemporary university, on the one hand, subject to corporatization and bureaucratic bloat, on the other, in thrall to a predominantly leftist politics that treats the university as a locus for political indoctrination.

No aspect of university life is, these days, immune to this agenda. Getting down to the nitty-gritty, universities may encourage interviewers to interrogate candidates for faculty and staff positions about their attitudes toward the social justice agenda. Drawing on old feminist documents intended to promote women in the academy, the Equal Opportunity and Diversity Office at the University of Massachusetts Amherst, for example, provides search committees with lists of suggested questions designed to ferret out the depth of job applicants’ commitment to “diversity.” Allowing space in parentheses for filling in the name of the protected group of one’s choice, typical questions include: “How have you demonstrated your commitment to (   ) issues in your current position?” “In your current position, have you ever seen a (    ) treated unfairly? How would/did you handle it?” (Patai, 2016b, May 30).

All these efforts to ensure ideological conformity include nary a hint of concern with intellectual development and education unfettered by political dogma. Instead, what dominates academic rhetoric appears to be the unrelenting desire to set the world aright according to the current lights of campus luminaries obsessed with a single focus. Since administrators use the term social justice (without bothering to define it), they must believe they know not merely what it is and how it can be achieved but also who are the primary victims of our presumptively unjust society and how they can be redeemed.Footnote 11 And underneath all these concerns lurks the conviction – whether opportunistic, heartfelt, or both – that group identity is either a privilege or a scourge, a cudgel, or a shield, never just a feature of a complex human reality. The currently popular language of “intersectionality” offers no help, for it merely multiplies categories of oppression and incites competition among the aggrieved.

But all identities do not have equal value, nor is “competence” required to deal with all cultures. While resting on cultural relativism, as does multiculturalism, cultural competence is far from an equal opportunity perspective. It is rarely if ever used to understand (or even imply tolerance of), say, Christian fundamentalists or political conservatives. Only certain identities, apparently, are to be respected and understood (rather than vilified and dismissed), those of “protected groups,” those who can claim they suffer or have suffered from discrimination, or those thought to be unsullied by the supposedly sordid history of the west generally and America in particular. We strive to enhance multiculturalism and diversity, and we do so also by acquiring competence in dealing with particular cultures – but not unless they can successfully assert a claim to past or present oppression and marginalization while displaying the requisite politics to undergird their grievances. This is why high-achieving minority groups (such as Asians) cease to count as “minorities” and in fact may find themselves the objects of quota systems intended to counterbalance their accomplishments vis-à-vis other groups.

Thus does the demand for “social justice” undermine merit-based rewards and achievement. This, however, is nothing new. Michael Walzer (1998), in his essay “Multiculturalism and the Politics of Interest,” made the same point 20 years ago: “In multicultural politics it is an advantage to be injured. Every injury, every act of discrimination or disrespect, every heedless, invidious, or malicious word is a kind of political entitlement, if not to reparation then at least to recognition. So one has to cultivate, as it were, a thin skin” (p. 89).

While arguing that it is at times “worth taking offense,” because a thin skin can act as a kind of early warning system for vulnerable groups, Walzer also observes: “[A] permanent state of suspicion that demeaning and malicious things are about to be said or done is self-defeating. And it is probably also self-defeating to imagine that the long-term goal of recognition and respect is best reached directly, by aiming at and insisting on respect itself” (p. 90).

The demand for multiculturalism and diversity, then, rests on constant claims of wrongs committed against vulnerable groups, and this in turn both justifies and necessitates policing of others’ language and attitudes. More concessions, resources, and attention are required, but not in the context of vigorous and free debate as one might expect at a university. The thin skin useful in detecting potentially dangerous circumstances has been transformed into something quite different, as noted by Claire Fox, founder of the Institute of Ideas, a London think tank. She faults the present generation of thin-skinned young people for displaying a “belligerent sense of entitlement. They assume their emotional suffering takes precedence. Express a view they disagree with and you must immediately recant and apologise” (2016).

The usual administrative capitulation to such demands based on identity politics must be gratifying to those some have called “crybullies” (Kimball, 2015), who use their alleged injuries to rail against and control those around them. Kimball’s term is apt also because the supposed spokespeople have rarely been selected by the group they claim to represent. Often, yelling loudly is sufficient to carry the day.

But matters do not end there, for after gaining the necessary legal standing, an institutional presence, and resources, in the real-world minority groups must in the end coexist with other groups: “The others are necessary, obviously, since they must do the recognizing and respecting, and then they will want to be recognized and respected in turn,” Walzer writes (1998). This, however, is not the perspective that has prevailed in recent years. Instead, we see identity politics played by insisting on abasement and mea culpas from the supposedly powerful (or merely more successful) groups, which are held responsible for real or imagined inequalities, failures, and hurt feelings proudly claimed by allegedly injured identity groups.

One of the most searing indictments of how identity politics functions was offered over two decades ago by Todd Gitlin. In a section of his 1995 book The Twilight of Common Dreams entitled “The Cant of Identity,” Gitlin writes:

The more vociferously a term is trumpeted in public, the more contemptible it is under scrutiny. The automatic recourse to a slogan, as if it were tantamount to a value or an argument, is frequently a measure of the need to suppress a difficulty or vagueness underneath. Cant is the hardening of the aura around a concept. Cant automates thought, substitutes for deeper assessment, creates the illusion of firmness where there are only intricacies, freezes a fluid reality. (2005, p. 400)

Gitlin explains how the cant of identity “underlies identity politics, which proposes to deduce a position, a tradition, a deep truth, or a way of life from a fact of birth, physiognomy, national origin, sex, or physical disability” and laments that “Americans are obsessed today with their racial, ethnic, religious, and sexual identities.” Protesting the effect of such fragmentation on left politics, Gitlin objects that “What is supposed to be universal is, above all, difference.”

By now such views are normalized throughout academe, mouthed by administrators at all levels, whose jobs are often organized around identity politics. This has led to the paradoxical results Gitlin already noticed: “For identity-based movements, the margin is the place to be.” But identity politics never stops, for, “within each margin, there are always more margins to carve out.” Furthermore, from the slogan “the personal is political,” it was an easy glide to “‘only the personal is really political’ – that is, only what I and people like me experience ought to be the object of my interest.” The result is that universalism is abandoned, even disdained, and we get “cultural separatism, emphasizing difference and distinct needs.”

Gitlin suggests that this is one result of the confusion of contemporary life, which has made “firmness of identity” hard to come by. Nor is it only psychological or moral claims that are at stake. “Partly because the state legitimizes labels and allocates resources accordingly, people affirm them.” In such a situation, “shape-shifting” becomes normal, and intolerance of “one’s own confusion generates a frantic search for hard-and-fast identity labels.” But since the market offers a dizzying array of choices, what we get, concludes Gitlin, is “identities lite.”

The obsession with identity has only grown worse in the years since Gitlin made these criticisms. It thrives by ignoring the commonsensical observation of Charles Murray (2005) that “a few minutes of conversation with individuals you meet will tell you much more about them than their group membership does.”Footnote 12 But Murray took note of an interesting implication of identity discourse: “Talking about group differences obligates all of us to renew our commitment to the ideal of equality that Thomas Jefferson had in mind when he wrote as a self-evident truth that all men are created equal.” Murray goes on to quote Steven Pinker’s useful formulation of that ideal in The Blank Slate: “Equality is not the empirical claim that all groups of humans are interchangeable; it is the moral principle that individuals should not be judged or constrained by the average properties of their group.”

Human rights , in other words, reside in all individuals qua individuals, not in groups. That is the important thing that an insistence on multiculturalism and diversity, as these terms are now deployed, threatens. And the threat is intensified when dealing with the “helping” professions, such as psychology, to which clients necessarily appeal as individuals, not as tokens of a group.

But this does not undermine the understanding that, as philosopher Thomas Nagel (1997) argues, reason is universal. Indeed, he points out, subjectivism – the notion that there is nothing beyond whatever is true “for me” – is both self-contradictory (because it, too, rests on a general claim) and an erroneous view.

By contrast, education in cultural competence depends on a radical relativism – the notion that we need special training because human cultures, though all equally valid (a very slippery term), are incommensurate and that differences are far more important than similarities. This is a vastly exaggerated claim, embraced primarily for political advantage. More than 25 years ago, the anthropologist Donald E. Brown developed a list of what he termed “human universals.” In more recent writing, he reiterates his basic understanding of the term: “those features of culture, society, language, behavior, and mind that, so far as the record has been examined, are found among all peoples known to ethnography and history” – although, he points out, this does not entail that every individual in a given society manifests all those traits. Anthropologists, however, have been more interested in “differences between societies” than in their commonalities, which have suffered a neglect Brown describes as:

overt and principled, seeming to follow logically from the view of culture that anthropologists held throughout much of the twentieth century, a view that seemed to be supported by exaggerated (and in some cases false) reports of the extraordinary extent to which cultures both differ from one another and yet decisively shape human behavior, a view that was construed to indicate that there must be few, if any, universal features of the human mind.

Psychologists, Brown argues, have generally been “much more open to the discovery of presumably universal features of the human mind.” He goes on to specify: “Examples of universals of psyche or mind that have been identified through broad cross-cultural studies are dichotomization or binary discriminations, emotions, classification, elementary logical concepts, psychological defense mechanisms, ethnocentrism or in-group bias, and reciprocity as a mechanism for bonding individuals to one another” (2004, pp. 50–51). Brown ends with the assertion that studying human universals is a crucial component of any effort to understand “human nature.”

No one with any experience of the world would deny that there is much to learn about other cultures and their distinctive habits and mores, practices, and preferences. But to recognize this variation is not to reject the existence of universals that allow people to understand each other across cultures. If cultures were indeed incommensurate, there could be no promotion of human rights internationally, nor any appreciation of, say, great works of art produced by cultures other than one’s own.

As for relativism, once again it is worth noting that if cultures and groups within them were never to be judged by standards beyond their local and national borders, much social activism would disappear and there would be no grounds for objecting to, say, slavery outside of one’s own group. The very notion of fundamental human rights depends on recognition of a commonality shared by all people everywhere.

When the West Is the Least

In her famous and controversial essay “Is Multiculturalism Bad for Women?”, the political philosopher Susan Moller Okin (1997) answered the question posed by her title with an unapologetic Yes. Her argument is that multiculturalism promotes tolerance and respect for cultures that are hostile to women, cultures that, for example, encourage forced marriage, female genital mutilation, polygamy, and stunted legal and political rights. This assertion earned her the ire of those (including other feminists) who wanted to celebrate a “politics of difference” and criticized the application of Enlightenment values as ethnocentric.Footnote 13 In the nearly two decades since Okin published her essay, the demands for diversity and multiculturalism have multiplied, ironically at the very time that Islamist challenges to basic liberal values of tolerance and equal rights have increased and taken ever more violent turns in western countries as well as in the Muslim world.

While Okin’s definition of feminism would be rejected by those feminists who do not seek equality “with,” but rather a total transformation “of,” the problem she saw in multiculturalism is far more urgent now than it was when she published her essay. Though multiculturalism is hard to pin down, she wrote:

the particular aspect that concerns me here is the claim, made in the context of basically liberal democracies, that minority cultures or ways of life are not sufficiently protected by ensuring the individual rights of their members and as a consequence should also be protected with special group rights or privileges. In the French case, for example, the right to contract polygamous marriages clearly constituted a group right,Footnote 14 not available to the rest of the population. In other cases, groups claim rights to govern themselves, have guaranteed political representation, or be exempt from generally applicable law.

Okin is asserting that the defense of “difference” is undesirable if it leads to the acceptance of the oppression of women in the name of a particular culture’s values, and she gives as examples the accommodations of European countries to Muslim communities. It is worth adding that it is not only women as a category who suffer in societies in which basic civil liberties do not exist for all individuals. Islam, for example, also targets Christians, Jews, apostates, homosexuals, and other Muslims. The evidence has only intensified over the past 20 years, and yet in the face of Islamist terrorism internationally, large swathes of educated westerners are proving themselves unwilling to criticize radical Islam despite its evident destructiveness to those (individuals and entire societies) who do not endorse its beliefs. Home-grown Christian fundamentalists, doing far less damage and engaging in far less violence, are given no such pass, while, for example, misogynist “gangsta rap” is.

Those writers and intellectuals who do not accede to Islamist demands for conformity face threats and violent deaths: Salman Rushdie, his translators, and publishers; filmmaker Theo Van Gogh; the Charlie Hebdo staff; Jews and Christians and human rights activists such as Phyllis Chesler and Ayaan Hirsi Ali. These latter find themselves ostracized, disinvited, and dismissed even by many who call themselves feminists and leftists, who have learned the unbalanced lessons of multiculturalism and diversity all too well.

The argument over cultural particularities versus universals, pursued today in an incoherent and opportunistic manner, has a long and interesting history. In the twentieth century, it gained special prominence through the French writer Julien Benda, whose 1927 book The Treason of the Intellectuals (2006) was concerned above all with the destructive potential of right-wing nationalisms devoted to cultivating the Volksgeist. This development, promoted by the epigones of the late eighteenth-century German philosopher Johann Gottfried von Herder, distorted and blew out of proportion Herder’s argument that each national group possesses a unique spirit, rooted in its particular culture.

The epigones severed Herder’s philosophical ideas about nationalism from his cosmopolitan and universalist framework, linking them instead with anti-democratic, anti-liberal, and anti-Semitic ideologies (Stern, 1961, p. 278). Today, the selective promotion of ethnic, religious, and sexual identities often sounds like a compulsive repetition of these nationalistic nineteenth-century distortions, resulting in an inversion of any genuine cosmopolitanism for the sake of privileging particular groups in the name of a narrow identity politics.

Benda’s critique of French intellectuals for abandoning the Enlightenment commitment to universal values in favor of local (national) identities was driven by an awareness of the dangers of unleashed national pride. Concerned about the potential violence of such sentiments, as evinced in the Dreyfus case of the 1890s, when much of the French population believed that Captain Alfred Dreyfus, a Jew and hence an “outsider,” must be guilty of the charges of treason falsely leveled against him, Benda saw the relationship between this insistence on the Volksgeist and the potential rise of fascism in Europe.

When comparing past with present identity politics, however, a crucial difference emerges: Herder and other romantic nationalists, in criticizing the universalistic approach of French Enlightenment figures such as Voltaire, were also defending their own culture and its values. They endorsed respect for particularity within general and universal principles, as opposed to sacrificing the local for the sake of the universal. By contrast, the current vogue for embracing cultural particularisms has led to quite a different trend among many intellectuals. Today, what accompanies terms such as multiculturalism and diversity in the west is antagonism toward “Eurocentric” culture in particular (especially if it is one’s own), typically accompanied by an unwillingness to make criticisms of other (non-western) cultures, however severe their abuses of human rights.

Instead, universal principles are replaced by a faux relativism, as is evident in the views prevalent in academic discourse: Yes, all cultures are equal, but western is worse. Yes, women are equal to men, but men are worse. Yes, race is an artificial construct, but whites alone are racist. The same sort of non-thought is everywhere, piling contradiction upon contradiction without embarrassment, always enveloped in the “good” politics presumed to justify any and every inconsistency and paradox.

In a powerful essay introducing a new edition of Benda’s work, Roger Kimball (2006) writes:

The humanizing “reason” that Enlightenment champions is a universal reason, sharable, in principle, by all. Such ideals have not fared well in recent decades: Herder’s progeny have labored hard to discredit them. Granted, the belief that there is “Jewish thinking” or “Soviet science” or “Aryan art” is no longer as widespread as it once was. But the dispersal of these particular chimeras has provided no inoculation against kindred fabrications: “African knowledge,” “female language,” “Eurocentric science,” “Islamic truth”: these are among today’s talismanic fetishes.

Benda observed French intellectuals’ anxiety that they would descend into “national partiality” if they considered their own country to be in the right. These “strange friends of justice,” he wrote, “are not unwilling to say: ‘I always maintain my country is in the wrong, even if it is right.’” He concluded that “the frenzy of impartiality, like any other frenzy, leads to injustice.” But this is not merely an abstract problem. In writing about the varieties of pacifists, Benda noted the type who asserts, contrary to all evidence, “that the nation is not in the least threatened and that the malevolence of neighbouring nations is a pure invention of people who want war” (2006, pp. 187–89).Footnote 15

Many of today’s academics and intellectuals, in thrall to the claims of “social justice” rooted in identity politics, in like fashion are unwilling to recognize threats to their own culture while eagerly defending other cultures even if these are demonstrably worse in terms of those very human rights that the activists proclaim. They are unwilling to criticize nondemocratic, non-western cultures, especially if these are inhabited by nonwhite people. The embrace of cultural relativism is thus revealed as a fraud, lacking the redeeming feature of at least resting on a consistent ethics.

If the much-proclaimed embrace of cultural relativism were authentic, after all, western culture would be valued as much as other cultures, instead of being constantly (mis)represented as inherently and uniquely imperialistic, racist, and violent. True, it makes sense to be more heated about the failings of one’s own culture (especially if measured against the bar of perfect justice), but it is quite a different thing to view western culture, American in particular, as far more deficient and always of lesser value than the ones the multiculturalist left extols. Yet that is precisely what famous intellectuals such as Susan Sontag, Michael Foucault, and Noam Chomsky have done, exculpating Communism and Islamism along the way.

Paul Hollander’s extensive work on anti-Americanism (1995/2003, 2004) documents this trend, as does Jean-François Revel’s later study of French anti-Americanism (2003), which includes a chapter called “The Worst Society That Ever Was .” After 150 pages of examples and analyses of his theme, Revel identifies the two “most glaring traits of obsessive anti-Americanism: selectivity with respect to evidence and indictments replete with contradiction” (p. 149). Today, under different names, we observe the same obsessions. The multiculturalist agenda, which leads to a dangerous social fragmentation, presupposes the belief that those who embrace it have indeed chosen the high road and are embarked on the challenging task of creating a better future. In this fantasy, “social justice,” freed from the baneful western tradition that in reality gave rise to ideas of universal human rights, will finally flourish unhindered.

None of these concepts, however, stands still; each and all constantly undergo revision and domain expansion. This is why cultural politics as played today is a game in which heads I win, tails you lose (depending on who has, or claims, which identity). To all this is added the further irony that if ignoring or devaluing other cultures is bad, embracing them may be still worse, as is apparent in the current vogue of railing against “cultural appropriation.”

Law professor Susan Scafidi (2005), in her book Who Owns Culture?: Appropriation and Authenticity in American Law, defines cultural appropriation as “taking intellectual property, traditional knowledge, cultural expressions, or artifacts from someone else’s culture without permission. This can include unauthorized use of another culture’s dance, dress, music, language, folklore, cuisine, traditional medicine, religious symbols, etc.” (as cited in Shriver, 2016). Perhaps this helps us understand why even Halloween costumes are now regulated by some schools.

Alain Finkielkraut, the scourge of French intellectual life, has devoted a number of books to protesting the abandonment of Enlightenment values in contemporary France. His book The Defeat of the Mind, in dialogue with Benda, criticizes the resurgence of Volksgeist particularisms visible in the rise of multiculturalism, pitting ethnic, racial, religious, and national groups against one another (1995). Most recently he has written persistently about French unwillingness to recognize Muslim immigration as the threat it in fact poses to France’s secular culture. In an interview with Der Spiegel (Von Rohr & Leick, 2013, December 6), shortly after the publication of his book L’Identité Malheureuse (2013), Finkielkraut reiterated his argument that “France is in the process of transforming into a post-national and multicultural society. It seems to me that this enormous transformation does not bring anything good.”Footnote 16 Though presented as a model for the future, multiculturalism, in his view, “does not mean that cultures blend. Mistrust prevails, communitarianism is rampant – parallel societies are forming that continuously distance themselves from each other.” More recently still, Finkielkraut has insisted: “Secularism has got to prevail. And we can’t compromise on the status of women” (Nossiter, 2016).Footnote 17

A persistent critic of the far right, Finkielkraut (2013, L’Identité, p. 86) stresses that the Enlightenment saw all men as equal in the sense that all have the same right to freedom, which means the ability to think, judge, and act on their own as conscious human beings. And by making use of that ability, they form nations. But nations are fragile things, and the identities each depends on, according to Finkielkraut and other contemporary critics, must rely on certain accepted or adopted commonalities. However broad-reaching these may be, once they are gone, a society fractures into particularisms – which is what we are seeing when immigrants arrive who explicitly reject the society they are physically moving into.

This is the situation that makes Finkielkraut speak of France’s current The Unhappy Identity, the title of his 2013 book. He points out, and the growing problems in European countries confirm, that Muslim immigrants who resist integration do not need to constitute anywhere near a majority in order to have a profound impact on the larger society. This is especially so when that society’s very values make it vulnerable to the demands of newcomers who reject its fundamentals, e.g., the equality of women, the predominance of secular values in education and public life, the prevailing legal norms, the acceptance of non-Muslim ways of living and believing, and so on.

Another leading French intellectual, Bernard-Henry Lévy, in his book Left in Dark Times: A Stand Against the New Barbarism (2008), makes an important distinction about the significance of “tolerance” as opposed to “secularism.” The former leads to an uncritical respect for religion, even if it means abandonment of universal human rights. By contrast, secularism keeps all religious beliefs

at an equal distance from political power, [and] also has to keep political power equally removed from those beliefs. Tolerance tolerates [his emphasis] that one group demands such and such a special right. The secular state does not tolerate or understand that. And that is why, when the political authorities are wrapped up in the wrong done to one community by the representation of its prophet with a bomb on his head instead of a turban, the secular regime answers: “We see that you’re upset; your faith is doubtless seriously wounded by such a representation, but that wound has no place in public debate; the law-maker therefore has nothing to do with it; that’s how democracy works.” (Lévy, 2008, pp. 179–80)

Writing well before the Islamist terrorist attacks in Paris on Charlie Hebdo (January 2015), the Bataclan (November 2015), and in Nice on Bastille Day (July 14, 2016), to mention only the recent episodes of large-scale mass murder in France, Lévy saw clearly the dangers of refusing to recognize Muslim challenges to the French legal establishment in its core principles. When liberal democracies capitulate in this way – whether out of the majority’s fear, guilt, or desire to be accommodating to people it sees as disadvantaged – a corrosive instability arises. In such a climate, some do indeed turn to the hard right (as is happening in European countries where Muslim influence is a growing problem), which Lévy, like Finkielkraut, deplores.

The same problems, of course (though thus far on a smaller scale), have arisen in the United States, where in the name of multicultural sensitivities, many have been willing to abdicate First Amendment and other rights. The classical historian Victor Davis Hanson (2016b) does not hesitate to describe the current scene as devolving into “multicultural separatism and ethnic and religious chauvinism.” It is by now a routine event to see “diversity” enthusiasts eagerly exculpate Islam. Even after the mass murder at a gay nightclub in Orlando in mid-June, 2016, the same double standards emerged, as Hanson points out: “From Iran to Saudi Arabia, the treatment of gays is reprehensible – but largely exempt from [mainstream] Western censure, on the tired theory that in the confused pantheon of -isms and -ologies, multiculturalism trumps human rights.”

Within the academy, antagonism toward the west is both a cause and a consequence of the decline in college courses in western civilization – the very term is unacceptable to some – which has been matched by the ever-expanding college offerings of courses on (certain) “other” cultures. Nor is this a recent development (see Patai, 2016c, September 12). Writing about the failed effort at Stanford to reintroduce courses in western culture decades after they had ceased to be required, Anthony Esolen (2016) bemoans the ignorance of many students, whom he calls “ninnies,” lacking in curiosity” about their own culture. “Multiculturalists,” he says, are not making serious efforts to learn about other cultures. Rather, they are people “who peddle the tandoori chicken rather than Sanskrit.” He adds that “only someone who actually has a culture is prepared to learn about another; as a master in the grammar of his native tongue is prepared to learn another. But these days we prefer our education to be like our politics: superficial and silly”Footnote 18 (his emphasis). Proving his point, students at Providence College, where Esolen teaches, marched to the president’s office in late 2016 demanding that Esolen be fired for criticizing diversity (Dreher, 2016).

Sobbing and outraged students succeed in getting courses and speakers canceled, and free speech and, of course reasoned debate, suppressed (see Davidson, 2018), as only the opinions and preferred rhetoric du jour are acceptable. Literary classics such as Huckleberry Finn and To Kill a Mockingbird, considered offensive or defective by current standards, are bowdlerized and/or dropped from curricula. This goes hand in hand with identity-driven curricula, which promote the notion that protected groups should study primarily themselves. It is damaging, Yale University students claimed not long ago in demanding curricular change (Flood, 2016), for people of color to study white male poets. Why waste time learning about western culture, except if these courses concentrate on the familiar litany of western deficiencies? And yet the supposedly dominant groups are expected, indeed required, to learn about protected minorities. In both cases, it is mainstream western culture that is to be downgraded, misrepresented, and ignored if not reviled, while identity politics is to be respected by all, and training sessions in diversity and inclusion are promoted and even imposed by colleges nationwide.

The well-known scholar and former president of the Modern Language Association, Elaine Marks, an instrumental figure in publicizing feminist literary theory, shortly before her death in 2001 turned against the practice of allowing identity politics and the tireless insistence on “differences” to dominate the study of literature. Disillusioned by students who trolled literature and culture for signs of the ubiquitous -isms, Marks published an essay entitled “Feminism’s Perverse Effects” (2005), in which she argued that replacing knowledge of western culture with a ceaseless pursuit of signs of its racist and sexist villainy leads students to an incapacity to respond to literature imaginatively. Merely voicing concern over the habit of reading literature for ideological bottom lines stigmatizes a scholar (even one with her track record) as a closet conservative and traitor, she wrote. Since then, higher education has openly pledged itself to an orthodoxy in which courses and programs grounded in identity politics (renamed “diversity”) proliferate, preferably taught by professors of the requisite identity. That is, you have to be one to teach it – all in the name of those elusive absolutes: multiculturalism, inclusion, cultural competence, and, ultimately, social justice.Footnote 19

In relation to other arts, not only literature, the same agenda of identity politics prevails, as detailed in Sohrab Ahmari’s recent book The New Philistines: How Identity Politics Disfigure the Arts (2017). Ahmari was born and raised in Iran, where, after Islamists seized power in 1979, “thousands of ideologically unfit faculty members and students were purged” and holding the wrong opinion or creating the wrong kind of art could mean loss of liberty and life. He has little patience with the art world today, which he finds entirely indifferent to the old standards of truth and beauty, instead embracing identity politics as its alpha and omega. “Identitarians celebrate individual difference, so long as you are different in the same way,” he writes. Obsessed with “a set of all-purpose formulas about race, gender, class, and sexuality on the one hand and power and privilege on the other,” they deny individuality and agency, treating everyone as “a political type or a stand-in for an ideological cause.” Contrary to its exalted claims to transgressiveness, he concludes, identity art becomes drearily conformist.

In the academy, the notion of excellence still puts in a feeble appearance now and then, at least rhetorically. Sleight-of-hand attempts to associate the diversity agenda with excellence now appear at many universities. The president of the University of Michigan, for example, proclaimed in 2016: “We aspire to achieve the highest levels of excellence at the university, and our dedication to academic excellence for the public good is inseparable from our commitment to diversity, equity and inclusion” (Schlissel, 2016, p. 3). The same document ominously warns that “measures of accountability” are a crucial part of the new plan. Similarly, at the University of Massachusetts Amherst, one of the many new administrative positions created to enact the current dispensation is that of Faculty Advisor for Diversity and Excellence. In the documents detailing this position, “inclusive excellence” is mentioned a few times, as if the very word excellence might disguise the real focus, ceaselessly reiterated, which is on the familiar terms inclusion and diversity (University of Massachusetts, 2016a, August 15), mentioned also in the frequent memos celebrating revisions to the university’s diversity strategic plan.

To defend perfectly reasonable positions that rest on universal liberal values rather than on identity politics, however, is to be vulnerable to the facile charge of “racism,” a charge that makes most people quake in their boots. Equally troubling in the context of the academy, apart from the apparently prevalent belief that First Amendment rights evaporate at the campus gates, is the fear of speaking out that afflicts many of those who do realize something is awry with the relentless accusations of systemic racism, sexism, and the rest of the gang.

With or without apologies, reputations can be ruined and jobs threatened rapidly. Not only is it imperative to refrain from saying anything that might be construed as hurtful, one must also appear never to entertain any thoughts that even skirt around the forbidden terms and ideas. Contagion is a constant danger, requiring permanent vigilance. The stunning silence of many professors and administrators when their colleagues are accused of racism and insensitivity is ample testimony to this reality.Footnote 20 An effacement of the distinction between words and deeds is encouraged by harassment policies, formulated to sustain grievances and try to force restrictions on free speech.Footnote 21 False and trivial accusations are neither discouraged nor punished. Implanting correct attitudes seems to have become the central mission of higher education these days.

One of the most obvious examples that bears witness to this new reality is the spread of terms such as “microaggression ,” originally coined by Chester M. Pierce in 1970, perhaps necessary in recent years because macroaggressions are scarce in academe. The inevitable result is a rush to proclaim a fragile and marginalized identity, followed by deployment of those effective old-fashioned weapons of shame, blame, and blackmail to extract concessions.

In a fascinating reversal, then, claiming powerlessness these days bestows power. And the claim can be justified only by discounting or simply denying the actual enormous progress toward equal opportunity that has occurred over generations. Women are the majority of university students, graduating at higher rates and entering various postgraduate programs in higher numbers than men. Yet this does not keep feminists from still charging that the academy is a male-dominated institution inimical to the interests of women, a terrain of constant sexual violence, like American society as a whole, they allege. Universities avidly compete for minority students and faculty, yet charges of racism increase rather than decrease. In order to keep identity battles at a fever pitch, real distinctions and reliable data are ignored or, more ingeniously still, are redefined as manifestations of “systemic” racism and ethnocentricity. But it is “racist” to comment on, say, honor killings among Muslims, even on a feminist academic listserv (see Patai, 2008a).

Of course, apostasy does occur from time to time, both in and out of the academy. Christopher Hitchens, who revised his earlier leftist views after 9/11, having recognized that Islamic terrorism was a real danger, argues in a tone reminiscent of Julien Benda that the first step we must take “is the acquisition of enough self-respect and self-confidence to say that we have met an enemy and that he is not us, but someone else. Someone with whom coexistence is, fortunately I think, not possible. (I say ‘fortunately’ because I am also convinced that such coexistence is not desirable)” (2004, p. 418). Hitchens has no patience with “moral equivalence” arguments nor with the conventional view that we must respect all religions. Perhaps, given his renowned atheism, he meant that we should respect none, but an equally pertinent observation is that not all religions have the same drive to suppress others, or, in the modern world, try to achieve their aims via brutality and terrorism, as radical Islam does quite openly.

Why, indeed, respect all religion any more than all politics? Is fascism to be respected, just as anti-fascism is? It seems unlikely that most people extolling diversity and multiculturalism, and readily applying labels such as “fascist” to those who contest their views, would assent. But to acknowledge this reality requires something other than adherence to mythemes of inclusion and cultural competence.

Even leaving aside the pertinent questions Okin and others have raised, of whether all cultures (and subcultures) deserve equal respect and on what grounds it might be advisable to distinguish between them, the particular dismissal and rejection of western culture in today’s academy lead to a paradoxical situation: Many campus social justice warriors display profound ignorance of their own culture’s traditions, to which one may add that it is easier for students to protest against hurtful speech and stay on permanent alert for microaggressions than to actually learn something about the world; it’s more fun, more dramatic, and more gratifying to one’s sense of moral superiority, and, not least, it takes far less time. Genuine cultural competence, by contrast, might require both recognition of what is real and laborious study.

Yet, as Kenneth Minogue (2010, p. 206) ironically observes, academic critics “are in no way more dramatically Western than in their hatred of their own heritage. It is an entrenched European tradition, though previously found largely in a religious idiom.” Advocates of multiculturalism also constantly display their dependence on that heritage in other ways. Their very demands cannot but rely for their fulfillment on the existence of the western tradition of universal rights and principles of justice. How else could minority groups expect their hegemonic oppressors not only to endlessly apologize but actually to set aside their own group interests and work instead to promote those of an identity group they purportedly oppress?

This state of affairs rests upon the paradox Minogue highlights. If there were no universal principles, no rights inhering in human beings as individuals (rather than as members of one or another group), if people were indeed irretrievably separated, mired in the interests of their own group and unable to see beyond it, minorities could never succeed in the shame-and-blame game now routinely played. It works because many members of the groups being attacked, despite being presumptively guilty of oppressive and unjust behavior (even if only many generations ago), can be and are actually expected to act on behalf of other groups. Without such a conviction, no minority group – say those claiming “Blacks Lives Matter” and insisting that to say “All Lives Matter” is somehow to be racist and insensitive – could expect a modicum of success, let alone actually manage to extract resources, apologies, curricular reform, and even resignations from the programs and administrators they attack.

Is there a relationship between the pursuit of multiculturalism and the rest of the social justice agenda in higher education and the distressingly low level of actual academic achievements of American students vis-à-vis those of many other nations?Footnote 22 International surveys (see DeSilver, 2015; OECD, 2012), indicating the unimpressive performance of American students in many areas (especially troubling in view of the much higher per capita spending on education in the United States), seem to suggest that cultural competence should, but evidently does not, begin at home.

The Utopian Dilemma

There is a curse besetting utopianism, J. L. Talmon wrote in an essay on utopianism and politics (1959). “While it has its birth in the noblest impulses of man, [utopianism] is doomed to be perverted into an instrument of tyranny and hypocrisy. For those two deep-seated urges of man, the love of freedom and the yearning for salvation, cannot be fulfilled both at the same time.” Nonetheless, he warns against a sneering and dismissive disdain for human beings and their struggles: “Such an attitude of pessimism is unwarranted, and lacks generosity and foresight. We must try to do good – but with a full and mature knowledge of the limitations of politics.”

The political scientist George Kateb expresses a similar view in his classic work Utopia and Its Enemies. He argues that the imposition of a utopian society would indeed require suppressing various sorts of “excellence” – that is the word he uses (1963). Like Talmon, Kateb neither renounces utopianism nor embraces this loss but rather suggests that the appropriate response is sad recognition of the limits and costs of manipulating human beings. As we shall see further on, many writers of speculative fiction attempt to delineate what those costs are, and satirical dystopias are one of their preferred means for doing so.

Today, however, judging by the demands of campus protestors, there exists a widespread belief in the west that, as Minogue put it, “inequality is itself the same as oppression,” a belief that in turn rests on the view that “perfection would be an order in which everyone equally shared in the goods of this world” (2010, pp. 202–03). This perhaps helps explain the difficulty many on the left have had in recognizing, or admitting, the actual conditions of life under communist regimes. The sheer destructiveness of these societies has tended to be excused as at least having been motivated by high ideals. By contrast, the new multiculturalists’ disinclination to criticize radical Islam, to take one of the most troubling examples of our time, is surely not motivated by genuine belief that Sharia law will usher in a better world.

Such silence on the part of those who do enjoy free speech, a free press, and freedom of association is (apart from legitimate fear in some cases) in large part rooted in the desire to avoid suggesting or even implying that, by contrast, western liberal values are actually better. As Andrew Anthony wrote about Ayaan Hirsi Ali, “she is loathed not just by Islamic fundamentalists but by many western liberals, who find her rejection of Islam almost as objectionable as her embrace of western liberalism” (2015).

Still, it is easy to understand the dilemma faced by true believers in abdicating their former convictions. These are hard to give up, as Paul Hollander has carefully documented in his work on communism worldwide, for apostasy requires abandonment of deeply held beliefs and past affirmations and affiliations. An extreme but hardly unique example is provided by the famed British Marxist historian Eric Hobsbawm (1917–2012) who, in his late 70s, still refused to renege on his life-long commitment to communism, instead affirming that, even had he known in 1934 of the deaths of millions of people in the “Soviet experiment,” he would nonetheless have supported it because “the chance of a new world being born in real suffering would still have been worth backing” (cited in Hollander, 2006, p. 289). Hobsbawm continued to defend Marxism until the end of his life (Kettle & Wedderburn, 2012). Perhaps that is the sign of a true ideologue: new information need not unsettle one’s core beliefs.

There is considerable congruence between the reality of communist societies as observed throughout the twentieth century and the utopian fiction that for centuries has envisioned regimentation of one sort or another as necessary if the “good society” is to be achieved. Thomas More’s Utopia (1516), which gave its name to the literary genre, already fully develops this theme. Choosing the name Utopia because it can be understood as eutopia (good place) or outopia (no place), More’s little book contrasts the imaginary island of Utopia, situated somewhere in the new world, with the corruption and degradation of old Europe.

Leaving aside the numerous dystopian institutions and characteristics in More’s work (such as enslavement of conquered peoples, subordination of women to men, and so on), whose meaning can be debated endlessly, Utopia’s key structural feature is that it is a radically egalitarian and uniform communistic society, with no private property whatsoever nor attachment to material goods, inhabited by peaceful citizens living according to strict and clearly laid out rules under the ever watchful eyes of their neighbors. Though controversy persists about More’s intentions, no contemporaneous evidence suggests that he was offering this vision as satire, and, indeed, it has remained the locus classicus of utopian literature for centuries.

The most famous nineteenth-century American utopian novel, Edward Bellamy’s best-selling Looking Backward (1888), also was based on complete economic equality attaching to each individual from birth. Bellamy’s model of social organization is the military, with the workforce forming what he calls “the industrial army.” Nonetheless, Bellamy recognized that people crave “distinctions,” and, while not sacrificing economic equality, he envisions symbolic rewards in the form of medals and other visible tokens, awarded to exceptional men, who are thus better able to attract women, he explains.

In the English-language tradition, many reactions to such notions of equality have appeared, especially after the mid-nineteenth-century rise of Marxist ideology. Thus, even before Bellamy’s book was published (and led to the formation of “Bellamy Clubs” worldwide, in support of his brand of socialism), the American writer Bertha Thomas in 1873 published a satirical short story, “A Vision of Communism: A Grotesque,” which takes equality and personal comfort to a new high (or low) by imposing correctives to the “Iniquitous Original Division of Personal Stock,” i.e., those talents and characteristics that, despite the existence of complete economic equality, provide unfair advantages to some and, if left uncorrected, cause envy, resentment, and inequality.

In a similar vein, the British humorist Jerome K. Jerome, in his 1891 story The New Utopia, envisions complete harmony and equality resulting not only from prohibitions of “wrong” and “silly” behavior but also through surgery to reduce brains to average capacity, to lop off limbs of the physically exceptional, and so on. A better-known modern version of the theme appears in Kurt Vonnegut’s famous story Harrison Bergeron (1970), in which a “Handicapper General” and her team impose disabilities and impediments on gifted individuals (without, however, mutilating them), so as to promote something resembling equality in all spheres.

Perhaps the most detailed, if lesser known, such dystopia is the novel Facial Justice, by the British writer L. P. Hartley (1960), which takes as its epigraph “The spirit that dwelleth in us lusteth to envy” (St. James). Following a nuclear war and years during which the surviving world population had to live underground, a new society is formed in Britain, once again above ground. Wracked by a sense of collective guilt over the war, this society, run by a never-seen dictator, imposes absolute equality: everyone wears sackcloth, all houses are alike (and the property of the state), and any kind of pairing off is discouraged. To avoid awareness of distinctions of any type, people are inculcated in “the Horizontal View of Life” – they may look neither up nor down (noticing height differences, whether in people or ruined buildings), but only straight ahead. Not surprisingly, the ruins of an old tower are considered a “phallic emblem from the bad old days.”Footnote 23 The word “mine” is hardly used, and “yours” means “everyone’s.” A popular new phrase is “voluntary-compulsory.”

The novel’s protagonist, named Jael 97, is “facially over privileged” and hence must report to the Ministry of Facial Justice to be fitted with a synthetic face, guaranteeing conformity to the acceptable norm. Because diversity of ideas is dangerous, leading to murder and war, in Hartley’s future England it is better to have only one idea. Making it still harder to rebel is a new edict declaring that all are now living in the Fun Age. Merit is discouraged, for it requires effort, “and we aren’t supposed to make an effort. Let the worst man win.” The Constitution is “based on equality of the most deep-seated and all-embracing order.” The idea of perfection itself is seen as “antisocial,” the worst possible crime. However, since grievances are common, safety valves are provided: A “discontentometer” exists, and each person is allowed seven complaints a week.

Hartley takes his fictional society’s directives to their logical conclusion: The Daily Leveller newspaper publishes an article suggesting that correcting grammar and spelling errors should be banned since it can lead to envy and bitterness. The article also protests against the tyranny of the objective case – e.g., who vs. whom – because “it wasn’t fair for a word to be governed by a verb, or even a preposition. Words can only be free if they’re equal, and how can they be equal if they’re governed by other words?”Footnote 24 Anthony Burgess, author of A Clockwork Orange (1971), one of the most famous of the twentieth-century dystopias, praised Facial Justice as “A brilliant projection of tendencies apparent in the post-war British welfare state” (1984).

While Hartley’s and similar works are obviously reductiones ad absurdum, current campus demands for comfort, safe spaces, and protection from microaggressions and hurt feelings suggest these dystopian speculations are no longer as far-fetched as perhaps they once seemed to be. After all, not merely economic disparities but any competitive advantages are likely to cause discomfort to those who lack them – for who can deny that beauty, intelligence, artistic and other talents, health, strength, industriousness, wit, and energy, not to mention great hair, along with much else, are, alas, not equally distributed in life? But this should not diminish our perception that, as is often observed these days, reality is making satire ever more difficult.

Fixing all such problems requires ever greater state involvement. A scathing and humorous attack on the actual scene in England today is Josie Appleton’s recent book Officious: Rise of the Busybody State (2016), which she sees as predicated on the sole belief “in the inherent virtues of regulation.” While pretending that “procedures” are in themselves capable of “warding off evil,” she writes, the real function of the new officiousness is to transform “unregulated life into regulated life” (pp. 9-10), demanding public submission.

In the United States, with the growing number of “Bias Response Teams” (BRTs) – which thoughtful universities encourage anyone and everyone to contact if they ever experience, witness, or merely overhear (in the real world or online) something they don’t like – the busybody state is hard at work. These BRTs, intended as support systems for fragile students, are “rapidly becoming part of the institutional machinery of higher education” (Snyder & Khalid, 2016). Examples of complaints around the country involve intentional or unintentional words or acts:

Definitions of bias incidents vary by campus but have the following key features: They encompass “any behavior or action directed towards an individual or group based upon actual or perceived identity characteristics.” These characteristics include “race, color, ethnicity, social class, national origin, religion, sexual orientation, gender identity and/or sexed equation, age, marital status, veteran status, and physical and mental health” – sometimes even “height” and “weight.”Footnote 25

A relatively recent development, BRTs have been around for about a decade and typically function behind the scenes, lacking transparency but possessing the ability to embroil alleged culprits in administrative punishments and retraining, all in the name of making students feel entirely comfortable and safe (Snyder & Khalid, 2016). It is also noteworthy that even a school such as the University of Chicago, which in August 2016 vigorously defended First Amendment rights against “safe spaces” and the like (Creeley, 2016), nonetheless has Bias Response Teams, on call 24/7, as at other universities.

Far from celebrating multiculturalism and diversity, such new policies highlight the prescience of dystopian visions in their understanding that a serious demand for social harmony must involve the enforced suppression of individual identity and free expression – for the sake of (a usually illusory) social cohesion. Dystopian satires, furthermore, come to mind when one considers current claims that “disparate outcomes,” as noted above, necessarily indicate prejudice and injustice and must therefore be corrected, all in the name of “social justice.”

It is certainly true that the existence of excellence, just plain talent, or any other advantage however acquired can cause those who lack it to feel resentful and uncomfortable. But more unacceptable to many people today is any discussion of genetically driven disparities among human populations, which suggests that the dystopian satires mentioned above were onto something fundamental. Nicholas Wade’s controversial book A Troublesome Inheritance: Genes, Race and Human History (2014) argues – delicately and with constant disclaimers, given the problematic history of “eugenics” – that natural selection has produced distinct social behaviors that play a significant role in racial and cultural variations worldwide. Earlier research about racial disparities in intelligence, such as R. J. Herrnstein and C. Murray’s The Bell Curve: Intelligence and Class Structure in American Life (1994), which Wade conspicuously does not mention, also aroused heated denunciation. Evidently arguments that challenge the view that race is (merely) a social construct or rhetorical trope are frowned upon these days, regardless of the evidence – unless, that is, embracing them is politically useful.

Beyond sheer denial of race as a biological reality, however, various ways exist for dealing with the problems of disparity between groups (as between individuals). It can be addressed by siphoning off the talented into a separate class; by medical, genetic, and/or behavioral engineering designed to produce conformity and remove grounds for tension; or by sheer authority and terror imposing uniformity on all people – except, of course, the leaders. All of these and other methods have been explored both in existing societies and in speculative fiction, with troubling if not downright nightmarish results. The predictable consequences, apart from a stultifying sameness, are the disappearance of creativity and initiative, which can only lead to stagnation and decline.

A further characteristic of many dystopias is the suppression and distortion of history. Such a practice is necessary – as we see today all around us – if people are to absorb an ideology that knowledge of the past could challenge. Orwell’s Nineteen Eighty-Four offers probably the best known treatment of this theme, but many details and elements of his novel suggest he was familiar with another dystopia published more than 10 years earlier in England. The British feminist writer Katharine Burdekin (1896–1963), using the pseudonym Murray Constantine, in 1937 published a powerful and prescient anti-fascist novel. Swastika Night, written in 1936, is set in the seventh century of the Hitlerian millennium, when militarist German and Japanese empires divide the world between them. A cult of masculinity is enforced; Jews have ceased to exist, and Christians are the lowest of the low – except for women, who are reduced to the level of ugly animals used merely for procreation. All books and actual knowledge of the past are suppressed, as are certain words and concepts. The notion of women as desirable and independent individuals has long since vanished.

In another of her speculative fictions, composed in 1935 but published only decades after her death, Burdekin (1989) had explored the same themes of censorship, rewriting of the past, and the oppression of one sex by the other. Entitled The End of This Day’s Business, it is an intriguing example of the fictional sex-role reversal, of which there are many in the dystopian and eutopian tradition. The novel is set 4000 years in the future, at a time in which women rule the world. Though in many ways depicting a eutopia, the novel also has a fundamental dystopian dimension: for the sake of maintaining their power and a peaceful society, women have reduced men to ignorant playthings, allowed competitive sports, games, and sexual access, but living in their separate sphere, ignorant of who their children and fathers are. The men are treated with kindness and condescension but denied literacy and all accurate knowledge of history. Their past achievements are hidden from them, and their material culture has long since been destroyed, so that no sources of pride in their sex remain. Only women, in this future, have actual knowledge of the past, ritually provided in order to instill in them the conviction of the need to maintain their own power.

The novel’s protagonist, in keeping with Burdekin’s deep suspicion of mere “reversals of privilege,” finally rebels against this order of things and informs her son of the truth. Like Socrates, she is put to death for this transgression (see Patai, 2002).

We live in an age where the goals of many social justice warriors seem to resemble these dystopian scenarios. In the name of presumptively desirable if elusive social goals (justice, equality, inclusion), restrictions on personal interactions and free expression have greatly increased. Attacks on and defenses of free speech on college campuses are anything but “content neutral,” despite well-known Supreme Court decisions regarding the First Amendment. While religious fundamentalists and creationists in some states have tried to impede the teaching of evolution, they and other right-wing groups are a small minority compared to the uniformity of thought that has dominated education for the past few decades.

Equally alarming, current orthodoxies have been allowed to define not only institutional and public policy but also scholarly research, which means that certain questions are not even to be raised, let alone seriously explored. The ensuing policing actions are often undertaken by other scholars, whose political commitments override professional obligations and sound investigation, with the result that certain topics are boycotted and prohibited, and their proponents vilified.

One of the most interesting recent books on the negative effects of such politics on scholarship is Alice Dreger’s Galileo’s Middle Finger: Heretics, Activists, and the Search for Justice in Science (2015), which chronicles hair-raising stories of group bullying of scholars whose research did not serve the political agendas of one or another popular cause. Such episodes have multiplied in recent years, as the blatant politicization of education and research has been deemed by many academics as not only acceptable but desirable. At the macrolevel, entire fields come to be dominated by ideology, not the pursuit of knowledge. At the microlevel, surveillance of speech and attitudes has grown so intense that even the most august reputation can be readily sullied by a politically injudicious joke or passing comment, as happened to Sir Tim Hunt, the Nobel Prize-winning British biologist (see Young, 2015).

Today, as Dreger demonstrates, important and complex research areas are discouraged if not outrightly prohibited, particularly in relation to sex and race differences, the formation of sexual identity, law enforcement, family life, and so on. Entire programs in academe are built around ideological assertions that are not to be questioned, as is readily apparent in, for example, Women’s Studies programs – in recent years renamed Women, Gender, and Sexuality Studies, perhaps to clarify that their focus now is primarily on dismantling “heteronormativity.”

As noted above, it is common in dystopian scenarios for individual freedom to be set in opposition to the common good. Benefitting from and cultivating one’s own qualities and abilities are seen as invariably leading to resentment and conflict, even if economic equality (at the low level that suppression of talent and initiative will allow) is guaranteed. The intractable problems of difference and sameness – the very thing that has entangled the many feminists who readily switch from one to the other as needed in their arguments about gender equality and women’s roles – have a long and contentious history. The underlying issue, however, is how one understands human nature and the relationship between nature and nurture, for these will determine beliefs about the extent to which individuals can be shaped, molded, or constrained for the sake of the group. While all societies depend upon some such shaping, and individual consciousness is always accompanied by group ties as well, the disagreement over ends and means resides in the details.Footnote 26

Notions of civil liberties and rights attaching to all individuals qua individuals, rather than as members of groups, have taken a long time to develop and be implemented, and it is therefore worth remarking that it requires far less time for them to be destroyed – or, for that matter, voluntarily abandoned. When group pride must be maintained at all cost, individual moral awareness and autonomy disappear, replaced by hypervigilance of social life and suppression of individual initiative. The result is what Minogue (2010) refers to as “sentimental moralism,” bound to produce a servile mind.

A pertinent example is offered in Ayn Rand’s short dystopian novel Anthem (1937), interesting because, apart from her attack on collectivism, well before Orwell, she envisioned the effects of collectivism even on language. Though ostensibly a first-person narrative, the novel contains no singular pronouns, only plural forms: “We” but never “I” and “our” but never “my.”Footnote 27 Any invention or contribution not coming from the collective is suppressed, and the predictable consequences are a primitive way of life in which even the knowledge of how to produce electricity has been lost. Extreme regulation is a necessity if people are truly to be protected from invidious comparisons and competitiveness – in other words, from human emotions.

In the pronoun policing currently demanded by and on behalf of transgender activists, we see an extension of these older patterns, once again dressed up as efforts to prevent discrimination. Perhaps the linguistic totalitarianism masterfully satirized in Lou Tafler’s inventive novel Fair New World (1994) will soon become a model. Tafler takes the battle of the sexes to what once seemed to be absurd new heights. He imagines three separate societies. Two are dystopias: Bruteland inhabited only by men and Feminania by women. The third is a eutopian alternative called Melior, governed by a philosopher king who did everything he could to not be elected.

In Feminania, the Femininnies have created a language called Fairspeak, in which the letter combinations man and men have been replaced by womb, womban, and womben, producing words such as wombdate, wombanacle, and dewomband. Naturally words such as woman and female are unacceptable and have become womban and fembale. Etymology counts for nothing, as in the real-world preference for herstory, promoted by feminists decades ago, which Tafler perhaps took as his inspiration, and even gender-neutral endings are revised. Thus er and or are replaced by her, and son by daughther, producing words such as acther, disasther, daughtherg, pherdaughther, pridaughther, and readaughther. Gent has become lady (as in intellilady, dililady, and ladylewombanly), and scores of terms such as wombanufacture, docuwombentary, wombagewombent, and comwombencewombent abound. In all sections of the book dealing with Feminania, then, the reader must slog through a soup of zany, often multisyllabic words. The author, however, has thoughtfully provided a glossary.Footnote 28

In dystopian (and often eutopian as well) scenarios, the great social conundrum of how to reconcile individual and group needs emerges most intensely in the arena of love and personal relations. These, if they mean anything at all, involve selection and the “sin of preference,” as Rand names it in Anthem. Many famous dystopias try to deal with the messiness of personal attachments by suppressing them altogether, whether by prohibiting sex, regimenting it, or facilitating unfettered sexual access but outlawing love and dismantling family attachments. Lobotomies, drugs, constant surveillance and indoctrination, imposed uniformity, ceaseless group manifestations of fealty, orgies of hatred and/or sexual license, and the abolition of art not devoted to celebrating the state are all staples of these works.

Once again today, life imitates art, and social activists seek to introduce repressive practices in schools, where they can more easily be imposed on a captive audience. A recent much-discussed article by psychologist Barbara Greenberg (2018), who describes herself as “a huge fan of social inclusion,” argues that schools should ban the habit of forming “best friends.” Why? Because these practices lead to emotional distress among children and teenagers. Greenberg writes: “The word ‘best’ encourages judgment and promotes exclusion.” She advises parents to prod their children, instead, to have a small group of close friends. Or, one might add, they could just send their kids to convents, which have a long history of attempting to suppress what they called “particular friendships.”

But it is not only personal relations that a well-regulated society attempts to control. Art, too, with its appeal to imagination, is a constant irritant and offender. More than 60 years ago, Ray Bradbury’s Fahrenheit 451 (1991) devoted considerable attention to the problem of the role of the individual in a society aiming at social harmony. Writing in the early days of television, Bradbury envisioned a future America in which the preeminent value is general contentment, fed by drugs, physical comfort, surveillance, and a constant stream of anodyne pseudo-interactive television programs broadcast on multiple huge screens affixed to living room walls. Firemen no longer put out fires; instead they burn books, as the fire captain, Beatty, explains to the rebellious fireman Montag (who has begun to wonder about the content of the books he’s burning). Books convey the “texture,” the “pores” of life – and for that reason awaken dangerous feelings best left undisturbed. This practice was not imposed by the government but selected by the people, including professors who did not defend their turf. Captain Beatty lays out the rationale:

“You must understand that our civilization is so vast that we can’t have our minorities upset and stirred. Ask yourself, What do we want in this country, above all? People want to be happy, isn’t that right? Haven’t you heard it all your life? I want to be happy, people say. Well, aren’t they? Don’t we keep them moving, don’t we give them fun? That’s all we live for, isn’t it?….”

Beatty provides examples of what this means in practice:

“Coloured people don’t like Little Black Sambo. Burn it. White people don’t feel good about Uncle Tom’s Cabin. Burn it. Someone’s written a book on tobacco and cancer of the lungs? The cigarette people are weeping? Burn the book. Serenity, Montag. Peace, Montag.” (Bradbury, p. 59)

This is not, however, merely a modern theme. Kierkegaard, in his 1844 work The Concept of Anxiety, famously suggested that “anxiety is the dizziness of freedom.” Some decades later, in The Brothers Karamazov (1879–1880), Dostoyevsky presents a similar argument through the monologue of the Grand Inquisitor, who explains that human beings consider their birthright of freedom insupportable and therefore perpetually seek some authority or leader to relieve them of it – in exchange for security. Only in this way can they achieve happiness.

Such ideas have profoundly influenced dystopian fiction, which routinely includes a scene involving an authoritative figure like Captain Beatty who explains to the (usually rebellious) protagonist why the masses not only must but actually wish to be spared the burden of freedom. But whereas Dostoyevsky’s parable sets forth a criticism of the power and authority of the Roman Catholic Church, in more recent renditions (in fiction, film, and reality), the same rationale is offered for secular state regulation of daily life.

Demands for thought reform, too, are inevitably a staple of the dystopian literary tradition, in which individual autonomy is understood to be a potential threat to state stability. It is inconvenient to have people thinking for themselves, since this entails questioning rules and regulations, contesting others’ views, and stirring up conflict. It makes sense for those in authority to claim all this must be suppressed for the greater good.Footnote 29

Those who dismiss these speculative works as mere fantasy need only recall the numerous totalitarian regimes that, throughout the twentieth century (and into the twenty-first century), have tried to obliterate the Enlightenment values of individual autonomy, responsibility, and civil liberties through combinations of propaganda, censorship, imprisonment, torture, and death. This assault continues into the present day in regimes such as those of North Korea and Cuba.Footnote 30 It is also found in nations and communities governed by rigid Sharia law, which subordinates individuals to the dictates of a religio-political ideology.

In his 1932 novel Brave New World, Aldous Huxley has one of the World Controllers say “There isn’t any need for a civilized man to bear anything that’s seriously unpleasant” (1969, p. 243). Reality has finally caught up with fantasy, rendering satire obsolete. At the University of Portland, Oregon, in early 2016, a webpage called “Speak Up” was launched, urging students to contact the Public Safety Department to report any “instances of discomfort” that they may have experienced or witnessed (Aguilar, 2016). The precipitating incident was the claim that students of color feel isolated due to the prevalence of “microaggressions” on campus. They can also now receive training in how to spot the subtleties of “microaggressions.” And more recently, some schools are insisting that facial expressions may also be construed as microaggressions, along with conventional terms such as “you guys,” which excludes women.

The kind of hypersensitivity that used to be promoted primarily in a few identity-based programs such as Women’s Studies or African-American Studies, with their emphasis on the ceaseless victimization of their specific identity groups, accompanied by the invariable assignment of guilt to other groups, has now morphed into an endless litany of potential offenses, in need of correction by “cultural competence” and its fellow terms. It has become so routine that monitoring by an ever-expanding cadre of administrators and staff is now required (Patai, 2016a, February 7).

And why not? When something has achieved the status of a major social problem, around which moralizing and posturing can coalesce, most everyone wants to be in on the action. Each new suspected affront then becomes, as Joel Best explained in his book Threatened Children (1990), “just another instance of X,” the intractable problem, useful for mobilizing grievances and demands. Key aspects of this process, according to Best, include expanding definitions of the problem, along with escalating statistics, rhetoric, and, of course, media attention.

A current instance of this process has been unfolding before our very eyes in the #MeToo movement that began in 2017 and is still gaining steam nationally and internationally (see note 7). It has intensified (but by no means originated) the atmosphere of moral panic that has been festering for the past few decades. Nearly 25 years ago, the cultural theorist John Fekete analyzed this phenomenon in his powerful and controversial book Moral Panic: Biopolitics Rising (1994), more relevant today than ever. “Biopolitics,” writes Fekete, is a “new primitivism which promotes self-identification through groups defined by categories like race and sex.” Such an atmosphere, affecting virtually all aspects of life in North America (and spreading from there), as we see on a daily basis today, characteristically refuses to distinguish trivial from serious charges – so that rape and a vulgar pass are treated as equally serious; is uninterested in the accuracy of allegations, and cultivates a crude identity politics that promotes vilification of entire groups and celebration of others. Furthermore, it encourages dangerous “panic remedies,” as Fekete points out. By capitulating to such a climate, democratic and formerly liberal societies start to unravel, adopting practices that closely resemble those found in authoritarian dystopias, whether satirical or actual.

It is illuminating that while “moral panic” spreads, alarm over actual political threats diminishes (unless those threats come from conservative politicians in the west). In the past 15 years, however, as the problems associated with massive Muslim immigration to Europe have intensified, a new kind of dystopian fiction has emerged, warning of Islamist takeovers of western countries and often resting on knowledge of the history of Muslim imperialism, adapted to modern times.

The most famous of these is no doubt Michel Houellebecq’s 2015 novel Submission (the literal meaning of the word Islam), a best seller in his native France, which envisions the peaceful ascension in the near future of an Islamic political party through democratic means, thanks to an alliance between socialist and Islamic politicians. In conjunction with home-grown attacks on western values, it is becoming clear that violence is not necessary to overthrow a society: a clever utilization of liberal values, propaganda, media support, and current multicultural dogma could bring about the same result.

As it happens, Houellebecq’s novel was published on January 7, 2015 – the very day Islamist terrorists murdered 12 staff members at the offices of the Parisian satirical weekly Charlie Hebdo, in the name of avenging affronts to their prophet Mohammed. The ensuing discussion in many western democracies about when free speech goes “too far” should be instructive as a sign both of the rapidly changing climate in the west, and in particular of the habit of capitulation to Islamist extremism. Houellebecq, of whom a caricature appeared on Charlie Hebdo’s cover that week, has predictably and repeatedly been accused of “Islamophobia.” Yet his fame, or notoriety, in France, persists, whereas no work such as his has attained much popularity in the United States.

By contrast, Margaret Atwood’s 1986 dystopia The Handmaid’s Tale, envisioning a fundamentalist Christian takeover of the United States resulting in a nightmarish and misogynistic totalitarian society, was and continues to be an immensely popular and widely used text. No attacks on the author for her supposed abuse of free speech to vilify a particular religious group were launched. Most intriguing is that Atwood’s novel was an extrapolation, an extremist fantasy, whereas Houellebecq’s, like other such anti-Islamist speculative fiction in recent years, utilizes widely known and verifiable proclamations and practices based in Sharia law, which numerous surveys indicate a significant numbers of Muslims in western countries (to restrict the discussion only to them) would indeed welcome.

Conclusion

The paradox of the ideological cluster that includes defending the language of multiculturalism, cultural competence, and diversity – all resting on identity politics – is, as noted above, that it invariably relies on ideas of basic human rights and freedoms that were first defined, embraced, and widely implemented in the west. It is considered unforgivably ethnocentric, however, to say this today. Nonetheless, the real costs of abjuring such values are readily apparent: one has only to look at past and present examples from the USSR to China, North Korea, Rwanda, Libya, Syria, and numerous other places.

In his essay On Liberty (1859), John Stuart Mill writes against paternalism and for individual autonomy. His liberalism, endorsing restrictions on government power, is the opposite of what today goes by the name of liberalism, with its ceaseless demands for codes and regulations imposed by the state and its institutions. Mill ended his work with this warning:

…a State which dwarfs its men, in order that they may be more docile instruments in its hands even for beneficial purposes, will find that with small men no great thing can really be accomplished; and that the perfection of machinery to which it has sacrificed everything, will in the end avail it nothing, for want of the vital power which, in order that the machine might work more smoothly, it has preferred to banish.

If taken seriously, the supposedly laudable goals of equity, inclusion, diversity, and the general pursuit of “social justice” are ultimately achievable only by rigorous micromanagement of speech and everyday interactions. Many people (mostly living in relatively free societies) seem inclined to consider this a small price to pay if hurt feelings and discomfort are to be averted, and their naïve ideas of how to create a better world are implemented. And this is precisely what we are seeing at those highly privileged places, universities – from which the same goals and practices spread to lower levels of education and to the society at large.

Extremes have a way of meeting. The celebration of ever cruder speech and gestures that followed the Free Speech Movement of the mid-1960s has by now morphed into a very different demand. Though vulgarity and sexuality pervade the mass media, in the academic world and beyond, the desire for individual freedom of speech and association has turned into cries for protection from others’ potentially offensive speech, gestures, glances, jokes, touches, invitations, innuendos, questions, facial expressions, and much else. Nothing is new here, perhaps: all things can turn into their opposite if one waits long enough.

Hubris, or political passions, should not lead us to think that if we can just regulate the content of education thoroughly, we will bring about social justice. In fact, we hardly know what “social justice” is, let alone how it may best be attained. Therefore, while we still enjoy the freedom to learn, explore, and debate and while we still have full access to information and the means of analyzing it, it behooves us to notice how easily the new insistence on cultural competence and all its kindred terms can slide into ideological policing and cultural incompetence, rooted in generalities that may not be far removed from the stereotypes they originally aspired to replace.