Keywords

There is no shortage of opinion about how to characterize the modern world and distinguish it from preceding epochs. Conjecture about this matter is invariably influenced by the training, theoretical orientations, preconceptions, and biases of scholars themselves and relatedly what gets established as an object of analytic interest. Of course, beyond academia there are the bewildered and alienated ordinary souls who must make their way on the postindustrial landscape. Incredible as it may seem to the wealthy and powerful, these people – of no particular importance and increasingly estranged from the forces that shape their lives – do think about things. Although they may not know about managing hedge funds or how to trade derivatives, they are acutely aware that for at least several decades it has been becoming harder to get ahead. Indeed, within our collective consciousness, there is the idea that although our forebears may have had it tougher than us in many ways, unlike us, they had something to look forward to. Hence, a distinguishing feature of modernity is that, for the majority, conditions are declining. The content of public policy – as well as the process by which it gets done – impacts this trend. It is axiomatic that the way Western law makers and regulators approach their craft has altered over several decades. There are conventional means of assessing such change as well as off the wall approaches. In the spirit of iconoclasm – and as something of a protest against the smart ones who have in various ways failed us – I prefer the latter type. I start therefore by discussing late night television. As strange as it might seem, the tonight show guys are saying something profound about what has happened to public policy in the West.

From 1962 until 1992, TV legend Johnny Carson was an American institution. It was hard to find anyone who had not heard of him. Indeed, with accolades including six Emmys, the Governor’s Award (1980), a Peabody award (1985), induction into the Television Academy’s Hall of Fame (1987), the Presidential Medal of Freedom (1992), a Kennedy Centre Honour (1993), and a posthumous star on Hollywood Boulevard’s Walk of Fame, it was probably hard to find any baby boomer who had not seen his tonight show which was broadcast weekly across the United States by NBC at 11.30 p.m.

Much like his modern-era equivalents, Carson’s format relied heavily on political satire. He would typically open with a monologue about the protagonists of his day, an era spanning seven Presidential administrations: Kennedy, Johnson, Nixon, Ford, Carter, Reagan, and Bush, 41. Carson was as talented and engaging as any of his successors. Indeed, the likes of Jay Leno, and David Letterman have magnanimously acknowledged his influence on themselves personally as well as on their shows’ formats (Luippold 2012). However, a careful comparative analysis of Carson’s style reveals a key difference between the substance of his humor and that of those who followed him. Carson’s one-liners were typically about matters of personal eccentricity, perhaps with a focus on a target’s misdeeds, appearance, and displays of physical incompetence. Here, in chronological order, are a few of his trinkets still doing the rounds on the Internet: “Did you know Richard Nixon is the only President whose formal portrait was painted by a police sketch artist?” “That would have been a great ticket, Reagan and Ford. An actor and a stuntman” (a reference to Gerald Ford’s clumsiness and an incident in which he fell down the steps while alighting from Air Force One). “Ronald Reagan just signed the new tax law. But I think he was in Hollywood too long. He signed it, ‘Best wishes, Ronald Reagan.’” “Nancy Reagan fell down on the Whitehouse lawn and broke her hair.

Carson rarely drew on disembodied consideration of public policy for inspiration. Public policy is dry, bookish, based – at least partly – on a depersonalized understanding of abstract notions about how and why variables are related. When done in a sophisticated way, it applies a methodology. For example, typically from a well-defended construct about the way the world is, it progresses to a protocol resembling Descartes’s scientific method and associated conjecture (constrained by principles of logic) about how to interpret research output and which output to interpret. Looking beyond the needs analysis, and research and development stages, regulation is drafted, moves through iterations, becomes authoritative, and is evaluated. This process is the antithesis of entertaining and contains little scope to be funny. If it were to be turned into a Vaudeville routine, it would require a certain spectator sophistication. After all, even the funniest joke about the relative merits of Monetarism versus Keynesianism will fall flat in the absence of a threshold understanding of what these terms mean. Even if this hurdle can be surmounted, public policy is not quirky. Rather cerebral and impersonal. As comedic subject matter, it is a stillborn, to use a Hollywood insider term.

Fast-forward to 2018. Modern era tonight show hosts do not overlook haircuts, sexual escapades, or general stupidity in their never-ending quest to get a laugh. The personal weirdness factor is alive and well as a source of inspiration. However, in the post-Brexit era – an era when the world waits in anticipation for a wall to emerge between the United States and Mexico (a wall that Mexico is going to pay for and to the other side of which illegal immigrants will soon be deported) – a new genre of humor has emerged. For example, here is part of a skit performed by Irish comedians, Foil, Arms, and Hog (broadcast 30 June, 2016) entitled “WTF is Brexit” in which two guys are talking with each other at a urinal….

Guy 1::

Okay, so the UK is in Europe – right ?

Guy2::

Yeah

Guy 1::

So they have the Euro?

Guy 2::

No

Guy 1::

Oh, so they are not part of the EU?

Guy 2::

No, they are part of the EU.

Guy 1::

What?

Guy 2::

Yeah, for now.

Guy 1::

What do you mean for now?

Guy 2::

Brexit!

Guy 1::

What is Brexit?

Guy 2::

It’s the British exit from the EU.

Guy 1::

So Britain wants to leave the EU? When?

Guy 2::

According to the EU, ASAP! Well actually, just Wales and England want to leave the EU.

Guy 1::

What about the other two? That’s not fair.

Guy 2::

Well, when the UK leaves the EU, then Scotland’s going to leave the UKand then join the EU!

Guy 1::

And what about England?

Guy 2::

Oh well, they are going to reconnect with the Commonwealth.

Guy 1::

What’s the Commonwealth?

Guy 2::

The former territories of the British Empire.

Guy 1::

So now they have an empire again?

Guy 2::

Look, all you need to remember is that the UK is leaving Europe.

Guy 1::

Well where the hell are they going to go, Asia?

The above exchange has nothing to do with politicians or public figures. Rather, it is a reference to the befuddlement surrounding Brexit. If it is funny, it is so because the whole plan seems ill conceived and a paranoid reaction to unspecified problems that may not even exist.

Another contemporary era window on emerging comedic content concerns modern America’s disquieting rise in mass murder on school and university campuses. For example, on the 20th of February 2018, following the Valentine’s day shooting at Marjorie Stoneman Douglas High school in Florida in which an alienated young man with a tragic recent past killed 17 people, US President Trump floated the idea of issuing firearms to teachers as a public policy initiative. He said, “up to 20 percent of teachers should be armed to stop the maniacs from attacking students.” The following day he described “a gun-free school as a magnet for criminals.” On the 23rd February, late night comedian Steven Colbert had this to say about the President’s suggestion.

Yes – just arm the teachers! I’m sure it’s in the school budget. Sorry your school can’t afford enough copies of ‘To kill a Mockingbird’ but good news! We’re giving you something that can kill any bird.”… “Now Trump’s idea of arming teachers did not go over well with law enforcement officials or teachers or people who are children……or others.” In commenting on Trump’s tweet that “if a potential sicko shooter has a large number of ‘very weapons talented’ teachers and others who will be instantly shooting, the sicko will never attack that school,” Colbert said “Yeah, that’s what sickos are known for – logical reasoning.” Colbert then said “what does he mean by ‘weapons-talented?’ That’s not a phrase I want to associate with teachers…wow Earl, you sure are handy with guns. Have you thought about working with children?

In the above monologue, Colbert obviously mentions the President. Perhaps to enhance the comedic potential of the skit, it is helpful to know something about the kind of man Donald Trump is. However, the joke here is really about ideas. As such, it is the proposal not the person that steals the comedic limelight.

So what is underlying change in tonight-show fodder? One hypothesis is that what people find funny has altered. This does not seem likely in any profound way. A competing hypothesis is more compelling: the proposition that contemporary ideas about how society should be organized are approaching the absurd; or at least have crossed a line such that they are now in a zone that renders them literally laughable. Perhaps part of the problem is that the policy wonks have been marginalized. They no longer are setting the course. It could be that there are justifiable reasons for banishing the experts, or at least admonishing them. Maybe they have failed, or just not done well enough. Alternatively, maybe – somewhere along the line – they were co-opted to do the bidding of the wealthy and those with special interests.

Aside from the reference to Carson vis-à-vis his modern equivalents, there is a more conventional way to examine what has happened to Western civic society since the 1960s. Since the end of the Second World War, there have been three eras of thought about public sector governance. The first has been variously characterized as the epoch of the New Deal consensus or regulated state capitalism, umbrella terms for Fordism, broadly conceived (Williams et al. 1992). The second is often referred to as the era of Neoliberalism. The third is still taking shape and hard to characterize. However, it is distinctive in two related respects. First, it emerges from the ruins of public policy that has been chronically failing for decades as well as economic and governance crises that know no modern precedent. Second, it represents a mostly visceral reaction on the part of ordinary people against theory and experts. In this latter respect, contemporary era decision makers often present themselves as skeptical of evidence-informed reasoning or even openly disdainful of it. The emerging epoch is henceforth referred to as the “age of crisis.”

The contemporary-era nascent social and political landscape is especially intriguing. To the extent that it is possible to make historical judgments about its causes and correlates, it was born from the long-term failure of policy to deliver results for ordinary people and, more recently, global calamities for which no one in authority seems to have answers, the 2008 financial crisis being the most notable case in point. It seems that in the aftermath of all that has gone wrong – the creeping malaise as well as the abrupt moments of unexpected catastrophe – thinkers have letdown the populous. As noted, it is not entirely clear whether their disappointing performance pertains more to failure of imagination or being, in various ways, “bought.” Whatever the case, if theory is now a thing of the past, what is to be its replacement? A starting point here is to attempt to conceive of its opposite. One solution – possibly the new orthodoxy – is to posit a mishmash of notions that feel right, or make large numbers of people feel good. The key word here is “feel.” Concisely, feeling is antithetical to thinking. Unburdened by a need for rationale, feelings have a life of their own. Hence, reasoning and argumentation does little to change the way they are subjectively experienced. However, context influences them. For example, consider anger. For the mentally adjusted, rage and resentment is manifest following a real or imagined injustice. It is possible that having unfairness put-right (or misconceptions corrected) assuages passion however, even in such relatively rare circumstances the preinjustice state is not recaptured and emotion not cognition invariably continues to inform actions. Furthermore, unlike theorizing, feelings are subjectively negative or positive and mostly untameable through voluntary acts of will. This is largely why to feel bad is, in principle, harmful; it is difficult to come back from. In this sense, it is noncomparable to its “thinking” equivalent, which would be something merely akin to theory requiring adjustment or jettisoning, or a null-hypothesis being unable to be rejected. For current purposes, the query of the moment is how does it arise that a society devises its governing principles based on feelings rather than theorizing? This is a thorny but nonetheless inescapable question in the modern world. At least part of the answer emerges from understanding how theory has failed; and failed in such a way as to disenfranchise – indeed harm – large numbers of people. It is from such an historical examination that this book begins. The date is around 1980 for most OECD countries; but for the people of Chile, came 7 years earlier in the aftermath of Augusto Pinochet’s military coup d’état which overthrew the democratically elected socialist government of Salvador Allende.

The Lead-Up to the Age of Crisis: From the New Deal Consensus to Neoliberalism to the Calamity-Ridden Contemporary World

Sometime around the late 1970s, a different orientation towards public policy entered the mainstream in Western countries. In certain jurisdictions including the United States and the UK, the change was relatively abrupt. In others such as Australia, New Zealand, and Canada, it was incremental (Katz and Darbishire 2000; Gould 2010). The details about why there was a departure from the postwar prescription for societal betterment create much of this book’s context and merit attention. For example, it appears that the Fordist-based paradigm and Fordist-based Kaldorian economic growth as the West’s dominant mode of production was by the mid-1970s not as effective as it had been or at least could be portrayed as such (Mead 2004; Koch 2006). Several decades earlier theorists such as Italian socialist philosopher Antonio Gramsci (1934, reprinted in 1971), in reinterpreting Marx’s notion of economic determinism argued that Fordism, by virtue of the moment in history when it took root and its subsequent ubiquity, is archetypal of Western-style capitalism (Antonio and Bonanno 2000). In his short essay, Americanism and Fordism he argued that Fordism embodies a sophisticated mix of bourgeois strategy and manipulation, classical Marxist-style exploitative elements, and a State-based response which stabilizes conditions that would otherwise culminate in a proletariat revolution. Whatever the case, the approach became, at least in relative terms, a practical and ultimately broadly beneficial theory about how to manage private capital while accommodating disparate interests of various actors in the process.

Fordism supplied business people with key theory that allowed the promise of industrialism to be realized. For 150 years before the beginning of the twentieth century, conjecture about workforce management was piecemeal and no orthodoxy was hegemonic. In a sense, creation of a universal blueprint for employee superintendence lagged behind philosophizing about the role and relevance of macrolevel elements of the new capitalist system as espoused by, on the one hand, its proponents such as Adam Smith and, on the other, its critics such as Marx and Engels. With some exceptions such as with Scottish textile mill owner Robert Owen (1771–1858) who provided counsel on how to motivate employees thus establishing his reputation as the first management theorist (Joullié and Spillane 2015), prior to the twentieth century, there was a paucity of practical advice. Indeed, until the twentieth century, what passed for ideas about administrative science are often better interpreted as sympathetic philosophy concerning the importance of the new management class. An example of such proselytizing is seen in the work of Alexander Hamilton (1771–1804) who put flesh on the bones of Adam Smith’s Wealth of Nations treatise (1776). Hamilton produced a report on the subject of manufactures that extolled the transformative potential of management. However, the nearest he came to providing down-to-earth advice concerning governance was to reiterate Smith’s counsel that factory efficiency is mostly achieved through division of labor and job specialization (Joullié and Spillane 2015; Gould et al. 2017).

It was not until the early 1900s that an integrated set of tenets concerning how to run large enterprises for profit and provide stewardship of private capital arose and abruptly became zeitgeist (Wren and Greenwood 1998; Scheiber 2012). Hence, Fordism was born as a panacea for the problem that had bedeviled business people since the advent of steam-powered technology. The dilemma was this: how could capital, in the form of increasingly differentiated and sophisticated machines, be integrated with labor to optimize investor return in the new industrial world? The solution entailed using semi-skilled workers deployed on an automated assembly line to do specialized tasks with dedicated apparatuses. In the early 20th century – an economic setting where key industrial and consumer markets were unsaturated and therefore demand invariably equaled supply – the perennially elusive ideal of optimal investor return was mostly a secondary consequence of something less abstract, output of standardized products.

A conspicuous feature of Fordism is that it emerges from theory. Scholarly opinion differs about whether it was a derivative of, or strongly influenced by, Taylorism (e.g., Hounshell 1984; Sorensen 1956; Doray and Macy 1988). However, unambiguously it belongs to the family of approaches that today are frequently derided as being in the scientific management tradition. Imperfect as these strategies now seem, a-theoretical they were not. They can be defended using logic that, in many cases, is informed by data obtained from the shop floor. Indeed, data kept scientific management theory alive and ultimately brought it – at least partially – to heal with the Hawthorne studies and ensuing human relations revolution of the 1930s (Wren and Greenwood 1998; Taneja et al. 2011; Joullié and Spillane 2015). In this sense, Taylor’s magnum opus, the Principles of Scientific Management (1911) is technically impressive due to the author’s commitment to using mathematical theorems in particular to present a view of how variables are related and his concern about empirical confirmation.

Another key characteristic of Fordism is that, as Gramsci made clear in what he wrote in his diary whilst imprisoned under the Mussolini Fascist regime, the term itself came to denote more than an approach to factory management. As the twentieth century progressed, the construct expanded and took on at least five divergent meanings that, taken together, define – not just a management blueprint – but a public policy era. First, Fordism is a label applied to a particular mode of economic growth that entails a virtuous cycle of mass production and consumption (Koch 2006). In a sense, it instantiates Adam Smith’s conception of the trajectory of modernity as outlined in landmark works such as Max Weber’s The Protestant Work Ethic and the Spirit of Capitalism (1905). Second, Fordism symbolizes something of a capitalist concession. Ironically, in spite of Henry Ford’s unitarist inclinations and legendary hatred of unions, the term now embraces begrudging employer recognition that they should tolerate organized labor as a broadly beneficial component of their forfeiture in the twentieth century’s new social contract. Relatedly, it signifies that workers should accept management perogative in return for rising wages (Watts 2005; Baca 2004). Third, Fordism is widely allied with the distinctively industrial-age notion that Western industries mature into inefficient but broadly accommodative structures of monopolistic competition where cost-plus rather than market consensus determines pricing (Watts 2005). Fourth, Fordism is associated with legitimization of deficit financing and credit-fuelled consumption (McDermott 1992; Tylecote 1995). In the New Deal era, its imperfections formed much of the rationale for State intervention in economic matters including, in particular, the establishment of a welfare state and state regulated capitalism. Fifth, Fordism has also become a research paradigm, used for example to analyze the proliferation of mass media, transportation, and politics (Wren and Greenwood 1998; Roobeek 1987). To gain perspective on how Fordism and its inextricable historical link with regulated entrepreneurship was woven into the fabric of American – and Western – industrial life, one need only reflect on the recently established hard-won legitimacy of organized labor in the middle decades of the twentieth century. For example, in 1952 General Dwight Eisenhower, Republican Presidential nominee and archetypal tough-guy said about labor unions.

I have no use for those, regardless of their political party, who hold some foolish dream of spinning the clock back to days when unorganized labor was a huddled, almost helpless, mass. Today in America, unions have a secure place in our industrial life. Only a handful of unreconstructed reactionaries harbor the ugly thought of breaking unions. Only a fool would want to deprive working men and woman of the right to join the union of their choice. (Speech to the American Federation of Labor in New York City. 17 September, 1952 (cited in Newton, 2011))

And so it was. During the 1950s under a Republican administration, the New Deal consensus was hegemonic. Whatever the inadequacies of Fordism, the theory as it applies to production – in combination with its institutional employer concessions and attendant state-based softening strategies to halt what Marxists saw as the natural course of dialectical materialism – was working and doing so for most people. For example, the decades of the 1930s to the 1960s saw inequality diminish to an unprecedented level (Gould et al. 2017; Moody 2007). During the era, the top 20% of wealthy people in the USA never held more than 65% of private capital. By contrast, since 1990 this figure has always been above 70% (Wisman 2013; Gould et al. 2017; Appleby 2011). From 1950 until 1972, real average weekly wages rose in the United States by 48%, from $212 to their highest point on record, $315. Over the same period, real income growth for families increased even more dramatically. For example, the poorest 20% of American households more than doubled (116%) their disposable income in the period 1947–1973 (Moody 2007; Mishel et al. 2005; Piketty 2014).

In the 1970s, postwar growth associated with Fordism began to decline and commentators speculated that the model itself needed re-examination. In the worst economic slump since the great depression, the US gross domestic product dropped by 1% from 1973 to 1975 and industrial output fell by 10% (Moody 2007). Over the same period, the world’s 25 riches countries saw their economic growth rate fall from 5% to 0 (Harrison and Bluestone 1988; Lichtenstein 2002). In the lead-up to the 1975 world recession, inflation was rising steeply in Western countries. However, it was the external shock of the 1973 OPEC crisis in which Arabian Gulf States acting as a cartel unprecedentedly quadrupled the price they were charging the West for oil, that seemed to herald the beginning of a new era. The OPEC crisis, as an extraordinary – as opposed to a cyclical – phenomenon, was a convenient whipping post for contemporary-era economists trying to make sense of what was happening (Moody 2007; Issawi 1979; Kolm 1977). Indeed, although much contemporaneous literature attached importance to it as the principal cause of the epoch’s malaise, more systemic elements were undermining prosperity from as early as the 1960s. Moody (2007), for example, provides a nuanced thesis about why the long-boom was abating from the mid-1960s. He argues that return on capital deployed in Western industries was falling from around this time. A reason for this was that from approximately the early 1960s, international competition was encroaching on domestic market share in circumstances of near market saturation. In economic parlance, Western industries were moving towards structures of perfect competition in which output settles at a point where, for any particular firm, marginal cost equals marginal revenue, and for practical purposes, return on capital diminishes to an international norm. To the extent that “harder working” capital was responsible for reduced surpluses, conditions for workers started to attenuate. Specifically, from the early 1970s onwards, their wage growth stagnated. In fact, over the next 20 years (from 1972 until the beginning of the Clinton administration), average real weekly wages in the United States fell (almost) every year (Moody 2007). Adding to the problem was the scourge of inflation, which was out of control in the late 1970s period of stagflation (Brenner 2002; Mandel 1978). However, this era was relatively short-lived. What was really happening was real wage growth was slowing because of shrinking growth in surpluses and insidious employer recalcitrance concerning the perennial wage/profit share. Whatever the case, by the 1990s American workers had 1960s purchasing power. In a vain attempt to address the problem, they toiled more. In the 1970s and 1980s, the nation’s employees put in an average of 3 h per week in overtime. In the 1990s, the figure was 4.2 h. By the early 2000s (2000–2004), during the period of slower economic growth and consequent rising unemployment, it was 5.4 h (Moody 2007; Harrison and Bluestone 1988). Life in the West had been getting difficult for ordinary people. It seemed that, with the rise of Japan and the Asia Tiger economies, more of the world was sharing capitalism’s benefits; Fordist-era industry structures of monopolistic competition were outmoded, or at least appeared so.

If, in the 1970s, conditions were turning against Western workers, it is equally true that a hitherto marginal strain of thought was entering the mainstream. It is likely that worsening economic circumstances made it easier to argue that the world needed a fresh approach. The new prescription was ready to be implemented and well-articulated. It came largely courtesy of Austrian-school theorists such as Friedrich Hayek and later his acolytes, the anti-Keynesian/Monetarist Milton Friedman and James Buchanan with his conception of public choice theory (Buchanan 2003). The alternative way of thinking was, at least according to certain commentators, a return to the common sense of the enlightenment and the ideas of eighteenth- and nineteenth-century thinkers such as Jean Baptiste Say (1767–1832), and Alexander de Toqueville (1805–1859), each interested in understanding America’s uniqueness in the new industrial world and, in various ways, influenced by Adam Smith’s (1773–1790) belief in natural law, utilitarianism and the inevitability of progress (Ebenstein 2002, 2003; Combe 1996). Another inspiration for the new liberals came from even earlier thinkers. For example, there was John Locke (1632–1704) who had argued persuasively that people are inherently selfish and, from this, it follows that they have a natural right to defend their life, health, liberty, and possessions. This doctrine ultimately worked its way into the American Declaration of Independence with the expression “life, liberty, and pursuit of happiness.” However, in spite of such an apparently impressive intellectual pedigree, the prescription that was to come from Friedman and his ilk in the 1970s is arguably a radical variant of earlier liberalism. It was, at its core, the notion that government and governance are largely pernicious influences or at least more intrusive than should be the case and its corollary thesis, that laissez-faire capitalism is inclined to remedy both economic and social problems. The world was soon to enter the era of neoliberalism, a shorthand term for what is often thought of in North America as free-market conservatism.

In summary therefore, it was under the twin influences of increasingly lackluster results that beset the plight of Western countries in the 1970s combined with the persuasive skill of scholars and some earlier politicians such as 1964 Republican Presidential nominee Barry Goldwater that much of the world haltingly entered the neoliberal era. Somewhat like the Keynesian pump-priming intellectual foundations of New-Dealism, neoliberalism is not a-theoretical. Indeed, as a cursory reading of Freidrich Hayek’s Road-to-Serfdom (1945) or even Ayn Rand’s Atlas Shrugged (1992), originally published in 1957 (which proposes as virtuous through its exposition of the so-called philosophy of objectivism unchecked ego and ruthless pursuance of self-interest) reveals, part of its problem has turned out to be that it preferences dogma over data. In other words, its unyielding and dispassionate adherence to ideology has guided its implementation.

Neoliberalism’s proponents assert that application of the market principle should be expanded in at least five related ways. First, policy should allow private sector entities to compete to provide services that hitherto have been the preserve of the public sector. In most OECD countries, this project is applied in arenas such as the administration of prisons, hospitals, and schools. It manifests as privatization of national airlines, telecommunications, and welfare systems (Combe 1996; Connell 2010). There is a partially disguised agenda here; if someone wants something, indeed needs it, they should pay for it. The fact that they may not be able to is a short-run impediment and one that reflects negatively on the needy person’s character; the logic being that it is they, through their choices, who placed themselves in the dependent situation (Dean 2009; Cooper 2012). Second, the market solution should be the default option to remedy social – as opposed to economic – problems (Brady 2008; Harvey 2005). Hence, it should not be viewed narrowly as merely a means of enhancing commercial performance as classically occurs when firm managers make choices concerning lease or buy, vertical or horizontal integration, or outsourcing. Third, policy makers should be creative in finding ways to apply the market solution; in other words, they must proactively conceive of new functions and speculate about how an unfettered private sector will fulfill them for profit. In this regard, it is only in the last 35 years that there has emerged in the Western World conceptions of supply and demand for drinking water, body parts, and outer space (Connell 2010). Fourth, the Schumpeterian phenomenon of industry creative destruction is not to be disparaged but rather is ultimately beneficial for myriad actors, including those who lose their jobs when an economic sector becomes defunct (Harvey 2006). Apparently, the idea here is that laid-off workers, irrespective of their age or other circumstances, are presented with retraining opportunities that – if they are responsible citizens – they will have budgeted for during the period when they were employed. Fifth, labor markets, because they are markets, are not exempt from deregulation. This aspect of the agenda is multifaceted. Perhaps its most contentious element concerns antiunionism. Neoliberalists propose that unions create competitive distortions (Braedley and Luxton 2010). They bid up the cost of labor beyond its market value. They are incompatible with the principle of merit-based promotion and appointment. They are a cause of indolence and inefficiency. For these reasons, they are typically a key initial target in implementing the neoliberal agenda. An example of this phenomenon is the case of Augusto Pinochet’s assault on Chilean unions in 1973 which came almost immediately after he seized power. Similarly, there was Reagan’s defeat of air traffic controllers in 1981, Thatcher’s dismantling of the miners’ union in 1984 and, in Australia, the Howard government’s entanglement with dock workers in its first term (1997) and later, after it gained control of the nation’s upper house of parliament, its WorkChoices agenda (2005).

A key upside of neoliberalism for the public is, in principle, the promise of reduced taxes and an ensuing downward pressure on prices. For example, in the United States, the passing of Proposition 13 in California in 1978, a referendum to cap property tax, has historical significance. It is heralded as the commencement of a new policy direction for Western governments (Connell 2010; Chapman 1998). However, in reflecting on changing taxation regimes over the last 40 years, authors such as Connell (2010) argue that although in the USA and other OECD countries tax-cuts have been an animating theme of both conservative and liberal electoral campaigns, government receipts have fallen very little. Rather, there has been a conspicuous shift from direct to indirect taxations arrangements, a circumstance that unambiguously discriminates against low-income earners in favor of wealthy people. In short, neoliberalism in practice promotes regressive tax regimes.

As noted, in the Western world, an emphasis on supply-side economics, the more technical description of much of the tax-related ideology that informs neoliberalism, is mostly associated in an applied sense with the modern-era governments of Reagan and Thatcher in the 1980s (Gould and Robert 2013; Gwartney 2008). The theory had not broken into the mainstream prior to this time, but perhaps came close to doing so during the failed 1964 US Presidential campaign of Barry Goldwater, a man whose time had not yet come. However, history has a way of being Western-centric. In particular, the first real test-run of resurrected liberalism came in 1973 following Augusto Pinochet’s military coup in Chile. The historical importance of Chile in this area cannot be overstated and is eloquently summed-up by Robert Packenham and William Ratliff (2007 p. 9) from the Hoover Institute who note…

The first country in the world to make that momentous break with the past – away from socialism and extreme state capitalism toward more market-orientated structures and polices – was not Deng Xiaoping’s China or Margaret Thatcher’s Britain in the late 1970s, Ronald Reagan’s United States in 1981, or any other country in Latin America or elsewhere. It was Pinochet’s Chile in 1975.

In decisively rebuking his socialist predecessor Allende, Pinochet brought in the “Chicago-boys,” mostly Chilean economic consultants who had spent time at the University of Chicago under the tutelage of ideologues such as Milton Friedman. The job of the advisers was to recast the old order. Henceforth, the market solution was to be the guiding maxim for addressing the nation’s economic and social problems. To support the approach, the regime removed trade barriers, privatized key state-owned industries, created a central bank with authority to set interest and exchange rates independently, cut wages, and privatized social security. In foreshadowing what was to come soon in the United States, William Simon, Secretary of the Treasury, described the new South American dictator as having brought economic freedom to his country (Asen 2009). In fact, despite the USA’s official position that it merely approved of – or at most tacitly supported – the Pinochet regime, several scholars have argued that America proactively paved the way for him through undermining the Allende government using the CIA to stage destabilization initiatives and covert trade-related interventions such as an “invisible blockade.” Authors such as Peter Kornbluh in his book the Pinochet File (2003), Tim Weiner in, Legacy of Ashes (2007), and Christopher Hitchens’s, The Trial of Henry Kissinger (2001) have proposed slightly differing theses concerning US manipulation of South-American politics during the 1970s.

The Chilean dalliance with neoliberalism did not go as planned. In 1986, the Indian economist Amartya Sen provided a postmortem of the experiment. His conclusions were gloomy. In what ended-up being known as the “Chicago road to socialism,” he noted that, by the early 1980s, the new approach had been disastrous, prompting a wholesale buy-back of public assets in an effort to restore the status-quo. Ironically, the State ended-up owning and running more of the economy than it had during the Allende administration, including industries focusing on manufacturing and exporting as well as the banking sector.

The rest of the world’s enthusiasm for neoliberalism was not dampened in the wake of the Chilean experiment. However, even before factoring–in the impacts of, for example, the 2008 global financial crisis, the new way was not delivering its promised benefits or at least not doing so for the majority of people living in OECD countries. For example, in the early 2000s, the “great moderation” in the business cycle was unmistakable on graphs (Summers 2005). The postindustrial more tempered boom-bust sequence seemed to be attributable to certain of neoliberalism’s peripheral elements. These included increased central bank independence, application of the “Taylor Rule” (which specifies that in the long run, an independently operated central bank will raise interest rates by more than one percent for each 1% rise in the inflation rate) in monetary policy and greater within-sector flexibility including elements such as “just-in-time” inventory management but, more particularly, increasingly flexible labor markets (to use the vernacular). At that time, proponents of rationale expectations and efficient markets theory held the intellectual high ground. The market solution was working, but for whom? Analysts such as Mark Weissbrot and Rebecca Ray (2011) had the temerity to compare the period 1960–1980 with the period 1980–2005. They focused on indicators including economic growth and social/psychological measures within low and middle-income countries which, according to at least one variant of the theory (e.g., Moody 2007), should have been first in-line to receive neoliberalism’s advantages. They concluded that, far from such nations being the beneficiaries of a so-called rising-bottom, they had in fact declined on key financial indices. Within the United States – a country that was likely to be at risk of setback when measured in narrow economic terms – it is conspicuous that even social justice indicators have it in 2015 at number 25 out of 31 OECD countries, just above Turkey, Greece, and Chile (Kauder and Potrafke 2015). When further scrutinized, it seems that some sectors of the US population were hit particularly hard by the new social and economic order. For example, researchers such as Case and Coates (2017) and Case and Deaton (2017) have been tracking the aggregate fate of middle-aged white Americans (particularly white American males) without college-level degrees since 1999. They note that the mean mortality-rate for this cohort was declining steadily throughout the twentieth century but in the twenty-first century, in contrast to other wealthy countries, began to rise precipitously. They further conclude that the trend-reversal is mostly attributable to elements such as suicide and drug overdose that, in turn, come from lost status arising principally from job insecurity and career disruption in the age of neoliberalism. Such research is a bombshell. It indexes a larger phenomenon that turns out to be a colossal blind spot for those espousing neoliberalism in theory. For example, in their book The Spirit-Level: Why Inequality Matters, Wilkinson and Pickett (2010) provide evidence that any mode of production – even if it produces across the board rising levels of prosperity – will have adverse aggregate psychological impacts if it also exacerbates wealth inequality. A thought experiment (not taken from the book) serves to underscore this point. Imagine a boss arrives at work and randomly summons one of their employees to tell them that they are to receive a salary increase of, say, 20%. The employee feels elated. Imagine, the same boss subsequently announces to all other equivalent-level employees that they will receive a pay rise of 25%. Now, the first employee feels worse than they did before the boss arrived.

In 2000, once again before the 2008 crisis, Crotty undertook a postmortem on the global economic effects of Neoliberalism. His conclusions were dismal. He wrote (p. 10).

The evidence to date supports neoliberalism’s critics. The promised benefits of neoliberalism have yet to materialize. Global income growth has slowed as has the rate of global capital accumulation, at least for the majority of the world’s people. Productivity growth has deteriorated, real wage growth has declined, inequality has risen in most Western countries, real interest rates are higher, financial crises erupt with increasing regularity, the less developed nations outside East-Asia have fallen even further behind the more advanced and average unemployment has risen.

Hence, over-reliance on the market solution was not delivering its assured dividends even before 2008. In the wake of the new approach, the US middle-class was being decimated and the twentieth century’s promise of social-mobility was now a pipedream. Following the global financial crisis – an event for which it seems likely that no party with skin in the game foresaw – government, the private sector, and an increasingly beaten-down public were unmistakably in reaction-mode. Initially within the United States and elsewhere there was a fleeting return to Keynesian-style pump priming, even by the hard-line Neocons (Stewart 2009). When the postcrisis mess was handed from the Bush to the Obama administration pump-priming and prop-up measures continued, for a brief time, and mostly only for corporate America. For example, following the collapse of merchant bankers Lehman Brothers (LB) and the takeover of Merrill Lynch (ML) by the Bank of America (BofA), the new Federal government invested 85 billion dollars to ensure that merchant bank AIG remained solvent (Boyer 2009). Overall, the US Federal Reserve sponsored massive – although less than recommended – stimulatory initiatives (Institut économique de Montréal, 2009; Boyer 2009). In Europe in 2008, the EU Commission recommended a 200 billion euros stimulus package (about 1.5% of the aggregate GDP of the countries being targeted).

However, in 2010, the G20 summit held in Toronto marked the beginning of a fresh global financial policy orientation of governments struggling to manage the aftermath of the crisis (Le Queux and Peetz 2013; Lo Duca and Stracca 2014). Because the bailouts were now threatening the solvency and credit-ratings of countries around the world, stringency was the new ordre du jour. The plan devised in Toronto was to reduce public debt within G20 nations by 50% by 2013 (Le Queux and Peetz 2013). Long-term austerity had begun. Once again, the State was the standard-bearer for free-market orthodoxy and it was ordinary people who were to take another body blow.

In his book Failed (2015), Mark Weissbrot presents a subversive thesis about the strategic orientation of governments following the 2008 crisis. Weissbrot examined the minutes of regular IMF consultations with member governments (covering 27 countries) for the years 2008–2011. He concludes that participants viewed spending cuts to the public sector and reduced public services as indispensable. This finding is perhaps unremarkable. However, Weissbrot observed something more insidious. Specifically, austerity measures are typically implemented in parallel with an agenda to limit democratic participation. For example, in Europe there has been an emphasis on transferring decision making to unelected bodies and, insofar as possible, to lessen the influence of national governments. In a similar vein, in the United States the lower 70% of those eligible to vote never do and therefore elected representatives disregard their preferences (Kiesa and Levine 2016). Such recent precipitous decline in electoral participation has provided impetus for theorists such as Ferguson to extend his investment theory of party competition to the arena of congressional elections. Ferguson notes that, since approximately 1980, campaign spending is a near perfect predictor of electoral outcomes (Ferguson 1995).

There is a fuzzy boundary between the end of the period of neoliberalism and the emergence of the next – current – era, described here as the “age of crisis.” As Ferguson (1995) notes, there is no special reason to believe that much of the neoliberal agenda is being jettisoned. On the contrary, it has made people who were always very rich and who make electoral donations, even more so. However, what distinguishes the new era is two elements. First, the most conspicuous of the contemporary Western-style public policy makers are, at least overtly, contemptuous of experts and disdainful of approaches that seem based on theory and philosophy. As noted, in the modern-era, the way a strategy makes one feel mostly eclipses consideration of whether theory predicts that it will work. Second, despite a now more disguised but still hard-core continued preoccupation with the substance of neoliberalism, the contemporary epoch’s approach to governance resurrects some elements of New-Dealism, or at least pays lip-service to their value. Such elements include isolationism, protectionism, and xenophobia. In the very recent past, President Trump’s preoccupation with building a wall between Mexico and the USA, Brexit, the rise of the alt-right and resurrected plans to reinstate bygone era tariffs and trade embargos are evidence of a widespread popular yearning for the good-old-days. Despite some aggressive rhetoric, it is unclear whether these agendas will be implemented or even if they could be. It is also not clear whether contemporary-era leaders have commitment to their implementation. Indeed, it seems likely that they do not. After all, as argued, compared to their predecessors, the leaders and decision makers of 2018 are mostly not theory-driven and as such unmoored from ideological allegiance.

To prosecute the case that contempt for theory and superficial preoccupation with good-old-days socialist-style protectionist elements (at least for those in the in-group) are distinctive hallmarks of the new era, it is instructive to examine two recent Western-World phenomena. First, there is the celebration of the nonexpert as a panacea for complex problems. The key word here is “celebration.” Modern history is replete with examples of politicians who were not technically competent. Indeed, in the case of the USA at least, this was part of the original vision of the founding fathers with their conception of the citizen-statesman (Gould et al. 2017). What delineates the new epoch however is lauding ignorance as a virtue. An early sign of the change came just before the 2008 US Presidential election when Republican nominee Senator John McCain chose as his running mate then Alaskan governor Sarah Palin. Ms Palin was chosen according to McCain because although not an expert in anything vaguely relevant to the most pressing problems faced by the country, she does not parse words and has common sense (Dunn 2011). For example, during the 2008 Vice-Presidential debate held on the 2nd of October when she was asked which of her side’s policy plans would have to be scaled-down due to the global financial crisis, Palin did not miss a beat. She said ....“How long have I been at this – like five weeks?” In an effort to show that she related to ordinary people, the same Ms Palin, inquired confidently, “What is it that a V-P actually does?” when asked on the 25th of June 2008 by Larry Kudlow of CNBC about the role. Second, there has been the rise of populist fringe movements that manifest anger and resentment but have few creative, well-defended public policy suggestions. This is not a partisan matter and occurs on the left and right of the political spectrum as in the case, on the one hand, with Occupy Wall Street and, on the other, with the Obama-era Tea Party movement.

Management in an Age of Crisis: Trump, Brexit and Those Incompetent and Co-opted Experts

Donald Trump was never going to win the 2016 American Presidential election. He was not even going to throw his hat in the ring. Prior to his official announcement made in Trump Tower on 16 June 2015, he certainly gave indications that he would run. However, he was really just an overly indulged carnival-barker with a twitter account. His US-Mexico wall idea was comparable to mid-twentieth-century eccentric millionaire Howard Hughes’s H-4 Hercules concept . The showy Spruce Goose, as it was better known, was an enormous but useless seaplane that was originally trumpeted as the ultimate solution for Axis-power submarines sinking US supply ships in the Atlantic after the country entered the Second World War. Members of Roosevelt’s War Cabinet were unimpressed with the idea, but ultimately Hughes Aircraft Corporation secured a development contract with the Department of the Navy to build three prototypes (McDonald 1981). The Wall was to be Trump’s Spruce Goose, expensive and inefficient, flamboyant and headline grabbing, but in the end, more about style than substance.

It was not that Trump did not have marketing skill and the sectional appeal that inevitably comes from hard-hitting rhetoric, an ostensible real-estate empire and a list of authored books with menacing titles like Think Like a Champion (2009), Time to get Tough (2011), Think Big and Kick Ass (2007), and the Art of the Deal (1987). Undeniably, his capacity to manipulate the media was impressive. It was reminiscent of actor Orson Wells who, in 1938, created mass panic in the tristate area when he read aloud on a wireless broadcast H.G. Well’s War of the Worlds; announcing that Martians had landed in a carpark at Grover’s Mill New Jersey and 7000 US servicemen had been deployed to face them down. Orson Well’s career took off following this incident. Similarly, Trump’s posturing about being President aimed merely to secure a fifteenth series of the Apprentice with NBC, to promote brand Trump, and to fire a warning shot across the bow of the real contenders who, according to him, were losers and needed to lift their game. At 68, Trump was too old, lacked discipline, knew nothing about bureaucracy, championed ridiculous and impractical ideas, and had no experience with elected office. Besides, Trump could not run because he could not allow his tax returns to be released or certain of his disreputable business dealings to be scrutinized. In declaring Chap. 5 bankruptcy, he had walked away from the Taj-Mahal Casino in Atlantic City with debt owing to ordinary people; dishwashers, painters, carpenters, plumbers, glaziers, and drapery installers (and this was not the first time). He had a class action against him by former Trump University students. Depending on one’s political sympathies, he was either a branding genius or a shameless narcissistic self-promoter. Either way, the idea of him being a candidate was Barnumesque, an off-the-wall publicity stunt but nonetheless patently absurd as a matter of practice. This was what the experts said. They were wrong.

When Trump did announce his candidacy, it was clear he could not win the primaries. He was competing in a field of 17, which included experienced State governors, Senators, and credible business people. There had not been such a large showing on the Republican side since the Lincoln-Douglas election of 1860. Moreover, Trump had no background in the GOP, no debate experience, no SuperPac, and continued to say things that embodied simultaneously incoherence and viciousness. He made crude and offensive remarks about women and seemed to have an adolescent’s preoccupation with their weight and appearance. He ridiculed and insulted people who were unable to respond. He showed limited engagement with complex issues and an unyielding preference for the glib and superficial over the scholarly and analytic. Perhaps most offputtingly, he crossed a sacred line by casting aspersions on the bravery and patriotism of Senator John McCain. During the Vietnam War, McCain, then a Navy pilot serving on the U.S.S. Forestall was captured and horrendously tortured for 6 years by the Vietcong in circumstances where he ultimately could have secured his freedom but refused to do so unless every prisoner being held alongside him was also released. Trump said about the Senator on 18 July 2015 at a campaign event at the Family Leadership Summit in Ames, Iowa…

He’s not a hero. He’s a hero because he was captured. I like people who weren’t captured.

Once again, the experts said Trump would not – could not – become the Republican standard-bearer. They were wrong – for the second time.

Following Trump’s nomination on May 3, 2016, he was not going to beat Hilary Clinton in the general election. This could not happen because the whole of America participates, or at least has the right to participate, in deciding who will be their nation’s President. The influence of the lunatic fringe was now diluted. The polls were showing that there would be regression towards the mean, attenuation. The Electoral College system, while not perfect, had been conceived by the wise founding fathers and thoughtfully refined after the Civil War. It invariably produces an optimal solution. Although the Democratic nominee was not especially inspiring and came with baggage, survey data indicated that she would naturally fall into the role. It was not even necessary for her to campaign much, hold press conferences, or go to red states. All she needed to do was what she was already doing and was comfortable with; stay mostly in New York and California and appear on shows that appealed mainly to liberal-minded women such as The View and Ellen. In the second Presidential debate held on the 16 of September, during a clash about the causes of poor quality public schools and ageing infrastructure, Trump seemed to wear it as a badge of honor that he does not pay his share of tax. “That makes me smart,” he said. Expert commentators on CNN denounced this utterance as unprecedentedly injudicious (e.g., Diaz 2016 27 September). It defied logic and reason. Trump had forgotten that the bulk of people who would soon vote were taxpayers. In asking why they – the battlers – should be subsidizing a millionaire at tax time, they would join the dots about what sort of man Trump was. However, this was all prologue. Later his candidacy would fall off a cliff. On October 7, when the tawdry Access-Hollywood tape came out which revealed an exchange in which Trump shamelessly boasted that his fame gave him the green light to sexually assault women as and when he felt like it, he was undoubtedly finished. Now it was a matter of the iron-law of arithmetic. His base of voters was, from that moment forward, to be only some percentage of males, 50% of those who would be casting a ballot. Boolean logic predicted he would soon be rendered a historical footnote. He was about to become the twenty-first-century’s Gary Hart, the 1988 Democratic party frontrunner who was forced to drop out in favor of Michael Dukakis when it was revealed that he had a girlfriend, Donna Rice, while he was married. Inductive reasoning was relevant here. Hart was a former diplomat. He was an intelligent and attractive man. However, what he did or was alleged to have done, despite being comparatively trifling, finished him. Ipso facto, 28 years later the star of the Apprentice was soon to get an ignominious dose of his own medicine; he would be fired. This was, once again, the consensus of experts offering narrowly focused analyses. They were wrong – for a third time. Zero out of three for the intelligencia concerning their analysis of the rise of Trump.

The story of the 45th US President’s political ascendancy embodies the paradox of the last 50 years. Experts have let down the public – at least most of them – with their prescriptions for societal betterment. Whether well intentioned or disingenuously attempting to create a pretext which allows the wealthy to further enrich themselves at the expense of everyone else, they have often been wrong. Their remedies and proselytizing have seemingly arisen from disciplined analysis. Wallowing in the intellectual debris of postindustrialism, more experts used more theory and logic to misread who was to be the President of the United States in 2016. Despite the risk of throwing the baby out with the bathwater, the decimated middle-class and those worse-off who would never likely get to be part of it were fed-up with the experts, and not without justification. A new and dystopic era had emerged. It was post-neoliberalism – postindustrialism. For the first time since Henry Ford worked out how to combine capital with labor and the state thoughtfully responded to make sure that no one was left behind, the world was a-theoretical. Philosophy was no longer to undergird public policy. The airplane was being built as it was being flown and the way people felt was more important than ideas defended using evidence-informed application of reasoning. Action without theory may have haltingly started with the emergence of Sarah Palin but had now reached its crescendo with the election of Trump. The rise of the brazenly brash and overtly anti-intellectual American politician was a manifestation of a larger phenomenon. Theory and ideological commitment was being cast asunder everywhere. Consider the UK’s Justice Secretary, Michael Gove’s comment about Brexit, made on June 6, 2016, before the Presidential election. In refusing to name any European economist who thought Britain’s exit from the Union was a good idea, Gove said in an interview with Faisal-Islam

I think that the people of this country have had enough of experts with organizations with acronyms – saying that they know what is best and getting it consistently wrong, because these people – these people – are the same ones who got consistently wrong (quote reproduced verbatim).

The population sided with him. Throughout the Western World, the public was giving experts the salutatory message that they are not as good as they think they are, a missive for which even the most enlightened person of reason should have sympathy. Indisputably, there is abundant evidence that the clever ones have not done very well. In their mediocrity, they have contributed to the dawning of the new era, both through their bad advice and elitist and condescending way of conveying it. It may even be worse than just a case of incompetence, a case well laid out in economist Thomas Sowell’s book, Intellectuals and Society (2010) wherein the author argues that much social commentary and public policy development work is substandard because, unlike when running a business, for example, those who produce theory or instantiate it through regulation mostly have nothing at stake if they get it wrong. Indeed, a more pernicious thesis also exists, the possibility that experts are being manipulative, clandestinely working, for example, for those with money and power. In Kurt Andersen’s recent book Fantasyland: How America went Haywire (2017), this theme is developed and its relationship with conspiracy theories, the blurring of the boundary between news and entertainment and the rise of populist fringe movements well explored. Whatever the case, the new era of public policy is skeptical, indeed disdainful, of ideas. Hence, the age of crisis is also the age of the eschewing of the pointy-heads. Insofar as actual policy manifestation is concerned, it is a disjointed and decontextualized patchwork of nineteenth-century notions about the value of laissez-faire capitalism, industrial age forms of protectionism and attendant xenophobia and ad hoc economic winner-picking. How long this will last or what will be its results could not be less clear.

From the beginning of the twentieth century, there have been attempts to periodize approaches to management philosophy, public administration, and the shifting economic milieu. Each such effort has typically focused on a slightly different object of analysis. Elements established as analytically consequential have included approaches to people management (e.g., Barley and Kunda 1992), the beginning and end-points of economic long waves (e.g., Kondratieff and Stolper 1935; Schumpeter and Nichol 1934; Rostow 1978; Sterman 1986), and recently the moral and ethical legitimacy of the public versus private sector (e.g. Park and Gould 2017; Gould et al. 2017). Some proposed frameworks have been deterministic. These embed within them the notion that human endeavor advances the development of theory and, at key tipping-points, the framework being applied is unrecognizable. When such thresholds are reached, a new era is commenced. For example, there is Barley and Kunda’s (1992) classification of management ideologies: industrial betterment (1870–1900), scientific management (1900–1923), welfare capitalism/human relations (1923–1955), systems rationalization (1955–1980), and organizational culture (1980–1995). In a similar vein, there is Gantman’s (2005) typology of the nineteenth-century’s liberal capitalism, the early and mid-twentieth-century’s organized capitalism and the disorganized capitalism from the late twentieth century. In this chapter, several of the key frameworks for understanding management and public-policy recent history have been collapsed together and summarized as the late industrial-age era of Fordism broadly conceived and the epoch of neoliberalism. Each of these eras, although an approach to public policy, has had a symbiotic relationship with philosophy concerning the management of private capital. In other words, the way the State handles matters of public administration has invariably created something of a template for people management generally; or perhaps it is the other way around. Indeed, the causal direction of this relationship remains unclear and will not be addressed here. Rather, for present purposes, what is noteworthy is that historically there has been synergy between public policy and approaches to private-sector governance and each such endeavor has typically been theory-driven.

An understanding that the rupture between industrialism and postindustrialism has much to do with insidious long-term public policy failure and unexpected moments of crisis is a solid starting point for coming to grips with the nature of modernity. But it is only a starting point. Indeed, the road ahead for the sense-makers is tough and the scholarly agenda plagued by unique challenges. Some of these pertain to interpreting what is happening. Others concern drawing conclusions about how – apparently shambolic elements – will stabilize, if they will stabilize, and/or whether they are best viewed as transitional. Another research agenda is about the indirect impacts of public policy. For example, how does its substance and process of creation impact management more broadly and the management of private capital? These are arguably pressing issues. They are dealt with – head-on – in the pages that follow.

In this section, outstanding business and management scholars explore aspects of the contemporary world of work and governance in circumstances where the context is shaped by long-term public policy failure, moments of economic and social catastrophe, and community skepticism. Authors delve into facets of the lead-up to the present epoch as well as its more consequential current manifestations. The section has six more chapters (Chaps. 41, “Labor and Employment Practices: The Rise and Fall of the New Managerialism,” 42, “A Return to the Good Old Days: Populism, Fake News, Yellow Journalism, and the Unparalleled Virtue of Business People,” 43, “Why did the Great Recession Fail to Produce a New New Deal in the USA?,” 44, “Trade Union Decline and Transformation: Where to for Employment Relations?,” 45, “The New Executive: Interconnected Yet Isolated and Uninformed – Leadership Challenges in the Digital Pandemic Epoch,” and 46, “Conclusion: Management Theory in Crisis”). Chapter 41, “Labor and Employment Practices: The Rise and Fall of the New Managerialism” by Professor John Godard from the Asper School of Business at the University of Manitoba, is an expose of the rise and fall of the new managerialism which examines, since the 1950s Golden Age, supervisory practices in Western – mostly USA – workplaces and details their consequences, too many of which have been largely unforeseen. Chapter 42, “A Return to the Good Old Days: Populism, Fake News, Yellow Journalism, and the Unparalleled Virtue of Business People” authored by Professor Mark Belnaves from Kuwait’s Gulf University of Science and Technology, addresses one aspect of technology’s role, the rise of the digital persona. Belnaves argues that this rise is a somewhat overlooked but undoubtedly sinister cause of (so-called) trusted sources no longer being as advertised. Chapter 43, “Why did the Great Recession Fail to Produce a New New Deal in the USA?” by Professor Jon D Wisman from American University (Washington, D.C.), explores how and why a wealthy elite’s command over ideology was significantly delegitimized during the Great Depression, but remained essentially unchallenged during the Great Recession, with the consequence that whereas a levelling of income, wealth, and privilege occurred in the wake of the earlier crisis, its widening has characterized the later one. Professor Bradley Bowden, from Griffith University, in Chap. 44, “Trade Union Decline and Transformation: Where to for Employment Relations?” writes a focused piece about the marginalization of unions and organized labor over the last 50 years. In this piece, Bowden shines a light on a key myth about the phenomenon. Specifically, he argues that it is not so much union decline which should be our analytic focus but rather how the construct of unionization has become largely the preserve of high-paid professionals; now well serving the interests of this cohort and neglecting organizational labor’s traditional constituency, the low-skilled/low-paid. In Chap. 45, “The New Executive: Interconnected Yet Isolated and Uninformed – Leadership Challenges in the Digital Pandemic Epoch” by Professor Kathleen M. Park (Boston University and Research Fellow at MIT’s Sloan School of Business) describes and interprets recent shifts in executive thinking. Her thesis on this topic explores the theme of the paradox of decline in ethical action in combination with a rise in ethics education. It also highlights the problem of an increasingly narrow focus in management thinking and priorities in the lead-up to the age of crisis. The conclusion of this section (Chap. 46, “Conclusion: Management Theory in Crisis”), written by Professor Jean-Etienne Joullié, from the Gulf University for Science and Technology and Laval University, argues that in tandem with public policy failure, there has been failure on the part of management theorists to position their discipline in an appropriate epistemological framework. Joullié argues that their pretentions to establish management as science, on the same footing, for example, as physics, have fallen short. To the extent that the practice of management influences – and is influenced by – public policy, Joullié’s chapter represents an appropriate conclusion to this section. Indeed, it reveals something about the systemic, and to an extent, exponential nature of decline. Insofar as the world of work and employment is concerned, it also sheds light on another aspect of how experts have failed to deliver.

Cross-References