This chapter explores how it is that we make reliable and systematic errors in our thinking and how these systematic biases can affect the decisions that we make.

‘The general root of superstition is that men observe when things hit, and not when they miss; and commit to memory the one, and forget and pass over the other’. (Francis Bacon, 1561–1626, English philosopher of science)

Decision-making by humans is biased in a variety of ways. Humans are often very poor decision-makers, failing to take account of important information that can be critically important for decision-making and action. Francis Bacon, in the quote above, gives one very good example—humans are good at taking account of hits, but are poor at remembering misses. Hence, the huge number of superstitious beliefs about ‘signs’ that predict the weather, or the stock market, or whatever. Departures from what might be expected on the basis of a purely rational calculation following the rules of logic and what economists refer to as ‘utility maximisation’ are pervasive features of human thinking. It is also starting to become clear that these biases serve outcomes that are other than a straightforward utility or reward-maximising function for individuals. Biases in decision-making may serve to reinforce affiliation or strength of the bonds within a tightly defined group, for example. Biases may also serve to punish behaviour that is seen to be transgressive—taking too much reward for too little effort (free-loading), for instance. In this case, there may even be a willingness to engage in costly punishment—to punish a member of one’s own group, or to punish an opposing group, even if there is an economic cost to oneself and to one’s group (Heinrich and colleagues 2006). The question posed by the group is simple: ‘what sort of a hit are we willing to take in order to teach the other lot a lesson?’ Both history and experiments show that humans are willing to take punishments in order to teach the other side a lesson—costly punishments that would not be prescribed by a purely rational calculus. The other side in the dispute may assume that there will be no costly punishment, as in believing that ‘they won’t impose trade sanctions on us. After all, they sell us a lot of cheese (or cars, or wine, or shoes—or whatever)’. But they can, and they will. A refusal to recognise in a negotiation that one side will act contrary to their own narrowly defined economic self-interest to ensure a broader political, legal and social lesson is learned is a very common mistake.

Cognitive biases are a pervasive and universal aspect of human thinking. In essence, they are systematic biases in gathering information and in thinking that lead to a deviation from rationality calculations or even simply what is demonstrably ‘good and fair judgement’. The case study in the Chapter 1 of this book provides many examples of many cognitive biases, and we will discuss a few them below. There are huge numbers of biases—the Wikipedia entry for ‘Cognitive Bias’ lists more than 175 of them. The famous cognitive scientist, Daniel Kahneman, was awarded the 2002 Nobel Memorial Prize in Economic Sciences, his citation reading that his award was ‘for having integrated insights from psychological research into economic science, especially concerning human judgment and decision-making under uncertainty’ 2011. His book ‘Thinking, Fast and Slow’ (2011) explores many cognitive biases, many of which he discovered and explored experimentally; it is important reading to supplement this chapter. We are aware of many of these cognitive predilections already. Phrases such as ‘wishful thinking’ or ‘cherry-picking’ reflect the idea that when we think about a topic, or argue in favour of a particular point of view, that our thinking is in some way biased. We are engaging in a form of argument where our thinking doesn’t fairly, reliably and accurately reflect reality or the truth. Instead, we are choosing evidence to suit our wishes and beliefs rather than actually reflecting reality as best we can.

There are many examples of this kind of thinking in the real world. Politics in particular is bedevilled by this. People who identify with a political candidate will deny data or evidence that is contrary to their point of view—in other words, the evidence is retrofitted to one’s point of view rather than adjusting one’s point of view to the data. A wonderful and egregious example of this arises with respect to political polling for elections. In the 2011 campaign for the Presidency of the United States, the political poll aggregators consistently showed that Barack Obama was reliably leading the Republican nominee, Mitt Romney, to the chagrin and disbelief of (many of) Romney’s supporters. They alleged, without meaningful statistical sampling evidence, that the polls were skewed. Thus, the ‘unskewing’ industry was born: Dean Chambers, a Romney supporter, was convinced that the polls were skewed in some way (search for ‘unskew polls 2012 dean chambers’). The reality of the polling was that the burden of support in the election was against Chambers’ preferred candidate, and an honest analysis of the polls would have told you that ahead of time. An honest and veridical analysis, though, comes with a price: acknowledging that your preferred candidate is doing badly at the polls. It’s much easier to spend time ‘unskewing’ polling data than getting out the vote for your candidate. You might actually have to go out in the cold and rain to try and convince uninterested voters on their doorsteps! Of course, polling is a complex technical affair, involving issues to do with sample size, confidence intervals, sampling demographics and a whole host of other variables. In the US example, sites like the Princeton Election Consortium (run by the neuroscientist and data analyst, Sam Wang) or 538.com give a sense of the technicalities involved and therefore how individual judgement can go so terribly wrong. Furthermore, the memory that people have for what the polls showed, and what they actually showed, can be very faulty indeed. And, occasionally, pollsters do get things wrong when they make modelling assumptions about the relationship between their polling and the underlying trend in reality. And some will do a better job of it than others. Where biases become maladaptive is precisely in these cases: information gathering can be conducted over an extended period of time, and the deliberative processes associated with the information gathered can also be conducted over a prolonged period of time. The key point of course is having time—more good data are better than less bad data, all things being equal, assuming of course that decision-making is being driven empirically—i.e., that it reflects the reality of the world as it is rather than as one might like it to be. While the singular Steve Jobs may have been able to deploy a ‘reality distortion field’ to get his way in business, and to impose his very particular vision on business, he also had many failures (the Apple III, the Lisa, the hockey puck mouse, NEXT, Apple TV…). Even he could not distort reality so much that it would conform to his vision of what it should or could be.

Thinking about Cognitive Biases

The sheer number of cognitive biases that have been identified (175 or more; Kahneman 2011; Nesbitt 2015), and the overlap between at least some of them, is itself a major problem. This statement alone should convince you of their reality: humans have a limited capacity to process information arising from the world beyond the brain, to combine this knowledge with prior knowledge they already have stored in their brains, and a limited capacity process this information quickly and efficiently in order to serve the adaptive needs of the present. This is where one type of bias may arise—from our limited capacity to process the necessary information to make a decision. The effect is one that results in our brains doing a ‘good enough’ job, most of the time, than a perfect job all the time. After all, if I mistake that moving shape at the edge of my vision for a tiger—even though it is actually a stippled bush, moving in the wind—little harm is done if I run away or adopt a defensive posture on the basis of a rapid decision made with imperfect information. On the other hand, if it is a tiger, and it is hungry, well then I am done for, if I take too long to make a decision about just what that ambiguous moving shape is, just because I need more information to identify the tiger with certainty!

The huge number of identified cognitive biases has led to a variety of classification schemes to help make them easier to understand, and thence to identify, and where possible, to work around. One especially useful scheme has recently be elaborated by Buster Benson (2017), who has classified cognitive biases along four, straightforward, problems that our cognitive biases help us to solve. Biases, as Benson reminds us, can be adaptive because they solve quickly and efficiently problems arising from ‘information overload, lack of meaning, the need to act fast, and how to know what needs to be remembered for later’. A ‘quick and dirty’ solution that solves the problem of the ‘now’ is the one that will be used. The problems that biases can shortcut can be grouped as follows:

  1. 1.

    Too much information: there is too much information in the world. You have select, and select quickly, in order to decide, in order to act. What do you select? Why? You are a CEO and your major country units have seen a sudden collapse in sales. You have lots of fine-grained information from point of sale terminals and from your warehouses. More information that you know what do with: sales per employee, sales per region, sales by market sector, sales by sales force team; you have information on the cost of sales, regulations, non-tariff barriers, local tariffs. This is what happened to traditional camera makers; it is what happened to the sales of compact disks. It happens quickly and predictably when a previously patented drug goes off-label, and generic manufacturers move in. It has happened to taxis when companies like Uber move in.

  2. 2.

    Not enough meaning: often, the meaning of the information we have to hand is not obvious, because we don’t have enough information, or the information is of poor quality, or has been collected in a biased fashion. What do you do? Why? You are a comic book producer, and your sales are collapsing, and fast. People are moving away from newsprint. What does this mean? Does it mean you become a digital comic book producer? Will that save you? If you were Marvel Comics, it meant that you were no longer a comic book producer. It meant that quickly you became a company in the entertainment business, with interests in movies and associated products, and you owned a whole universe of characters. This is your new business.

  3. 3.

    Need to act fast: sometimes, decisions must be taken quickly, and either problem 1 or 2 is present. You are a doctor, and the patient is bleeding out. You have lots of one type of information (there is blood, and lots of it), but little of any other information (Where is the blood coming from? Why isn’t it clotting?). You are a CEO, and one of your major country units has seen a sudden collapse in sales. You haven’t visited for a while. What’s going on? Now, what should you do? Who’s to blame? Someone? Maybe no one? Did technological changes render you redundant? Did you persist with attempting to sell horses for personal transport despite the advent of the motor car? Do you transform your business quickly from something you know intimately (grass-eating engines, i.e., horses) to something you know nothing about (internal combustion engines)?

  4. 4.

    What should we remember? Again, there is too much information, and much that we need to know, we can know offload to external cognitive devices like books or the internet, or indeed the brains of specialists in the area. We need therefore to select information, and we select based on what we already know and can readily hang new knowledge onto. Maybe the sales collapse has resulted because of non-tariff barriers that have stalled our product in some obscure customs warehouse in the middle of nowhere. We remember that happened once before selling into a new territory. Maybe that’s what’s happened. You chase the problem you remember a previous solution for, rather than trying to find out objectively what’s actually happened. In the case of CD sales, a new digital infrastructure, combined with file sharing, initially sandbagged your business. An inability to adapt to music streaming services exacerbated the problem. And trying to shut them down will have as much likelihood of success as stagecoaches opposing the widespread adoption of the motor car.

Remember, there are many cognitive biases (we will name and explore some of them below)—and they may indeed be adaptive, depending on the problem you are trying to solve. Where the problem must be solved rapidly, with incomplete information and a high degree of ambiguity, decisions made and actions taken, biases will most likely be adaptive. Where the problem is ambiguous, the information gathering for it takes time, where there are extensive deliberative processes involved, and the outcomes are uncertain or arguable, cognitive biases are likely to be present, and they are likely to cause a deviation from an optimal course of action. This happens most often where the end has already been decided (usually for ideological reasons), and the evidence collected, and the strategy adopted, are retrofitted to ensure that the end chosen occurs. Of course, reality doesn’t bend to our wishes or biases so easily, and disaster may await. Hence, the failure of ‘unskewing’: the polling data truly reflected the underlying reality.

Tom Spengler’s Cognitive Biases in Action

The scenario offered in Chapter 1 is fictional, but it has elements recognisable from the history of corporate mergers and acquisitions, where companies do attempt friendly and occasionally hostile mergers, acquisitions and takeovers. Sometimes these mergers or acquisitions are very successful and create value; sometimes they are not and are disasters that destroy value. The scenario presented will be used here as a case study to illustrate some of the most important cognitive biases and how they distort the judgement of a supposedly hard-headed, numbers-oriented business person.

Tom saw what he thought was an opportunity, and then pushed hard for it, creating a team that did his bidding, and seemingly had won over his board, but hadn’t, it seems, consulted his major shareholders who had a great deal of say in the outcome too. He also seemingly did not give a great deal of thought to his opposite numbers in the companies he was hoping to merge with. They obviously had thoughts of their own and a willingness to see them through. It seems Tom give very little time to thinking about the downsides of what he was proposing. He also pretty obviously was neglecting his own brain health, possibly to the detriment of his optimal cognitive functioning (by getting little sleep and aerobic exercise), which led him to use some mild stimulants (caffeine and nicotine). Heart health seems to have been a problem for him too—overweight, flushed face, smoking and eventually a heart attack.

Pre-Meeting Biases

Tom seems to have engaged in a combination of confirmation bias and the focusing illusion. Confirmation bias (sometimes called myside bias) occurs when you collect evidence or information that supports or confirms your own particular point of view and you discount information that is contrary to your point of view. As the novelist George Orwell put it in his famous political novel, 1984, ‘the best books…are those that tell you what you know already’. The focusing illusion is a cognitive bias that emphasises only upside arguments (local benefits), but ignores costs. Tom certainly seems to have considered only the positive aspects of the merger and has ignored the negative aspects of the merger. There are costs to the merger in time and money that should have given pause. More than this, Tom has assembled a team that seems deliberately to have been constructed so that they will do his bidding, and not oppose his wishes, or tell him unwelcome news. Tom has also constructed a financial case, using a biddable financial analyst, that entirely suits his own wishes, and which misprices the risks associated with the merger. Moreover, he is also trying to make good the costs associated with his previous failed attempt at a merger—the ‘sunk cost’ fallacy. Economists emphasise that money spent and lost is just that—decisions should be made at the margin rather than trying to recover previously lost costs. These are gone, so forget about them: instead, concentrate on what is the correct thing to do now, in the light of what you now know, and forget past costs. An underlying problem with the sunk cost fallacy is the more general problem of ‘loss aversion’. Generally, we dislike losses much more than we like gains and therefore prefer try to make good losses. We won’t sell shares that have lost value, preferring to hold on to them in the hope of them coming good. Crystallising the loss is itself unpleasant. Underlying changes in brain regions that compute losses and gains reflect decreases in brain regions that represent reward, among other things (see Tom and colleagues 2007).

Another problem that may occur is groupthink—a well-known and sometimes misunderstood idea. Groupthink is not the idea that a group of unconnected but demographically similar individuals happen to think the same thing. Groupthink occurs under very specific conditions: when a group makes poor decisions because of high levels of within-group cohesion and a desire to minimise within-group conflict. This might happen in an exhausted, embattled and worn-out Government Cabinet, but can and will just easily arise in the sort of group that Tom has assembled, supposedly to drive the merger, but which really exists to do his bidding. Working under some degree of duress (the threat of being replaced or fired) for a boss who appears to be something of a bully, and who is very certain of his own judgement, will be stressful and difficult, especially for those who wish to do a good job. Under these conditions, the necessary critical analysis simply does not occur. We have seen a different type of groupthink seize individual behaviour many times in financial and other markets. Recent examples are the contagion within financial institutions associated with the dotcom boom (and subsequent dotcom bust). A more recent example is the sudden acceleration and deceleration of property values throughout the western world caused in part by competition between financial institutions for what they uniformly misperceived as the capacity for property prices to move in an upwards direction only. Groupthink can be reduced by the group having an extensive network of weak ties to other individuals and groups. Weak ties can be informational ties: ties to others can provide us with novel ideas and knowledge and provide a route to a ‘reality-test’ planned courses of action. An extensive national and international weak tie network might provide information that would otherwise not be available. Another way to avoid groupthink is to deliberately design group deliberations to question, test and probe the conclusions of the group—this issue is discussed in detail in Chapter 6.

Another pervasive cognitive bias is known as ‘identity protective cognition’ (Kahan 2007). We humans are members of social groups and social tribes. Our social groups and social tribes can be very rewarding and comforting and can come to represent very important aspects of our personal identities. Social scientists have often noted the exceptional strength of intra-group ties for social groups that are at best fringe in their relationship to what we know of the world scientifically. The moving target presented by the anti-vaccination community provides a fine example of a community impervious to reason and logic and, more particularly, to data. If you are a member of such a group and you are outraged by what I have just said, take a deep breath. Don’t critique it. Think: are you engaging in identity-protective cognition? Are you refusing to entertain evidence in a particular way? Are you discounting the views of scientists merely because the group you are part of consists of lots of people who are opposed to vaccines because they are ‘unnatural’? (Actually, nothing could be more natural than a vaccine, and nothing could be more unnatural than death in a car crash.) A truism of human behaviour is that people are incentive-driven (even by perverse incentives). Tom was paying his people well: it was in their short-term financial interests to go along with his plans. The American writer, Upton Sinclair, probably explains it best: ‘It is difficult to get a man to understand something when his salary depends upon his not understanding it’. This will have affected Tom’s team’s thinking: the certainty of getting fired, and losing a salary, versus going along with his plans (which might have turned out well, after all).

Fundamental Attribution Error

Solely focusing on individuals and their behaviour and ignoring the situations within which their behaviour occurs is known as the fundamental attribution error. It is a cognitive bias caused by the salience of the person and the relative invisibility of the system (group norms, laws, rules, etc.). Criminals convicted of crimes will typically blame the situation (‘I was provoked’), and observers will blame the person (‘you didn’t exercise self-control, despite provocation’). The lesson for organisations is simple: changing personnel is not enough to solve problems, because the dysfunctional system itself persists. We need substantial systemic changes too. In the case of political decisions, for example, these are often taken within a group context (think Cabinet collective responsibility), even though the policy itself might be strongly identified with a single individual, such as the responsible government minister. Behind the minister sits the invisible ministerial department, special advisors and others—but our bandwidth is limited, and it takes a special effort not to focus solely on the person, but also to try and see the system around them.

Perspective Taking

Marcus Aurelius, the Roman Emperor, famously said in his ‘Meditations’ that ‘everything we see is a perspective, not the truth’. He was hinting at the idea that all we can do is sample some fraction of the information available to us in the environment and that we inevitably adopt a particular perspective when trying to understand things. This perspective may blind us to what others might be thinking. An important task in business and management is to try and understand and identify the motivations of others—to engage in ‘perspective taking’. Perspective taking may occur during meetings or negotiations or even at a distance, where evidence to interpret the intentions of others may be scant or non-existent. In these cases, reasoning from one’s own dispositions and biases and assuming that the other person or team will think and behave likewise is a common phenomenon. It is the assumption that others will see the world as we do. Tom made this mistake: he was blind to the motivations of those in the companies he wanted to merge with. He took the view that they must see the world as he saw it, and, of course, they did not. Atticus Finch, the protagonist lawyer of Harper Lee’s novel, ‘To Kill a Mockingbird’, said pointedly that ‘You never really understand a person until you consider things from his point of view’. The effort to try and see the world as others see it—especially those with whom you might be negotiating, or with whom you might be in conflict—is difficult, but very valuable (see Wang et al. 2014).

Leadership

There is an expensive, modern obsession in organisations with ‘leadership’ (Alvesson and Spicer 2016), which, when attained, offers a kind of a magic acid to dissolve away the many problems that may arise in business. Leadership is often regarded as a central, vitally important, cloak or a mantle that can be easily adopted when a person is put into a position of leadership. No such obsession exists with ‘followership’, curiously! In our case study, Tom Spengler is certainly very concerned to show that he is ‘a decisive and strong leader’. There are two profound errors that arise as a result of this focus on leadership. The first is one that has already been mentioned, the fundamental attribution error: the mistaking of the person who is at the ‘head of the organisation’ or system for the system or the organisation itself. The second is the assumption that leadership, when accompanied by some form of charisma, authenticity and capacity to touch individuals at their core, will somehow solve the problems arising within organisations. Many businesses will use language around what it is that a leader is supposed to do. They are supposed to ‘make the business world-class’ or ‘gold standard’; they should be able to entrain or activate the business’s ‘noble purpose’; they must have a ‘vision’; they must be ‘thought leaders’; they must be ‘horizon scanners’. Time spent on this kind of high-flown activity is, of course, time spent away from engaging in real work.

It is a remarkably underappreciated fact that, in knowledge-based organisations such as professional services firms, universities, research institutes, medical practices or whatever, positions of leadership are rotated. The managing partner, the head of department, or whatever, is seen as a ‘first among equals’ who is required to behave in a collegiate fashion and bring colleagues along as a whole. Typically their positions are term-limited, and it is no surprise that these kinds of organisations, comprised of very motivated knowledge workers, tend to be among the most stable forms of organisational life that exist today. Universities, for example, have lives that extend back over many centuries. The point here is straightforward: veneration of leadership as the solution to what ails a business or an organisation is a strategy that is bound to fail. Leaders have power only to the extent that others grant it to them (Keltner 2016)—leadership is social, at its core. In the modern world, coercive models of leadership will fail—because people can, and do, walk away, leaving the leader bereft of followers. And then where is the leader without followers?

Other Important Cognitive Biases

There are many other cognitive biases; we will explore a few others here. Our memories fail us in all sorts of ways. When judging the frequency of an event, we call the most recent exemplar to mind and use that recall to judge frequency. Thus, the availability in mind of an item is used to judge its frequency. Straight after a plane crash, people when asked how frequent plane crashes are will think of them as being more prevalent than they actually are. Memory is vulnerable to other factors too: learning and recollection are both badly affected by stress and lack of sleep. The language we use is also very important: language has the important property of ‘framing’ arguments and discussions. How we speak about something determines in large part how we come to feel about that something. Are immigrants ‘welfare tourists’ or are they ‘hard-working individuals’ attracted by economic opportunities available in a particular country? The language used ‘frames’ how we should think of immigration. The crime debate in the UK was dominated in the 1970s by the phrase ‘short, sharp, shock’, which relied on the folk theory that quick and severe punishment would shock teenagers out of criminal tendencies. (The pleasing alliteration of the successive sibilants was an important, but useless, selling point too.) Short, sharp shocks, of course, predictably have no such effect, but why let data from the psychology of punishment and from criminology influence debate? Short, sharp, shock treatments designed to scare adolescents into being good just didn’t and don’t work. The data we have suggest that adolescents are actually more sensitive to rewards than punishments. The phrase ‘cut and run’ was used to forestall debate about the palpably failing US military strategy in Iraq. A change in course couldn’t be undertaken until empirical reality forced the change of direction. Verificationism (also known as confirmation bias or myside bias) is a pervasive and potentially dangerous cognitive error, where evidence favouring a particular point of view is collected and weighted heavily and contrary evidence is discounted or ignored. House prices have been rising for years; therefore, they will continue to rise, so property is a safe-bet investment. Evidence contrary to your point of view is systematically discounted or underweighted, but cherry-picking anecdotes in your favour is not the way to honestly proceed. Confirmation bias comes with a major problem. It feels good because it activates the brain’s reward system (Aue and colleagues 2012). This feeling can easily lead to overconfidence because you are chasing the feeling of intrinsic reward, not systematically pursuing the truth—and the evidence is apparently on your side! The Nobel Prize-winning British scientist, Peter Medawar, warned bluntly against being ‘deeply in love’ with your ideas and not being willing to expose them to a ‘cruelly critical test’ in order to discard them as wrong, as quickly as possible (Medawar 1979). Medawar also warned that ‘the intensity of the conviction that a hypothesis is true has no bearing on whether it is true or not’ (ibid, 39). The unwillingness to do so is an interesting cognitive bias—deliberately not exploring counterfactuals or contrary evidence to what you believe or assert, as these other points of view or contrary evidence might falsify your claims. Confirmation bias can, with effort, be conquered by its opposite, falsificationism, which is a difficult habit of mind to acquire. It is a must for any working scientist. Falsificationism requires considering what empirical evidence would invalidate (falsify) the position you are adopting. One way of avoiding this bias is to state clearly what empirical evidence would falsify your opinion; another is to build an evidence-based brake into policy formation. In science this is done by international, anonymous, expert ‘peer review’. Peer-review and similar systems can be built into the process of deliberation that underlies policy formation or strategic decision-making.

The phrase ‘delusional’ has sometimes been used to describe the behaviour of certain members of our political parties and businesses, but delusions imply the psychiatric diagnosis of pathological beliefs maintained contrary to all evidence. Anosagnosia (a more useful description taken from neuropsychology) is the condition of literally being ‘without knowledge’ (being unaware) of a cognitive or other impairment and behaving as if there is no problem. Additionally, the knowledge and expertise required to solve the problems confronting leadership teams and others may be greater than they can acknowledge, understand and act upon. This leads to anosagnosia within the cultures of these organisations. All sorts of organisations from party political systems to businesses to governments and civil service systems can undervalue expertise and suppress cognitive diversity. In what should have been a publically shameful moment, but wasn’t, a former UK Government minister declared during the Brexit debate that ‘people in this country have had enough of experts’. Whatever the merits or demerits of the case for Brexit, the deliberate disparaging of expert knowledge is deeply troublesome. One presumes that the former minister doesn’t truly believe what he says, or he would be happy to turn his dental care over to the local chap with a hammer and tongs and take his plane flights with non-qualified pilots (qualified pilots being experts). A list like this where we do have to defer to experts could go on. As we will see in Chapter 7, substantial data show that complex and difficult problems (for example, how to rescue a collapsing economy) are best solved by groups with substantial intellectual strength and capacity (obvious), and substantial diversity of experience (not obvious).

One final and very important cognitive bias to consider is the Dunning-Kruger effect (Dunning and Kruger 1999), named for the psychologists who first described and quantified the effect. This is a bias where the truly incompetent do not realise how truly incompetent they are—they overestimate their own degree of skill and capacity to perform a task requiring expert knowledge. Moreover, their degree of incompetence comes with an important deficiency—they are unable to reflect on their own cognitive processes in such a way as to recognise and appreciate their own mistakes. Technically expressed, they have poor metacognitive abilities. This gives rise to the bias of illusory superiority: a somewhat ironic bias, because in addition to their poor performance on a task, they will rate their ability as above average. This might be why we all regard ourselves as being better drivers than average! The obverse of this is that those who are highly skilled or expert tend to under-rate and underestimate their own abilities, giving rise to the phenomenon of illusory inferiority. The Irish poet, William Butler Yeats, in a famous line from his poem, The Second Coming, expresses a version of this pervasive cognitive bias thus: ‘The best lack all conviction, while the worst/Are full of passionate intensity.’

Conclusions

Individual rationality and cognition are limited and error-prone. A system driven by the unrestrained and unhampered cognition of individuals will fail; imposing ideological or personal control where objective, evidence-based and evidence-tested policy formation are called for is a recipe for disaster. And we know this, but this is the mistake we repeatedly make. Thus, our institutions and organisations must be re-engineered to be adaptive, plastic and capable of learning (and especially to be capable of learning and acting upon upsetting and unpleasant truths). Robust institutional governance processes are required to recognise error and failure quickly and to change course rapidly. Organisational design needs to recognise our biases; these biases need to be tested robustly and our ideologies discarded when they fail empirical tests.

Exercises

  1. 1.

    List some common examples of cognitive biases that are apparent during the debate around a salient political or business problem. Politicians provide a target rich focus—so pick an important debate (such as Brexit) and list some of the more common biases exhibited by politicians.

  2. 2.

    How would you address the persistence of these biases? Are there changes to deliberative processes that could be introduced that would make a difference?

  3. 3.

    If you have taken one side in the debate (say that you are pro-Brexit, for example), challenge your thinking, first, by laying out in 500 words or so your pro-Brexit case. Now, and this is hard, set out a 500-word case for the anti-Brexit case. Then, set out the empirical information that would cause you to change your mind.

  4. 4.

    Given a course of action that is failing in your business, set out a course of action that allows you to change course.

  5. 5.

    Are there other biases that Tom Spengler was prone to? What do the CEOs of the companies that he hoped to merge with need to watch out for in their own cognition—do they have obvious cognitive biases also?