Abstract
Despite more than 40 years of research into the field and the increasing use of the term in the media, there is still some uncertainty and even mystery over cognitive biases. This chapter provides a background to the topic with the aim to clarify what is meant by cognitive biases. After introducing some uses and misuses of the term, examples of common biases are presented. This is followed by a brief history of the research in the area over the years which illustrates the continued debate on cognitive biases and decision-making. Work in the emerging field of cognitive biases in visualization, prior to this publication, is outlined which concerns both the interpretation of the visualization and the visualization tools, such as visual analytic systems. Finally, we discuss the challenging issue of debiasing - how to mitigate the undesirable impact of cognitive biases on judgments.
Access provided by CONRICYT-eBooks. Download chapter PDF
Similar content being viewed by others
1.1 Introduction
Decisions, decisions, decisions, we make them all the time, probably thousands each day. Most are part of daily living, such as moving about our environment, others need more thought, but are not particularly critical, such as what coffee to buy. However, some decisions are important, even with life implications, from deciding if it’s safe to cross the road, to a doctor deciding what cancer treatment to suggest for a patient. We might imagine that all these decisions, whether trivial or not, are based on sound reasoning using our senses and our experience stored in our memory. However, it is generally agreed that the majority of decisions are made unconsciously using heuristics - strategies that use only a fraction of the available information. This makes sense in evolutionary terms [32], as to survive approaching danger, for instance, decisions had to be made rapidly. Humans do not have the time or brain processing power to do much else than use heuristics, and are, in fact, inherently lazy in order to conserve precious energy resources [22]. Fortunately, most of the time the result of using the very fast and automatic heuristic strategies are “good enough”, however in certain situations they are not good enough, leading to poor judgments. It is these “errors in judgment” or “irrational behavior” that are commonly referred to as cognitive biases.
During this decade, interest in cognitive biases has increased markedly, with several large research projects [38, 57] starting in 2012, as well as a few mentions in popular on-line publications [8] and even in the press. In addition to an increase in scholarly articles,Footnote 1 the biggest change has been in media interest, especially in the business world. A recent Google search for “cognitive bias” presents many business orientated items which are either aimed at selling (e.g. Cognitive Biases : How to Use Them to Sell More) or as a warning (e.g. Hidden Cognitive Biases That Cost You Big Money). Other search results are generally pessimistic regarding cognitive biases such as The 17 Worst Cognitive Biases Ruining Your Life!
More recently, implicit or unconscious bias has been in the media, in the context of equality and anti-discrimination. This is often the result of stereotyping which is influenced by our background, culture and experience. In this sense “unconscious” means that humans make this judgment without realizing it, as with heuristic processing. And, if we think that cognitive biases only affect humans, then there are studies on rats [6], sheep [69], bees [62], chicken [72] and many other animals which use cognitive bias as an indicator of animal emotion [59]. However, these uses of the term “cognitive bias” differ from the more traditional one which we are discussing in this book.
Before considering cognitive biases (in humans) in the context of visualization and visual analytics tools, the next sections, provide some examples of common cognitive biases and a brief history of their ‘discovery’ and subsequent research.
1.1.1 Examples
A recent classification of cognitive biases, the Cognitive Bias Codex by Benson [47] lists 187 biases.Footnote 2 These have been added to since the 1970s and the trend seems to be continuing, although sometimes just a bias by another name. There are, of course, similarities which various classification schemes over the years have attempted to tease out [3, 4, 9, 34, 37, 58, 66, 70] although most of the work has been in the area of decision support. In Chap. 2, Calero Valdez et al. propose a framework, specifically for the study of cognitive biases in visualization, and contrast this with the aforementioned Cognitive Bias Codex.
For those readers, not familiar with cognitive biases, here are four examples of common biases:
Familiarity/availability bias is where people tend to estimate the likelihood of something to happen by how easy it is to recall similar events. For instance, people will generally think that travel by airplane is significantly more dangerous in the aftermath of a plane crash being reported in the media (see Chap. 6).
Confirmation bias is where people tend to search for confirming rather than for dis-confirming evidence with regard to their own previous assumptions. For example, if you think that eating chocolate makes you loose weight then a Google search “loose weight by eating chocolate” will confirm this if you ignore article to the contrary (see Chap. 5).
Representational bias in visualization involves constraints and salience. For example, a matrix representation is not good at showing network data (a constraint) but can highlight missing relationships in its table view (salience) (see Chap. 10).
Overconfidence bias is where people tend to assess the accuracy of their answers or performance as greater than it actually is. There are many related cognitive biases such as illusion of control and planning fallacy (see Chap. 9).
1.2 A Brief History of Cognitive Biases
Early research on decision-making was founded on the theory of rational choice, where a person carefully assess all the alternatives and if they make errors these would not be systematic. However, in the 1950s and 60s, experiments found that people are generally poor at applying even basic probability rules and often make sub-optimal judgments when measured against an ‘ideal’ standard derived from Bayesian analysis [19]. Even experts, such as physicians, were found to make biased judgments [48]. Simon proposed bounded rationality [63], suggesting that humans are too limited in their data processing abilities to make truly rational decisions but employ simplifying heuristics or rules to cope with the limitations.
In the early 70s, Tversky and Kahneman developed this idea with their heuristics–biases program, with particular attention on judgments involving uncertainty. Systematic deviations from ‘normative’ behavior were referred to as cognitive biases and this was backed up by a series of experiments which illustrated 15 biases [66]. Heuer [34] also promoted the idea of cognitive bias errors being due to irrationality in human judgment with his work amongst intelligence analysts. Over the years many more cognitive biases were proposed, based mostly on laboratory experiments. However in the 80s, researchers began to question the notion that people are error prone and a lively debate has ensued over the years typified by the communication between Gigerenzer [26], and Kahneman and Tversky [41]. One of the opposing arguments poses the question “Are humans really so bad at making decisions, especially where it involves uncertainty?”. Gigerenzer [28] suggests that the use of heuristics can in fact make accurate judgments rather than producing cognitive biases and describes such heuristics as “fast and frugal” (see Chap. 13).
It is suggested that the success of heuristics–biases program is partly due to the persuasive nature of the experimental scenarios, often in the laboratory, which can easily be imagined by the reader [42]. However, many of the studies have clearly involved domain experts in the workplace. Another criticism of the heuristics–and–biases approach is the resultant long list of biases and heuristics, with no unifying concepts other than the methods used to discover them [4]. So the focus of later work has been to propose decision making mechanisms rather than just looking for departures from normative (ideal) models [40]. To this end, dual process models have been put forward, for example the two system theories of reasoning which feature System 1: involuntary/rapid/rule-based + System 2: conscious/slower/reasoning decision making [22, 65]. Kahneman’s book “Thinking, Fast and Slow” [39] also adopts this dual process model and gives a very readable account of heuristic and biases.
Other developments include the Swiss Army Knife approach [29] that proposes that there are discrete modules in the brain performing specific functions, and deviations occur when an inappropriate module is chosen or where no such module exists, so the next best one is used. Formalizing heuristics [27] and modeling cognitive biases [36] are other approaches to understanding what is going on in our heads when we make decisions. A useful discussion of the impact of Tversky and Kahnemans work can be found in [24]. But as research continues in this area, Norman provides a word of warning, especially in medical diagnosis, that there is bias in researching cognitive bias [51].
1.3 Impact of Biases
Not withstanding the debate amongst researchers as to the underlying cognitive processes, there is little doubt that in particular circumstances, systematic behavior patterns can lead to worse decisions. Making a poor decision when buying a car by overrating the opinion of a person you have recently met (vividness bias), is often not a major problem, but in other realms such as medical judgments and intelligence analysis, the implications can be damaging. For instance, a number of reports and studies have implicated cognitive biases as having played a significant role in a number of high-profile intelligence failures (see Chap. 9). Although uncertainty is a factor, a person’s lack of knowledge or expertise is not the overriding consideration. Cognitive biases such as overconfidence and confirmation are often associated with poor judgments among people in senior roles, as in a realistic study where all the twelve experienced intelligence analysts were led astray by confirmation bias, leaving only the inexperienced analyst with the correct answer [5].
In addition to Chap. 9, which focuses on intelligence analysis, many of the chapters in this book describe the impact of various cognitive biases, especially in relation to interpreting visualizations or when using visualization tools. For instance, Chap. 6 details the impact of familiarity related biases, especially with experts from the physical sciences and Chap. 10 discusses potential problems with representational biases when viewing visualizations. The case study described in Chap. 12 reveals the likelihood of numerous cognitive biases which can seriously affect decision making in a college admissions process. Chapters 3 and 4 discuss the notion that various aspects of computer systems, as well as humans, can also exhibit biases.
1.4 Cognitive Biases in Visualization
Interest in cognitive bias research has grown considerably at both the cognitive science level and also in relation to the visual analytics and decision-making tools that we build. The DECISIVe workshopsFootnote 3 have focused on two main issues related to visualization: (i) is the interpretation of visualizations subject to cognitive biases and (ii) can we adapt visualization tools to reduce the impact of cognitive biases?
1.4.1 Interpretation of Visualizations
There is evidence from peoples’ susceptibility to optical illusions that systematic errors can occur due to simplifying heuristics, such as grouping graphic items together, as set out in the Gestalt principles [1, 53, 55, 67]. It has also been demonstrated that different visual representation of common abstract forms or appearance of the visualization itself can affect the interpretation of the data [12, 16, 54, 74, 75, 77]. In relation to the comprehension of images, Fendley [23] discusses cognitive biases in detail and proposes a decision support system to mitigate a selection of biases. Ellis and Dix [21] proposed that cognitive biases can occur in the process of viewing visualizations and present examples of situations where particular cognitive biases could affect the user’s decision making. Recent studies into priming and anchoring [68], the curse of knowledge [73] and the attraction effect [17] demonstrate these cognitive bias effects when interpreting visualizations, but as their authors point out, much more work needs to be done in this area.
1.4.2 Visualization Tools
In visual analytics, user interaction plays a significant role in providing insightful visual representations of data. As such, people interact with the systems to steer and modify parameters of the visualization and the underlying analytical model. While such human-in-the-loop systems have proven advantages over automated approaches, there exists the potential that the innate biases of people could propagate through the analytic tools [61]. However, if the system is able to monitor the actions of the user and their use of the data resources, then it may be possible to guide them and reduce the impact of particular cognitive biases. This requires ways to effectively detect and measure the occurrence of a range of cognitive biases in users [10, 45, 46, 71]. Work towards this is the subject of Chaps. 5, 7 and 9 in particular. Researchers point out that novel corrective actions, ideally tailored to the user, are then required.
1.5 Debiasing
Reducing the negative impact of cognitive biases is a challenge due to the inherent nature of biases and the indirect ways in which they must be observed. Early work generally focused on developing user training, typically scenario-based, in an attempt to mitigate the effect of a small number of cognitive biases. However, this approach has met with little convincing generalizable and lasting success. Research shows that even if users are made aware of a particular cognitive bias, they are often unwilling to accept that their decisions could be affected by it, which itself constitutes bias blind spot [56]. Structured analytical techniques (SATs) (as discussed in [35]), such as ‘argument mapping’ and Analysis of Competing Hypotheses (ACH) have been used in intelligence analysis to reduce the impact of cognitive biases. Few of these techniques have been evaluated in empirical studies, apart from ACH, which, for realistic complex problems, has proved unsatisfactory, often due to the time pressures (see Chap. 9).
There has been appreciable effort in the medical field to identify cognitive bias effects and reduce prevalent diagnostic errors [14, 15, 30] with interventions (checklists) to increase clinicians knowledge, improve clinical reasoning and decision-making skills [11] or assist clinicians with selected tools. According to Croskerry [13], progress is being made, but this is hindered by the general lack of education in critical thinking amongst clinicians.
Bias-Reducing Analytic Techniques (BRATS) are another way of investigating bias mitigation. They benefit from minimally intrusive cognitive interventions [44] based on prior work on cognitive dis-fluency [33]. While results were mixed, opportunities for further research show promise. Another method involves the application of serious games to improve critical thinking as in the MACBETH [18] and HEURISTICA [2], games developed as part of IARPA’s Sirius program [7].
A common challenge across all these methods is the difficulty to shape an individual’s cognitive behavior. Therefore, research is shifting toward modifying and improving the decision environment (i.e. tools, etc.). Recent works investigate how visualizations can reduce base-rate bias in probabilistic reasoning [43, 49]. Other visualization research focuses on the cognitive biases that affect judgments under uncertainty [78]: for example in finance, helping investors to overcome uncertainty aversion and diversification bias [60] or loss aversion and conservatism [76]; assisting Fantasy baseball experts to mitigate the regression bias in their predictions [50]; or countering the anchoring and adjustment bias in decision support systems [25].
Researchers further propose frameworks, integrated into visual analytic systems, that provide support for mitigating some cognitive biases through measures such as the use of appropriate visualization types, uncertainty awareness, the use of statistical information and feedback from evidence-based reasoning [52, 61]. Other approaches attempt to “externalize the thinking” of the decision-maker [45] or improve hypothesis generation [31], in this case to avoid confirmation bias.
1.6 Conclusion
Cognitive biases are still somewhat intriguing. How humans actually make decisions is still largely a mystery, but we do know that most of this goes on at an unconscious level. Indeed, neuroscience experiments suggest that human decisions for physical movement are made well before the person is consciously aware of them [64]. From a survival of the species point of view, the evolutionary argument is compelling for very quick decisions and we often cannot say how we arrived at a particular judgement other than say it was a ‘gut feeling’. The popular classification of cognitive biases as errors brought about by heuristics - the unconscious decision-making processes in the brain - is more a matter of academic than practical interest. The important point is that better decisions can be made if we are more aware of the circumstances in which cognitive biases can occur and devise ways of countering this unhelpful behaviour. Both of these factors, bias detection and mitigation, pose serious challenges to the research community, as apparent from the limited progress so far on both accounts. However, the DECISIVe workshops have stimulated research into dealing with cognitive biases in visualization, and I hope that readers of this book will find help and inspiration in its chapters.
Notes
- 1.
A Google scholar search for “cognitive bias” reports 3000 in 2012 and 5480 in 2017.
- 2.
The author’s own survey collected 288 distinct biases.
- 3.
Full papers for DECISIVe 2014 are available [20].
References
Ali N, Peebles D (2013) The effect of Gestalt laws of perceptual organization on the comprehension of three-variable bar and line graphs. Hum Factors 55(1):183–203
Argenta C, Hale CR (2015) Analyzing variation of adaptive game-based training with event sequence alignment and clustering. In: Proceedings of the third annual conference on advances in cognitive systems poster collection, p 26
Arnott D (1998) A taxonomy of decision biases. Monash University, School of Information Management and Systems, Caulfield
Baron J (2008) Thinking and deciding, 4th ed
BBC (2014) Horizon: how we really make decisions. http://www.imdb.com/title/tt3577924/
Brydges NM, Hall L (2017) A shortened protocol for assessing cognitive bias in rats. J Neurosci Methods 286:1–5
Bush RM (2017) Serious play: an introduction to the sirius research program. SAGE Publications, Sage, CA: Los Angeles, CA
Business-Insider (2013) 57 cognitive biases that screw up how we think. http://www.businessinsider.com/cognitive-biases-2013-8
Carter CR, Kaufmann L, Michel A (2007) Behavioral supply management: a taxonomy of judgment and decision-making biases. Int J Phys Distrib Logistics Manage 37(8):631–669
Cho I, Wesslen R, Karduni A, Santhanam S, Shaikh S, Dou W (2017) The anchoring effect in decision-making with visual analytics. In: Visual analytics science and technology (VAST)
Cooper N, Da Silva A, Powell S (2016) Teaching clinical reasoning. ABC of clinical reasoning. Wiley Blackwell, Chichester, pp 44–50
Correll M, Gleicher M (2014) Error bars considered harmful: exploring alternate encodings for mean and error. IEEE Trans Visual Comput Graphics 20(12):2142–2151
Croskerry P (2016) Our better angels and black boxes. BMJ Publishing Group Ltd and the British Association for Accident & Emergency Medicine
Croskerry P (2017) Cognitive and affective biases, and logical failures. Diagnosis: interpreting the shadows
Croskerry P, Singhal G, Mamede S (2013) Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf 2012
Daron JD, Lorenz S, Wolski P, Blamey RC, Jack C (2015) Interpreting climate data visualisations to inform adaptation decisions. Clim Risk Manage 10:17–26
Dimara E, Bezerianos A, Dragicevic P (2017) The attraction effect in information visualization. IEEE Trans Visual Comput Graphics 23(1):471–480
Dunbar NE, Miller CH, Adame BJ, Elizondo J, Wilson SN, Schartel SG, Lane B, Kauffman AA, Straub S, Burgon K, et al (2013) Mitigation of cognitive bias through the use of a serious game. In: Proceedings of the games learning society annual conference
Edwards W, Lindman H, Savage LJ (1963) Bayesian statistical inference for psychological research. Psychol Rev 70(3):193
Ellis G (ed) (2014) DECISIVe 2014 : 1st workshop on dealing with cognitive biases in visualisations. IEEE VIS 2014, Paris, France. http://goo.gl/522HKh
Ellis G, Dix A (2015) Decision making under uncertainty in visualisation? In: IEEE VIS2015. http://nbn-resolving.de/urn:nbn:de:bsz:352-0-305305
Evans JSB (2008) Dual-processing accounts of reasoning, judgment, and social cognition. Annu Rev Psychol 59:255–278
Fendley ME (2009) Human cognitive biases and heuristics in image analysis. PhD thesis, Wright State University
Fiedler K, von Sydow M (2015) Heuristics and biases: beyond Tversky and Kahnemans (1974) judgment under uncertainty. In: Cognitive psychology: Revisiting the classical studies, pp 146–161
George JF, Duffy K, Ahuja M (2000) Countering the anchoring and adjustment bias with decision support systems. Decis Support Syst 29(2):195–206
Gigerenzer G (1996) On narrow norms and vague heuristics: A reply to Kahneman and Tversky
Gigerenzer G, Gaissmaier W (2011) Heuristic decision making. Annu Rev Psychol 62:451–482
Gigerenzer G, Todd PM, ABC Research Group et al (1999) Simple heuristics that make us smart. Oxford University Press, Oxford
Gilovich T, Griffin D (2002) Introduction-heuristics and biases: then and now. Heuristics and biases: the psychology of intuitive judgment pp 1–18
Graber ML, Kissam S, Payne VL, Meyer AN, Sorensen A, Lenfestey N, Tant E, Henriksen K, LaBresh K, Singh H (2012) Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Qual Saf
Green TM, Ribarsky W, Fisher B (2008) Visual analytics for complex concepts using a human cognition model. In: IEEE symposium on visual analytics science and technology, VAST’08, 2008. IEEE, New York, pp 91–98
Haselton MG, Bryant GA, Wilke A, Frederick DA, Galperin A, Frankenhuis WE, Moore T (2009) Adaptive rationality: an evolutionary perspective on cognitive bias. Soc Cogn 27(5):733–763
Hernandez I, Preston JL (2013) Disfluency disrupts the confirmation bias. J Exp Soc Psychol 49(1):178–182
Heuer RJ (1999) Psychology of intelligence analysis. United States Govt Printing Office.
Heuer RJ, Pherson RH (2010) Structured analytic techniques for intelligence analysis. Cq Press, Washington, D.C
Hilbert M (2012) Toward a synthesis of cognitive biases: how noisy information processing can bias human decision making. Psychol Bull 138(2):211
Hogarth R (1987) Judgment and choice: the psychology of decision. Wiley, Chichester
IARPA (2013) Sirius program. https://www.iarpa.gov/index.php/research-programs/sirius
Kahneman D (2011) Thinking, fast and slow. Macmillan, New York
Kahneman D, Frederick S (2002) Representativeness revisited: attribute substitution in intuitive judgment. Heuristics Biases Psychol Intuitive Judgment 49:81
Kahneman D, Tversky A (1996) On the reality of cognitive illusions. American Psychological Association
Keren G, Teigen KH (2004) Yet another look at the heuristics and biases approach. Blackwell handbook of judgment and decision making pp 89–109
Khan A, Breslav S, Glueck M, Hornbæk K (2015) Benefits of visualization in the mammography problem. Int J Hum-Comput Stud 83:94–113
Kretz DR (2015) Strategies to reduce cognitive bias in intelligence analysis: can mild interventions improve analytic judgment? The University of Texas at Dallas
Kretz DR, Granderson CW (2013) An interdisciplinary approach to studying and improving terrorism analysis. In: 2013 IEEE international conference on intelligence and security informatics (ISI). IEEE, New York, pp 157–159
Kretz DR, Simpson B, Graham CJ (2012) A game-based experimental protocol for identifying and overcoming judgment biases in forensic decision analysis. In: 2012 IEEE conference on technologies for homeland Security (HST). IEEE, New York, pp 439–444
Manoogian J, Benson B (2017) Cognitive bias codex. https://betterhumans.coach.me/cognitive-bias-cheat-sheet-55a472476b18
Meehl PE (1954) Clinical versus statistical prediction: a theoretical analysis and a review of the evidence
Micallef L, Dragicevic P, Fekete JD (2012) Assessing the effect of visualizations on Bayesian reasoning through crowdsourcing. IEEE Trans Visual Comput Graphics 18(12):2536–2545
Miller S, Kirlik A, Kosorukoff A, Tsai J (2008) Supporting joint human-computer judgment under uncertainty. In: Proceedings of the human factors and ergonomics society annual meeting, vol 52. Sage, Los Angeles, pp 408–412
Norman G (2014) The bias in researching cognitive bias. Adv Health Sci Educ 19(3):291–295
Nussbaumer A, Verbert K, Hillemann EC, Bedek MA, Albert D (2016) A framework for cognitive bias detection and feedback in a visual analytics environment. In: 2016 European intelligence and security informatics conference (EISIC). IEEE, New York, pp 148–151
Peebles D (2008) The effect of emergent features on judgments of quantity in configural and separable displays. J Exp Psychol: Appl 14(2):85
Peebles D, Cheng PCH (2003) Modeling the effect of task and graphical representation on response latency in a graph reading task. Hum Factors 45(1):28–46
Pinker S (1990) A theory of graph comprehension. Artificial intelligence and the future of testing pp 73–126
Pronin E, Lin DY, Ross L (2002) The bias blind spot: perceptions of bias in self versus others. Pers Soc Psychol Bull 28(3):369–381
RECOBIA (2012) European Union RECOBIA project. http://www.recobia.eu
Remus WE, Kottemann JE (1986) Toward intelligent decision support systems: an artificially intelligent statistician. MIS Q pp 403–418
Roelofs S, Boleij H, Nordquist RE, van der Staay FJ (2016) Making decisions under ambiguity: judgment bias tasks for assessing emotional state in animals. Frontiers Behav Neurosci 10:119
Rudolph S, Savikhin A, Ebert DS (2009) Finvis: applied visual analytics for personal financial planning. In: IEEE symposium on visual analytics science and technology, 2009. VAST 2009. IEEE, New York, pp 195–202
Sacha D, Senaratne H, Kwon BC, Ellis G, Keim DA (2016) The role of uncertainty, awareness, and trust in visual analytics. IEEE Trans Visual Comput Graphics 22(1):240–249
Schlüns H, Welling H, Federici JR, Lewejohann L (2017) The glass is not yet half empty: agitation but not Varroa treatment causes cognitive bias in honey bees. Anim Cogn 20(2):233–241
Simon HA (1957) Models of man; social and rational. Wiley, New York
Soon CS, Brass M, Heinze HJ, Haynes JD (2008) Unconscious determinants of free decisions in the human brain. Nat Neurosci 11(5):543
Stanovich KE, West RF (2000) Individual differences in reasoning: implications for the rationality debate? Behav Brain Sci 23(5):645–665
Tversky A, Kahneman D (1974) Judgment under uncertainty: heuristics and biases. Science 185(4157):1124–1131
Tversky B (1991) Distortions in memory for visual displays. Spatial instruments and spatial displays pp 61–75
Valdez AC, Ziefle M, Sedlmair M (2018) Priming and anchoring effects in visualization. IEEE Trans Visual Comput Graphics 24(1):584–594
Verbeek E, Ferguson D, Lee C (2014) Are hungry sheep more pessimistic? the effects of food restriction on cognitive bias and the involvement of ghrelin in its regulation. Physiol Behav 123:67–75
Virine L, Trumper M (2007) Project decisions: the art and science. Berrett-Koehler Publishers, Oakland
Wall E, Blaha LM, Franklin L, Endert A (2017) Warning, bias may occur: a proposed approach to detecting cognitive bias in interactive visual analytics. In: IEEE conference on visual analytics science and technology (VAST)
Wichman A, Keeling LJ, Forkman B (2012) Cognitive bias and anticipatory behaviour of laying hens housed in basic and enriched pens. Appl Anim Behav Sci 140(1):62–69
Xiong C, van Weelden L, Franconeri S (2017) The curse of knowledge in visual data communication. In: Talk given at the information visualization research satellite event at vision sciences society annual meeting, St. Pete Beach, FL
Zacks J, Tversky B (1999) Bars and lines: a study of graphic communication. Memory Cogn 27(6):1073–1079
Zhang J, Norman DA (1994) Representations in distributed cognitive tasks. Cogn Sci 18(1):87–122
Zhang Y, Bellamy RK, Kellogg WA (2015) Designing information for remediating cognitive biases in decision-making. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems. ACM, New York, pp 2211–2220
Ziemkiewicz C, Kosara R (2010) Implied dynamics in information visualization. In: Proceedings of the international conference on advanced visual interfaces. ACM, New York, pp 215–222
Zuk T, Carpendale S (2007) Visualization of uncertainty and reasoning. In: International symposium on smart graphics. Springer, Berlin, pp 164–177
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Ellis, G. (2018). So, What Are Cognitive Biases?. In: Ellis, G. (eds) Cognitive Biases in Visualizations. Springer, Cham. https://doi.org/10.1007/978-3-319-95831-6_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-95831-6_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-95830-9
Online ISBN: 978-3-319-95831-6
eBook Packages: Computer ScienceComputer Science (R0)