1 Introduction

A rising fraction of the US public recognizes that climate change is happening now, caused mainly by human activities (Hamilton 2017; Saad 2017). The rise coincides with growing public awareness that most scientists agree on this point (Cook et al. 2016; Hamilton 2016a), which research identifies as a “gateway cognition” for acceptance of anthropogenic climate change (Ding et al. 2011; Lewandowski et al. 2013; van der Linden et al. 2015, 2017a). Ideally, recognition of anthropogenic climate change (ACC) should also reflect improved public knowledge of the science itself, but such knowledge appears thin (Leiserowitz et al. 2010). Moreover, how people answer factual questions on climate is often shaped by their underlying beliefs, rather than vice versa, undermining causal interpretation (Hamilton 2012; Kahan 2015). Without definitive measurements, climate knowledge has not been well tracked over time. Cross-sectional research provides the main avenue for its study.

The simplest hypothesis about knowledge and opinions, termed the information deficit model, posits that people fail to accept scientific conclusions about ACC because they lack good information (Suldovsky 2017). This hypothesis finds support in some experimental studies, including recent work on “inoculating” people against misinformation (Cook et al. 2017; van der Linden et al. 2017b). Non-experimental support includes survey research where education or general knowledge exhibit overall positive effects on ACC acceptance (Ehret et al. 2017; Hamilton et al. 2015a). Empirically, however, the information deficit model proves incomplete. It cannot explain the politically motivated rejection that is often directed against ACC and climate science (Dunlap et al. 2016; Dunlap and McCright 2015). Well-educated conservatives are among the most vehement opponents (Hamilton 2008, 2011, 2012; McCright and Dunlap 2011). Rejection of ACC by conservative information elites requires different hypotheses positing politically selective acquisition of information, as in the overlapping concepts of biased assimilation (Corner et al. 2012; Ehret et al. 2017; McCright and Dunlap 2011), elite cues (Brulle et al. 2012; Carmichael and Brulle 2017; Darmofal 2005), motivated reasoning (Kraft et al. 2015; Kunda 1990; Taber and Lodge 2006), compensatory control (Kay et al. 2009), or cultural cognition (Kahan et al. 2011).

Research on public opinion about climate change often finds support for both information deficit and biased-assimilation type processes, indicating that both are important—but to different degrees and among different people (Ehret et al. 2017). Evidence for information deficit processes includes the positive main effects on ACC acceptance from respondent education, and from objective knowledge measures including general science literacy as well as more focused climate knowledge (Hamilton et al. 2012). Evidence for biased-assimilation type processes includes the main (additive) effects from indicators of politics, ideology, or worldview seen in almost every study. Evidence that these processes vary across individuals includes widely replicated interaction effects with the general form information × politics, wherein the influence of information indicators, such as education, science literacy, or numeracy on ACC acceptance proves to be positive among liberals and moderates but near zero or negative among conservatives (Bolin and Hamilton 2018; Drummond and Fishhoff 2017; Hamilton 2008, 2011; Hamilton et al. 2012, 2015a; Kahan et al. 2012; McCright and Dunlap 2011). If statistical models do not allow for such interactions, their estimates for the main effects of education or knowledge measures may essentially be averaging divergent trends. Where interactions are included, main effects describe influence of education or knowledge given zero values or base categories of the politics indicators, so interpretation depends on details of variable coding.

Definitive evidence requires objective tests of knowledge—but these raise issues of content. Less definitively, researchers might just ask people how well they understand climate change, or how well informed they feel. Self-assessed understanding correlates with specific opinions in some studies, but like education, it can exhibit divergent effects depending on political identity. To the extent that self-assessment reflects physical-world or scientific knowledge, it should legitimately inform climate beliefs; but self-assessments could also reflect confidence in politically based views. Some studies nevertheless analyze self-assessed understanding as an independent variable, possibly predictive of climate change opinions (Malka et al. 2009; Hamilton 2011; McCright and Dunlap 2011).

This paper turns that perspective around to examine “understanding” as dependent variable. Survey research has established how respondent characteristics such as age, sex, education, and political identity predict beliefs about climate change. Do these background characteristics also predict self-assessed understanding? Controlling for background factors, how well does tested knowledge predict understanding? Do the effects of knowledge on understanding change with political identity, similar to the information × politics-type interaction effects widely reported for climate beliefs themselves? A nationwide US survey conducted in 2016 provides data to explore these questions.

2 Climate change views and self-assessed understanding

Two standard questions about climate change, exhibiting high reliability and criterion validity, have been asked in more than 40,000 interviews across many surveys since 2010 (e.g., Hamilton et al. 2015a). Understanding inquires how well respondents feel they understand the issue of climate change or global warming; most people say they understand “a moderate amount” or “a great deal.” Climate asks which of three statements they think is most accurate; one choice corresponds to the scientific consensus that climate change is happening now, caused mainly by human activities. Table 1 gives the wording of both questions, along with response percentages from the Polar, Environment, and Science (POLES) survey, a nationwide US 2016 survey comprising the principal data for this paper.

Table 1 Variable definitions with regression codes and weighted summary statistics from POLES surveys, August and November/December 2016, n = 1411. Climate and knowledge question response order rotated in interviews

Researchers at the University of New Hampshire and Columbia University designed the POLES survey to assess general public knowledge and perceptions of science, with a particular interest in the Earth’s polar regions (Hamilton 2016b). Random sample cell and landline telephone interviewing for POLES took place in two stages, August and November/December 2016 (combined n = 1411). Sampling weights used for all analyses in this paper make adjustments to better represent the US population.Footnote 1

Sixty-four percent of POLES respondents agree with scientists that climate change is happening now, caused mainly by human activities (Fig. 1a). Only 29% think climate change is happening but mainly for natural reasons; few think it not changing, or admit they do not know. These results resemble those from many other surveys, and fit with an upward trend observed from 2010 to 2017 (Hamilton 2017).

Fig. 1
figure 1

Views about climate change and self-assessed understanding of this issue

Party line or ideological divisions dominate survey responses to climate change questions. Figure 1b illustrates using the four-party scheme of Hamilton and Saito (2015), which distinguishes Tea Party supporters from other groups. Although the U.S. Tea Party is an informal, decentralized organization, survey respondents who self-identify as Tea Party supporters often express significantly more rejectionist views regarding climate change and other science or environmental topics, compared with non-Tea Party Republicans (Hamilton and Saito 2015; Leiserowitz et al. 2011; Shao 2016). In the POLES data for example, most Democrats and Independents agree that climate change is happening now, caused mainly by human activities. A minority of Republicans and even fewer Tea Party supporters accept this consensus. A 59-point gap separates Democrats from Tea Party supporters.

Partisan gradients like that in Fig. 1b occur on many science or environment-related issues, although they tend to be steepest on climate change (Hamilton and Saito 2015). Self-assessed understanding follows a different pattern, however. Figure 1c shows that a majority of POLES respondents feel they understand a moderate amount about climate change, and almost one-fourth respond “a great deal.” The partisan pattern (Fig. 1d) is not monotonic but U-shaped: Democrats and Tea Party supporters most often say they understand a great deal, although substantively their opinions are opposite. Similar U shapes (not shown) occur when understanding is graphed against respondent ideology from extreme liberal to extreme conservative, or against frequency of religious service attendance from never to at least once per week.Footnote 2 “Understanding a great deal” is also most common among the small fraction who believe that climate change is not happening now. Clearly, asserting that one understands a great deal about climate change relates to political outlook; but how does it relate to knowledge?

3 Testing climate-relevant knowledge

Survey researchers operationalize science literacy using sets of basic questions with clear answers, such as whether the Earth orbits the Sun or vice versa (National Science Board 2010; Hamilton et al. 2012). The questions focus on facts with universal acceptance among scientists, but some of these—such as whether the Earth is billions of years old, or whether humans evolved from earlier forms of life—can evoke religion-based responses among non-scientists, leading to recommendations that they not be included in measures of science literacy (Roos 2014). Climate change knowledge questions face a similar problem. Basic facts about climate change, such as the human-caused buildup of greenhouse gases, the rise of global temperatures, or the melting of Arctic ice are rejected as a matter of belief by many people. In their place, alternative (false) facts have been promoted by partisan sources: for example, that recent volcanoes emit more CO2 than humans, global temperatures are not really rising, or Arctic sea ice declined but then recovered (Hamilton 2012). A climate-literacy score constructed from questions on the physical reality of climate trends makes sense from a scientific standpoint, and would certainly correlate with acceptance of ACC. Interpreting this correlation in causal terms risks circularity, however. Politically linked beliefs and sources inform response to factual questions, so that factual responses already are confounded with social identity (Kahan et al. 2012).

These identity-linked questions have answers that could be guessed (right or wrongly) from ideology or general opinions about ACC, but some other questions do not have this property. For example, the area of late-summer ice on the Arctic Ocean objectively has decreased over the past 30 years, but survey responses on this point behave partly as if one had asked people for their personal opinion about ACC. On the other hand, when asked whether the North Pole is located on land ice or sea ice, opinions about climate change whisper no clues; responses behave more like neutral knowledge (Hamilton 2015a). Similarly, asking whether sea level is rising will bring answers that reflect ACC beliefs; but those beliefs do not suggest whether sea ice or land ice could raise sea level more.

The five knowledge questions listed in Table 1 represent this second, belief-neutral type. Figure 2a–e chart the POLES survey responses. The questions range from easy to difficult: meaning of the greenhouse effect is answered correctly by 64% of the respondents, but only 18% are aware the USA has territory and people in the Arctic. Together, these five items provide a rough index of knowledge (Fig. 2f).Footnote 3

Fig. 2
figure 2

Basic knowledge questions and number of answers correct

By design, these questions do not address the reality of climate change itself, but all have substantial relevance to that topic. Knowing what scientists mean by greenhouse effect is fundamental to the whole discourse, even for those who dispute its reality. Although some people give answers about whether Arctic sea ice has declined based on their political beliefs, such beliefs offer no guidance in recalling that the North Pole is located in the Arctic Ocean or the South Pole on a continent. Nor do political beliefs help to understand the different impacts on sea level from melting of Greenland and Antarctic land ice (potentially more than 60 m) compared with Arctic sea ice (a few centimeters, caused by freshening the ocean although sea ice is already floating). Changing conditions in these icy realms have been major themes in scientific and popular discussions about climate change, however, including contrarian arguments about why change is not happening. The most difficult question in this set asks whether the USA has territory and people in the Arctic. Almost 8 million km2 of Alaska lie north of the Arctic Circle, including Prudhoe Bay oilfields and the Arctic National Wildlife Refuge, along with the predominantly Iñupiat towns of Barrow (pop 4500), Kotzebue (3200), and many smaller communities. More than a dozen Arctic or subarctic Alaska communities face threats from climate-linked erosion that in some cases may force their abandonment (Bronen 2009; Hamilton et al. 2016b; Marino 2015; USACE 2009).

These five questions do not directly test knowledge of climate change, but that becomes a virtue in terms of separating them from climate change beliefs. The basic facts nevertheless are central to understanding this issue—why sea level often gets mentioned in connection with Antarctica, for example, or how the North and South Poles are different. Put another way, someone who scores poorly on this five-item quiz, unaware of polar locations or what “greenhouse effect” means, does not understand a great deal about climate change.

4 Predictors of self-assessed understanding

The knowledge index exhibits a mild though statistically significant correlation with ACC acceptance. Relationships between knowledge and self-assessed understanding, the focus of this paper, are more interesting and complex. Table 2 explores these relationships through logit regression models that test knowledge alongside other characteristics as predictors of “understanding a great deal.”Footnote 4 Models 1 and 2 employ the 2016 US POLES survey data described earlier, with variable definitions and coding as given in Table 1. Respondent political identity is represented either by four discrete parties (model 1) or alternatively by an ordinal 5-point scale from liberal to conservative (model 2). An indicator for the pre- and post-election POLES survey stages (August and November/December 2016) is included as a control, but demonstrably has no effect.Footnote 5

Table 2 Respondent characteristics as predictors of “understanding a great deal” about climate change in US or New Hampshire (NH) surveys. Coefficients and standard errors from weighted logit regression; stars denote probabilities from adjusted Wald tests

For an independent replication, model 3 employs 2014–2015 survey data from New Hampshire, where the climate and understanding questions had been asked alongside a subset of knowledge items (npole, spole, and sealevel). The knowledge scores for model 3 thus range from 0 to 3, unlike the 0 to 5 scores used in models 1 and 2; otherwise, variable definitions and codes are the same. Three New Hampshire surveys are pooled to provide an adequate sample for this analysis; the individual-survey indicators detect no significant differences. See Hamilton (2016a, 2016b) for more about the New Hampshire series, which closely tracks national data.

Other things being equal, men are more likely than women to say that they understand a great deal about climate change (models 1–3). Respondents who are better educated (models 1 and 3) or older (models 1 and 2) tend to express greater confidence too. Among those who scored zero on the knowledge quiz (answered no questions correctly), Tea Party supporters (model 1) and conservatives (model 3) are more inclined than members of other political groups to say they understand a great deal.Footnote 6

Tested knowledge exhibits a positive main effect on self-assessed understanding in all three models. Because knowledge also appears in the knowledge × party or knowledge × ideology interaction terms, we can interpret these main effects as the effect of knowledge among Democrats (base category of party) in model 1, or among liberals (zero value of ideology) in models 2 and 3.

All three models in Table 2 find significant interactions between knowledge and political identity, each with the same general meaning: objective knowledge is positively related to self-assessed understanding among Democrats and Independents, or among liberals and moderates—but not so among conservatives. Among the most conservative respondents, knowledge appears unrelated or even negatively related to self-assessed understanding. In models 1 and 3, conservatives with low knowledge scores appear more likely than other conservatives to say they understand a great deal about climate change.

Figure 3 visualizes this interaction as an adjusted margins plot based on model 1. The logit regression coefficients relating knowledge to understanding, adjusted for other variables in this model, are positive among Democrats (b = 0.402; odds ratio eb = 1.495), Independents (b = 0.402–0.052 = 0.350, odds ratio eb = 1.419) and, to a weaker degree, Republicans (b = 0.402–0.226 = 0.176, odds ratio eb = 1.192). The coefficient turns negative (b = 0.402–0.474 = − 0.072, odds ratio eb = 0.931) among Tea Party supporters, indicating that higher confidence in that subgroup coincides with less knowledge.

Fig. 3
figure 3

Probability of “understand a great deal,” by knowledge score and political party identification (adjusted margins plot based on model 1 of Table 2)

Such interactions could partly reflect divergent interpretations of “understanding” climate change, if for conservatives that question disproportionately evokes confidence in their political beliefs rather than physical-world knowledge. Evidence consistent with this explanation appears in the pattern of “do not know” responses. Among respondents professing a great deal of understanding about climate change, the percentage of “do not know” responses to our knowledge questions is significantly related to party and ideology, across both US and New Hampshire survey datasets. For example, 49% of Tea Party supporters claiming a great deal of understanding said they did not know the answer to one or more knowledge questions on the POLES surveys, 22% admitted not knowing two or more, and 11% did not know three or more. The comparable figures for Democrats with a great deal of understanding are 22% (one or more), 4% (two or more), and 1% (three or more). Other parties fall in between, also giving fewer “do not know” responses than Tea Partiers.

Interaction effects can be notoriously sample-specific, due to statistical problems with multicollinearity, influential observations, and multiple comparisons. Three replications in Table 2, however, provide some evidence for stability.

5 Discussion

Public acceptance that ACC is real exhibits a monotonic partisan gradient, whereas self-assessed understanding follows a U-shaped pattern (Fig. 1d). Working in opposition, these political patterns render pointless a simple correlation between “understanding” and ACC acceptance. Among the most conservative, self-assessed understanding is unrelated or even negatively related to physical-world knowledge. Analysis of “do not know” responses suggests that conservatives disproportionately interpret the understanding question as referring to their political confidence rather than science knowledge. Self-assessments thus plausibly reflect both divergent interpretations of “understanding” (political vs. scientific) and politically guided filtering of nominally scientific information.

Politically guided information filtering can be described in symmetrical terms as something “both sides do.” Some experiments find support for symmetry (Frimer et al. 2017; Washburn and Skitka 2017), whereas others report that “asymmetries abound” (Jost 2017). The education × politics interactions frequently observed with climate change opinions on surveys might be read symmetrically: more educated (or scientifically literate) partisans are more efficacious at filtering, so they acquire information that intensifies identity-appropriate views in either direction. This overlooks a basic difference in content, however. More educated Democrats and Independents, or liberals and moderates, incline toward the climate change views expressed by most scientists, so they are filtering for scientifically informed sources (Carmichael et al. 2017). Conversely, conservatives who reject this scientific consensus rely more on political sources to filter science information, or for their sense of understanding.

Other studies have noted lower conservative trust of scientists in general (Gauchat 2012; Nadelson et al. 2014) and on specific topics from climate change and evolution to vaccines and nuclear power (Hamilton 2015b; Hamilton et al. 2015b). Regarding non-science topics as well, conservatives appear to have more politically homogeneous online networks (Boutyline and Willer 2017). They tend to be more attuned to partisan cues (Bullock 2011; Carmichael et al. 2017), more politically selective in their exposure to information (Rodriguez et al. 2017), and more responsive to ideologically compatible fake news (Guess et al. 2018).

The asymmetrical interaction in Fig. 3 adds a new piece to this puzzle. Liberals and conservatives both overstate their understanding of climate change. Science-aware individuals might accept the scientific consensus, however, without necessarily understanding the evidence behind it. This point has been emphasized in research on awareness of the scientific consensus as a “gateway cognition” for recognizing the reality of anthropogenic climate change (Lewandowsky et al. 2013; Maibach et al. 2014; van der Linden et al. 2015, 2017a). Truly, understanding even a moderate amount about this topic, however, does require knowledge of science. In contrast, the minimal quiz employed here covers elementary facts that, if not already known, could be picked up through casual attention. Although self-assessments are biased high and our knowledge quiz sets the bar low, among liberals and moderates these two measures at least correlate, such that higher self-assessments tend to occur with more knowledge. Among the most conservative respondents that does not hold; higher self-assessments appear unrelated or even negatively related to physical-world knowledge.

These results have implications for science communication. Efforts at conveying scientific information may not reach people who confidently reject that science without knowing basic facts. We see many others, however, who should be more open to outreach and science communication. These include political moderates holding less committed views on climate, and expressing lower confidence in their own understanding. Self-assessed understanding reaches its lowest levels among non-Tea Party Republicans or moderate conservatives, in both surveys analyzed here. Moderates and moderate conservatives could be a focus of future research evaluating differential effects of outreach or communication strategies.