Debates and discourse around vaccines—like many debates about science and technology at the public interface—outstrip technical consideration of the objects themselves to include a host of extra-scientific considerations. For vaccines, the discussion blends with political ideologies, geopolitical conflicts, and social identities. In short, discussions about vaccines are always about much more than vaccines. Stuck: How Vaccine Rumors Start—and Why They Don’t Go Away, by Heidi Larson, exemplifies this far-reaching view and thereby challenges singular analyses of vaccine hesitancy that lay blame on poor science literacy, social media, or the biggest bugbears: misinformation and disinformation.

Stuck offers an examination of vaccine rumors—the narratives, the social vectors that transmit them, and how they manifest in different contexts. The book was highly anticipated, as it draws from Larson’s decades of well-regarded work as an anthropologist rumor hunter studying vaccine confidence. Ample reviews of the book reveal competing interpretations of Larson’s work. One is the characterization of Stuck as a helpful addition to misinformation studies (e.g., Donovan 2020), while the other draws out an ecological view of rumors from the text, whereby rumors are informative even if untrue (e.g., Galchen 2022). I accept the latter interpretation and see Stuck as, indeed, a corrective to the predominant thrust of misinformation studies and the whack-a-mole approach to solving misinformation problems.

A review in Nature by a misinformation studies scholar introduces Larson as a “researcher of rumors” yet likens Stuck’s findings to the reviewer’s own scholarship on misinformation (Donovan 2020, 681). Donovan (2020, 681) writes: “It is apparent from Larson’s book and my own research that to counter vaccine hesitancy, a broad coalition of medical professionals, journalists, civil society organizations and technologists must develop a plan for challenging misinformation.”

Noticeably missing from this list is public representation, which services democratic goals, confers pragmatic benefits such as encouraging public cooperation, and affords the epistemic benefits of theorizing from the margins (e.g., standpoint theory). This omission either neglects the value of public participation or casts the public as a problem to be managed by experts. I think Larson would agree that neither is acceptable. Donovan also recommends technological solutions to what she sees as a technological problem. Specifically, she wants “research into how bad information rises to the top of search engines and circulates online, and … strategy to halt that contagion” (Donovan 2020, 681).

This is a misrepresentation of Stuck and Larson’s research program. Larson takes a holistic view of rumors, pointing out that they do not only serve negative purposes. They can, for instance, be a lifeline for citizens denied vital information by authoritarian governments. Even democracies, I would add, cultivate public messaging by means of propaganda when deemed necessary. Thus, equalizing a rumor with misinformation adopts the vantage of the political elites, whereby official communications are the only source of reliable content.1

In Stuck, I read a challenge to the common practice among media-trained health scientists and health care practitioners to “myth bust,” “debunk,” and “separate vaccine fact from fiction” (e.g., Cassata 2021). Larson persuasively argues that vaccine rumors are not problems to be snuffed out; rather, they are signals that deserve a deeper understanding of why they came about and why they stick. This is how we get unstuck. Larson’s insight does not propose an end to fact-checking, but it does suggest that this mode of science communication is less helpful than the myth busters think. Larson’s message is for all of us—friends and family of vaccine-hesitant people, health care providers, public health researchers and practitioners, and science communicators. Getting the facts right is not enough. Damning for misinformation studies is Larson’s framing of the core problem: “We don’t have a misinformation problem, we have a relationship problem” (quoted in Gellin 2020, 304; also quoted in Paun, Deutsch, and Tamma 2020).

While Larson does contribute to misinformation studies (see Larson 2018), and the Vaccine Confidence Project that she leads is a sought-after resource for organizations combatting health misinformation, Larson insists that focusing on the inaccuracy of any given rumor is to miss the point (Larson 2020a). She told the New York Times, “If you shut down Facebook tomorrow, it’s not going to make [misinformation] go away. It’ll just move” (Anderson 2020). Misinformation can be deleted, but the underlying distrust that caused it remains.

The issues surrounding vaccine hesitancy are, more than anything, about people feeling left out of the conversation. “This is a public cry to say, ‘Is anyone listening?’” she writes (Larson 2020b, xxv). Vaccination campaigns, Larson explains, depend on “a social contract whose fabric is eroding in a broader context of anti-globalization, nationalism, and populism” (126). The foundation that underpins vaccination confidence, then, is trust. This is a converging view among the scholars participating in this vaccine book review forum (Charles, Goldenberg, Hausman, Larson, Lawrence, and Navin). Larson writes that “risk perceptions are closely entwined with levels of trust. The higher the trust, the more willingness to take a risk; the lower the trust, the higher the risk aversion” (xxxvi–xxxvii). This arguably explains why some members of the public are not satisfied with official sources of health information, like the European Medicines Agency and the Centers for Disease Control and Prevention, and resolve to do their own research. Whereas misinformation studies can offer copious anecdata about rabbit holes (Bernstein 2021) and impress urgency with the use of martial language like “information warfare” (for criticism of this conflict narrative, see Hwang [2020]), it pays little attention to the climate in which misinformation takes. According to Larson (as articulated by Jenny Anderson from the New York Times), “Rumors take root in the soil of doubt, and it’s the soil that wants attention” (Anderson 2020).