Introduction: (Mis) Understanding Science

Recent discussions regarding the ethics of science typically concentrate on a number of topics: the misuse of technology, the regulation of the results of research, and the ethical conduct of research (see e.g., [10, 12, 13]). In addition to these issues, there is another part of the chain of events from the genesis of an idea in science to its realisation in modern technologies, and that is the communication of research to the public. It is this communication I wish to address.

The communication of scientific results may seem innocuous: surely there are more pressing issues when we are confronted with the unethical treatment of subjects, the creation of weapons from novel scientific research, etc. However, badly communicated, misunderstood or wilfully misinterpreted science can have substantial negative consequences. This can be in the form of bad policy decisions, the perpetuation of harmful false beliefs, or general panic.

Before I continue, a point of clarification regarding “misrepresentation”. Hereafter, I will refer to the multitude of ways in which science falsely represented—intentionally or accidentally—as ‘misrepresented,’ for the sake of clarity. There may be divergence amongst beliefs readers hold related to those who intentionally misrepresent scientific research and those who do so unintentionally. However, much of my arguments stem from the status of the scientist in relation to other non-scientists, and also to self-interested motivations scientists might have for correct public understanding of science. Under my account it matters little, from the perspective of the scientist, whether the misrepresentation is intentional or accidental. Their motivations to correct these false representations should exist for either option.

Perhaps the simplest misunderstanding or miscommunication of scientific understanding is the term “weight.” If Jon were to ask Jane what her weight was, Jane might answer in kilograms or pounds, depending on what country she grew up in. However, Jane would be strictly correct if she was to respond in Newtons (or pound-force, in the Imperial system), or remind Jon that he was most likely asking after her mass, not her weight. Weight describes the force of gravity on a body, while mass is more a measure of the number of particles that make up a body.

The above example is an innocuous one, and is certainly not one we would normally expect scientists to correct. It is also a confusion that is unlikely to have any practical consequences, apart from causing the odd physicist to grind their teeth. Unless we colonise outer space and we have to standardise scales to the gravitational fields of different planets, this is unlikely to be a public misunderstanding that needs to be resolved.

However, there may be misrepresentations of science which may have problematic outcomes. The first we might think of is when science promises or warns of too much. This is a particularly salient point for nanotechnology, in which there are multiple and diverging opinions about the implications of nanotechnology (e.g., [11]). If we react too harshly to nanotechnology due to ill-informed fears, we might needlessly over regulate nanotechnology. On the other hand, if we become intoxicated by the promises made by advocates of technologies that converge on the nanometre scale, we might allow highly dangerous research to be conducted and unsafe technologies to be implemented. How these considerations are represented are extremely important.

The next is the misrepresentation of climate science in the debates surrounding government policy designed to mitigate the effects of anthropogenic climate change. Not only the research pertaining to this issue, but the conduct of the scientists and the process by which science is done have been called into question by those who identify themselves as “climate-change sceptics” ([8], pp. 127–146). It has been remarked for some time that these sceptics frequently misrepresent the available scientific evidence to push certain political agendas [14, 16].

Finally, the use of fMRI scans to determine gender-based difference in neurological function has been criticised for perpetuating existing gender biases. Through the use of suspect interpretation of data, and the publication of these interpretations in popular science media, Cordelia Fine has argued that gender stereotypes are being propagated through the misrepresentation of the science, through (among other things) assigning excessive significance to or drawing over broad inferences from research data [5].

What should we do about the harms these misrepresentations can cause? Moreover, who should be responsible for the correction of public understanding of science? I will argue it is the scientist that ought to be responsible for correcting the misrepresentation of science. By ‘responsibility’ I do not mean that it exclusively the scientist’s fault when science is misrepresented. What I mean by ‘responsibility’ is that the scientist has a duty to prevent or mitigate the misrepresentation of scientific knowledge to the best of their ability. Whether or not they ought to be blamed and how they ought to figure in the distribution of blame is another discussion.

With this in mind, I will proceed in the following way. I will give three accounts of why a scientist ought to act to foster the correct understanding of science—one based on the capacity to prevent harm, another based on distribution of skill, and one based on self-interested reasons. These accounts are logically independent: one can accept any number of them without accepting any of the others. It is my belief that if one accepts at least one or more of these accounts, they should accept my conclusion. Following these accounts I will briefly discuss some options scientists might adopt in addressing the problem of the misrepresentation of science.

Argument One: The Scientist and Harm’s Way

My first argument is one that concerns vulnerability. That is, A is vulnerable to B if and only if B’s actions and choices have a great impact on A’s interests ([6, 7], 779). It has been argued that in cases of vulnerability, B has an obligation to help A by virtue of the dependency that this vulnerability creates. Moreover, those who are task responsible—those whose duty it is to prevent or mitigate harm—may not be the same as those who are causally responsible for the harm ([6, 7], 779–780).

Regarding responsibility for addressing the misrepresentation of science, an argument from vulnerability is simple. The misrepresentation of science can be harmful, as I have shown above. A journalist, or a politician, or even a concerned citizen may be directly causally responsible for the misrepresentation of science. That is, they may be responsible for the act of conveying scientific material in an inaccurate or misleading manner. But, I wish to claim, it is the scientist who should be responsible for rectifying these misrepresentations, or engaging with the public in such a way to prevent them from occurring.

Why would we hold the scientist responsible? Scientists, as those individuals with specialised knowledge and training are in the best position to prevent this harm from occurring ([6, 7], 62–70). And insofar as we are vulnerable to scientists as a result of how their research is conveyed by scientific or lay sources, scientists carry a general obligation to prevent such harm from occurring. Whether or not it should be scientists as individuals or scientists as a collective who meet this responsibility is something I will deal with at the end of this paper.

Argument Two: Scientists as Professionals

To say that a scientist has a general obligation to prevent disvalue from occurring through the misrepresentation of their research is not enough. Scientists have a special obligation, I would venture, to prevent the misrepresentation of their work. This comes from their status as professionals.

Now, one might argue that there is a blurry line between professional scientists and those who know science. That is, while there are people whom we call “scientists” as a matter of convention, there does not seem to be a firm line separating these individuals from individuals who merely possess scientific knowledge. We might think that this stands in opposition to lawyers or doctors, who possess accreditation that signifies they have if not a different kind of status, than at least a set of knowledge and skills that is radically more developed than the distinction that exists between scientists and mere possessors of scientific knowledge.

I concede there may be no fact of the matter as to what makes a scientist, but rather individuals who possess more or less knowledge of science. But the same goes for medical practitioners. Say I attain my M.D., but do not become a professional physician. I possess all the skills to become a practising doctor; I just have not become a doctor as such. One day as I am walking down the street, a man nearby stops, as one eyelid droops and his face goes rigid on one side. I immediately identify that the man is in the early stages of a massive stroke. If I continue to walk past, am I accountable when in 15 s time he falls to the ground with no competent medical practitioner to at least identify his symptoms and call emergency services? Perhaps I have not strictly failed in my professional obligation. But my knowledge, my ability to provide care, alert other practitioners (time to diagnosis and response considered important in the outcome of a stroke), surely brings with it responsibility. Or in a less extreme case, if I have taken a CPR course but have not become a paramedic, and while at the beach a man is pulled ashore by a friend after going under and stopping breathing, most would say I am responsible. Regardless of not being a professional qua having a particular job, my status as one uniquely qualified to prevent disvalue brings some responsibility.

Likewise with the scientist. In certain circumstances, scientists or those who know science are the only ones who can accurately understand and predict the outcomes of science. They carry with them the responsibility to prevent disvalue where they are in a position to do so. This includes the prevention of harms that result of a misrepresentation of science.

Further, not only does the scientist a duty to prevent disvalue from misrepresentation of science as I argued previously, but due to the specific set of skills invested in them we have a claim against them that they do so. Insofar as liberal democratic societies invest large amounts of public recourses into the training of scientists and the advancement of science—the United States basic science for the last financial year was $60.5 billion dollars [1]—and universities instruct students as the expert population in science, we as a society have a claim to the good that comes from an unequal distribution of social capital, much as we do from any field.

Argument Three: Self-Interest

Finally, scientists have a self-interested motive to maintain the adequate representation of their research. Scientists, much like any field of endeavour that relies on heavy public support to function, will require credibility. Regardless of the causal primacy of science in certain malevolent uses of science and technology, the institution of science may suffer when science is misrepresented. This loss of standing can seriously disrupt the progress of science, through lack of engagement with emerging scientific issues, distrust of new technologies and diminishing funds for research, and lack of involvement in education in the sciences [3].

This process is not always warranted: recent discoveries of potential malpractice by climatologists in a British research centre has led to a protracted debate regarding trust in science, despite assertions that this was an isolated incident that does not debunk existing evidence regarding anthropogenic climate change [14, 16]. But it drives to the heart of my point—scientists have strong prudential reasons to engage in debates regarding how science is used, as they have a stake in how it reflects on them. If science is used to perpetuate falsehoods, or science is falsely represented, than the credibility of the institution of science may suffer. If science is used to malign political debates, science suffers. If a general panic occurs because of misrepresented science, than again, science suffers. For all these reasons, scientists have a self-interested motive to ensure science is not misrepresented.

Now, while the climate change case has certainly caused the questioning of the legitimacy of climate science, it may be argued that this is the exception, rather than the rule. Certainly, many scientific disciplines actually have their reputations enhanced through misrepresentation, whether this is by promising more than the field can be expected to provide at this time (such as nanotechnology), or by overstating the significance of current results (such as neuroscience). Thus, it may be in the self-interest of scientists, with the exception of those on the receiving end of political opposition, to misrepresent themselves.Footnote 1

Moreover, public panic from misrepresentation can negatively impact science, as concerned groups seek to heavily regulate fields due to potential but as yet far-off threats posed by certain types of technologies. Philip Ball highlights this by noting that despite nanoparticles potentially posing a health hazard, the lack of coherence on what qualifies as a “nanoparticle” or a “nanohazard” makes the creation of a nanohazard symbol similar to a biohazard symbol would be inappropriate [2]. Scientists may have an incentive, rather, to misrepresent their work in overly positive or triumphalist tones, to prevent public opinion from constraining them.Footnote 2

However, we might think there are competing reasons for scientists to not want to misrepresent themselves from self-interested reasons. The first is that while short term gains may benefit scientists, the long term benefits of such misrepresentation are anything but certain. We might think that popular reactions to modern genetic selection and selective abortion practices as a revival of “eugenics,” where such a term is and has been used pejoratively (though of course, this pejorative sense is not the only way in which eugenics can be used. See e.g. [15], pp. 8–10) have harmed the perception of modern genetics to an extent. This example may serve to show that while short term gains in prestige through misrepresentation may provide a certain kind of incentive, reflection will show that the risks to a field or those fields which emerge after it make such actions problematic.

Second, we might want take scientists up on their claims that science is the practice of understanding facts about the world. Certainly, the central members of the “new atheism,” such as Richard Dawkins and Sam Harris profess that science has greater power than other fields of human endeavour, such as religion, to give true statements about our world (see e.g. [4, 9]).

I choose not to engage with the validity of these claims at this time. However, if science is engaged in the practice of misrepresenting itself, or allowing itself to be misrepresented, this it seems would strike a tension between the image scientists and other seek to present of science, and the things carried out in its name. We might think that the institutions of religion may have to be accountable for the atrocities committed in the name, sanctioned or not. By the same token, the bad outcomes coming from the misrepresentation of science reflect just as much on scientists. If scientists are serious about their claims that science is a source of truth in the world, they should ensure that these claims are genuine, and not a form of charlatanry.

Discussions: Other Stakeholders and Policy Considerations

All this, of course, does not excuse others. As I stated earlier, this paper is primarily about giving moral and prudential reasons that scientists may have to maintain and monitor the perception of their research. But those who wilfully misuse or misinterpret science are still at fault. What we may say, however, is that scientists who do nothing to prevent their research from being maligned are partly at fault for the harms that may result from it.

A further question arises when we ask what actions this duty might result in. It is one thing to say that scientists have a duty to prevent the misrepresentation of science. It is quite another to say how this duty would best be satisfied. There are two broad sets of options: individualistic and collective discharging of duty.

On an individual level, we could say that scientists ought to carry out their duty to correct misrepresentation of science wherever they find it. Scientists should be active in monitoring the way that science is used, and vocal in correcting any errors found in public life. Moreover, they should seek to educate individuals about the correct interpretation of science wherever they can. When a journalist misuses a scientific claim, by drawing the wrong inference or making claims the science does not conclusively show, the scientist should respond.

However, this could be a very demanding solution. Scientists have to expend much energy keeping up with the progress of their own field, much less taking the time to investigate the possible places their science is misused. Multidisciplinary, transdisciplinary and critical uses of science intensify through the creation of convergent technologies, the increasingly scientific basis of the explanations of behaviour, and science makes larger and larger policy impacts. To keep up with this all would exhaust the recourses of most scientists. It may be that we are required to limit this claim somewhat, perhaps by saying that if a scientist finds and instance misrepresented science she is equipped to correct, but does not, she is culpable for this misrepresentation being perpetuated.

On a collective level, the use of scientific associations to provide a voice to engage in debates that use or misuse scientific evidence can be a good solution in collectively discharging the duties of scientists. However, these associations are often highly specific to certain fields, or are so broad that they run into as much trouble covering ground as individual scientists do.

At an intermediate level, perhaps scientific communicators will fill a role somewhere between the individual and the large collective. They will serve as professionals who, while participating in the scientific enterprise, also interface between science and the public. A few—if not many relative to the number of active scientists—science communicators are active in public life.

The reality will most likely be a mix of these solutions. Individual scientists will police the use of their own work in a responsible manner, by writing letters and replying to critics in the public. The proliferation of online media makes this process easier in some respects, and harder in others, as the access to information increases, but the number of voices increases with it. Scientific communicators will act as spokespeople for sections of the field.

For larger, more insidious problems, professional societies are much better equipped to tackle larger groups abusing scientific understanding. Some misrepresentations of science such as that in the global warming debate are conducted by large, well funded organisations and lobby groups [8]. In these cases, it is not a single rebuttal, but a systematic campaign that is needed to educate the population about the presentation of faulty views. In addition, public societies can help educate the populous about systematic errors in understanding scientific research, such as what statistical inference does and does not show, the causation–correlation distinction, and so on.

There are also things the media can do to facilitate scientists communicating science properly. One is an increase in the standards of scientific journalism, and greater space within popular publications for scientific input, review and critical reply. In addition, the recognition and incentivisation of more scientific communicators would serve to bring about a greater engagement between science and the public. Finally, education in the sciences to increase the base knowledge of science the public possess, particularly about grounding subjects such as statistics, mathematics, and basic science would serve to make the scientist’s job easier by preventing problems, rather than solving them after the fact.

Conclusions

Scientists, as members of society, professionals in an influential community, and members of a field who wish their field to retain good standing, have reasons to maintain how their research is understood. Misrepresentation of science can not only hurt others, but it can hurt science. I have provided three accounts of the responsibility of the scientist to ensure the correct representation of scientific knowledge. The specifics of how this is carried out in practice is much more difficult, but needless to say there are solutions, such as the ones I have sketched above, that are applicable on many levels of governance.