Foreword

When the Île-de-France Nanosciences Competence Centre (C’Nano IdF) was launched by the French Ministry of Research, the National Centre for Scientific Research (CNRS) and the French Atomic Energy Commission (CEA), in 2005, its director immediately planned to include, alongside its thematic axes, a transverse axis entitled “Nanoscience & Society”. This axis was created in 2006. It consists of a team of researchers in human and social sciences, toxicology and eco-toxicology. Its Office welcomes representatives from these disciplines, as well as researchers in nanophysics, nanochemistry and nanobiology, in order to foster research with a genuinely interdisciplinary dimension.

The reason our Office supports multidisciplinary research on the development of nanosciences and nanotechnology (NST) is to explore the different controversies emerging in this arena.Footnote 1 The objectives include: a better understanding of the health and environmental risks, a deeper inquiry into the philosophy and sociology of nanosciences and their applications, into their place in our educational system, their regulatory frameworks and markets, etc. Numerous research programs have shown the advantages of an epistemological openness to views from diverse disciplines, and perhaps new interdisciplinary methodologies or concepts, for our work on nanosciences and nanotechnologies.

However, in the interactions with our Île-de-France colleagues, there have been occasional misunderstandings about our objectives, methods and research practices. Surprisingly, the very existence of explicit techno-scientific controversies in the “nano” arena is often denied on behalf of a conception of science, risks, public engagement and responsibility which borders on a disembodied idealism and merits at least serious discussion.

The recurrence of this view prompted us to clarify our position regarding our common field of research, in order to avoid being trapped in the seemingly clear divide between the universal and neutral pursuit of pure science, on the one hand, and on the other hand the infinite variety of values and opinions that lead to the horrors throes of pure relativism (Vinck 2007).

We therefore launched an internal trans-disciplinary project within the Nanosciences & Society Office, in order to overhaul the premises underpinning both this idealistic standpoint and our own work, and to find a better definition of our approach to the exploration of the real policy implications of NST research initiatives. Indeed, the debates surrounding NST clearly have wider implications for the examination of issues of Science, Technology and Society as a whole.

A first step in this clarification process was taken in Paris, at the conference which has now been published in this issue of Foundations of Chemistry. We further develop it in this paper, in the hope that it will contribute to the joint construction of a better nano-future.

Introduction

Unlike previous Research and Development (R&D) programs and technological waves, the distinctiveness of nano-R&D is the explicit willingness of NST proponents, practitioners and regulators to actively anticipate and regulate the consequences of the technology before it spreads into the wider world, instead of dealing with its ‘side-effects’ afterwards. Nanosciences and nanotechnologies are both a global scientific program and an ongoing political agenda, a societal project shaped by research—including Social and Human Sciences (SHS)—as well as a scientific project shaped by society (Laurent 2010). Because definitions of NST are irreducibly multiple (Theis et al. 2006) and not limited to a well-defined set of scientific disciplines or specific fields of application (Loeve 2010), the meaning of NST can be agreed as a matter of fact only to the extent that it is a matter of concern (with shared problem and controversies).

Nanosciences and nanotechnology (NST) can hardly be characterized otherwise than as an assemblage of:

  • Heterogeneous fields of pre-existing scientific subdisciplines, e.g. surface science, spintronics or supramolecular chemistry;

  • Instrumental practices, e.g. probe microscopy, DNA-microarrays or optical tweezers;

  • Technoscientific objects, e.g. carbon nanotubes, molecular machines, magnetic nanoparticles;

  • Visions and narratives of the future, e.g. technological greening of old industries, the prospect of transhumanity, an interconnected world innervated by an invisible network of ubiquitous devices;

  • Regulatory definitions, e.g. at the International Standardization Organization (ISO), the Organisation for Economic Co-operation and Development (OECD), or the European Commission’s definition of nanomaterials;

  • And changing research policies and innovation systems: from ‘mode 1’ to ‘mode 2’ (Gibbons et al. 1994), from ‘big science’ and ‘pure science’ to ‘science in the context of application’ (Carrier and Nordmann 2011) or to ‘post-academic science’ (Ziman 2000; Johnson 2004).

Nanosciences and nanotechnology (NST) is the ongoing collective experiment assembling all these heterogeneous elements. SHS do play an important role in this collective experiment. Contrary to a well-established view of the relations between the ‘hard’ sciences and the humanities, SHS research on NST cannot be considered as merely a soft outer layer covering a hard scientific core. Just consider the assemblage outlined above: it is clear then that there is no nano-center to contrast with a societal periphery (Jones 2006), but rather a multiplicity of interconnected spheres, including science considered as a social activity in its own right as well as other activities which partake in our common world: law, trade, education, art, work, etc.Footnote 2 Concerns about the implications of nanotechnology come not simply from a handful of SHS research outsiders worried about the hazards of nanotechnology or more pragmatically looking for new opportunities. From the outset, early nanotech visionaries such as Drexler, who popularized the concept of nanotechnology, clearly articulated the prospect of a revolutionary general-purpose technology at the molecular scale with a reflection on its possible social and cultural impacts (Drexler 1986).Footnote 3 Even after the marginalization of the nanotech guru and his utopian/dystopian visions, the first large-scale R&D funding scheme—the US National Nanotechnology Initiative (NNI), launched in 2000—included funding for research into the ‘societal and ethical implications of nanotechnology’. With initiatives of this kind, which have been followed across many countries in the developed and developing world, nanosciences and technologies have mobilized SHS research as an integral component of their development in order to anticipate and assess the so-called ‘future impacts’ of the emerging technologyFootnote 4—a framing, as we will argue, which is not neutral.

After a period of intense polarization around ‘hype’ issues (Schummer 2005)—such as runaway self-replicating nanomachines, brain-machine interfaces, or invisible spy chips—which had the merit of bringing nanotechnology into the public arena, the debates then refocused on more ‘reasonable’, narrow and—apparently—less ‘politicized’ issues relating to the toxicological risks of nanoparticles and materials (Wood et al. 2008). We are now entering a third phase predominantly framed and phrased around the issue of responsibility in a seemingly more inclusive and consensual dynamic. With watchwords such as ‘Responsible Research and Innovation’ (the prevalent catchphrase in the European context) and ‘innovative and responsible governance’ (in a more US context, see Roco et al. 2011), responsible innovation and governance is now the dominant paradigm in nanotechnology policy, embracing a voluntary approach to managing nanotech risks and societal issues.

To be sure, ‘responsible innovation’ is a fine-sounding, vague and catchall term. However, it contains a tension that can be brought out by asking whether responsibility should serve innovation or innovation serve responsibility (Rip 2011a).

  • In the former case, responsible innovation expresses a commitment to develop the technology while pursuing every reasonable effort to anticipate and mitigate adverse implications or unintended consequences. It is thus on a par with ‘responsible development’ and essentially means avoiding public backlash and fostering acceptance of the technology. When equated with ‘responsible development’, ‘responsible innovation’ does not fundamentally question the linear model of innovation where “science finds, industry applies, and public adapts”. In this case, there is indeed a tenuous line between adapting public opinion to the development of the technology and modifying innovation to reflect public opinion.

  • In the latter case, responsible innovation means looking for criteria that determine which part of ‘innovation’ is worth developing and which is not, in keeping with our conception of responsibility.

However, the term ‘responsibility’ itself contains a variety of meanings originating in very different spheres: the nineteenth century paternalist ideal of ‘responsible science’ (Rip 2011b); the world of management and business, with the notion of ‘Corporate Social Responsibility’ (CSR), (Carroll 1999); and the philosophical field of technology ethics (Jonas 1984). The discourse on responsible innovation also implies a peculiar climate of urgency in the sense that it urges innovation to respond.

But to what exactly should innovation respond? To ‘societal needs’, by giving the people what they (allegedly) want? To ‘competitiveness’, by exploiting all possible opportunities? To the avoidance of harm or the minimization of risk—‘negative’ responsible innovation—or to the active goal of doing good—‘positive’, responsible innovation (Randles et al. 2012)? Moreover, what qualifies innovation as responsible? Is it providing the public with transparent information, or engaging the public into real debates? Is it attending to Health Environment and Safety (HES) and Ethical, Legal, and Societal Aspects (ELSA)? Is it bearing the costs of potential hazards, or favoring particular values or conceptions of a good life? Is it considering “all aspects” of one’s research, including everything that “happens with it outside of the lab” (SEI 2004), and if so, how far does this consideration extend and how can it be made ‘do-able’? (Kelty 2009).

For sure, ‘responsible innovation’ means many different things to many different people. It is an umbrella term covering a variety of systems of governance and practices at different macro/meso/micro levels (Rip 2011a): political and societal discourses; institutional codes of conduct; ELSA consortia; ongoing practices of scientific, industrial, and civil society actors. Like all buzzwords, the term is intended to be consensual, motherhood and apple pie. It is devoid of controversial content and open to constant reinterpretation. Like ‘sustainable development’, ‘co-opetition’ and other oxymorons (Méheust 2009), it smoothes the angles and prevents the disclosure of potential conflicts of values. However, though ‘responsible innovation’ is subject to criticism in that respect, it does not mean that it should be abandoned without further examination.

In this collective contribution by the C’Nano IdF Nanoscience & Society Office members, we want to take ‘responsible innovation’ as an opportunity to disentangle various ways of construing responsibility, in terms of different instruments and contexts of public action that shape the field of nanotechnologies. By asking how responsibility is construed and engaged, what we are responsible for, to whom and to what extent, and, last but not least, who is ‘we’—who is the subject of responsibility, individual or collective?— our aim is to clarify our own conception and approach to responsibility in order to make sense of our action with and within NST.

In the following lines, we systematically examine two ways of exploring the relations between nanosciences and society (which are also two ways of construing responsibility) in three interrelated arenas: the production of knowledge, expert assessment and the public management of risk, and public debate.

Deliberating knowledge production: science AND society versus science IN society

Irresponsibly, we have placed science and technology in the realm of public debate where it does not belong. Scientists debate among peers in specialized conferences and publications. Scientific—and even technological—research is a process the ultimate aim of which is unpredictable.Footnote 5

It seems that part of the hiatus in the relations between Science and Society actually comes from a vision of science that is widely promulgated by many and often heard during nano-debates or within the labs, which perceives science as an activity entirely separate from the social sphere. In this scheme, science is thought of as an activity beyond the reach of ‘ordinary people’, reserved for an elite, the ‘knowledgeable’, who have their own rules and autonomy, and whose productions are controlled internally through compliance with scientific standards and externally through a system of peer review, which ensures both the scientific rigor of the reasoning and, more generally, its legitimacy.

In other words, approached from a perspective developed by sociologists of sciences, what is propagated here is a Mertonnian vision of science (Merton 1973). In this account, science is an activity carried out by disinterested actors working only for the pursuit of truth through the practice of organized skepticism in a communalist approach where knowledge is accessible to all, universal and objective. Science is thus seen as an autonomous field of intrinsically high value, separate from the social sphere.

This perspective is not interested in science as it is practiced, but in an idealized science (Barnes and Dolby 1970; Mulkay 1969) that reveals the unmediated and uncontroversial reality of nature. It would seem that this vision of science as a world apart, detached from reality and from social, cultural and historical context, is not only dated,Footnote 6 but mostly false. In particular, when it comes to NST, such an idealistic view completely ignores the real nature of research. Consider, for instance, scientists seeking to connect molecules to nanoscale electrodes in order to make them function as electronic components. What they are doing is not concerned with testing fundamental hypotheses about the reality of nature, independently of human purpose and action. Even though such activities require a great deal of scientific knowledge to model, calculate and measure the operations of the system, it is knowledge about a technical object (or a molecule conceptualized and instrumentalized as such). Even though it may generate new knowledge, that knowledge concerns the technological possibilities rather than fundamental properties of nature independent of humans. Nanoscale research does not pose to its objects the question “what it is”, and not always the question “what can it be useful for?” either; It rather asks “what can it afford?” It is a mode of research primarily interested in exploring and exploiting the ‘affordances’ of the nanoscale—the possibilities of action offered by nanoscale environments, objects and processes (Gibson 1979; Harré and Llored 2011; Bensaude-Vincent and Loeve 2013): How an object, process or phenomenon can be made, controllable, manipulable, tunable, trackable, addressable, transferable, triggerable, etc. While nanoresearch does not always directly seek to produce useful applications, it is nevertheless always concerned with acquiring new capabilities of technological design and phenomenon control.

The social, cultural and political history of science and the sociology of scientific knowledge have drawn certain conclusions regarding science in Society:

  • Knowledge is contingent on the socio-historical contexts of its production;

  • Knowledge and society simultaneously (re)define and (re)construct each other;

  • Science is not an ideal city of egalitarian and democratic spirit;

  • Knowledge has always been linked with power.

The sciences have long been applied in the design of techniques, material objects and weapons, contributing to the practical mastery of the world, the production and implementation of political and military control; they have been decisive in the reproduction of elites and in social selection, have provided ideals and standards, new ways of being-in-the-world (Pestre 1995, 2003). In addition, like other social activities, scientific research is financed in part (large part) from public funds, which is a good reason why the choices that guide the allocation of those funds should be discussed by all.

Today, it is acknowledged that the myth of pure science performed an essential function. By successfully obscuring the role of scholars within society, this discourse has promulgated the intrinsic superiority of western civilization’s scientific modes of thinking and being (Bonneuil and Petitjean 1997). It has enabled the West to establish and sustain its intellectual ideals and moral values and act to control the world, things and people. Yet today’s NST are international; they incorporate a multiplicity of visions of the world from heterogeneous cultural backgrounds. Consequently, they invite us to rethink the dominance of purely western views and needs in a multifaceted world.

Without taking sides—which is not our purpose here—on the qualities and deficiencies of western civilization, the pure science discourse helps to sustain and reactivate its big metaphysical dichotomies: between facts and values (Putnam 2002), between object and subject, between primary and secondary qualities (Whitehead 1920/2004), between the world as it is and we, human beings, who are in a position to state what is knowable, but also between everyday and ‘ordinary’ laboratory practices and the sanitized and sanitizing reports that arise from them (Latour and Woolgar 1979). While the meaning produced always embodies a specific historical and social order, the scientific statements are presented as socially neutral, describing nature in its reality, simply telling the truth. They become undeniable, especially for lay people (Latour 1996). Moreover, this kind of dichotomous perspective would seem all the more absurd in that the observed research field of NST is the outcome of technological advances, proclaims a close association with societal and economic goals, and renders obsolete established dichotomies such as ‘basic’ versus ‘applied’ science, ‘science’ versus ‘technology’, ‘scientist’ versus ‘engineer’, ‘knowing’ versus ‘doing’, etc.

Research on the history and sociology of innovation has shown, for instance, that basic and applied research are linked in the case of electromagnetism (Wise 1988), nuclear physics (Latour 1989), and materials science (Callon et al. 1991; Bensaude-Vincent et al. 2011). NST, however, does more than intertwining basic and applied modes of research. While the category of ‘basic science’ always presupposes the need for a category of ‘applied science’—and vice versa—the emergent and exploratory research conducted under the NST umbrella seems to belong to a regime of knowledge production that proves indifferent to these dichotomies. For that reason, some prefer to call this kind of knowledge regime ‘technoscience’, where theoretical representation and technical intervention cannot—even in thought—be distinguished (Bensaude-Vincent et al. 2011). Indeed, a large proportion of nano-objects functioning in the lab (such as molecular machines or solid-state quantum computers) have no short or medium term applications: they could have applications—and of course some of them will—but they are not primarily designed or valued for utilitarian purposes (Loeve 2010). Does this mean that they constitute fundamental or basic research? Not at all! Despite the efforts of many scientists who claim that they are doing nanoscience rather that nanotechnology, since they are concerned not so much with applications and are often dismissive of technology—viewing it as a purely utilitarian and market-oriented activity—the lack of direct application is not enough to posit a new fundamental domain of science. For technoscientific research, it makes no sense to separate theory and reality or mind and world and, only then, to see how they relate to one another (Nordmann 2013). In this respect, technoscientific research may appear quite fundamental, as Gilbert Hottois remarked about physics (Hottois 1984, 68–69).

Mathematical and experimental (hence doubly operational) physics is content to formulate, in mathematical form, what happens on the occasion of a technological operation. It refers exclusively to technological procedures, to the technical measurement and recording of the result of interactions. The question of quiddity (what and what essence) is totally alien to it.

Nanosciences and nanotechnology (NST) of course includes some very practical research domains, such as targeted drug delivery or nano-electro-mechanical devices (NEMS), and modeling, measuring and calculating the behavior of NEMS is precisely an example of the sort of seemingly very theoretical practice referred to by Hottois. Current industrial uses and nano-products are also widespread in materials, cosmetics, food, etc. But there is no necessary unilinear sequence in NST from laboratory to industry, or from theory to application. In fact, many industrial uses of nano-objects predated the academic research carried out on the same objects. To take an iconic example, when the field of carbon nanotubes began its big scientific takeover with Sumio Iijima’s famous 1991 Nature report, carbon nanotubes had already been on the market since the 1980s as intercalation compounds for lithium-ion batteries (Endo 2002). Many NST research objects (such as titanium dioxide nanoparticles or semiconductor crystals) were first produced in industry before entering the lab, where they are then studied like ‘natural’ objects—one more example of the “ontological indifference” of nanoresearch to dichotomies such as the natural versus the artificial (Galison 2010).

The authors of this paper consider scientific and technological research to be a fully fledged social and cultural activity like education, law, art, etc. and therefore believe that it should be the object of political reflection. Of course, studying the practical developments of NST does not mean that we claim to be able to assess the quality of scientific work. We rather seek to explore the goals of research and bring them into the public sphere. While the results of scientific and technological research are sometimes unexpected, the goals are easy to define: generating relevant knowledge in a field, exploring new possibilities, producing new devices, knowing the risks, but also creating jobs, changing society, challenging nature, contributing to scientific and technological policy choices…

It is impossible to separate knowledge from its modes of production and of social existence. In our view, they need to be analyzed together. Nevertheless, the view that science and society are separate worlds still holds and underpins numerous discourses related to the debates on NST, which we will now explore further.

Deliberating expert assessment and politics: from risks to collective concerns

The production of technical knowledge in NST and the construction of the field are not confined to the laboratory. There are several dimensions of uncertainty in NST: uncertainties of knowledge (about the risks, such as the toxicological effects of nanomaterials and nanoparticles); uncertainties of development (about the developmental trajectories of nano-products); public uncertainties (about the social impacts, public attitudes to NST, and the way deliberations will be implemented) (Kearnes and Rip 2009). As public concerns about the potential safety of nanotechnology objects and consumer products were voiced early in the definition of nanotechnology programs, public expertise is an important component in the evolution of nanotechnology as a public issue. It is therefore crucial to examine the instruments and modalities of “responsibility” in the areas related to the development, use, and circulation of public assessment.

If conceived as an autonomous field of activity, expertise is supposed to be left to professionals. While situations of uncertainty—such as nanotechnology—might cast doubt on the production of assessment and prompt calls for public participation, the setting of boundaries to determine who can intervene in assessment and who cannot is a crucial issue. At this stage, two positions can be identified. The first consists in determining in advance the criteria whereby individuals can be selected to participate in the expertise process. The second approaches the construction of boundaries between ‘expert’ and ‘lay’, between ‘science’ and ‘politics’ as open questions up to analysis, and possibly entailing political dimensions.

Science and technology studies (STS) scholars have been at the forefront of this second approach. Using case studies related to public management of risk and uncertainty, they have explored the construction of boundaries between ‘science’ and ‘politics’ in areas where the distinction between the two is not straightforward, the process of ontological construction in scientific and social scientific assessment (e.g. in the life sciences), and the democratic challenges posed by the organization of assessment.

Numerous empirical studies have demonstrated the political significance of the construction, diffusion and use of public assessment in technical domains. The ‘boundary-work’ present in settings such as the American federal agencies (Jasanoff 1990) or the French ‘agences sanitaires’ (Granjou and Barbier 2010) is a process that is both epistemological (that is, expected to represent the facts of science) and political (in that it shapes the forms of public legitimacy in contemporary democracies). In the modern system of expertise—and particularly in matters related to technological risks—this results in a technocratic vision of expertise, both descriptive and prescriptive, in which a phase of ‘scientific’ evaluation conducted by experts is followed by a ‘political’ phase, characterized by decisions about the public management of the risks identified by the experts during the previous phase. Collective actions are then ‘responsible’ when experts and scientists follow the standards of scientific research, while decision-makers act in accordance with the scientific advice they receive, under the mandate they receive from citizens.

STS research has shown that the modernity works by ‘purifying’ the public issue so that an assessment can be made within such a linear system (Latour 1993). The processes that are able to draw gradual boundaries between ‘scientific problems’ and ‘political issues’ within intertwined entities are those that ensure the stability of the modern political system. Yet in situations of controversy and uncertainty, the purification process appears increasingly complex, a complexity that has been analyzed by STS scholars as arising from a tension between the acceptance of the need for boundaries on the one hand, and on the other hand the practical realities of readjustment in the actual construction of expert knowledge (Jasanoff 1992).

Nanosciences and nanotechnology (NST) are an interesting example of this mechanism. Assessing the potential risks of industrially produced nanomaterials is still an uncertain process, as even identifying the physico-chemical characteristics of these substances is problematic. This arises from the dynamics of innovation in nanotechnology. As in any of the materials sciences, the construction of nano-objects is based on the properties that are required for specific uses. ‘Nano’ substances are ‘materials by design’: They are developed to exhibit particular physico-chemical properties. What makes them ‘nano’ is then no more than the fact that they have new size-dependant properties. Consequently, two producers of carbon nanotubes will produce different nanotubes reflecting the needs of their different customers. They can develop objects that differ in their length, diameter, numbers of walls, conductivity, rigidity… and in other physico-chemical characteristics. In consequence, there are potentially as many nanomaterials as there are variations in an indefinite numbers of parameters. This makes any evaluation of the risks of ‘nanomaterials’ difficult, because a limited number of characteristics has to be chosen in order to assess the risks. An additional problem is that of measuring exposure to nanomaterials. As these substances move around, they can change, become attached to or separated from other materials. The exposure to nano substances thus depends on a large number of parameters, arising from the characteristics of the substances themselves and from the environments in which they operate. Hence, the separation of the ‘risk assessment’ phase from the ‘risk management’ phase is rendered difficult by the uncertainty characterizing the evaluation of the risks of nano substances: Considerations related to the definition of ‘nano-ness’ have to be introduced upfront. They are discussed in regulatory and standardization arenas, through processes in which the connection between ‘science’ and ‘politics’ is not straightforward.

Consider for instance the definition of ‘nanomaterials’. ‘Nanomaterials’ are not a known category of chemicals, and may be defined in potentially controversial ways (Maynard 2011; Stamm 2011; Lacour et al. 2011; Laurent 2013a). They can be defined according to a 1–100 nm size limit, as has been the case in international standardization bodies (ISO 2008), for reasons relating as much to technical uncertainties as to the constraints of international diplomacy (Laurent 2013b). But criteria linked with the potential properties of the substances may also be applied. Thus, the European institutions have used criteria connecting ‘nano-ness’ with size distribution (rather than with size itself) of the components of a given material (EC 2011). In this latter case, the relationship between ‘nano-ness’ and potential hazards is more explicit, albeit not always acknowledged and controversial in European policy and assessment circles. Defining nanomaterials (a pre-requisite for any risk evaluation in the traditional linear model of assessment) is thus an inherently political process, where what matters is the way uncertainty is approached (Lacour 2012), and which goes hand in hand with a way of defining what “responsibility” means in (regulatory) practice (Laurent 2012).

There have been interesting experiments in other technological domains on the overlap between assessment and politics. For instance, the International Panel on Climate Change (IPCC) constitutes a new kind of structure that allocates a role to collective assessment and international negotiation, making the Panel both a technical and political entity (Miller 2001). In the case of nanotechnology, ‘responsible development’ has been identified with the call for a ‘precautionary’ approach (EC 2004). This is reflected in a number of initiatives, including the development of scientific research projects on the toxicology of nanomaterials, which have identified a number of potential hazards in specific cases. Yet the question remains what to do in a situation where risks are suspected, but not demonstrated in ways that might lead to consensual regulation. The European regulatory authorities have adopted measures such as amendments specifically targeting nanomaterials in regulatory texts, compulsory labeling,Footnote 7 or voluntary codes of conduct defining general principles for ‘responsible research’ in nanoscience and nanotechnology (EC 2008). ‘Precautionary’ action is therefore a mix of science and politics. The process is scientific in that it is based on ongoing research into the potential hazards of nano-substances. Yet it is political in that it requires collective decisions about a still controversial situation. ‘Responsibility’ is then a component of a collective process, and is neither a ‘scientific’ nor a ‘political’ matter.

The experiments underway to find a scientifically trustworthy and politically acceptable path for the responsible development of nanotechnology ultimately shift the notion of ‘risk’ itself. Defining a public issue in terms of ‘risks’ supposes that there is a category of identified objects with measurable properties (including toxicological properties), and that public bodies are then responsible for appropriate initiatives. Yet in the case of nanotechnology, the situation is more complex. First, the identity of the objects in question is uncertain, as discussed above. Second, their toxicological properties are often relational rather than intrinsic properties, and once outside the lab, the set of possible relations profiling toxicity proliferate. Third, nanomaterials are a topic of discussion insofar as nanotechnology as a science policy program has become an important focus of policy-making and public debate. The need to respond to the “ethical, legal and social implications” of nanotechnology was raised early in the development of nanotechnology programs in France and the United States. Within this initiative, the framing of public concerns in terms of health and safety risks was central, and structured the public treatment of nanotechnology alongside the ‘ethical issues’, for which specific forms of assessment were required (Renn and Roco 2006). This framing is not neutral. It separates ‘risks’ from ‘ethics’ (whereas questions related to the identification of risks are matters of value choices, as discussed above) and assumes that in nanotechnology public concerns are issues to be dealt with by specific forms of expert assessment, whether scientific (e.g. toxicology) or social scientific (e.g. ethics). The issues of risk in relation to nanomaterials are thus defined as such insofar as these substances form part of a policy program whose ‘impacts’ are expected to be taken into account. Yet the uncertainty about the identification of nanomaterials, their potential developments and the way they should be treated in terms of public regulation, introduces doubt into this framing process, which then becomes a matter of public concern and, possibly, of democratic deliberation.

Deliberating public debates: from the deficit model to the value of dissent

If considered as an autonomous sphere distinct from society, science is not open to debate, or at best only at the margins. In particular, debates about scientific theories, the practices of researchers or the foundations of their work, are precluded. True, there are public discussions when, for example, science has immediate repercussions on the health of citizens. But this view implies that all forms of debate should remain restricted to a well-defined sphere (either ‘scientific’ or ‘social’), and that any crossover will have to be based on the provision of scientific, rational information to a public that is, in essence, presented as being primarily governed by emotion or feelings.

This discourse is based on the assumption that the public is incompetent on scientific and technological issues, and tends to lose interest in them, if it does not adopt a ‘technophobic’ attitude, based on partisan interests. This view of science-society debates has been described as a ‘deficit model’ (Wynne 1991) (contending that people reject science because they don’t understand it) or a ‘model of public instruction’ (Callon 1998) (in which the role of science is to educate an ignorant public). It is based on the assumption that scientists produce universal and objective knowledge, in contrast with common sense, which is purportedly grounded in subjectivity, irrational beliefs and superstition. Only by disseminating knowledge established by the former is it possible to dispel the fog in which the latter gropes. In the deficit model, the scientist is the bringer of light!

This view is very prevalent, of course, in the field of nanoscience and nanotechnology, where the use of the image of the infinitely small, related to the necessarily ultrasophisticated tools that provide access to knowledge, as well as the ubiquitous allusion to the mysteries of quantum physics, accentuates the idea that knowledge is completely inaccessible to ordinary mortals, and must therefore be left to the ‘knowledgeable’, who can at best try to explain it to the public through popular science. The following excerpt from the French national debate on nanotechnologies held by the national commission for public debate in winter 2009–2010 can be read as a typical illustration of this cognitive deficit model:

M. X (scientist CNRS): (…) To deny knowledge is to open the doors to ignorance, to all possible abuses and manipulations beyond. When you say we’re going to chuck them all over the place (author’s note: nanoparticles), where do you get this information? Who told you that?

Mme Y (member of the civil society organization Friends of the Earth): tires with carbon tubes, when they get worn… You have nanoparticles in walls and cements. I’m just reading what is written…

M. X: matter in nanoparticulate form has been around for centuries.

Mme Y: the problem is that we will do it on such a scale…

M. X: we are explaining that we will acquire a better understanding of what this material is made of and of its effects. You reject this understanding. It‘s a strange position.Footnote 8

The ‘knowledge deficit’ argument not only leads always to a caricature of the interlocutor—the ‘layman’—but it fails to account for the breadth of public debate related to science and technology. As countless empirical studies have shown, scientific research is fueled by controversies (see e.g. Collins 1983), either internal to the scientific arena (e.g. Does it ether exist? Does quantum physics violate the principle of causality?) or pertaining to ‘non-scientific’ actors (e.g. What should we do with nuclear waste? Should we authorize genetically modified crops?). The proponents of NST programs recognize this explicitly, as they call for ‘upstream public engagement’ and ‘public dialogue’ with all ‘stakeholders’ in nanotechnology development.Footnote 9 Of course, the real meaning of such appeals is uncertain, and they may reflect an attempt to communicate a ‘positive image’ of responsible development while still maintaining the deficit model, as much as a desire for authentic public debate as a basis for regulatory decision-making. But this very ambivalence is a sign that nanotechnology has been associated from the start with concerns about public perception, and about ways of involving the public in the construction of the field.

These concerns have led to the organization of numerous public debates and discussions. Evaluating these initiatives is complex. On the one hand, they can be interpreted as ‘democratic experiments’, and potentially assessed on the basis of criteria such as the impact on decision-making or the number and diversity of participants. Yet on the other hand, they are part and parcel of the development of NST, in which the ‘public’ is perceived as potential consumers and users, whose acceptance and trust must be won (Wickson et al. 2010). As such, the normative definitions of ‘democracy’ that these initiatives embody cannot be taken for granted. Consider, for instance, the outcomes of a national public debate about nanotechnology held in France in 2009. This debate was widely described as a failure, since many of the public meetings were interrupted by so called ‘radical’ opponents of nanotechnology, who refused to take part in the public discussion at all, arguing that is was merely a way to construct the acceptability of an otherwise unquestioned development program.

There are two possible readings of this episode. The first re-affirms the need for good communication and the ignorance of the vocal opposition (which fails to account for the fact that the opponents were in fact very well informed, much more so than the average non-vocal participant). The second interprets this episode as an experiment seeking, like many others, to make NST a public matter of concern. As with any experiment—including scientific ones—there are uncertainties about the process and the outcomes, which need to be interpreted in order to make sense of it. For that matter, actions that do not follow the rules of the participatory process might be a sign that new experiments are needed to empower democracy on these contested territories. They might be a sign that interested groups are less interested in ‘risks’ and ‘impacts’ than in deeper issues related to the construction of research programs, or to the worldviews and values associated with the design of NST objects.

While the first perspective is content with stable and familiar categories (‘science’ and the ‘public’), the second questions both the production of knowledge and the construction of various public groupings, in ways that might not be consensual. Being ‘responsible’, in the first perspective, means ensuring the efficiency of information channels to a ready-made public. Being ‘responsible’, in the second, is much more uncertain; it pertains to the possibility of making controversies explicit and of testing ways to tackle them.

Proponents of the deficit model may lament the trend in science policy and be nostalgic for the somewhat mythical era when scientists were left in peace in their labs, and when the public was eager to learn from them. The ideal ‘public’ they whish for is constructed as a ‘pure’ and ‘disinterested’ citizen devoid of any political and moral attachments, in a mirror-like image of the ‘pure’ and ‘disinterested’ science. However, this model will not help them to interpret the current state of the relations between science and society, nor will it restore the cultural supremacy of science. For far from denigrating the value of knowledge, accepting the idea that controversy is inherent to research is doubly productive. First, it contributes to the dynamics of scientific development and the value of knowledge when it holds firm against every kind of test. Second, it is politically productive in that it permits the collective exploration of a variety of public decision-making options. It may even extend the notion of ‘responsibility’ to the procedures of democracy itself. In this sense, it emerges clearly that the main objective of a debate, when it occurs, is certainly not consensus.

Interpreting the 10 years of debates on nanoscience and nanotechnology is complex. To conclude that they have been a series of repeated failures would imply that their objectives were clearly defined. In the short term, they certainly did not produce any decisions, but in our view this was not their goal. However, they have probably provided an opportunity for many participants to form and express their positions. Their effects need to be evaluated in the long term. A debate always contains a dimension of unpredictability and a successful debate is one that elicits new positions, lines of force, and even new oppositions. When placed in a real situation, for a new research orientation or, for example, in an emergency, these new positions and oppositions will further help scientists and non-scientists to reach an informed decision and to feel involved in issues that broaden their original scope. In fact, it is not the deficit of knowledge that can lead to decisions not founded on relevant information, but rather the deficit of debate. Decisions taken in this way may turn out to be too hasty, if not simply wrong.

Moreover, involving groups concerned with science and technology (including research groups themselves) goes far beyond questions of health and environmental risk alone, or of individual safety. This issue of public engagement, with the resources and media it requires, is the subject of much research, especially in communication, education and mediation science, but also in sociology. For instance, attendance at science museums, sales of science books, or the success of initiatives such as ‘Science Day’, illustrate the appetite of citizens for scientific and technological issues and their social context, and even on the philosophical questions that inevitably arise from scientific and technological research. Both the appropriation of knowledge and the changes in social practices induced by scientific and technological policies have already proved to be topics that can mobilize people strongly in areas such as GMOs, nuclear power, AIDS, or rare diseases (Epstein 1998; Callon and Rabeharisoa 2008).

However, there has to be some agreements on the purpose of these debates. This must be made explicit from the outset—even though it will inevitably be redefined during the deliberative process. And even if their purpose is limited to issues of risk, as it is often regrettably the case, understanding the nature of risk and discussing the way in which risks are identified, acknowledged and addressed is, again, a valuable source of controversy that needs further exploration. Above all, it is an important ‘test-case’ for our democracies.

It also seems clear that there is no single ‘public’ constituting a global and stable entity. Multiple categories of people, with different competences, take part in public debates, and their legitimacy cannot be prejudged. In addition, one person may wear several ‘hats’, and may change hats in the course of the debate. For example, a scientist is also a citizen and a consumer. An activist is also a person who possesses curiosity, and a critic is also someone who possesses knowledge. A scientist (including a SHS) is also partly the public of another scientist of another discipline (including SHS) as much as an active partner of an interdisciplinary research. Every discipline has its own interface with a ‘public’ that includes researchers of other disciplines too. In short, the ‘public’ is not the general ‘otherness’ of research.

Finally, if one accepts the idea that research is both a social activity and a means of producing knowledge and action in our shared world, it must be recognized that an understanding of scientific matters in no way guarantees an understanding of the social, ethical, and economical issues associated with controversies about science and technology. There are forms of ignorance in science as in other areas, and they are widespread once we start digging a little… no one is omniscient! The aim of interdisciplinary research is not very different with that of public engagement: It is to produce an overlap of as much zones of knowledge and (most important) of non-knowledge that it is necessary for making sense of a shared problem (Stengers 2013).

To summarize, we could say, to paraphrase Paul Ricoeur, that “the idea of debate” is the essence of democracy, based on the public organization of dissensions (Ricoeur 1991, pp. 166–167):

The first victory of democracies is the constitution of a public sphere of discussion (…) where multiple streams of opinions are confronted. (…) Democracy is not a regime without conflicts, but a regime where conflicts are open and negotiable. (…) Under this regime, conflict is no accident, disease or misfortune; it is the expression of the non-scientifically or dogmatically decidable character of the public good (…). Political discussion is without conclusion, although it is not without decision.Footnote 10

The ‘debate’ is not necessarily a fixed device. The possibility of expressing different positions is hugely precious as well as contributing to the pool of ideas. This kind of reflection and collective concern is dependent on the establishment of democratic instruments that structure discussions and exchanges, and on a reflexive evaluation of the forms of democracy thus produced. This is how democratic institutions and their philosophy work, and this kind of democratic policy is essential to ensure that decisions that may have to be taken in situations of uncertainty are informed, reversible, and amendable.

Conclusion

What have we discussed so far? We have shown that the production of knowledge in NST requires us to re-examine the relations between science and society. We have explored these questions with respect to issues of expert assessment and public debate. This enables us to return to the concept of ‘responsibility’ addressed in our introduction: how should responsibility be construed and assumed, what are we responsible for, to whom and to what extent, and, who is the responsible subject?

We have distinguished two approaches. In the first, ‘science’ forms an autonomous sphere independent of society at large; the responsibility of researchers is distinct from that of policy makers and industrialists. The former are responsible for technological choices, while the latter will be prosecuted if consumers, workers or the environment suffer from the negative consequences of new products in the market. In this dualistic vision of the relationship between science and society, everything seems quite simple. Researchers, in developing basic knowledge, are assessed by their peers and, like any human being, have their own individual ethical codes. The chain of responsibility here, like the chain of activities, is linear. Researchers are not responsible for the longer term consequences of their work and have no connection whatsoever with consumers.

This ideal chain of responsibility does not reflect the complex fabric of our shared world, which is made of relationships and interdependencies. Such a view is only a pale decal of a naive description of legal responsibility. Moreover, the normative framework of responsibility is far broader than its positive legal form. Even from a legal point of view, responsibility is a multifaceted object, covering civil, criminal or administrative liability and extending, under certain circumstances, to obligations of a more moral character. These various forms are not attached to successive actors in a linear chain, but are distributed in accordance with specific acts or practices in laboratories as well as in factories, by government officials and by consumers or users of the objects in question. To separate the research field from the normative framework of common responsibility is to assume that each person’s role in innovation processes is stable. But responsibility is no more linear than innovation. It in fact takes quite different forms if we consider liability or damage caused simultaneously to the environment, public health, the health of individual consumers, but also, perhaps, to individual human rights.

Moreover, reducing a scientist’s responsibility to individual moral awareness does not reflect the practical reality of life in a laboratory: today, laboratories are assessed on the basis of their ability to generate relevant knowledge through collective practice, but also on their ability to create objects that can enter the market, which is measured by the number of patent applications, contracts, partnerships, start-ups, etc. All these indicators reveal a close connection between research, economics, and society at large.

Responsibility is broader than individual morality (McCarthy and Kelty 2010); it is neither reducible to judicial liability (even if the normative judicial approach to the notion goes far wider than liability), nor to administrative accountability (even though accountability matters greatly for any organization and might be a significant condition of responsibility). Responsibility, in the etymological sense, is the power to respond to whoever is affected or concerned by one’s actions in relation to a third party. Culpability and responsibility are also two different things: “Responsable mais pas coupable!”, i.e. “Responsible but not guilty!” in English. Determining who is responsible is not necessarily to look for scapegoats. Responsibility is not a negative term: it refers to active engagement. It is the prior commitment to assume the possible incidence of adverse effects or damage—a commitment that expresses itself not only in words but in actions: to act responsibly is to promise. Confusing responsibility and culpability reduces the former to moral subjectivism, whereas responsibility pertains to an ethics of action, not to an ethics of intention or conviction (i.e. based on the intentions of the moral agents under consideration). Good intentions do not guarantee responsible behavior. “Responsible scientists” might be motivated by the best intentions while being involuntarily contributing to shape unsustainable futures. Under this view, is it not the cause determining the action (i.e. the subject’s intention) that should be treated as good or bad, but rather its effects and conditions. As it is often observed in considering the effects of technoscientific developments, these effects and conditions may include processes that far exceed the power of single individuals and which occur through them rather than because of them, which means that there is also a necessary collective dimension to responsibility.

Some would like to reduce the scope of responsibility to the level of the individual human being (conviction, conscience); others would like to limit it to a calculation (‘cost-benefit analysis’). However, the scope of responsibility is not limited to what can be predicted in a risk calculation or a cost-benefit analysis, especially in a field where social and technological uncertainties are so great. Moreover, what counts as ‘benefit’ for a scientist may not be the same as ‘benefit’ for a company, shareholder or consumer. This is why the criteria of any particular assessment should be deliberated in a context where a plurality of views and values is acknowledged and taken as a starting point. This is precisely why responsibility can extend well beyond the private sphere of the subject and the individual moral conscience. The history of science is full of examples of collective responsibility assumed by researchers, not to mention the fact that almost all research organizations now have ethics committees, which reflect the possibility of interpreting ethics and responsibility at a collective level. If we deny any collective dimension of responsibility, we are doomed to perpetuate the same chains of irresponsibility: “I, a researcher, have synthesized this nanoparticle, but I am not responsible for the way it is used by industrialists”…“I, industrialist A, incorporate this nanoparticle into a material and market it to industrialist B, who uses it in this end product. I am responsible for the health of my employees and for the safety requirements of the industrial process in my factory. But I am not responsible for the practices of industrialist B”…“I, industrialist B, sell this nano-enabled product, which had successfully passed all safety tests, but then, if a consumer uses it abnormally, I am not responsible. After all, s/he had freely chosen to buy this product.” The buck is always passed on to the next step in the chain. Assigning a collective dimension to responsibility does not mean stigmatizing this or that group: it means stimulating shared concerns and building a common world through care and attention to what goes on between our different fields of expertise. Precisely because NST is characterized by the production of devices that are generic (not necessarily made for this or that specific purpose, but potentially for many) and transverse (not necessarily confined to application within a specific industrial sector), they must be backed up by an attitude of responsibility that acknowledges its collective dimension.

Whatever a researcher’s field of practice (science or technology, basic or applied research, etc.), it can be interpreted in terms of a social delegation of rights (the right to be compensated for this work) and therefore, as a delegation of power (the power to perform this activity). For that reason, it logically entails a particular responsibility (to be accountable to those who delegate and bestow this power and those rights). It is precisely because society delegates power to the scientist, that s/he must make good use of it. In the case of NST, research activity has an undeniable technoscientific character (it explores the technologically possible). Moreover, it takes place in an increasingly technological society in which, paradoxically, people have less and less technical mastery over their day-to-day material environment. This is why it is not too much to say that the power entrusted to NST research by the rest of society is nothing less than the power over matter.

Therefore, the way responsibility is framed in ELSI programs (‘Ethical, Legal and Societal Implications’ or ‘Impacts’) can hardly be considered sufficient from an ethical point of view. Although anticipating impacts is accounting for the consequences of actions, ELSI research cannot be considered as an instantiation of consequentialism in the ethical sense. This specific ethical theory assumes that the moral quality of actions have to be evaluated on the basis of their consequences rather than on the intentions of the agent or the action itself. However, it requires taking uncertainty in account instead of making decisions on the basis of calculable risks only. According to the philosopher Bernard Williams, ethical responsibility for the consequences has to be extended to the realm of “moral luck”, that is, to circumstances where the course of the action to be evaluated depends on factors that are beyond the control of the moral agent (Williams 1982). Such factors might include not only knowledge uncertainties and development uncertainties, but also values and representations.Footnote 11

A possible objection is that such an extension of responsibility would ultimately lead to a dilution of responsibility. If the researcher becomes responsible for everything, is s/he still responsible for anything? One may well imagine how such a researcher, having no more time to conduct his/her research, might cease to be responsible for anything. However, this is not a realistic view: in principle, yes, a researcher is therefore responsible to the entire society. In practice, his duty is to make this excessive responsibility ‘do-able’ by applying it in domains where it could matter and make a difference. Between the necessity to extend the sphere of responsibility and the actual ability to exercise responsibility, there is certainly a tension, but not a contradiction. If every human being is theoretically responsible to Humanity, to have a responsible attitude means choosing to apply responsibility in one specific area and not another. This could mean discussing the laboratory work in a social context, having a say on possible applications, on their value, their relevance, their utility, their economic ownership, etc. The more researchers accept their responsibility for the particular innovation they help to bring into existence, the more far-reaching will be their commitment.

Finally, researchers are also responsible for the cultural representations and values that they convey in their day-to-day activities. What representations of human nature, technology, society does their practice express? In this, more abstract, domain, the researcher’s responsibility may be primarily to cultivate and exercise thought and to express it publicly—an observation which also applies to SHS researchers who take science and technology as their object of study.