In this chapter I take the reader deeper into the conceptual space of RNA interference (RNAi), the novel biotechnology briefly encountered in Chapter 3. Here, I examine its adoption as a research instrument at the Sea Lice Research Centre for screening genes by gene silencing in search of new therapeutic targets.

First, I sketch some developments in molecular biology from research on the diverse class of molecules known as RNA, focusing on the so-called “microRNA” (miRNA). Immediately, this historical context may seem out of place, given my preoccupation with the minutiae of experimental research at the SLRC as instances of distributed cognition. However, as my analysis makes clear, these episodes from the recent history of science cast light on novel modes of iterative knowledge production in biology. Through an anthropology of knowledge about RNAi, I address how this technology was co-opted and translated for research on salmon lice by Nilsen’s group, as it dovetailed with the trajectory of their experimental system. My goal is not to paint the full picture of a technically complex field, but to sample historical and ethnographic cases that illustrate a continuously changing scientific landscape, and the material culture and modes of practical reasoning used to transect it.

I then turn to historical and philosophical work that have identified a poor fit between the kind of research practices that characterize microRNA and RNAi research, and conventional stories about how experiments contribute to the growth of knowledge. Particularly, I draw on the “New Experimentalists” and their descendants, and arguments for the centrality of practice and materiality, rather than theory, in experimental science. As part of a broader “practice turn” in science studies, these orientations illustrate how ‘reverse vaccinology’ through RNAi in the science of salmon lice did not just entail adoption of new methods, but introduced novel cultural practices of cognition, which had epistemic consequences on the level of the experimental system. Using ethnographic examples, I suggest these developments in salmon lice research can be productively analyzed under the rubric of “exploratory experimentation” (Burian, 1997, 2007; Franklin, 2005; O’Malley, 2007; Schickore, 2016; Steinle, 1997, 2016; Waters, 2007). This concept describes a set of open-ended research practices that does not easily map onto the conventional hypothesis-centered account of scientific experimentation. The interplay between domesticated lice strains, incubators, single-tank system, RNAi, and a suite of associated technologies from biochemistry to bioinformatics, was epistemologically productive because it enabled a range of epistemic pursuits, including “technology-oriented research,” and “question-driven inquiry” (O’Malley et al., 2010). As I hope to make clear, not every act of experimentation is for testing hypotheses, making predictions, or settling the highly specific research questions associated with the “Hypothetic-Deductive Model.” Hypothesis-driven research of this kind is usually reserved for situations with tightly delineated and regulated research contexts. It is therefore a poor descriptive model for the kind of open-ended, multidisciplinary approach to molecular parasitology that was carried out at the SLRC.

In examining these developments, I explore relations between scientific concepts and material culture through a distinct variation on the anthropology of knowledge that Roepstorff and Frith have described as “experimental anthropology” (2012). In this case, “going experimental” as they dub it, does not refer to a method or research aesthetic, but implies that I take as my object of study the cultural practice of scientific experimentation, and approach it as an activity of joint meaning-making. This entails that one must take seriously the technical minutiae and “emic” accounts of central scientific concepts. I must, invoking Ludvig Fleck’s words (1979), examine the experimental “thought style” (Denkstil) and “thought collectives” (Denkkollektiv) of contemporary biologists. This requires close analysis of criteria used by these collectives for assessing the validity of knowledge, their assumptions about why certain pursuits are valuable, necessary, and productive, as well as attention to how knowledge gets transformed through active engagements in the lab.

Screening Salmon Lice

On the third floor of the High-technology Centre, next to the water cooler and a small plaque informing visitors they are entering the SLRC, a Centre for Research-based Innovation funded by The Research Council of Norway, hangs a large poster. On the poster is a diagram that represents the Centre’s complex workflow, or “pipeline.” This will “facilitate development of new methods for lice control and shorten the time from basic research to new products and tools for parasite control in the aquaculture sector to achieve a true integrated pest management in the future.” In the preceding, we have seen how key elements, such as lice strains, hatcheries, and single-tank arrays, were put to work in the search for therapeutic targets with the adoption of a relatively new biotechnology called RNA interference (RNAi). RNAi falls into the purview of “functional genomics.” This approach to the complexity of life aims to understand relations between genotypes and phenotypes by investigating transcription, translation, and regulation of genes to answer where and when these are expressed in the organism. This includes how the expression of genes differs in cell types and cell states, their functional roles in cellular processes, the interaction between genes and gene products, and how gene expression changes according to environmental factors (Fig. 4.1).

Fig. 4.1
figure 1

Rendition of the Center’s pipeline for discovery

The diagram depicts a multistage process where knowledge derived from the lice genome is used to identify candidate genes for RNAi screenings. It also marks a series of decision points dependent on the epistemic outcomes of each preceding step, such as “phenotype assessment” and “drug-target evaluation.” While this depiction suggests simplicity and linearity in the process of advancing from experiment via data to therapeutic application, the scientists working in this field are well aware of the intricacies obscured by such salient representations. They know that data production in contemporary biology is “out of sequence,” messy, and contingent (Stevens, 2013: 108). The common sense intuitions described by David Hume as humankind’s “original stock of ideas,” which sustains our potential for knowledge production, evolved for active sensemaking in the medium-sized niche that humans are accustomed to (Atran, 1990). When we enter the world of molecular mechanisms like RNAi, these dispositions do not always serve us well. Our species cannot directly see biological macromolecules, like genes and proteins, with our bare eyes. Nor can we interact with them with our bare hands, meaning that any relationships we have to such entities are necessarily mediated and enacted through material artifacts and representations.

SLRC’s novelty lay in the application of RNAi to conduct “screens” for candidate gene targets. In this context, a screen is an experiment performed to assess the contribution of a particular gene to the organism’s phenotype, which helps determine whether there is potential for pursuing further research on the candidate that could result in effective commercial vaccines, or other therapeutic biomolecules. RNAi screens are supported by high-throughput technologies, such as genome sequencing, microarray analysis, and RNA sequencing. It is a form of bioengineering practice known as “reverse genetics,” where sequences of DNA or DNA products (such as mRNA molecules) are disrupted or altered so their systemic effect on particular molecular pathways can be observed, either at the cellular level or the level of the “whole” organism. Reverse genetics marks a distinction with the “forward genetics” of classical genetics. Reverse genetics looks at the phenotypes that result from changes to specific sequences of genes. In contrast, forward genetics, looks for genetic origins of traits by irradiation, chemical alteration, or insertional mutagenesis caused by jumping genes (or, transposons), sequences that may change position within a genome.

Biologists tell us that RNA interference is an ancient phenomenon, over 1.5 billion years. Eons before humans elucidated the biological processes that would later be unified as the “RNAi mechanism” in the late 1990s, eukaryotic organisms evolved a tiny molecular machinery. This protected their hereditary material against attacks from harmful genetic elements, such as viruses and transposons. As many other biotechnologies today, RNAi has a double nature. In one sense, it is an active cellular mechanism that has evolved in a vast number of living things. In another, it is domesticated and applied as a commercial technology, firmly entrenched as a staple ingredient in the material arrangement of numerous laboratories and experimental systems across the planet.

How did RNAi transform from a product of natural selection to one of cultural selection, or to use Rheinberger’s concepts, morph from an epistemic thing in fundamental biology to a productive technical object in the applied science of salmon lice? To appreciate this transformation, we must first examine the role of RNA molecules more broadly, including research into cellular processes that were first considered to be of minor interest, but turned out to be profoundly important.

RNA Basics

Ribonucleic acid (RNA) are remarkable polymeric molecules that serve many different biological functions inside the cells of all known organisms. Alongside DNA (deoxyribonucleic acid), its more famous relative, RNA is one of the essential macromolecules for life, as we know it. The molecule takes many forms but the most familiar, which is taught in high-school curriculums, is its role as the messenger molecule, a substance capable of storing information transcribed from double-stranded DNA by the RNA-polymerase machinery into an intermediate, single-stranded form known as “messenger RNA” (mRNA). This sequence of nucleic acids is then translated into a proteinaceous form with amino acids, by tiny molecular entities known as ribosomes and an adapter molecule, transfer RNA (tRNA). In eukaryotes, this process takes place in the cytoplasm of the cell. This cascade of molecular events, which results in the formation and modification of proteins, is known as gene expression and it is fundamental for living things. Francis Crick elevated this one-directional traffic of information from DNA via RNA to protein as the “Central Dogma of molecular biology.”1

The molecule also come in other flavors, such as transfer-RNA (which transfers amino acids in protein-synthesis), ribosomal RNA (that combines with protein to form ribosomes), and small nuclear-RNA (processing mRNA into a mature form in eukaryotes). RNA molecules are synthesized in cells as single RNA strands but have the biochemical ability to base-pair with themselves and other RNAs, forming secondary and tertiary structures. RNA molecules are also classified by their size (“long” or “short”) and on basis of their origins and mechanisms of operation. Molecular biologists have demonstrated how RNA molecules are central for regulating gene expression in cells. Proteins are usually not synthesized unless needed for a biological purpose, since this would be highly inefficient. Cells are therefore equipped with tiny mechanisms ensuring that not every protein that is potentially in the genome gets synthesized all the time.

Biologists used to believe that gene regulation was achieved by proteins, complex polymers that twist and fold into a bewildering variety of shapes and can act as catalysts (enzymes) for a multitude of chemical reactions (see Myers, 2015 for an ethnography of protein research). Details about key mechanisms of action in genetic regulation of hereditary material were famously elucidated in work by the 1965 Nobel laureates François Jacob, André Michel Lwoff, and Jaques Monod. When Jacob and Monod proposed their famously elegant lac-operon model of gene expression four years earlier, using E. coli as their model system, it was not yet clear whether gene expression was regulated by proteins or RNA, although the two were convinced that RNA was the main regulatory molecule. But as narrated in a popular textbook, the notion that RNA governed gene expression was “largely forgotten as more and more protein regulators were found in both prokaryotes and eukaryotes” (Watson et al., 2014: 701). Still, considerable research on newly discovered regulatory molecules composed of RNA had accumulated by the mid-1990s. The idea that RNA could catalyze its own replication and synthesize other RNA molecules, even paved way for an influential hypothesis about life’s origin, articulated by Nobel laureate Walter Gilbert (1986). In an ancient “RNA world,” the molecule began acting as a self-replicating entity well before DNA evolved to become the central genetic material in organisms, with RNA only later assuming its familiar role as the messenger molecule, mediating between DNA and its protein products.

MicroRNA: Converging on Biology’s Dark Matter

A massive research effort in molecular biology has since been directed at the complexities of a relatively newfound class of nucleic acids with noncoding functions, known as microRNA (miRNA). The first “glimpses into a tiny RNA world” came from the Boston region three decades ago. Victor Ambros and Gary Ruvkun worked together in the 1980s as postdoctoral researchers in H. Robert Horvitz’s molecular genetics lab at MIT (Ruvkun, 2001; Ruvkun et al., 2004). This worm became a favored model system for studying general principles of developmental regulation, after Sydney Brenner initiated the Worm Project in 1963 to map and describe the developmental lineages of all the thousand cells in this transparent, millimeter-long nematode which has a 3.5-day life cycle (Ankeny, 2001). Ambros and Ruvkun were descendants of this widely successful research program (O’Malley et al., 2010), which landed Horvitz, Brenner, and John Sulston a 2002 Nobel Prize for breakthroughs in “genetic regulation of organ development and programmed cell death.”

Ambros and Ruvkun studied gene expression in mutant cell lineages to understand “heterochronicity,” the timing of when cells transition between different life stages. They were focusing on features of a mutation (e912) in a gene known as lin-4, which caused developmental defects making the animals look deformed by reiterating extra larval stages, as well as the gene lin-14, which produced the Lin-14 protein, keeping cells in their larval state.2 Further work on cell lineages suggested that lin-4 and lin-14 were part of a larger developmental switching system: “the same cell lineages that reiterated early programs at later larval stages in lin-4(e912) animals instead completely deleted their entire early larval programs in animals lacking lin-14” (Lee et al., 2004: 89). When Ambros and Ruvkun left the MIT to establish separate laboratories, at respectively the Massachusetts General Hospital and Harvard, they continued to investigate the complex details of this relationship. It was known that production of Lin-14 protein after the first larval stage led to the arrested development of adult cells and yielded sterile specimens that did not reach adulthood.

By 1987, it was clear that when lin-4 was transcribed into a messenger RNA that decreased abundance of Lin-14 protein, lin-14 mRNA lingered in the cell. This indicated a post-transcriptional mechanism at work, which at the time were assumed to be predominantly caused by proteins controlled by genes in conformity with the “protein orthodoxy” (O’Malley et al., 2010). In 1989, evidence from Ruvkun’s group showed that activation of lin-4 somehow turned off production of Lin-14 by blocking translation of the mRNA, rather than preventing its formation as would be expected. Probing further into the regulatory relationship between these two genes over the next years, Ruvkun’s lab found conserved sequences in a particular region of the mRNA responsible for downregulating LIN-14, and these sequences were suspected to contain the elements through which lin-4 acted (Lee et al., 2004: 90). The two labs then shared data hoping to learn more, with Ambros’ group exchanging lin-4 sequences for Ruvkun’s data on lin-14. On June 11 in 1992, both investigators noticed a remarkable coincidence, and when Ambros called Ruvkun “each of them read the complementary sequences to the other over the phone, practically in unison” (Lee et al., 2004: 91), confirming a partial alignment between lin-4 RNA with noncoding sequences in the lin-14 mRNA.

With new information at hand, the groups unpacked these surprising relationships, building a strong case for a more direct interaction between lin-4 RNA and the lin-14 mRNA. Importantly, Ambros’ lab showed that lin-4 did not produce a regulatory protein as first suspected. Instead, it yielded a very short strand of RNA at the length of roughly 22 nucleotides, in addition to a longer RNA, around 70–80 nucleotides. The gene did not code for a protein at all, which was puzzling: what functions could such an oddball molecule serve? Working from a different angle, Ruvkun’s group made the case that seven short stretches around 20–22 nucleotides long in the so-called “3-prime untranslated region” (3’-UTR) of lin-14’s mRNA paired with lin-4 RNA, albeit imperfectly. These surprising results were published in 1993, back-to-back in two papers in the prestigious journal Cell. But despite the new vistas opened up by this research, the findings did not “trigger a goldrush,” as the insights were “novelty rather than a harbinger” (Ruvkun et al., 2004: 96). Furthermore, the representational scope of these observations appeared limited to C. elegans or was, at best, generalizable to other Nematoda, thereby pointing to a minor phenomenon.

But the perception that these findings were trivial, changed seven years later. In 2000, a second short RNA was detected in genetic analyzes of the same heterochronic pathway in C. elegans. Let-7 also caused cell arrest at the larval stage, despite a diminutive stature of only 21 nucleotides. But the bigger story about a tiny RNA world, that would radically change the science of gene regulation, came together when evidence from bioinformatic databases showed that let-7 had clear phylogenetic relationships to genes coding for small RNAs in the genomes of Drosophila and even humans, eventually showing up with homologues in sequences from a range of other organisms. This was a major discovery. A radically new type of general and highly conserved and influential regulatory mechanism for gene expression, spanning across biological kingdoms, had been uncovered. The term “microRNA” (miRNA) was popularized by Gary Ruvkun in a 2001 commentary in Science, appearing alongside three groundbreaking papers on these mechanisms: “tiny RNA genes may be the biological equivalent of dark matter - all around us but almost escaping detection” (2001: 799). Today, thousands of miRNAs, which fold back onto themselves to form “hairpin” structures, are known to subtly influence gene expression. While some regulate cell development and homeostasis, others protect against viruses and transposons. It is to this latter category of regulatory elements we now turn.

RNA Interference

As the microRNA puzzle came together, different properties of RNA were also explored by other scientists. In the 1980s, research had uncovered the molecule’s ability to regulate gene expression by binding with complementary target RNA, in a process known as “antisense RNA.” But RNA held other secrets. Textbook accounts of the process later known as RNA interference, often start with some serendipitous results in molecular genetics from Richard Jorgensen and Carolyn Napoli. Working for a now-defunct transgene agribusiness company, the two were designing ornamental petunias. Eager to learn more about the enzymatic pathway that makes it intensely violet, the two introduced an exogenous gene into the plant, but their intervention did not deepen flower coloration as predicted. Instead, the exposed plants had scattered pigmentation, and some were entirely white. This suggested that some unknown effect was “cosuppressing” both the endo- and transgene (Napoli et al., 1990). But while their observations were certainly interesting, they could not offer a sensible causal explanation. Nonetheless, publications of similar cases in plant systems soon began piling up. Since these cosuppression events also resulted in degradation of RNA after transcription, the phenomenon was rebranded “post-transcriptional gene silencing.”

Soon, documentation of analogous processes emerged from other species. Studies on the model fungi Neurospora crassa described how exogenous gene sequences impaired expression of endogenous genes, an effect that was called “quelling.” At Cornell University, Kenneth Kemphues and his graduate student Susan Guo made similar observations in animals, when they injected antisense RNA into C. elegans while studying a gene called par-1. “Antisense” RNA is complementary to the “sense” strand of the messenger RNA which is translated into a protein. In line with the reigning model of “antisense” interactions, Kemphues and Guo figured that injections would halt gene expression, since hybridization between RNA sequences (complementary binding) should effectively inhibit translation. Surprisingly, they got the same results in both experimental and control conditions, undermining their predictions. Since the RNA injections in the control were not complimentary, and thus could not bind to the mRNA transcript, some unknown process had to cause their strange results. “Identification of par-1 gene by injecting in vitro-transcribed anti-sense RNA” was first published in the Worm Breeders Gazette (13(3): 24 June 1, 1994), and disseminated in Cell only later. Gazette was an early precursor to bioinformatic databases, promoting an ethos of cooperation and open data. Its content was based on quick presentations of new results and methods in a digestible format, to be treated as personal communications and not citable without the author’s permission.

Amidst these developments, the molecular biologists Andrew Fire and Craig C. Mello directed two different research groups working on DNA transformation in C. elegans, using “clever” new methods for microinjections as part of their experimental systems (Mello, 2008). Mello had trained on the worm under David Hirsch’s supervision at University of Colorado in Boulder, in 1982. When Hirsch left to join the biotech industry, Mello moved to another alumni of Hirsch’s lab, namely Dan Stinchcom’s laboratory at Harvard. In Boston, Stinchom shared facilities with Victor Ambros (of microRNA fame), and both supervised Mello in their Wormlab. Years later, Mello learned about antisense technology and RNA injections from Kemphues and Guo. He decided to apply the technique in his own research at the University of Massachusetts. Andrew Fire was also researching this phenomenon from his lab at the Carnegie Institution of Washington’s Department of Embryology. Fire, on the other hand, became interested after data on the worm’s response to RNA-triggered gene silencing from other labs “came together” in informal discussions in a heavily attended C. elegans meeting, organized by Mello in 1997 (Fire, 2007: 203–204).

Fire’s group had long worked on unc-22, a favored gene he came to know during a fellowship in the mid-1980s, at the Medical Research Council Lab of Molecular Biology in Cambridge (England). At this time there were discussions in the worm community about whether a fraction of double-stranded RNA (dsRNA) was causing the observed gene silencing. Indications pointed to a relatively stable material whose effects persisted over days. And dsRNA, which is more stable than its single-strand variety, was a well-known contaminant in RNA synthesis, since the molecule can form double helices by folding and pairing with itself at complementary sites. Since C. elegans was a flexible and accommodating experimental system, “virtually any biochemical sludge could be concocted and injected into a worm, with a very rapid (and in most cases quite specific) assay at the end” (Fire, 2007: 204). It was therefore convenient for one of Fire’s technicians, SiQun Xu, to perform double-stranded RNA synthesis of unc-22, a gene involved in muscle function, which produced a condition where the worm twitched strongly, even with minuscule amounts of RNA. Using a technique called in situ hybridization, Mary Montgomery from Fire’s group also demonstrated remarkable efficiency of RNA-initiated downregulation of the gene mex-3 in embryos. In Mello’s lab, a graduate student named Sam Driver was rehearsing micro-injections of dsRNA into the nematode under Mello’s tutelage, but he accidentally botched several injections. These ended up in the worm’s body cavity instead of the targeted germ cells. To the team’s surprise, even misplaced injections yielded significant downregulated phenotypes.

These systemic effects were deeply puzzling in light of the “antisense” model, and within a year, Fire, Mello, and their co-workers executed a series of experiments that probed these issues further, summarizing their results in a five-page letter in Nature (Fire et al., 1998). “Potent and specific genetic interference by double-stranded RNA in Caenorhabditis elegans” drew six conclusions. First, double-stranded RNA was far more effective than single-stranded RNA for reducing gene function. Most likely, previous assays introduced double-stranded RNA unintentionally, an artifact that would unify disparate observations made by other research groups. Secondly, the silencing effects were specific for mRNA sequences homologous to the injected dsRNA, as other mRNAs were unaffected. Thirdly, the mechanism was likely post-translational, meaning that a mature mRNA sequence was required (neither introns nor promoter sequences triggered downregulation). Fourth, the target mRNA was somehow degraded in the cell. Fifth, only a few molecules of RNA were needed to manifest an effect. And finally, the results could systematically spread to other tissues and silence target genes in progenies.

Mello had already relabeled this phenomenon “RNA interference” (Fire, 2007: 203), since “antisense” was a misnomer. Similar effects were also caused by “sense” strands of RNA, and their work had a potential link to gene silencing reports from other organisms. This pointed to a significant evolutionary story, although the exact pathways were unclear: “Whatever their target, the mechanisms underlying RNA interference probably exist for a biological purpose. Genetic interference by dsRNA could be used by the organism for physiological gene silencing. Likewise, the ability of dsRNA to work at a distance from the site of injection, and particularly to move into both germline and muscle cells, suggests that there is an effective RNA-transport mechanism in C. Elegans” (Fire et al., 1998: 810).3

More investigations followed (Fire, 2007; Mello, 2007). Lisa Timmons from Mello’s lab, modified E. coli to produce double-stranded RNA which she fed the nematodes. This unspecific treatment also caused interference. Another lab member, Hiroaki Tabara, simply soaked larvae in a double-stranded RNA solution to elicit the interference response. Soon, more evidence that the mechanism was operating at the transcriptional level came from Fire’s group, and a mechanistic model was proposed. Likely, a protein complex mediated between the injected RNA and target mRNA molecule. An evolutionary conjecture proposed that this response was part of a defense mechanism against viruses. Within a year, gene silencing by dsRNA was confirmed in a broad range of organisms, suggesting that the system evolved in a common ancestor over 1.5 billion years ago.

More biochemical features of RNAi were uncovered through work on in vitro cell cultures in Drosophila melanogaster (Hammond et al., 2000; Zamore et al., 2000). RNA between 21 and 23 nucleotides long were found to accompany the interference effect, with double-stranded molecules being processed into shorter, intermediary types that bonded to homologous mRNA targets and cleaving them. These shorter, processed molecules guiding the cleavage of mRNA transcripts were labelled “short-interfering RNAs” or siRNAs (Parrish et al., 2000). How these cellular events were directed was understood in 2001, when the small RNA pathways were shown to be governed by a “common processing machinery that generate guiding RNAs that mediate both RNAi and endogenous gene regulation” (Grishok et al., 2001: 23), offering decisive proof of a relationship between microRNAs and RNA interference.

Later models added a dsRNA endonuclease named DICER, a protein complex that cleaves double-stranded RNA molecules into smaller fragments, one of many actors in a longer molecular cascade involving the RNA-Induced Silencing Complex (RISC). Bioinformatic analyzes showed that this protein complex contained an evolutionary conserved class of endonucleases known as ARGONAUTE, which was identified across phylogenetically distant taxa. Endonucleases are enzymes that cleave the phosphodiester bonds that tie together nucleotides in DNA (deoxyribonucleasees) or RNA (ribonucleases). ARGONAUTE binds different small RNAs together into binding pockets in its three-dimensional structure, and the small RNA molecules appear to guide ARGONAUTE to target mRNA transcripts matching their sequence for either silencing or destruction. As evolutionary conserved proteins, these are involved in both the miRNA and RNAi pathways in many species, giving a unified account of a range of phenomena (Winter et al., 2009). A wealth of work has since characterized the biogenesis of these intricate, molecular machines (Fig. 4.2).

Fig. 4.2
figure 2

A simplified diagram of the RNAi pathway

Reception

In their 2002 December issues, both Nature and Science declared RNAi among their Breakthroughs of the Year. The journalist writing for Science framed the story as follows: “Just when scientists thought they had deciphered the roles played by the cell’s leading actors, a familiar performer has turned up in a stunning variety of guises. RNA, long upstaged by its more glamorous sibling, DNA, is turning out to have star qualities of its own” (Couzin, 2002). RNAi’s ability to initiate gene silencing promised to shed light on the complexities of genomic regulation in specific model organisms as a tool for downregulating different candidate genes and assessing their functional consequences.4 But it also promised more, as the silencing mechanism could potentially be harnessed for discovery and rapid validation of drug targets in human medicine. It also arrived with great timing, as massive amounts of genomic sequence data were being produced at an increasing rate, and RNAi offered a simple and reliable method for assessing specific genes. Even more enticing, RNAi could possibly work as a therapeutic in its own right, by silencing a gene required for viral reproduction or a gene that a tumor needs to grow. Since many diseases are caused by problematic gene activity, RNAi could possibly block harmful genetic pathways. And before long, RNAi entered the public imagination as a potential panacea for many diseases.

RNAi was especially promising for diseases where known drug targets were difficult to reach by other molecular pathways. It could also potentially block cascades of gene expression in disease at the level of RNA, instead of the protein level, where most conventional therapeutics work. When the Nobel Prize in Medicine in 2006 was awarded to Fire and Mello, belief in RNAi’s translational potential skyrocketed.5 In the words of one analyst, RNAi therapeutics was like “stopping the flood by turning off the faucet instead of mopping up the floor” (Haussecker, 2008: 452). Technically speaking, it offered a chemically homogenous pathway with many applications, which gave a competitive advantage compared to pharmaceuticals based on chemically diverse target molecules that could be prohibitively expensive and difficult to commercialize. Since RNAi overlapped considerably with the miRNA pathway, there were also hopes of synergies between research on both systems. RNAi therapeutics had many attractive features for both small biotech companies and Big Pharma alike. Notably, Merck acquired Sirna Therapeutics in 2006 (then valued at 1.1 billion USD), and Roche entered a historically costly licensing deal with the RNAi pioneers at the company Alnylam, a de facto gatekeeper for RNAi therapeutics which possessed disputed patent rights. Despite its dependence on advanced scientific breakthroughs, application of RNAi as a technology offered low technical barriers, since dsRNA synthesis was both easy and affordable. RNAi was also a hot topic among academics, suggesting that high-risk projects could be outsourced to academic laboratories, instead of tying up in-company biomedical researchers (Haussecker, 2008: 452).

Despite these optimistic projections, more sober expectations for RNAi inevitably followed, as hype met the nitty-gritty reality of translational science (Haussecker, 2012; Krieg, 2011). Enthusiasm had been excessive, and after an initial period of sensationalism, the belief in a swift realization of its translational potential faded. As with other biotechnological frontiers like gene therapy, the technology saw great financial volatility. In particular, the delivery challenge, getting RNA fragments into the right cells, manifested as a bigger obstacle than first assumed. Technology development also faced a backlash during the financial crisis of 2007–2008. In one high-profile case, biotech giant Roche decided to shut down their entire RNAi platform in late 2010, priced at 500 million USD. Other pharmaceutical giants like Pfizer, Merck, and Abbott also terminated their RNAi portfolios, despite the enticing technoscientific imaginaries that had fueled investment in these clinical pipelines during the gold rush.

Still, despite a long and bumpy journey, clinical development of RNAi therapeutics continued steadily, with less hype (Bobbin & Rossi, 2016; Haussecker & Kay, 2015). The Scientist, for example, predicted a “Second Coming” of RNAi within a decade, despite an “era of doubt and despair” having replaced the “era of irrational exuberance” (Bender, 2014). This prediction was correct, as better modalities for drug delivery in the liver, for example, paved new paths toward clinical development. Eventually, drug makers reentered the field of RNAi-based therapeutics through new investments (Haussecker, 2018). By 2020, several compounds had moved past Phase-III trials and were approaching the market. While its commercial potential remains untested, RNAi pharmaceuticals were among the best-performing stocks in 2019, leading one CEO to confidently assert that “RNAi has got its sexy back” (Dunn, 2020).

RNAi and the Science of Salmon Lice

I now turn to how this novel biotechnology was instrumentalized as a technical thing, in the science of salmon lice. Building on work on epistemic practices known as “exploratory experimentation,” I argue that conventional models of experiment, which sees knowledge as mainly progressing through “hypothesis-driven” research, does not adequately capture the cognitive ecology of RNAi-based molecular parasitology at the SLRC.

According to a perceptive cognitive-historical analysis by Sung (2008), the elucidation of RNA interference began with an “anomaly” in molecular genetics. The reigning model of gene expression, including antisense-RNA, implied that interventions with double-stranded RNA should have little effect, since these molecules were already hybridized. When these molecules caused gene silencing in C. elegans and other organisms, there was no alternative explanation for the resulting anomalies. Detection and resolution of these anomalous outcomes confronted experimental biologists with a unique problem-space, spawning several conceptual revolutions in the science of gene regulation. Sung’s analysis builds on the assumption that science, like other creative pursuits, operates through embodied meaning construction known as “conceptual integration networks” or “conceptual blending” (Fauconnier & Turner, 1998). These cognitive dynamics elucidate the human capacity to integrate information from different domains and fashion new ideas from the resulting blends. In this view, language does not just represent meaning, but prompts for meaning construction in specific context, based on a repertoire of cognitive and material resources, cultural models, and conceptual structures originating from sensory-motor experience (a topic we shall revisit in more detail in the next chapters).

Meaningful resolution of the RNAi anomaly and its contradictions was the product of a cascade of conceptual linkages. First, Sung shows how biologists used distinct “reasoning strategies” that set up “interrelations” between bodies of knowledge produced by different techniques, so that aspects of a phenomenon in one field, namely, cosuppression in plants, could be transferred to the interference response in C. elegans, Drosophila, and other organisms. This move generated a plethora of novel ideas. Since existing interpretative frameworks, like antisense RNA, were unable to account for the observed experimental anomalies, this model was elaborated through a strategy of “complication,” where new observations of gene silencing effects were accommodated through additions, deletions, and specialization of existing conceptual elements. This process entailed a series of “abductive” inferences across several experimental contexts to resolve the anomalous contradiction.6 Relations were drawn between inserted double-stranded RNA and selected experimental observations about how exogenous strands of RNA were processed into shorter molecules. Furthermore, Sung notes that the laboratory context introduced embodied structure to anomaly resolution; experiments were performed not simply to test theoretical propositions, but to observe surprising phenomenal regularities, create new concepts, and explore variables in more detail.

Fire and Mello’s 1998 study on C. elegans, for example, linked RNAi to cosuppression and post-transcriptional gene silencing (PTGS) in plants and other organisms, paving way for interrelations with observations from other research groups, including in vitro systems built around D. melanogaster and plant experiments. These interrelations, in turn, helped formulate new experiments in molecular genetics that disentangled involved mechanisms, and compressed these into meaningful, coherent cause–effect relationships. Finally, a transition to the RNAi model was achieved by conceptual integrations between previously unlinked elements. New experiments facilitated compression of disparate relations into a coherent account sensible on “the human scale” through a cause-and-effect frame that was “easily apprehended by humans” (2008: 190). The resulting causal model of RNA-based gene silencing could then be transposed from the context of C. elegans into other experimental systems.

RNAi saw tremendous success as a tool for exploring individual gene function, and it was this aspect that made RNAi so appealing for salmon lice experimentation. By the early 2000s, the power to probe gene function could be unleashed with ready-to-use kits and protocols listed in the catalogues of commercial suppliers of reagents. As with other biotechnologies, RNAi was domesticated, cultivated, and commercialized to serve humans in their quest for controlling biology on the molecular scale. In Rheinberger’s terms, RNAi was materially and conceptually transformed from an elusive epistemic thing, something unknown, into a technical thing; a standardized method for inquiring into other novels, epistemic things. In the laboratories of the Sea Lice Research Centre, my ethnographic field site, this long history of translational research was embodied by the MEGAscriptTM RNAi Kit from Thermo Fisher Scientific. Delivered in a small cardboard box, it contained all necessary reagents needed to synthesize double-stranded RNA molecules for knockdown experiments on salmon lice.

From an anthropological perspective, RNAi’s life as a “technical thing” is lodged at the boundary between nature and culture. Since its effects in the laboratory is partly an outcome of unintentional nature (biological evolution), and partly an intentional cultural product, RNAi transcends our commonsense intuitions about functions as the effects of artifacts and things. As noted by Sperber (2007), questions like “what is it for?” or “what is its function?” are properly asked for two kinds of entities: biological traits and processes (e.g., red blood cells, polymerase) and cultural artifacts (e.g., forks, calculators). While biological things have selected effects conferred via natural selection, artifacts are imbued with intended effects by their users.7 A calculator’s intended effect, for example, is to solve mathematical problems—although it may, as a byproduct, also be hurled as a projectile. The difference between intended and selected effects appears to nicely map onto the nature–culture distinction.

Some biological artifacts perform their role as cultural artifacts by doing the same thing as their selected functions, and in RNAi there is an overlap between its selected effects, conferred through evolution, and its intended effects, conferred through human meddling. RNAi performs its artifactual function (preventing translation of messenger RNA) through its biological function, which explains its adoption in countless laboratories. But using these molecular machines for experimental purposes also exploits biological properties which the entity has not been selected for, namely, the evolved ability of RNA to base-pair with complementary sequences of nucleotides. This property is not usually exploited in nucleic-acid metabolism, although it appears in nature as double-stranded RNA viruses, and possibly in other poorly understood cellular processes. But parasitologists at the SLRC exploit the organism’s potential for sequence-specific gene silencing by synthesizing double-stranded RNA molecules with the MEGAscriptTM Kit. Thus, the “cultural becoming” of RNAi as a research instrument co-opts multiple properties of RNA (see Sperber, 2007: 136).

As mentioned in the previous chapter, accommodating RNAi into the experimental system of Nilsen’s research group, did not happen overnight, although RNAi had been successfully applied to other experimental organisms. While RNAi was available as a commercial kit, it still had to be coaxed into an interlocking fit with other components and practices in the experimental machine that had gradually developed around domesticated strains of L. salmonis. One main challenge faced by those recruiting RNAi as a screening method, was to find a reliable delivery route for getting the synthetic double-stranded molecules into the parasite’s interior. While C. elegans responded to a variety of delivery methods, a reliable transmission route had to be specifically adapted to lice at different life stages. The obvious choice for delivery into adult specimens, which are covered by a though exoskeleton, were microinjections. But making injections work was no trivial matter. It required fine-tuning a complex operation with many potentially confounds, within the cognitive ecology of the experimental system. This included:

  • Perfecting the recipe of the double-stranded RNA solution, based on the MEGAscriptTM RNAi Kit.

  • Identification of a non-lethal entry-point into the salmon louse in the dorsal region of the cephalothorax, where the plates on the lice exoskeleton are joined.

  • Cultivating embodied skills and procedural schemas for handling the lice, down to the level of finding the correct angle for the micro-needle, avoiding punctuation of vital organs, and applying sufficient pressure for fluid injection.

  • Finding appropriate glass needles (as one technician explained, the best results were obtained when the group customized their own needles).

  • Optimizing the amount of ds-RNA solution to be injected, and the amount of bromophenol-blue colorant that was used as a marker to identify successful delivery after injections.

  • Calibrating post-injection incubation; the time between RNAi exposure and reinfection on hosts.

  • Devising a new “production line” with intelligent ways of using laboratory space for coordinating research materials and staff during experimental events (we shall return to this matter in Chapter 5).

The first reported use of RNAi in salmon lice by Nilsen’s group was published in 2009, two years before the official opening of SLRC (Dalvin et al., 2009). This study applied RNAi to functionally characterize a protein known as the “maternal yolk-associated protein” (LsYAP), which seemingly played a key role in the embryogenesis of salmon lice. Analyzes of microarray data taken during post-molt growth and maturation of adult female lice had revealed a surge in mRNA transcripts just prior to the release of mature eggs. One of the most interesting transcripts identified during this search, was an mRNA encoding for an unknown protein. This protein had three Fascicilin 1 (FAS 1) protein domains, stretches of amino acids which were deeply conserved over evolutionary time. First identified in grasshopper embryos, these domains were later found in a range of organism and assumed to be functionally important for cell attachment and adhesion.8

Initial studies of lice at different life stages using methods like quantitative PCR and in situ hybridization then showed that LsYAP was a female-specific transcript, and that the protein was associated with the egg yolk. These proteins were most likely incorporated into the female oocyte after transportation from their sites of production in sub-cuticular tissue. In lice, oocytes are produced in the ovary and transported to the genital segment. This inference was based on observations that LsYAP was never observed outside of the genital segment and supported by the identification of LsYAP protein sources in sub-cuticular cells and the hemolymph, a fluid in invertebrates akin to blood. While there were few signs of any direct phenotypic effects on adult lice during silencing of the LsYAP protein, the interference response manifested as deformations in the offspring. In addition to morphological evidence, the potency of RNAi to produce highly specific knockdown effects was also confirmed independently by both quantitative PCR, microarray data, and western blotting methods. In sum, these formative experiments demonstrated a “proof of concept.” RNAi could indeed work as a screening system for therapeutic targets in the lice genome.

In addition to these issues, a range of other relevant conditions for experimental success, such as the refinement of injections, and analytical techniques for procuring useful results from knockdown experiments, were also explored. For example, the group tested several methods for delivering double-stranded RNA into the animal, including a mechanized microinjector and a manual instrument that was operated by blowing into a long tube. Eventually, the latter was preferred since it afforded operators with better tactile control. The group also had to make a series of decisions with epistemic consequences for subsequent analyzes, such as the number of egg-strings to preserve for hatching and the number of samples to be preserved, either frozen or stabilized for later processing with a substance known as RNAlater. Next, the RNAi-treated animals were screened using a method known as quantitative real-time polymerase chain reaction (qPCR), to verify the downregulation of targeted genes. This was necessary due to the potential for “off-target effects,” where other genes than the target sequence get accidentally silenced. qPCR-measurements were also supported by antibody staining, an immunohistochemical technique where tissue samples are stained inspection in the microscope to visually confirm the phenotypical effects of gene expression. In the next three chapters I present multimodal analyzes of how these resources were orchestrated within the experimental system at the microlevel of specific events and interactions.

Following this feasibility study, the team also worked out additional techniques, including a method to silence genes in the early phase of the life cycle by soaking lice larvae in a solution of double-stranded RNA (a method already well-established in C. elegans). This research was published in 2014. A Scottish research group had reported gene knockdowns on the nauplius and copepodid stages using a similar technique in 2009, but these experiments showed high mortality and could not be replicated by the group in Bergen, who set out to develop more robust means for RNAi delivery. They hypothesized that the parasite at this life stage would be particularly receptive during hatching and molting, since the exoskeleton’s structural integrity was weak, allowing RNA molecules to pass through the cuticular barrier.

Building on these developments, the group also performed a series of experiments to identify life stages where RNAi would be efficacious. These trials described the temporal onset of downregulation, when drops in gene expression could be detected, and its duration, comparing the interference response in eight different genes. While these experiments showed significant silencing when the nauplius I-stage was treated beyond its molting phase, they were unsuccessful in downregulating gene expression in the copepodid life stage. Furthermore, the silencing effects in lice lasted for over a month in adult females.

Together, these efforts to stabilize RNAi applications for lice, and make it cohere in a productive manner within the self-vindicating structure of thoughts, actions, materials, and marks of the experimental system, belongs to the class of epistemic practices that Hacking called “modelling of the apparatus” (1992).

Exploratory Experimentation: From Basic RNA Research to RNAi in Salmon Lice

Since the publication of The Logic of Scientific Discovery (Popper, 2005 [1935]), the two major “stock positions” on experimental logic and inference has been Baconian inductivism (after Francis Bacon), and Popperian falsificationism (Franklin, 2005: 891). Inductivism holds that data ought to be collected before theorizing, and that the search for patterns in data should take place afterward. The goal is to make inductive inferences from one instance to many and possibly confirm theories by showing how observations and theory agree. Popper’s falsificationism was a critique of this view, pronouncing a set of normative principles for demarcating and justifying scientific beliefs. In this theory-centered view, which consist of an endlessly repeating two-step cycle, real knowledge can only be derived from hypotheses if they can be refuted by observation (Godfrey-Smith, 2009: 60). First, comes a theoretical activity whereby a hypothesis or prediction is launched in the form of risky conjectures that should be put to a test (there are no recipes for making conjectures in Popper’s view). Secondly, there are attempted refutations through critical testing and observation. While Popper’s model was not limited to experimental science, observations should ideally be performed under rigorous conditions, where scientists can deduce specific consequences from their theories and models, before succumbing their hypotheses to stringent testing. Predictions should be bold, risky, and so precisely formulated as to “forbid” certain observations. If the conjecture passes testing, i.e., are shown not to be false, the theory is said to be “corroborated.” Popper’s principle is thus fallibilistic as theories can never be confirmed. At best, scientists may hope to accumulate theories that have been shown not to be false, yet.

An offshoot of this idea circulates as the so-called Hypothetic-Deductive Method (HDM), a highly schematized account of science which is regularly conflated with the Popperian position (Schickore & Steinle, 2006: ix). Here, making observations that conform to predictions are said to support a given theory. However, as Godfrey-Smith points out, “this process has the basic pattern of what Popper describe, but the idea that theories can be supported by observations is not a Popperian idea” (2009: 69–70). Rather, textbook versions of HDM mix some of Popper’s principles with an overtly optimistic view about the epistemic role of confirmation that Popper rejected. This model has public appeal, as a deeply internalized cultural model and normative ideal with moral force. Work in science studies, however, demonstrate how experimentation is not simply “handmaiden to theory,” but is composed from a more complex tapestry of local tasks. A singular focus that limits the epistemic function of experiments to the appraisal and primacy of theory can thus obscure the generative potential of experimental practices in the research process. The empirical inadequacy of this account becomes especially clear when we compare this model to the canvas of experimentation I described above, ranging from early work on microRNA to the implementation of RNAi as an experimental method in the parasitology of salmon lice.

In a series of biographical mediations, Victor Ambros and colleagues write that the intellectual interests that led to the investigation of lin-4 did not come from well-formed hypotheses about noncoding microRNAs or antisense regulation: “We were simply curious about an interesting worm mutant, and everything we found out about it was unexpected” (Lee et al., 2004: 89). Similarly, Gary Ruvkun’s group points to serendipity as a prime mover behind their own findings, as their work involved “jackpot approaches” that were quite unsuccessful at first. As they conclude, elegance in molecular genetics is “aesthetically pleasing, but scientifically overrated” (Ruvkun et al., 2004: 94). Discovery of regulatory microRNAs was the product of a series of fortuitous experimental events, which generated new insight and resolved a series of anomalies in the absence of specific conjectures.

Links between the Ruvkun-group’s research on regulatory RNA, and Fire and Mello’s work on RNAi, for instance, were pursued on rather unorthodox grounds. It was not motivated by well-formulated hypotheses derived from a theoretical edifice. The group’s own words reveal unconventional justifications for their epistemic choices: “An even deeper connection to RNAi started with numerological considerations (it cannot be called reasoning). When siRNAs of 22 nt, the same size as lin-4 and let-7, were discovered by the Baulcombe and Tuschl groups in 1999 and 2001 […], Ruvkun noted that the number 22 (the number of letters in the Hebrew alphabet) is stressed in the Kabbalah, a Jewish mystical tradition celebrated in medieval Spain, alternative bookstores, and a number of helpful Web sites […]. We began to explore the action of the RNAi machinery in miRNA maturation and activity” (Ruvkun et al., 2004: 94).

Additionally, anomaly resolution demanded a variety of strategies, encompassing experimental tools from biochemistry and molecular genetics, along with new and powerful computational analyzes. These bioinformatic methods, which do not fit well with standard schemas of experimentation, helped identify patterns in larger datasets about networks of interactions and phylogenetic relationships between DNA, RNA, and proteins in the absence of specific hypotheses. As observed by the philosopher Maureen A. O’Malley and colleagues, these breakthroughs in RNA research were made possible by “a reinforcing epistemic transformation that is built on the marriage of wet bench biology to computational biology, as well as the high-throughput data gathering and analysis that such combined approaches enable” (2010: 412).

At the Sea Lice Research Centre, we saw examples of how the marriage between RNAi-based gene silencing and computational methods was critical for progress in studies on salmon lice. In contrast to the received view of experiments as tests of predictions and hypotheses explicitly derived from theory, the drivers of experimental actions at the SLRC were much broader. They included parameter variation, simplification and tweaking of the experimental arrangement, as well as the identification of appropriate concepts to express empirical rules governing the experimental project, mapping of patterns in data, description of regular phenomena, and not least: construction and tuning of new instrumentation.

From the perspective of an anthropology of knowledge, neither the falsificationist nor the Hypothetic-Deductive story offers a satisfactory empirical rendition of experimentation as a situated epistemic activity “in the wild.” There is no uniform standard for what testing hypotheses entail in practice. Furthermore, what is considered an acceptable level of observational specificity for a given theoretical prediction varies across different epistemic situations and only obtains legitimacy through acceptance by a broader scientific community. Even though experimental demonstrations might appear to follow a deductive template in their reported form, they clearly do not have the closed-form of deductive formal logical arguments (Galison, 1987: 2). Instead, I propose that the cognitive ecology of experimentation at the SLRC was maintained through a set of epistemic strategies that is better articulated through the concept of “exploratory experimentation.” Making this argument, I build on scholarship highlighting how experimentation is motivated by other epistemic concerns than merely hypothesis testing.

In the 1980s, science studies made a turn from theory-centered accounts toward greater pluralism in studies on experimentation, in reaction to “the impasse reached in the debate about scientific realism” (Schickore, 2016: 20). Known as the “New Experimentalism” (Mayo, 1994), this body of work encouraged a rethink of how stocks of robust knowledge accumulated from experimentation in relative independence from high-level theories. It is neatly summarized by Hacking’s recognition that experiment “sometimes pursues a life of its own” (1983: 215; see also Galison, 1987). This rethinking increased awareness about important, but often disregarded, tasks of experimental science. These include accumulation of a material culture of finely tuned instruments, and the transmission of skills and propositional knowledge that help obtain accurate readings, and how to distinguish salient effects from artifacts and other background factors (Rheinberger, 1997). Scientists were not just theory builders, but also builders of tools that embody knowledge. Whatever the outcome in terms of “global” theory, researchers working on a given experimental set-up could at least be seen as gaining the know-how, skills, and abilities necessary to produce the observed experimental effects (see also Schickore, 2016: 23). The New Experimentalists renewed interest in observation as an enskilled practice, by attending to how observation was mediated through instruments (Hacking, 1983: 168).9 The turn also cast light on how diligent cross-checking of empirical results keep theorizing in check, and helps distinguish between substantial and speculative outcomes (Chalmers, 1999: 206).

Asking “how experiments end” in microphysics, Peter Galison found them to be “neither rule-governed nor arbitrary” (1987: 254). Dismissing “interest-theories” that reduced laboratory work to mere confirmations of preconceived theory, Galison instead examined the long-, medium-, and short-term constraints that shape experimental practices, and must be overcome through the course of research. Recognizing that experimental outcomes are subject to many theoretical and material constraints, Galison argued that these should not be seen as rigid and determinative, since repeated acts of bootstrapping enable experimentalists to solidify results in the face of shifting conditions. This solidity has two key dimensions: directness of measurement, and stability of experimental outcome (ibid.: 260). While directness refers to how insight enables novel causal understandings, stability refers to how experimentalists gain control over the experimental condition. Later, Galison presented an alternative model further displacing the role of theory, experimentation, and instrumentation (1997: 799). Here, these three elements of science were seen as periodically “intercalated”, similar to how brick walls are stacked in a staggered pattern for resilience (see Fig. 4.3). The inertia and conservatism of different subcultures of research ensure that theoretical progress does not immediately translate into shifts in experimental work and instrumentation, and vice versa. For Galison, it is precisely this lack of synchronicity, or “disunity,” that makes experimental science so robust.

Fig. 4.3
figure 3

The positivist model of scientific progress (A), which Galison dubs “reduction to experience” (1997: 785), aimed to build successive theories upon a solid foundation of observational primitives and logical operations on “protocol statements.” Foundationalism was inverted by the anti-positivists (B), centered around the primacy of theory and the unreliability of observations due to contaminations by theory-ladenness (1997: 794). Kuhn postulated that revolutions in concept and theory caused incommensurability between paradigms. Despite epistemological differences, Galison sees Popper and Kuhn as espousing “reduction to theory.” Here, theory and observation get coperiodized so that breaks in theory coincide with breaks in observation. Galison’s model of intercalated periodization (C) gives contingent autonomy and parity to each, without coperiodization and abrupt changes (1997: 799). Centrally, the epistemic role of material culture, e.g., instrumentation is recognized. Figure redrawn on basis of Galison (1997: 785, 794, 799)

Appearing independently in two case studies in the same year (Burian, 1997; Steinle, 1997), the concept of “exploratory experimentation” further elucidated the interplay between the material cultures of instrumentation, practice, and theoretical conceptualization, by problematizing ways in which experimentation assumed a life on its own, with quite other epistemic goals than hypothesis testing.

Drawing on historical sources from the scientific origins of electromagnetism, Steinle characterizes exploratory experimentation as a set of epistemic strategies used by Faraday to produce new and crucial insights about phenomenal regularities in the infancy of a new research field. He contrasts these strategies with the Popperian view, here construed as an empirical claim about how the experimental process unfolds in practice. Prototypically, “theory-driven” experiments are usually performed with a “well-formed theory in mind from the very first idea, via the specific design and the execution, to the evaluation” (Steinle, 1997: 69). Typically, these are based on detailed expectations concerning possible experimental outcomes. In this model, experiments are not for generating theory, but highly constrained and fixed events, with respect to instrumental arrangements and expectations. Exploratory experiments, on the other hand, order complexity by producing novel concepts and classifications based on observation, rather than falsification of hypothesis derived from theories. Referring to Ludwig Fleck’s work, Steinle suggests that the act of structuring a research field with respect to concepts and categories, profoundly shapes future research by propelling it in certain directions, at the cost of closing off alternative avenues of investigation. As such, these practices often form the undisclosed backstage of research.

Complementing Steinle’s account, Burian invoked the notion of exploratory experimentation to highlight a particular triangulation strategy used by Jean Brachet, between 1938 and 1952, to quantify and localize amino acids biochemically. Lacking suitable methods, Brachet employed a wide arsenal of instruments and techniques from a variety of research fields to cast light on the nature of protein synthesis (1997: 41). By refining and cross-checking his techniques to avoid artifacts and independently confirm results, it was eventually possible for Brachet to localize distinct nucleic acids. Here, Burian extends on Rheinberger’s argument about how the materiality of experimental systems is crucial for attaining novel insights in some contexts. By triangulation between different instruments, researchers can establish connections across experimental systems, opening new productive lines of research.

Additional studies have since applied the concept of exploration to understand a range of other case studies, which together paint a diverse and nuanced picture of experimental life (Burian, 2007; Elliott, 2007; O’Malley, 2007; O’Malley et al., 2010; Steinle, 2002, 2016; Waters, 2007). This record shows that scientists, when confronted with real-world complexity, often work on experimental arrangements with considerably more degrees of freedom and heterogeneity than what Popperian hypothesis testing entails. Sometimes, the objects of scrutiny are insufficiently described, or so anomalous and underspecified that it is impossible to conjure well-formed hypotheses and predictions about the target system’s behavior. On other occasions, the performance of an apparatus must be described under a range of conditions, before it can be productively operationalized in the testing of conjecture. And occasionally, when robust theoretical accounts are lacking, experiments are performed simply to probe unknown relationships to “see what happens.” As such, the notion of exploratory experiment offers a fine-grained view of experimental activity that recognizes the fundamental importance of socially situated activities, including:

  • Surveying various experimental parameters, or combinations of parameters.

  • Separation of dispensable from indispensable conditions for achieving a given result.

  • Identification of empirical rules, and creation of suitable representational modalities for these rules.

  • Mapping empirical regularities within a system or phenomenon (such as “if-then” propositions), to afford new concepts and categories, or revise existing ones.

  • Identification of necessary conditions for producing detectable effects, and to represent regularities in such a way that other effects can be reduced to epiphenomena of other empirical regularities.

  • Movement between material experiments and computer simulations for descriptive purposes (a practice similar to thought experiments, a more “abstract” form of exploration).

  • Development of new instruments, techniques, and protocols.

  • Production of phenomena and effects that do not exist outside the laboratory.

  • Checking whether an instrument or experimental configuration works as intended.

  • Creating arrangements for exploring new phenomena through series of linked experiments.

  • Replicating other results to verify them, or to explore new configurations of instruments.

While these exploratory modalities can entail expectations that are informed by background theory, they are not theory-derived tests of hypothesis in the strict sense, where instrumentation is designed to address one precisely formulated question from a body of theory to falsify a prediction. Neither does this entail “mindless playing around” in the laboratory, free of theory (Steinle, 2006: 186). As the above inventory makes clear, exploratory experimentation involves definitive procedures and guidelines aimed to achieve specific epistemic outcomes. But where the standard model tests specific expectations about what is supposed to happen throughout the experiment, exploratory experimentation orders and categorizes regularities and patterns after the experimental activity ends.

Three Modes of Inquiry in the Molecular Parasitology of Salmon Lice

One reason why exploratory experimentation helps make sense of developments in the post-genomic life sciences is that practice in this field mainly pursues descriptions of mechanisms, rather than high-level theory (Tabery et al., 2016). A biological mechanism is a structure that performs a function in virtue of its component parts, operations, and their organization, so that the orchestrated function of the mechanism is responsible for creating one or more phenomena (Bechtel, 2006: 26). The reliance among biologists on diagrammatic accounts of mechanisms and cascades of molecular events, rather than propositional theories based on deduction from laws, reflects this approach to scientific explanations.10

In the pursuit of salmon lice therapeutics at the SLRC, this strategy manifested as actions to first localize critical target mechanisms within relevant biological subsystems. Subsequently, researchers would manipulate a range of variables, in attempts to decompose the constituent parts of these mechanisms. To determine how different parameters were situated toward the biological phenomenon and interacted to produce it, scientists had to simultaneously work across multiple levels of analysis and methods. As such, exploratory experimentation helps articulate a range of knowledge-making activities based around RNAi at the Sea Lice Research Centre, falling outside the purview of a theory-driven account of the experiment. These varieties of exploration were not only crucial for the historical emergence of the experimental system but could be observed ethnographically from everyday laboratory work on salmon lice.

Despite its productivity, however, the concept of exploratory experimentation is coarse, and cannot capture the entire spectrum of epistemic dynamics that occur in experimental activity. In a case study on the recent history of miRNAs and the turn from genetic to genomic regulation, O’Malley, Elliott, and Burian therefore augment the exploratory modality with two open-ended categories, which they respectively dub “technology-oriented” and “question-driven” research (O’Malley et al., 2010). Together, these modes of inquiry help us better understand the temporal evolution of SLRC’s experimental system, and by extension, the nuance of RNAi screenings of salmon lice biology as an iterative research style.

Following O’Malley and colleagues, the exploratory modality is best reserved for cases of “highly systematic and rigorous variation of relevant parameters in an effort to characterize poorly understood phenomena” (O’Malley et al., 2010: 413). This includes identification of regularities, characterization of the underlying entities responsible for creating them, and the making of conceptual frameworks that can organize observed complexity. In contemporary bioscience, this modality is exemplified by a widespread use of high-throughput technologies in genomics and bioinformatic resources for problem-solving. In these fields, computational and partly automated data-mining approaches have become critical for analyzing the massive amounts of genomic data that is being produced at a rapid pace. These “neo-Baconian” instruments can be used as “induction machines” to discover patterns in data in the absence of specific hypotheses (Stevens, 2013). Easily accessed via the web browser, online bioinformatic resources like NCBI or Ensemble are central in this research process, as devices are more “oriented to the future than the past” (ibid.: 138). Since these tools are designed around known molecular interactions in different biological systems, they are not simply repositories for information, but rather objects of material culture that embodies biological concepts, thereby facilitating the making of new biological knowledge. Over time, scientific concepts have co-evolved in parallel with these bioinformatic systems; beginning with outdated assumptions about “one gene, one protein” interactions, to a current vision of an interactional gene web that works in concert within a complex network of regulatory elements. Models of these interactions in turn, feedback and materialize in the ongoing redesign of bioinformatic databases and their associated analytical tools, as more is learned by applying them in specific research projects.

Computers and black-boxed algorithms have become indispensable for a research strategy that relies on bioinformatic systems to map interactions in gene expression at the genomic level (Allen, 2001; Kell & Oliver, 2004). But the legitimacy of such neo-Baconian practice in an increasingly “data-centric” field (Leonelli, 2016), has spawned considerable debate among biologists, and those who study their practices. The community at the SLRC, along with their peers in countless biology labs around the world have voted in favor of these facilitating technologies with their feet, as they have gradually embraced new methods without much concern for quarrels between epistemologists.

In this context, it is fruitful to distinguish between “wide” and “narrow” instruments (Franklin, 2005). Wide instruments, like micro-arrays and high-throughput sequencing make heavy use of computational algorithms to assemble genomes (in DNA sequencing), or populations of messenger transcripts (for RNA sequencing). Some wide instruments can make millions of measurements simultaneously, or in a very short time, through rapid serial processing. Narrow instruments, on the other hand, yield only a few data points, such as tools used to carefully examine stained tissue sections through the light microscope (a topic which gets extensive treatment in Chapter 7). According to Franklin, wide instruments are best understood as heuristic devices providing practical, efficient methods for solving problems. Neither optimal nor perfect, wide instruments are deemed sufficient for the tasks at hand. They accomplish immediate goals and speed up the research process, particularly in conditions with knowledge gaps about the specifics of a phenomenon or system. By measuring a large part of a domain, wide instruments maximize the likelihood of identifying “difference-makers”; decisive causal factors that change the state of measured outcomes in a biological system (Franklin, 2005: 896).

Still, wide instruments also have limitations. In many cases, they cannot be used in isolation from more narrow approaches. As the biosciences have progressed in their understanding of gene expression, one remaining challenge has been to precisely map structures of macromolecules (DNA, RNA) to their functional expression as protein products under different conditions. There is, however, currently no system or body of biochemical theory capable of generating broad hypotheses that can predict detailed genotype-phenotype, or structure–function relationships for a large assortment of biomolecules (Burian, 2007: 286). One way this challenge manifested in everyday research at the SLRC was the difficulty of inferring functions based on wide instruments alone, such as high-throughput microarrays or RNA sequencing, given that a particular protein could be involved in many cellular processes. Completing the picture about what a particular enzyme did in salmon lice, for example, would therefore require alternative forms of exploration, using a combination of narrow instruments, such as RNAi and other methods.

The Nilsen group’s study of the LsYAP gene, which I described above, exemplifies this experimental logic where wide and narrow instruments interplayed constructively. In that case, the report of high expression levels of the LsYAP gene which drew interest to the gene, came from microarray analysis.11 Bioinformatic processing then identified the sequence and helped design primers for synthesizing double-stranded RNA, so that the gene could be silenced by RNA interference. By knocking down the gene it could then be functionally examined in detail, using a range of narrow instruments. As the authors of the study wrote in their conclusion: “The transcription profile of LsYAP on different life stages combined with in situ hybridisation shows that the LsYAP mRNA is purely transcribed in subcuticular cells lining the adult female louse” (Dalvin et al., 2009: 1414). The use of a wide instrument (microarray technology) interacted with a narrow one (visualization of gene expression through immunohistochemistry), to probe the candidate gene’s potential as a therapeutic target. From here, more sophisticated interventions and models of the molecular cascade could be developed. Among other things, the description involved a bootstrapping procedure where microarray, a wide tool, was redeployed to verify the knockdown effect in samples subjected to RNA interference: “Microarray data also demonstrated that the RNAi against LsYAP was specific and the transcription level of remaining genes on the array was unaltered” (ibid.). There was little need for specific predictions in the conventional sense, to reach new insights about the pathway.

An exchange between narrow and wide technologies also fueled RNAi-based explorations of other genes at the SLRC. One postdoctoral project examined the functional characteristics of an iron regulatory protein (IRP). Blood-feeders like L. salmonis must evolve systems for handling excess iron, a micronutrient that is lethal in high doses. From other organisms, it was known that the Iron Regulatory Protein 1 and 2 were involved in this process, and a database search along with comprehensive bioinformatic analyzes revealed two IRP homologues in L. salmonis. In situ hybridization was then used to localize where in the body these genes were highly expressed.12 Later, an RNAi experiment on pre-adult female lice to check the functional role of these genes surprisingly demonstrated up-regulations of another gene, known as Ferritin.

A third example of exploratory applications of RNAi came from characterizations of three chitinase genes and a more detailed functional analysis of the gene known as LsChi2 (Eichner et al., 2015). Chitin is a polysaccharide and a structurally important component of the louse exoskeleton. It is also the target for chitinases, enzymes that break down the rigid exoskeleton of the arthropod body, during molting between life stages. This is the reason why pesticides like di- and teflubenzuron target the chitin-pathway, raising concerns about adverse effects on other crustaceans around farming sites. Candidates belonging to a particular family of chitinases were first identified in the lice genome through a database search for homologies to known chitinases in crustaceans and insects. The group then found that these relevant genes contained several sequences coding for a signaling peptide, suggesting that the proteins were excreted out from the cell. Identification of this extracellular role confirmed that chitinases either acted on the molting process or had a possible role in digestive functions. Three relevant sequences were then identified, and an expression profile was run using qPCR to detect their presence, coupled with an in situ hybridization trial to visualize gene expression in the sampled tissue. Although the intervention did not prevent molting in the parasite, RNAi-induced silencing of LsChi2 in nauplii larvae produced animals with “changes in body dimensions, locomotive behavior, and inability to infect fish” (ibid.: 47). Together, the outcomes of this genetic knockdown provided biological data for exploring the chitin pathway and would be “a valuable tool in future efforts to combat this parasite using chemotherapy or vaccine strategies.”

“Technology-oriented research,” the second mode of inquiry proposed by O’Malley and co-authors, is based around the design and modification of instruments (2010: 413). An experimental system used in one field of inquiry may be operationalized as a tool for research in another. The transformation of RNAi from an epistemic thing in molecular genetics, to a technical object capable of modifying gene expression in salmon lice research, is an obvious example. We also saw the technology-oriented pattern exemplified in SLRC’s historical trajectory, where a novel material culture for experimentation, composed of enculturated lice strains, incubators, and a new single-tank system, went through multiple iterations. Progress in the molecular parasitology of salmon lice depended on a continuous supply of new instruments, and modification of old ones. New knowledge about phenomena was enabled not just by changes in ideas, but also from novel orchestrations of material components. As such, the experimental system’s potential to deliver new insights changed profoundly over time, as identification of new patterns and performances radically transformed the questions that could be asked. Historical knowledge of how instruments and other artifacts performed in the past, and how these fitted together in larger systems thus became crucial to produce a “machine for making the future” (Rheinberger, 1997: 28). Again, while these practices were undoubtedly epistemically productive, they get obscured when viewed through the Popperian lens on experimentation as merely hypothesis testing.

The last addition, “question-driven investigations,” is present in interdisciplinary contexts where it is hard to generate highly specific hypotheses at certain points, due to a lack of existing knowledge. Here, open-ended questions can be productive, driving later breakthroughs in understanding. The basic research that led to anomaly resolution in the science of microRNA and RNAi was, as we have seen, profoundly question-driven. This was also the case with technical applications of RNAi in experimentation on salmon lice, and the infrastructure developed around domesticated lice strains. Question-driven experimentation often explore general questions, such as “how many X there are,” or “what kind of Y’s there are,” which may, or may not be refined into specific hypotheses later. In Chapter 6, I present an ethnographic description of a case where several genes coding for a protein known as fibronectin first had to be identified, before it was possible to select candidates for RNA interference experiments and characterize these genes at the molecular level.

At various points in time, the foundational experimental system of the SLRC exhibited shifts in the primacy and relative weighting of these three modes of inquiry. Meaningful variations in parameters of the system were introduced over time, as more central and indispensable conditions could be assorted from the more modifying and dispensable ones. The incremental process of acquiring new meaningful insights about lice biology also included efforts like determining stable empirical rules about the system’s behavior. These efforts included the study of sex rations and hatching rates among lice strains, as well as research that eventually resulted in a critical revision of the salmon louse life cycle. In this new model, the number of molting stages in the cycle was reduced from ten to eight. The previous model, which reigned for five decades, was long considered to represent a unique copepod life cycle with eight “post-nauplius instars” and four “chalimus” life stages. However, systematic observations of molting in the incubator system, accompanied by morphometric analysis of the larvae and their shed exuviae (exoskeletal remains left in the incubators), demonstrated that L. salmonis only had two chalimus stages, and thus only six post-nauplius instars totally. These insights were tremendously important for future experimental applications, and for devising effective pest management regimes. It implied that the effects of various therapeutic interventions in the salmon pen were based on an erroneous model of the parasite’s life cycle.

It was also necessary to work out new representational conventions for capturing invariances articulated by these empirical rules, like the formula for determining the “daily instantaneous loss rate” (see Chapter 3). This entailed efforts to engineer new representational tools, such as spreadsheet templates for keeping track of variables within the experimental system. As the system matured, it was then crucial to understand other aspects of its operational parameters. One example is the problematic interactional effects that were observed in communal fish tanks, where statistical analysis revealed a high degree of unspecific lice loss. Other question-driven investigations in this cognitive ecology concerned subjects as different as fish welfare, water quality, feed-uptake, and the complexity of biological variations in lice strains.

Exploratory Experimentation as Distributed Cognition

In this chapter, I have shown how developments in RNA research converged with the science of salmon lice in unexpected ways. While the therapeutic promise of RNAi remains to be fulfilled, the method was embraced by Nilsen’s research group as a highly adaptable and applicable instrument for molecular parasitology. Through diligent work over years, the research community was able to standardize RNAi technology as a means for their own epistemic ends, to probe the biology of salmon lice on a mass scale.

RNAi experiments gave researchers an opportunity to narrow the search space for potential vaccination targets in the louse genome. Using RNAi, candidates for antiparasitic interventions could be subject to preliminary testing without the costly and troublesome procedure of conducting live vaccine trials prematurely on many candidate genes. By simulating the effects of actual vaccines through silencing specific mRNA transcripts via injections or bath treatments of lice, RNAi provided an opportunity to observe and chart the downstream effects of certain genes through the parasite’s life span. Potential antigens with negligible effects could thus be ruled out efficiently, and the Centre could focus their efforts on a few clinical vaccine trials for the most potent therapeutic candidates. Genes involved in critical processes like molting and female reproduction were of particular interest, as they had been effective targets in other cases of pest management.

Here, we see how the experimental system operated as a cognitive ecology, a “cultural ratchet” that accumulated adaptive solutions in an encompassing infrastructure for studying salmon lice biology across molecular, morphological, and behavioral levels of analysis. Just like exploratory experimentation played a key role in basic research on regulatory RNA, so did applications of RNAi, as a technology for salmon lice studies, sustain exploratory efforts and discovery in new directions. This style of practical reasoning was a consequence of inheriting technical things from fundamental research on issues that, once upon a time, were epistemic things.

As critics of the logical-empiricist program demonstrated long ago, any experimental test of hypotheses is also simultaneously testing a web of interconnected beliefs (Godfrey-Smith, 2009: 33). RNAi trials at the SLRC were occasionally informed by theory in the sense that evolutionary theory informed the phylogenetic reasoning behind the selection of a particular gene target, or that theories concerning the molecular biology of the cell informed the selections of genetic target pathways. But the goal of RNAi experiments was not the refinement of high-level theories. Rather, their goal was to demonstrate the value of specific genetic cascades and mechanisms as therapeutic targets, through fine-grained analysis of the phenotypic details surrounding the functional action of specific genes and their involvement in mechanisms that mediated host-parasite interactions, reproduction, and so forth.

For the cognitive anthropology of knowledge, insights from the New Experimentalists, supported by conceptual work on exploratory experimentation, technology-oriented and question-driven inquiry, expose the cultural richness of experimental practice. Following these, I espouse a pluralistic approach to experimental culture, that goes beyond the hypothesis-centered view. While the value of the exploratory framework is subject to an ongoing debate in science studies, I find the concept ethnographically productive because it highlights a range of epistemic activities in the laboratory that would otherwise go unnoticed. An emphasis on exploratory modalities takes seriously the contribution of material culture to scientific knowledge production, that both conventional studies of epistemology and ethnographic studies of science, tend to disregard. By studying ethnographically, the exploratory conduct of scientists, in naturalistic settings, it is possible to push these backstage activities onto the frontstage.

Still, while the notion of exploratory experimentation is appealing, and gets us on the right track toward a cognitive anthropology of experimentation, it is still too elusive to capture the variety of cultural productions occurring in the laboratory at the microlevel of interaction. To ameliorate this, I want to upgrade these analytical tools in the next chapters by adding resources from the toolkit of distributed cognition, slightly shifting the analysis in a more ethnographically satisfying direction that helps refocus how these cultural practices of cognition are configured.

In Chapter 1, I mentioned that cognitive approaches to science have been the target of unwarranted skepticism within science studies. One reason is that the notion of “cognitive” has erroneously been equated with “rationalism.” Earlier cognitive accounts of science could be seen as “merely transferring the positivists’ foundational logic and its purported virtue to lead to the truth within the heads of the scientists” (Heintz, 2004: 394). Cognitive anthropology, and the lens of distributed cognition, allows recasting questions about the iterative nature of knowledge, the transmission and propagation of scientific representations inside and outside the experimental laboratory, as well as the making of scientific intersubjectivity (Ellen, 2004: 433). The method of cognitive ethnography supports this project by offering portraits of how scientific action is productively constrained in the wild. Not by seeing experimental science as acts of reasoning that inevitably result in true beliefs, or by ascribing “Popperian” minds to scientists a priori, but by approaching these phenomena as vivid cultural productions that contribute to the growth of knowledge.

Distributed cognition is highly compatible with a view on experimentation as iterative and exploratory, given that both perspectives argue against a view of human knowledge and reasoning as primarily a theoretical activity, bounded by skin and skull. Together, this helps shift the analysis toward the process of experimental knowledge production, and not just its end products. This provides a toolkit for teasing apart scientific meaning-making by casting light on the interplay between material and conceptual resources that scientists have at their disposal. In what follows, I hope to show how mundane acts of experimental practice, like observations of instrument readings, are not just simple acts of perception, but forms of enactive sensemaking. Positivist, Popperian, and post-Kuhnian accounts of science have all overlooked central aspects of these meaning-making processes, that are situated in the gaps between acts of perception, and the establishment of scientific fact (Galison, 1987: 8). An interactive and ethnographically informed view of scientific cognition and experimental action allows us to see, on the microlevel, how a rich cognitive ecology bridges this gap. Extending this view of science to specific ethnographic events sampled from the molecular parasitology of salmon lice is the task ahead in the next chapters.

Notes

  1. 1.

    The Dogma was revised after the discovery of enzymes known as “reverse transcriptase,” but remains salient.

  2. 2.

    Following convention, I italicize letters for genes (lin-4) and capitalize its associated protein (LIN-14).

  3. 3.

    Mello recalls agonizing over this lack of a clear hypothesis in a lecture: “…We were really nervous that paper would not be accepted, that paper that was in Nature with Andy Fire. We were really nervous, because it was purely phenomenological. All we knew in that story was that if you give worms double-stranded RNA they responded to it in this amazing sequence-specific way […]. As cool as that was, we thought they were gonna ask us for the mechanism. You know, reviewer number three always says: “yes, it’s an interesting story but there’s no mechanistic insight, therefore” (author’s transcription, Mello, 2013).

  4. 4.

    Around 2012, CRISPR-Cas 9, an evolved defense system in bacteria and prokaryote microorganisms, replaced RNAi as the great disruptor of biotechnology. While RNAi regulates genes post-translationally (at the level of mRNA), CRISPR works at the transcriptional level. Some predict CRISPR will replace RNAi in loss-of-function studies, due to its specificity, low cost, and ease, despite the inertia of experimental systems based on RNAi. CRISPR is already applied in research on salmon and lice.

  5. 5.

    Commentators agreed that Fire and Mello deserved the 2006 Nobel, but some lamented that others deserved recognition. In 2008, Ruvkun, Ambros, or David Baulcombe received the Lasker award. Ruvkun and Ambros received the Breakthrough Prize in 2015. All are predicted contenders for a second Nobel on RNAi, if the technology fulfills its promises.

  6. 6.

    A classic case of abductive reasoning is the following syllogistic construction from C. S. Peirce: “if a white ball and a bag full of white balls, then the white ball is from the bag” (Sung, 2008: 128).

  7. 7.

    Sperber uses the term teleofunction: “Let us say that an effect of type F is a teleofunction of items of type A, just in case the fact that A items have produced F effects helps explain the fact that A items propagate, i.e. keep being re-produced” (Sperber, 2007: 128). Teleofunctions of various entities have different mechanisms for propagation. Items with biological teleofunctions, like RNAi, are phenotypical features of organisms. Cultural teleofunctions are either mental, within-agent representations, or exist as public productions (practices, inscriptions etc.). While an artifact’s function is the effect that explains why it was produced, its teleofunction is the effect that explains why it was “re-produced” (the prefix stresses this crucial distinction).

  8. 8.

    Parasitologists at the University of Aberdeen were introduced to RNAi by the Nilsen group, and used RNAi in their own studies on L. salmonis. While the Nilsen-group submitted their manuscript to the International Journal of Parasitology on March 4, 2009 (accepted April 16), the other group submitted to Parasitology a month later on April 13 (accepted May 18). Due to unforeseen circumstances, the Nilsen-paper was not published before November, while the Aberdeen-group published in Parasitology’s July issue. As a result, both papers claimed to be the first to use RNAi in lice. The Aberdeen-group’s paper stated they were “[…] the first report to perform dsRNAi in any copepod” (Campbell et al., 2009: 873). Nilsen’s group wrote: “Finally, we have demonstrated systemic RNAi for the first time, to our knowledge, in a copepod species […]” (Dalvin et al., 2009: 1414). Neither paper referred to work by the other group.

  9. 9.

    It is exemplified by Hacking’s famous anecdote about how invisible entities are grounded in experimental applications. He recalls a dialogue with a physicist-friend about detecting electric charges (‘quarks’) using a superconducting ball of niobium: “How does one alter the charge on the niobium ball? [asks Hacking]. ‘Well, at that stage’, said my friend, ‘we spray it with positrons to increase the charge or with electrons to decrease the charge’. From that day forth I’ve been a scientific realist. So far as I’m concerned, if you can spray them then they are real’” (Hacking, 1983: 23). Known as “entity realism,” this position accepts the realism of manipulable entities but maintains skepticism towards higher-level theories about these entities.

  10. 10.

    In the context of molecular parasitology, evolutionary theory mainly figured as “systematic theory” (Hacking, 1992) for addressing functional questions, such as “what is X for” or “what does Y do”? In experimental work at the SLRC, evolutionary theory was a resource, and not a framework that should (or could) be challenged through laboratory tests on salmon lice.

  11. 11.

    Microarrays (or “gene-chips”) were introduced in the mid 1990’s to identify active from inactive genes. Chemically, microarrays exploit base-pairing rules between mRNA molecules and its DNA template.

  12. 12.

    In situ hybridisation (ISH) is a technique for visualizing gene expression in tissues by locating the expression of distinct DNA or RNA sequences in samples by fluorescent probes.