How are samples of lice-tissues, collected from RNAi experiments, endowed with biological meanings through work downstream in the experimental pipeline? This chapter tracks the representational and material cascades initiated in the previous chapter. It examines the making of meaningful measurements of gene expression in lice tissues, focusing on a widely used technology known as real-time quantitative polymerase chain reaction. By ethnographically tracing the work and situatedness of one researcher within the cultural-cognitive ecosystem of the laboratory, I show how everyday operations on the benchtop depend on “ecological assemblies”; small-scale cultural practices that orchestrate arrays of resources in the agent’s immediate environment to house and extend cognitive processes that span beyond the boundaries of the individual. An important property of these functional systems is their role as material anchors for conceptual blends. I show how the cultural artifacts, which litter the lab, afford scientists a suite of external resources with remarkable computational properties. Together, these representational cascades shift the experimental system’s epistemic states, as part of an extended cognitive process of thinking through things.

Experimental activities in the laboratory rearrange accumulated resources and technical things to reveal and display the character of epistemic things, those elusive features of gene expression in salmon lice. We have seen how molecular parasitologists, as cognitive agents creating new knowledge, do not only think, but touch, move and otherwise engage with material objects and their colleagues, through a broad range of material engagements and semiotic activities. The analysis in Chapter 5 ended with the termination of an RNAi trial, where specimens were arranged in small plastic tubes, and placed either on fixative for histological examination in the microscope or immersed in a substance known as RNAlater, thereby setting the agenda for future work. When kept on RNAlater, lice were preserved for weeks in the fridge, or for months or longer in the deep freezer. Experimentalists could then, at their later convenience, study the effects of their RNAi interventions using molecular methods to probe the animal’s gene expression, and align these readings with phenotypic data, like observations of gross morphology.

In this chapter, I examine how archived salmon lice tissues are further transformed within the experimental pipeline by sampling epistemic activities from the “DNA lab.” Specifically, I look at how biological macromolecules are handled to reveal hidden features of genes that are immensely interesting for molecular parasitologists. I focus on measurements of gene-expression using a method known as “quantitative polymerase chain reaction,” or simply “qPCR.” The structural and functional dimensions of DNA, RNA, and proteins cannot be usefully studied with the naked eye, or even a microscope.1 As scientists cannot see biological macromolecules directly, their properties are mediated through various representational artifacts (Myers, 2015). Here, I describe the material culture of the biology lab as a “historically sedimented structure” (Goodwin, 1995: 268). This structure enables working with invisible substances such as DNA and RNA in epistemically rewarding ways.

To an outsider who only catches a short glimpse of the action, the ebb and flow of activities in the socially organized setting of the lab may look rather mundane, verging on the prosaic (Hine, 2001). The bulk of laboratory life consists of highly repetitive tasks performed by the lab countertop, or on the office computer. Endless pipetting by the bench and in fume hoods; shuffling of boxes filled with plastic test tubes, reagents, as well as bits and pieces of salmon lice; fetching of boxed samples from the fridge; assembling of devices; carefully putting slabs of fragile electrophoresis gels into UV-cabinets; monotonous interactions with paper printouts or digital interfaces; and seemingly interminable rounds of waiting for various devices and biochemical processes to finish, so that new results may, slowly, emerge. Despite such apparent mundanity, the cognitive ecosystem of SLRC presents an evolving and adaptive problem-space (Nersessian, 2006, 2012), for exploring the lice genome. This dynamic space was constrained by the Centre’s research program, which was continuously reconfigured as the biology of salmon lice progressed into new directions. So, what may appear as pedestrian at first glance, are creative, multimodal semiotic encounters with artifacts and devices that couple with various forms of language-use, including literal inscriptions, numerical representations, and manipulations of scientific visuals. These constitute powerful epistemic action loops for generating new insight. Situated within a rich ecology, littered with meaningful representational structure, experimentalists enact critical resources for making knowledge about lice. When we zoom in closely on specific practices within this experimental system and make them our unit of analysis, apparently disparate domains of activity come together, and the boundary between pragmatic and epistemic actions, seems to dissolve.

With this in mind, one could ask where we should look to identify scientific cognition. The classical view, which Andy Clark dubs BRAINBOUND (2008: xxvii), suggests that the loci of cognitive activity are circumscribed by the skin and skull of individual scientists. In this view, the non-neural body of a researcher is just a “sensory and effector system” of thinking brains, and the environment surrounding this brain organ nothing more than the arena where adaptive problems arise and are sensed by brain and body. As a replacement to BRAINBOUND, Clark argues for EXTENDED, an alternative, composite picture where: “the actual local operations that realize certain forms of human cognizing include inextricable tangles of feedback, feed-forward and feed-around loops: loops that promiscuously crisscross the boundaries of the brain, body and the world. The local mechanisms of the mind, if this is correct, are not all in the head. Cognition leaks out into body and the world” (ibid.: xxviii).

As part of this lineage of ideas to rethink the boundaries and unit of analysis for cognitive systems, the distributed approach picked out three ways that cognition is trafficked beyond the individual. First, cognitive processes can be distributed across members of a community, to create a division of labor required to complete different tasks and reach epistemic goals. Secondly, experimental science, as an embodied cognitive process, involves coordination between internal and external structures. To invoke Clark’s evocative phrasing, the mind is “leaky,” “shamelessly” mingling with the body and world as it seeps out from its assumed confines (1998: 53). Thirdly, this promiscuous organ participates in mutual feedback processes with material environments that can distribute cognitive practices through time so that the products of earlier events transform the character of later events.

Applying this vocabulary, we can understand experimental research as a cumulative cultural process that ratchet up solutions for solving frequently encountered epistemic problems that again feed back into the dissection of novel phenomena over time. Earlier, we saw how Rheinberger drew attention to this transition with his twin concepts of epistemic and technical things (1997). In a sense, their cumulative nature is summarized in that old maxim, famously expressed in a letter by Isaac Newton: “if I have seen further, it is by standing on the shoulder of giants.”

As we zoom in on instances of laboratory benchwork, it is helpful to consider two additional principles from EXTENDED that minds the role of material culture and increases the resolution of my analysis of the DNA laboratory’s role in this cultural-cognitive ecosystem. The first, is the “Principle of Ecological Assembly” (PEA), which states that agents promiscuously co-opt environmental and bodily resources to scaffold cognitive accomplishments: “according to the PEA, the canny cognizer tends to recruit, on the spot, whatever mix of problem-solving resources will yield an acceptable result with minimum effort” (Clark, 2008: 13). We saw instantiations of this process in the joint semiotic activities described in Chapter 5. The second is a methodological principle known as the “Parity Principle” (PP). It states that if a cognitive process works in such a way that we could call it cognitive if it occurred inside the head, then we are justified in calling it “cognitive,” even if its actual location is on the workbench.

In the context of an anthropology of scientific knowledge, these principles encourage us to “ignore old metabolic boundaries” and “attend to the computational and functional organization of the problem-solving whole” (Clark, 2008: 79). Accordingly, distributed cognition extends the computational language previously reserved for what takes place within the old boundary to account for coupled systems between human agents and material culture that can be observed in the wild. As such, the cognitive ethnographer’s task when encountering such hybrid systems, is to ask what information goes, where, when, and in what form, during specific moments of interaction. In their natural habitats, scientists recruit a wide variety of resources and emergent structures arising from the interplay between morphology and control. This includes active sensing to retrieve information, deictic gestures like pointing, perceptual efforts that stabilize organism–environment relations, bodily and tool-based extensions, as well as material symbols like inscriptions and other “exograms” (Donald, 2010). In these “ecological assemblies” or “functional systems” (see Hutchins, 2011), interactions with external objects may instantiate genuine cognition and reasoning.

As we have seen, cognitive artifacts are critically important for supporting both short-term ecological assemblies, created on the fly for specific tasks, and larger cultural-cognitive ecosystems that outlive individuals. Here, it is worth noting that a cognitive artifact does not delineate a sharply bounded category of objects. Rather, it should be considered “a category of processes that produce cognitive effects by bringing functional skills into coordination with various kinds of structure” (Hutchins, 1999: 127). Without access to the affordances embodied by such epistemic enhancers, ranging from opportunistic use of natural structures to intentionally designed objects, scientists are significantly stripped of their powers.2

In the following, I track the work of Veronica, a Ph.D. student at the Centre, as she engages with an everyday experimental task known as “quantitative polymerase chain reaction” (“qPCR”) to learn more about a class of genes that is the focus of her Ph.D. project. I first situate Veronica’s domain of interest within the overarching research program at the SLRC. Then, through a detailed description of a series of cultural practices that are taken-for-granted and rarely articulated by those involved, I present an analysis informed by distributed cognition, that illuminates the complexity of meaning-making in Veronica’s performance of gene expression analysis using qPCR. In my ethnographic account of this multimodal activity system, I examine a series of seemingly simple cultural strategies for connecting conceptual and material structure which support Veronica’s scientific activities. I address how these benchtop strategies, which are embedded within the SLRC’s experimental system, help propagate representations of salmon lice biology, and contribute to meaningful conversions of nucleic acids in test tubes into novel information about gene expression. Following Goodwin, I emphasize how organization of space through various material engagements create the necessary structures for accomplishing experimental work (Goodwin, 1995). In the final section, I briefly examine relations between material culture and meaning-making in the pedagogical transfer of laboratory skills, and the advent of commercial “kits” in molecular biology.

While the previous chapter examined the execution of RNA interference as a team effort, my concern in this chapter is tracing how the DNA lab, as part of a larger cultural-cognitive ecosystem, was orchestrated by a single agent to accomplish scientific work. Some of these traces become invisible during front-stage performances of scientific knowledge, such as journal publications, due to discursive practices and epistemic norms in the experimental life sciences that regulate what counts as relevant information.

Again, a disclaimer. I have tried to keep technical details to the minimum necessary for readers to make sense of what I am conveying, which means that my descriptive account will be far from exhaustive of this rich domain of bioscience. The challenge of reducing the complexity of practice to what is sufficient for an adequate analysis is a familiar theme, both from cognitive ethnography (Hutchins, 1995b: 266), and from debates in science studies more generally about the relative weighing of internal and external factors when situating scientific knowledge production historically (Kitcher, 1998; Shapin, 1992).

Practical reasoning must operate on stable representations of relevant constraints in the specific domains being engaged by the cognitive agent (Hutchins, 2005: 1557). We often think that the complexity of a given practice owes to the richness of the internal, mental representations held by those who perform it. Surprisingly, however, the human trick where an agent structures the external environment informationally can itself provide a critical resource for successful cognitive accomplishments (Kirsh, 1995, 2010). Through operations with rather mundane artifacts on the laboratory bench, scientists can scaffold highly complex chains of reasoning about biological phenomena. Here, I propose that the cultural artifacts involved in qPCR acquire powerful epistemic functions, not due to any intrinsic qualities they possess, but because they can be used as “material anchors for conceptual blends” (Hutchins, 2005). Through cultural practices that mingle together concepts with material anchors, it is possible for scientists to increase the stability of conceptual structures, which enable more complex forms of reasoning than would otherwise be possible. In many domains of experimental science, the conceptual structures under scrutiny are so complex, that they cannot be managed and represented in a stable manner by researchers relying on mental resources alone. According to Hutchins, the production and maintenance of stable representation of conceptual elements in cases of real-world computation requires that involved elements are held or anchored in place. This “holding in place” can be accomplished “by mapping the conceptual elements onto a relatively stable material structure,” thereby turning a material medium into a physical anchor for a conceptual blend (Hutchins, 2005: 1562).

The process by which cognitive artifacts merge into larger ecological assemblies in experimental biology are cultural elaborations of this general phenomenon. As I show, many epistemic events within the spaces where qPCR is accomplished, critically depend on blends created through associations between the conceptual and material. In this process, relationships between material structures, like arrays of nucleic acids in carefully arranged test tubes, can serve as a proxy for relations between conceptual elements, like different experimental treatments. Only when they get orchestrated correctly will such assemblies yield new insights about gene function in salmon lice. The case of executing qPCR, I argue, makes visible some important relations between environmental structure, social organization, and the conceptual fabric of scientific knowledge production. Again, we step into the lab, “Cognito-scope” in hand.

Fibronectin Type II

Veronica is a Ph.D. student on a three-year fellowship at the Sea Lice Research Centre, where she is primarily affiliated with Work Package 4, which tackles the broad subject of “molecular parasitology.” Her research is jointly supervised by the Centre director, and Sara, the senior molecular biologist responsible for coordinating all RNA interference trials. For her dissertation research, Veronica’s supervisors have assembled a list of interesting genes, and it is expected that she will screen these candidates using RNAi, observe their biological function, and describe molecular characteristics.

Laboratories of contemporary experimental biology continually negotiate the pragmatic and epistemic tradeoffs between individual utility and the communitarian order (see Knorr-Cetina, 1999: Chapter 9). As on other frontiers of research, work at the SLRC can be construed as a race against time and other research groups; funding is finite, mistaken directions can be costly, and Ph.D. deadlines must be met. The scope of doctoral projects like Veronica’s must strike a balance between what a student can reasonably achieve within a limited timeframe, usually three or four years depending on whether the scholarship includes teaching or administrative obligations, and the needs of the larger research program being pursued.

Veronica’s list of genes had been identified via sequencing and annotation of the salmon louse genome, and they were predicted to be involved in an extracellular-matrix protein known as fibronectin. As we saw, a gene prediction is the outcome of a partly automated analysis of the genome (a “genome annotation”), combined with judgments made by human experts like Veronica and her supervisors about which genes are most likely to be worthwhile targets to research further. These judgments can be informed by findings reported in journals by other scientists who pursue work on biological mechanisms in model systems that may be quite different from salmon lice.

A genome prediction attaches biological information to sequence data from all the chromosomes in an organism. Today, much of this process is automated through computational annotation tools that identify patterns in sequence data from the organism in question, and then compare these sequences directly to the sequence stored in other online databases, which contain the published genomes of other organisms. Genomic databases are organized to present information both about structural elements (chromosomal locations, genetic structure, coding and non-coding regions), and functional properties (regulatory cascades, interactions with other genes and known expression profiles). In Chapter 5, we saw how biologists employed the toolbox of phylogenetic inference to map the evolutionary contingent relationships between genes. Browsing through genomic libraries helps molecular biologists to identify genetic sequences that create distinct proteins involved in various cellular processes.

Veronica explained the logic behind the selection of her own candidate genes as follows. Previous research suggested that fibronectin (FN) interacts with the “extracellular matrix,” a form of connective tissue that serves structural and biochemical functions in cells. Potentially, this plays a role in other cellular processes related to host-parasite interactions. Proteins are molecular structures made up of amino acids, and a “protein domain” is a sequence of functionally distinct amino acids that links up a larger polypeptide chain. Knowledge about the 64 possible codons of the “genetic code,” the sequential rules governing how triplets of nucleotides such as A, T, C, and Gs get transcribed into RNA, and strung together as proteins in cells through transcription, can be combined with powerful computational tools for reasoning about biological matter. The genetic code describes which nucleotide sequences code for any of the twenty amino acids, as well as how these units configure into larger protein sequences. This makes the translation between genetic (nucleic acid) and polypeptide (amino acid) sequences a trivial task for professionals. Today, even lay individuals can perform such translations, compare sequences from different organisms, and predict a “protein sequence back-translation” through a portfolio of user-friendly web-based tools.3

Computer analysis showed Veronica and her peers that FN is part of the much larger Kringle-domain, a conserved protein structure named after the Scandinavian pastry due to its characteristic shape. Veronica focused on so-called “Type II” domains of fibronectin (“FNII”). The FNII class of structures bind to important molecules, such as collagen and gelatin (denatured collagen). She was particularly interested in how these genes influenced the collagen pathway, a main structural protein for connective tissues. To gain a sufficiently rich understanding of the domain, she estimated the need to sequence up to twenty of these genes and carefully observe their expression at different developmental stages using RNAi to silence their effects on the louse. In this case, the transcripts (messenger RNAs) coding for FNII-domains were found in exocrine glands. Exocrine glands are cellular structures that excrete biological substances to the parasite’s outside surface. Transcripts of mRNA were identified by Veronica’s colleague Hanna, in the area around the mouth tubule of the louse. Veronica’s project will therefore help colleagues understand the functional relationships between FNII-genes and exocrine glands in lice, by characterizing a relatively unknown system.

Researchers used to believe that FNII was specific to vertebrates, but annotations of other genomes found the domain to also be present in invertebrates like the louse. A search in LiceBase, the in-house database for the lice genome, revealed the presence of roughly two hundred FNII-domains. In comparison, there are only twenty-five in Homo sapiens. Was the number of FNII-domains in lice suggestive of these genes’ importance for louse biology and adaptations to a unique parasitic lifestyle? Furthermore, could disrupting the collagen-binding pathway have a cascading effect on louse development, and potentially offer clues toward a vaccine target, or other kinds of therapeutic biomolecules of some practical value for salmon farming? These were some of the questions motivating Veronica’s research.

We saw that attractive candidate genes for any future lice vaccine should target critical biological pathways, such as those regulating the reproductive system, or food uptake and digestion through the gut and intestines. The gut, for example, is exposed to salmon blood extracted by the parasite and may contain potential antigens. A challenge for Veronica and her peers, however, was that thousands of genes are likely involved in any of these biological pathways, with many of these being phenotypically redundant. This meant that secondary “backup” pathways involving alternative genes participating in similar biological processes were probable. Teasing these apart was a formidable challenge.

Using RNA interference, Veronica would systematically silence sequences of interest to functionally characterize a narrow selection of the most promising FNII-domains. She could then observe the effects of her intervention, with a keen eye toward critical processes such as molting and reproduction. Like the other scientists at the Centre, Veronica hoped that her explorative experiments in the end would yield interesting phenotypes; experimentally treated lice that developed differently from the control specimens. In these RNAi experiments, observations of changes in gene expression at the level of gross morphology were corroborated by taking measurements of downregulated genes, and comparing these with a non-functioning fragment, and with readings from a control group from the same experiment. The combination of an unviable phenotype, such as one without offspring, and a statistically significant downregulation, was an indicator that the gene in question was vitally involved in the targeted process. This fragment could then be further scrutinized through other methods, setting off a chain of activities extending far beyond a single RNAi trial.

Figure 6.1 depicts a “heatmap’ of fibronectin type II-domains that Veronica used to guide her initial investigations. The “map,” which belongs to a class of artifacts peculiar to computational biology, was handed down to Veronica by her supervisors. The diagram’s X-axis specifies the life stage and sex of the sampled materials, as well as the body part these tissues have been sampled from. The Y-axis, on the right, enumerates a list of fragments that have been automatically generated in the genome database. EMLSAT, the initial abbreviation on each entry, describes which version of the genome annotation that specific fragment number is found. The histogram in the upper left corner displays a legend with color codes for the relative expression levels of genes as compared to an internal control fragment. Here, dark colors indicate low relative expression levels, while bright colors mean that the gene is highly expressed.

Fig. 6.1
figure 1

Author’s rendering of an annotated heatmap used by Veronica. The original diagram was based on RNA-sequencing, showing expression profiles of genes containing the domain

Heatmaps are artifacts that can summarize large amounts of information, thereby facilitating “many-against-many comparisons” (Stevens, 2013: 192–194). This heatmap does not directly represent the phenomenon but is created on basis of numerical representations from the output of RNA-sequencing experiments (RNA-seq). As a method characteristic of “exploratory experimentation,” RNA-sequencing of salmon lice tissues offered an inductive, “broad” instrument capable of producing thousands of datapoints instantaneously, which in turn facilitated the search for “difference-makers” in the biological data (Franklin, 2005). Without a heat map, the analyst would, in this case, need to visually scan a matrix with numerical data from over thousand different measurements to make sensible comparisons. In terms of distributed cognition, the ingenuity of heatmaps as a representational practice, lies in substituting a very hard computational problem of comparing a high number of possible combinatorial values to find patterns in multidimensional data, with a much simpler perceptual task in a visual search. Those familiar with data cultures of contemporary bioscience, can simply scan the matrix to identify meaningful patterns with little effort.

Veronica had recently terminated an RNAi experiment on a fragment from the list, which I here refer to simply as G1000. Targeting G1000 yielded some eye-catching phenotypes with obvious developmental irregularities. Veronica’s RNAi treatment produced a condition where the resulting egg strings were largely deformed on most of her specimens, in contrast to the straight, regular form of wild-type egg strings. This offered a visual indicator that the gene may be involved in important pathways. Such visual representations did not however, provide direct causal evidence that G1000 was a suitable target for therapeutic interventions. She now had to verify that the genes in the relevant salmon lice tissues were actually silenced or “downregulated” vis-a-vis her control samples, thereby ruling out any spurious effects from unknown technical or biological mishaps. Only with an answer to this question at hand, could the research community evaluate whether they should throw more resources at studying the fragment in detail.

In the DNA Laboratory

December 14, 2014. I am seated next to Veronica, in front of an Applied Biosystems 7500 unit; a quantitative polymerase chain reaction-machine, colloquially known simply as ‘the qPCR’.4 The device looks like a large, bulky, off-white computer cabinet (see Fig. 6.2), and produces a faint humming, which joins the chorus of other fanned equipment running in the background. At the SLRC, the qPCR is regularly used by staff to profile the mRNA content of salmon lice sampled from various experiments. Users primarily interact with this essential piece of technology via a software package running on a Windows PC platform. Veronica’s goal for the day is to examine the expression levels of G1000, which she targeted with RNA interference in an earlier joint experiment. To determine whether G1000 has been significantly downregulated in her samples, relative to experimental controls, Veronica prepares and loads a specially engineered 96-well microplate with nucleotide samples into the qPCR machine’s opening slot. Setting up the machine for this “run” only takes around ten minutes, with the device completing its analysis in roughly two hours. However, a long chain of cumulative action on these genetic substrates, predates her efforts to initiate meaningful “structure-preserving” operations on her samples with the machine (Goodwin, 2013: 17).

Fig. 6.2
figure 2

Feeding the qPCR machine and setting up the reaction. Veronica creates an alignment between the array of items laid out on the paper spreadsheet A, with those on the computer monitor-interface B, and the coordination of reagents on the 96-well microplate inserted into the machine C

After terminating a previous RNAi experiment jointly with her colleagues, Veronica first used a series of standardized procedures to isolate RNA from tissues that were preserved on tubes with RNAlater. To isolate RNA, she made homogenates of lice tissue and then, using centrifugation along with chemicals like TRIzol and chloroform, she separated this biological material into three phases: a protein phase, a DNA interphase, and an aqueous phase containing the RNA. She then transferred the RNA phase to a new tube along with isopropanol and incubated the samples. After this step, a new round of centrifugation followed, producing an “RNA pellet” that was washed with ethanol. This new sample was then mixed in a lab vortex and centrifuged again. Discharging the eluate, Veronica then dried the resulting RNA pellet and eluted it in RNAse free water, before storing the samples at -80 degrees Celsius. Using a Nanodrop spectrophotometer she also tested the sample’s concentration and quality, ensuring their adequacy for further processing.

Veronica also treated her samples with DNAse, an enzyme which degrades DNA so that it does not contaminate the RNA sample further downstream, and reverse-transcribed lice-RNA into cDNA using the Affinity Script cDNA kit. Following this, Veronica carefully prepared her material substrates for the qPCR experiment by following the Centre’s in-house qPCR protocol. This protocol instructs that any new qPCR-assay must be validated with a standard dilution curve (this process falls beyond the scope of my description here). Standardized protocols, which are offered for most technical procedures, are crucial infrastructures for any such transformations in the Centre’s state of knowledge.

Other preparations included Veronica ordering reagents known as “primers,” and some assistance from the chief engineer to prepare 10 micro-liter aliquots that were stored in a box in the clean-room freezer. While she could have done this herself, it was highly recommended that all primers were prepared in the same standardized manner to ensure reliable results. Furthermore, Veronica had to prepare a master mix for the assay, making sure to include a bit of extra reagent to compensate for what would be lost during pipetting. She then moved from the clean room, where the risk of contamination is low, into the less strictly regulated template room. Here, a cDNA template was added to the microplate. For molecular biologists, this action signals that Veronica conducted a “two-step qPCR,” and not the faster, but less flexible and slightly less sensitive “one-step” procedure, where everything is conducted in a single-tube reaction. After Veronica loaded her reactions onto the plate, she then placed an optical adhesive film on top, and centrifuged the object, spinning the liquid down to the bottom of each well. She also made sure that the plate’s edge was not contaminated, which could potentially interfere with the machine analysis. Let us now take a detailed look at the sequence of action where Veronica sets up the machine to profile gene expression. Figure 6.2. depicts the scene, and the excerpt gives an overview of this process.

EXCERPT

00:00 Positioned in front of the qPCR-machine, Veronica creates a file for a new experiment on the computer. A “setup wizard” in the software guides her through the steps that must be taken before the analysis can begin. It asks for information about the trial: what kind of experimental design is being conducted, specifies the instrumental options, reagents, and temperature for the PCR-cycle. Having entered these parameters, Veronica names her fragments, and chooses the number of biological parallels to be used.

00:10 Carefully inserting the 96-well plate correctly into the machine, Veronica closes the tray. No longer risking contaminating the samples, she removes her nitrile gloves.

00:25 Veronica double-checks and confirms selection of reagent, in this case: SYBR Green.

00:45 She labels the different fragments that are being tested, according to the lay-out of a printed spreadsheet and defines the targets and names for each of her samples, including her controls so that each fragment is correctly labelled in the output file that she will later transfer to her office computer.

04:35 Veronica assigns samples to the selected wells on the graphical interface by a “click-and-drag” motion, highlighting in different colors where each sample is located on the microplate.

07:05 She double-checks that she has chosen the SYBR Green, standard curve-method.

07:30 Veronica changes reaction volumes for each well on the software interface so they correspond with the physical samples on her microplate.

07:45 The “run” is initiated through the interface and it takes roughly two hours before the analysis is complete. Checking the time, Veronica finds out she is delayed and edits an entry in the logbook’s timetable that accompanies the machine, so that others in the lab will know the workstation is occupied for a while. The clock indicates that it is lunchtime.

The Polymerase Chain Reaction

On its own, this rather naïve description hardly renders Veronica’s practices with the qPCR-machine meaningful as a scientific event capable of generating new insight. Why must she use this machine to study her samples? How does it work? What dense webs of meaning construction support the device, and what new knowledge is mutually supported by its use? Answering how qPCR contributes to the transformation of representational states within the experimental system, thereby supporting progressive co-adaptation of elements in the self-vindicating structure of experimental practices, first requires an appreciation of the problem that this instrument was designed to solve.

A challenge when working with genetic material at the start of the biotech revolution was that little DNA was easily available to researchers for manipulation. While the biochemical problem of DNA isolation, was crudely solved by Friedrich Miescher’s work on “nuclein” already in 1869, one of the technical challenges faced by molecular biologists in the 1970s was developing assays that were sensitive enough to detect signals of small variations in the target DNA structures for medical applications. Molecular cloning technology had partly solved the problem of lacking abundance of nucleic acids when it entered the scene in 1972. It was now possible to copy a gene and insert it into bacteria to produce the protein coded for by the gene. Still, these cloning-techniques relied on living organisms as the reproductive medium.

Polymerase chain reaction made humans less dependent on these cumbersome bacterial systems, and made laboratory life easier and more flexible, as plenty of nucleic acids became available for analysis. PCR solved the sensitivity-of-detection problem by amplifying the source, DNA, rather than the means of detecting its signal (Rabinow, 1996: 84). Like so many other biotechnologies, PCR did so by harnessing a natural mechanism in the cell; in this case a cellular machinery for duplicating and repairing DNA in chromosomes. So, while PCR did not solve a specific scientific problem, its availability as a convenient off-the-shelf technology created many new situations for use, across all of biology’s subfields. Suddenly, it was possible to detect whether a gene of interest was present in a sample, and to compare this sample with others. PCR has since been transformed from a conceptual idea into a technique for copying DNA, embodied by many kinds of analytic devices, with multiple applications in a wide range of experimental systems.5

In technical terms, PCR is an in vitro method to copy genetic material exponentially by amplifying DNA segments extracted from organisms, or from cDNA, a DNA molecule “back-translated” from RNA. These substrates are known as the template. As the method’s name implies, the process relies on polymerase (a macromolecule that catalyzes formation and repair of DNA), and a chain reaction (a series of events driven by positive feedback). Two short, synthetic nucleotide-sequences (primers) are designed to biochemically correspond to flanks on the segment targeted for amplification and added to a test tube as starting points (or “anchors”) for the reaction. Small molecules called deoxynucleotide triphosphates (dNTPs) must also be mixed in, as building blocks for the new genetic material, along with various buffer reagents that help the chemical reaction run smoothly.6 Additionally, an enzyme that can polymerize nucleotides is required to extend primers in each direction, forward and reverse along the segment to be copied.

Enzymes are molecules that can catalyze chemical reactions, and the DNA polymerase used for this process is a protein complex used by cells during DNA replication and repair, like in regular cell-division. This enzyme was isolated from Thermus aquaticus, a bacterium discovered in the hot lakes of Yellowstone, whose heat-resistant polymerase was described in 1976. The advantage of adopting a heat-stable polymerase, was that lab workers no longer had to manually add new polymerase after each heating cycle. In the early days of PCR, the polymerase would degrade when exposed to the high temperatures of the process, with new polymerases having to be tediously added for each amplification run. In contemporary laboratories, Taq-polymerase is co-opted into a biochemical reaction that can be automatically repeated through multiple cycles in a special PCR machine. In this machine, the amount of DNA in the test tube doubles exponentially for each cycle. In a hypothetical case where a scientist starts with a single DNA molecule, cycle number one produces two copies. Cycle three makes eight, and cycle 29 makes 536870912. 30 cycles later one molecule of DNA has multiplied to 1073741824 copies.

The principles of PCR are common knowledge for biologists working on molecular topics. To duplicate a segment of DNA, the double-helix first needs to be separated in cells. In nature, this process happens with the help of helicase, another class of enzyme. In the laboratory, heating does the trick. When reagents are heated in the PCR machine, the double-stranded DNA molecules are separated by breaking the hydrogen bonds between the annealed nucleotide bases. Primers then bind to the separate strands, and polymerase replicates a new double strand. The two strands are anti-parallel and can only bind in one direction; the polymerase therefore moves directionally along the strand and links up the three-prime end (3’) of one strand with the five-prime (5’) end of the other. An original double helix is thus split into two single strands and used as a template to create a new double-stranded molecule in accordance with a complementarity principle: the adenine base (A) bond with thymine (T), while guanine (G) binds with cytosine (C) in the sequence-specific order of the original template. These cycles in the machine are based on three phases: denaturation of the double strand during heating, annealing of the primers by hybridization with the strand at a lower temperature, and finally the strand’s extension by polymerase at a slightly higher temperature. After a couple of hours, the DNA molecules inside the thermo-cycler, the amplicons, are made abundant.

Quantitative Polymerase Chain Reaction

Since Rabinow’s seminal anthropological account of the emergence and controversy over PCR technology (1996), a wide range of novel applications of this facilitating technology have emerged. One is quantitative PCR, which builds on conventional PCR, but expands its powers by combining three biochemical procedures. In the two-step procedure described here, there is first a reverse transcription of messenger RNA (mRNA) into copy DNA (cDNA) using the enzyme reverse transcriptase, which some RNA-based viruses use to insert themselves into the DNA of host cells. Secondly, cDNA is amplified using the polymerase chain reaction principle. The final step is “real-time” detection and quantification of the amplified materials.

In contrast to conventional PCR, which relies only on thermal cycling and biochemical reagents to amplify a stretch of DNA, quantitative PCR uses non-specific fluorescent dyes or dyed probes, that can intercalate with the strands of nucleic acid as they get amplified in the test tube. Additionally, while conventional PCR provides a result that is analyzed at the endpoint of repeated cycles of heating and cooling, qPCR takes “real-time” continuous measurements (“real-time qPCR”). When the dye or dyed probe binds with the DNA or RNA sequence as the number of molecules gets amplified over consecutive cycling runs, the chemical reaction emits fluorescence that is registered by a special detector in the machine. The intensity of the fluorescence in qPCR is then proportional to the increased concentration of the new amplicons. During each cycle, the device collects data for each sample, and outputs measurements of test tube activity at the end of each one, rather than giving a single endpoint reading after completing all the cycles. Due to its simplicity and power, qPCR has become the method of choice for quantifying nucleotides in a sample.

Molecular biologists use different chemical technologies to detect the amplified product in qPCR. The two most popular ones used in the DNA lab at the SLRC were TaqMan (a type of probe), and SYBR Green (a dye intercalate). TaqMan-quantification uses a short complementary DNA probe to detect the amplifying target, using a reporter dye in one end and a quencher, a chemical structure that quenches fluorescence, on the other.7 When polymerase produces new copies of DNA, the dye is cleaved from the probe, emitting fluorescence proportional to the number of molecules at the end of the previous cycle, or the beginning of the current one. A high cost per reaction is a major drawback of the method. We saw in the above vignette that Veronica instead selected SYBR green-based detection for her own experiment. When this dye is added to the reagent, it bonds to all the double-stranded DNA in the sample. During the denaturation phase, it is then released again, and fluorescence decreases. When the strand is extended once more during polymerization, SYBR Green binds to double-stranded DNA anew, and the machine can detect net increases in fluorescence as a measurement of relative gene expression. Lab associates explained that SYBR has lower specificity than TaqMan, which makes it liable to produce false positives by binding to nonspecific DNA, especially in the absence of well-designed primers. But since the method is less costly than TaqMan, which requires specially prepared assays for each gene, it can be used to run more reactions when resources are finite, making it highly suitable for the kind of screenings that Veronica and her colleagues regularly performed.

In Veronica’s relative standard curve experiment, the relative concentration of the target gene in the sample was normalized vis-a-vis a reference, usually a gene that is expressed constantly in both the calibrator and experimental condition. These are then compared to a baseline, untreated control sample.8 This way, experimentalists can also control for problems during RNA isolation, such as pipetting mistakes, and undesired chemical reactions that sometimes occur in the test tube. The machine gives a continuous measurement of the population of mRNA molecules in the sample, which reveals which genes are expressed in a cell at a given moment in time. Only when there is a statistically significant downregulation, can observed phenotypes be attributed to the causal effects of RNA interference experiments. Measurements of gene expression thus offer decisive moments in the lab. Depending on its outcomes, a qPCR-run may provide justification for pursuing new directions of research, and thus feed back into new arrangements of practices and tasks in the experimental system. If the result is negative, the experimenter can move on to other, more promising candidate genes. Again invoking Goodwin’s metaphor (2013: 18), qPCR is key to the “the laminated organization of action” that produces knowledge through webs of interlocking experimental resources in the SLRC community.

To better understand how new scientific meanings are construed through qPCR, let us examine the in-house protocol for the procedure. Written by a former postdoctoral candidate at the Centre, the protocol offers a survey of what should be included in the experimental design of a qPCR reaction. As with the RNAi checklist seen in the previous chapter, the qPCR protocol presents a regulatory representation for distributing cognition, and acts as a coordination device for orchestrating joint actions within the experimental system. From the perspective of cognitive anthropology, this recipe exemplifies a “task model” that helps improve the reliability of outcomes (Shore, 1995: 65–66). So even though the in-house qPCR-protocol is not a precise guide to how individuals perform qPCR, it has the virtue of making explicit shared expectations and epistemic norms that regulates its use, and provide information about the implementational-level details of the practice (Hutchins, 1995a: 28). As Lynch points out, laboratory scientists are deeply attuned to the necessity of interpreting protocols in the relation to performative contexts; there can be no discrete boundary between protocol and practice (2002: 205).

First, the qPCR protocol explains that users need at least three biological replicates of the samples. In these, which represent different RNAi targets and can be sampled from select life-stages or body parts, the target quantity of mRNA is unknown. In this case, Veronica is dealing with tissue from salmon lice where the G1000-fragment has been targeted. In Fig. 6.7, these samples are represented by the beige and red cells on her spreadsheet. Such replicates are necessary for statistical analysis since the numerical output of the procedure is based on averaging values from all the replicates. Each of these biological replicates was also paired with a control fragment. At the time Veronica executed her experiment, RNAi trials at the Centre used a fragment from a codfish gene known as CPY, which did not have any biological effect when injected into lice. In Fig. 6.7B, these fragments are found in cells 10–12/D-F and 4–6/G-H. Also, at least two technical replicates are used to discount variations in the technical execution of the experiment (not visible).

A reference gene is used as an endogenous control by containing a target that is expressed at the same level in all samples. It is used to normalize the fluorescence levels that are detected by the machine. These are paired with the biological replicates and control fragments. Genes that are stably expressed throughout the organism’s lifecycle, so-called “housekeeping genes,” are used for this purpose. Eight years prior to the opening of the Centre, its Director and a collaborator had experimentally verified that the elongation-factor 1 alpha (El1α) was a suitable reference gene for transcription profiling due to low variation in transcription. This gene serves as a basis for quantitating the relative expression levels of the target fragment. Reference fragments were shaded blue on the spreadsheet in 6.7B.

qPCR must also include a no amplification control (NAC). The protocol explains that this is a real-time reaction without the enzyme known as DNA polymerase, also called -RT control. This control, which shows contamination of DNA in the sample, is highlighted in 7G (see 6.7B).9 Additionally, the array contains a no template control, a PCR reaction without a DNA, RNA, or cDNA template, which monitors biochemical contaminations and byproducts that can produce false positives (so-called primer-dimers). These are highlighted in Cell 8G on the spreadsheet in 6.7B. Finally, the protocol contains instructions for programming the essential temperatures for the reaction, ranging between 50 and 95 degrees Celsius, and the timing of different cycles in the assay, which last from 15 seconds and up to 10 minutes, depending on the reagents. The SYBR Green program for qPCR chosen by Veronica completes 40 runs in around two hours.

Making Data

Laboratory novices acquire their theoretical familiarity with qPCR from textbooks and coursework but accumulate practical know-how about the method by interacting with the machine on specific research projects in the lab. While many of the technical properties of the device is effectively black boxed in practice, detailed questions about the apparatus can be answered by consulting technicians, or the methods and application guide published by the manufacturer. Page two from the 260-page manual for Relative Standard Curve and Comparative CT-experiments that accompanies the Applied Biosystems 7500-device explains the fundamental principles. Regardless of run or read type, the instrument collects data in three phases. First, there is excitation. The instrument illuminates all wells in the reaction plate and excites the fluorophores in each test tube. Then there is emission. Instrumental optics collect the residual fluorescence emitted from each well on the reaction plate, generating an image of light that corresponds to emission wavelengths. Next, the instrument takes this light image and digitally assembles a new representation of fluorescence, collected over fixed time intervals. A raw image is then automatically stored for analysis by the machine. When the run is complete, the machine uses “region of interest (ROI), optical, dye, and background calibrations to determine the location and intensity of the fluorescence in each read, the dye associated with each fluorescent signal, and the significance of the signals.”

Before Veronica’s session is over, she must intermittently monitor her run and deal with notification alerts given by the machine. When the run is finished, she unloads the plate from the instrument, and checks her amplification plots to screen for abnormal amplification patterns, making sure that the relevant values (such as the slope/amplification efficiency, the R2-values/correlation coefficient, and the CT-values) check out correctly. The output from a conventional PCR experiment is an abundance of amplified DNA molecules in the test tube. These can be visualized as a band on a gel using electrophoresis, or compared with a known concentration of a marker and measured using a spectrophotometer, like an instrument known as a “NanoDrop.” Outputs from qPCR, on the other hand, is information about patterns of gene expression in the different samples in terms of relative levels of messenger-RNA. In practice, the most important output value for determining this relationship is the “CT-value” (the “threshold cycle,” or “quantification cycle”—Cq).10 This value refers to the intersection between the curve of amplification and a set threshold. The manual describes it as: “the PCR cycle number at which the fluorescence level equals the threshold,” which is a central measurement for further calculations downstream.

The qPCR machine automatically represents its output in plots where the level of fluorescence can be read from one axis on a diagram, and the cycle number from the other. A comparison of fluorescence plots to cycle numbers for all the samples is then set against a background of fluorescence at the same starting point, known as a “baseline correction.” A threshold level of fluorescence is also set, above the background level, but within the plot’s linear amplification phase. This is done to provide a threshold for the cycle numbers. A central feature of qPCR is that the threshold cycle (“CT”) is inversely proportional to the amount of nuclei acid in the starting sample, so that a lower value indicates a higher concentration of nucleic acid (and vice versa). It is only when the nucleotide concentration has reached this threshold that it is possible to infer anything above the concentration from the intensity of fluorescent light. This also means that the more initial DNA or RNA template is present in the sample at the starting point, the earlier the CT-value is reached for that sample. Being directly proportional to the number of amplicons that gets generated throughout the cycling process, the fluorescent signal provides the means to assess expression levels.

At this stage, the qPCR machine’s software can display different plots for inspection, each with its own characteristics. These plots are usually inspected on the computer in the DNA lab before moving on to further analysis elsewhere. Here, the experimenter looks for the presence of reaction curves that might reveal whether something has gone amiss during the run.11 If the curves are acceptable, there are several further epistemic actions that are necessary to secure a useful outcome. Although the machine automatically analyzes the wells, users can either choose to view the results by working directly in the machine’s software package, or by exporting the data to an Excel spreadsheet. Veronica and her colleagues would often bring these spreadsheets to the undisturbed setting of their personal offices, rather than the communal lab space, to perform further calculations and compare expression profiles with data from other experiments.

In the specific procedure used by Veronica, known as “relative quantification,” users of qPCR normalize the target sample (“gene of interest,” or GOI) to the reference gene, a so-called “housekeeping” gene whose expression level remains constant under most conditions. As we saw, housekeeping genes are usually involved in very basic cellular processes and have been experimentally vetted to be constantly expressed throughout the cell’s lifecycle, thereby providing a baseline for making comparisons across samples. The relative value of this normalization is then compared to a “calibrator” or “control sample.” There resulting differences in CT-values can then be referred to as “fold-differences” that are either “up-regulated” or “down-regulated,” depending on the context.

Although there are several ways to normalize and quantitate qPCR results, depending on what they are used for, Veronica and her peers relied on the “Livak-method,” which was colloquially referred to as “the Delta-Delta CT” (ΔΔCT).12 This method is founded on the assumption that amplification efficiencies of both the gene of interest and control fragment are equally at 100%, and within 5% of each, so that every PCR cycle doubles the amount of nucleic acid in the test tube.13 Handily, template spreadsheets with ready-made algorithms for calculating the “Delta-Delta CT” were handed over to newcomers from senior peers in the community. These historically accumulated resources could then be adapted to different experimental designs. Here, we see how the mutability and “unfolding variations” of inscriptions allow a scientific community to adapt inscriptions to their own particular uses (Kaiser, 2009: 7). Adaptability, not immutability, makes these representations efficacious within the cognitive ecology of the experimental system.

The calculation procedure used by Veronica and her peers had four steps. Here, a simplified example of the computation and its parameters must serve as an illustration:

  • First, the difference between the CT-value of the target gene in the untreated sample and the CT-value of the reference gene in the untreated sample is identified.

  • Next, the researcher must find the difference between the CT value of the target gene in the treated sample, and the CT value of the reference gene in the treated sample.

  • She then calculates the difference between these two values.

  • This difference is then squared over two, yielding the 2ΔΔCT, which provides a measurement of down-regulation of genes in terms of relative, or “fold’-differences (in the work of Veronica and her colleagues, multiple genes were often tested at the same time, yielding a significantly more complex matrix than the simplified example displayed in Fig. 6.3.

    Fig. 6.3
    figure 3

    An algorithmic-level description of how “Delta-Delta CT” is calculated. This idealized table provides hypothetical values for a treated and untreated condition for a target gene. It highlights the arithmetic operations used to complete the computation. In practice, values are calculated based on averages from several biological replicates, which requires more complex spreadsheets. In Veronica’s experiment we saw that the qPCR protocol advised using at least three replicates

At this point, researchers commonly ran statistical tests on CT-values to determine whether the treated samples displayed significant down-regulation compared to a reference sample. As data from qPCR are seldomly normally distributed, meaning that data points do not form a bell-shaped curve when plotted in a diagram, I was told that null-hypothesis tests were usually of the non-parametric variety. (Occasionally, values were log-transformed, and parametric significance tests applied). Finally, the representational output from this procedure was a bar graph or boxplot. Here, expression levels, error bars displaying data variability (confidence intervals), and results of statistical significance tests (with a significance level, Alpha, usually set at 0.05), could be read from the same graphical representation14 (Fig. 6.4).

Fig. 6.4
figure 4

Bar graph rendered by the author, based on a working spreadsheet exemplifying relative expression levels as a “fold-difference” in RNAi-treated adult lice. In this time series, measurements were made 3-, 14-, and 17-days post-injection (“dpi”). The first bar (3 dpi) shows under a 0.37-fold expression, compared to experimental control (normalized to a “1-fold” expression). The second (14 dpi), shows a 0.18-fold expression, while the third bar (17 dpi) shows a 0.05-fold expression, compared to the control. Results from tests of significance were occasionally placed on the bar chart to add information. This graph is based on a different experiment than the one performed by Veronica, but the general principle applies

These representational outputs from qPCR were an important source of evidence when considering claims about the effects of RNAi-induced gene silencing, and for making causal inferences about gene function. Together with morphological, and other sorts of molecular evidence, scientists at the Centre could use these to evaluate which genes were reliably silenced by RNAi, and the potential for investing more research in specific candidate targets. In the case of Veronica’s qPCR experiment, the data turned out to be ambiguous. While she initially thought she had come across an interesting phenotype, later analysis showed that several experimental confounds were in play, such as the presence of a viral pathogen in the samples that caused doubts about previous interpretations of lice morphology. After laboriously cross-checking her results, Veronica concluded that these candidate genes were not worth pursuing further and that resulting phenotypes from the RNAi experiment could not conclusively be attributed to an interference response. In the time ahead, she would continue her research on fibronectin domains by performing new rounds of RNAi experiments and qPCR measurements on other genes from her list.

Making Meaning: Image Schemas, Conceptual Blends and Material Anchors

In the ethnographic descriptions above, we saw how Veronica’s accomplishment of qPCR was afforded by chains of interaction with a number of “substrates” in the laboratory. Through reuse, decomposition, and transformation, these helped her to see patterns of gene expression. By substrate, I follow Goodwin and refer to the use of material and conceptual resources in the laboratory as a point of departure for building subsequent epistemic actions (2013: 11). These substrates were not just a context for Veronica’s actions but constitute a “semiotic landscape” for meaningful experimental work. In this section, I draw on theoretical resources from the distributed framework to scrutinize some ways in which qPCR emerges as a significant cultural achievement made possible by the material and social organization of the laboratory space as a cognitive ecology. What are the cultural practices that enable budding scientists like Veronica to wield artifacts in an epistemically productive way? To answer this, we must first review key developments in the study of meaning construction.

A key component of our capacity for meaning-making and reasoning about complex matters is a collection of basic “image schemas” based on how our bodies are constituted, which Turner describes as “skeletal patterns that recur in our sensory and motor activity under experience” (2003: 147). Evolutionarily speaking, image schemas derive from the fact that our primate bodies are positioned and act in three-dimensional space. They are “condensed re-descriptions of perceptual experience for the purpose of mapping spatial structure onto conceptual structure” (Oakley et al., 2010: 215). Image schemas are not fixed and static “pictures in the head,” but flexible and dynamic activity structures representing different types of content. They are composed from spatial primitives through a process of schematic integration with non-spatial elements. Complex image schemas can be constructed on basis of simpler ones by combining, superimposing, specifying, and elaborating them. Through these prelinguistic, embodied image schemas, our species can draw on structures in sensory and motor modalities to make sense of abstract domains and infer the properties of very different entities, extending to higher-level mappings such as conceptual metaphors.

As products of embodied interaction, image schemas are exemplified by my own perceptions as they appear while I write this paragraph, sitting by my desk. Looking down on my feet I experience vertical orientation through a plane of reference running through my body’s middle. Turning my head to each side provides a distinct sensation of a front and back, as well as two mirrored, opposing, lateral sides that I conventionally describe as right and left. Fingertips, arms extended, seem more distant from my body than my shoulders. I grasp the pen knowing that my right hand is more dexterous than my left and enact movement through space by rising from the chair, stepping forward. Moving through the room, I experience my body as a trajector in an enclosed container. All of this is enabled by asymmetries in my body plan and the world, together creating spatial contrasts. These contrasts are powerful drivers of human reasoning.

Image schemas based around such embodied interactions inform both concrete and more abstract concepts. Not least, they underpin a variety of creative practices, such as science and mathematics (Lakoff & Núñez, 2000).15 Higher-order concepts become meaningful via metaphoric expansions of familiar image-schemas derived from mundane somatic examples, like bodies positioned in space, manipulations of objects, and perceptual engagement with things (Oakley, 2010: 215). Conceptual metaphor theory argues that metaphoric thoughts arise by structuring one domain, a target, with elements from a different domain, the source. A familiar example from the history of biology is the conceptual metaphor a heart is a pump.16 When William Harvey published his treatise on heart action and how blood moved through the body in 1628, he invoked the mechanical pump as his guiding metaphor. Properties of the source domain (pump), in this case a mechanical device with the ability to transport liquids to or from inaccessible places, could be transferred to the heart muscle as the target domain. This, in turn, offered a heuristic scaffold that highlighted similarities and differences between hearts and pumps, making it possible to explore questions about pressure, circulation speeds of fluids, and so on. Understanding these aspects of pumps, however, depended on much simpler image schemas of patterned movement through space, force, displacement, containers, trajectories of motion, and kinesthetics. Here, basic image schemas become templates for the superimposition of perceptions, that mediate between experiences and our experiential representations. Interventions against salmon lice, for example, are often been framed through a conceptual metaphor of war: farmers talk about “winning the fight against salmon lice,” and scientists talk about drug resistance as an “evolutionary arms-race.”

Conceptual metaphors can be seen as special cases of a more powerful and ubiquitous process of human imagination that Fauconnier and Turner call “conceptual integration networks,” or simply “conceptual blending” (1998). This idea is based on the insight that background resources required for meaning construction are underspecified by grammar. Here, the proposed cognitive mechanism is a projection of selected elements from two different source domains in mental space that form a cross-space mapping that compose a generic, shared mental space which enables a dynamic “blend” of features. In this view, mental spaces are conceptual packets constructed by various frames and cognitive models through thinking and talking in ways that afford local understanding and action, where novel structure and features can arise according to the logic of the input spaces.

While conceptual metaphor theory is well equipped to account for entrenched structures of meaning held stably in long-term memory, blending theory better explains the structure of short-lived, local mappings for information integration generated in working memory, on the fly in various creative practices. As a basic mental operation that constructs partial matches between two inputs, and selective projections into a novel and emergent structure, blending produces new insight that can be co-opted by memory, aiding both construction and manipulation of meanings across domains of the human experience (Fauconnier, 2001: 2495). This process of conceptual integration produces a continuum of mechanisms for meaning-construction that unifies apparently disparate cognitive phenomena like categorization, analogy, metaphor, logical frames, and grammatical constructions, under one account.17

In its simplest form, as represented in Fig. 6.5, a conceptual blend or integration network is composed of two mental spaces that are cross-space mapped to a counterpart based on similarity judgments, providing partial input to a generic space. This generic space can later become a resource for building new integration networks. The blend itself constitutes a fourth mental space where the two inputs are being selectively projected to preserve certain features and compose new, emergent structures.

Fig. 6.5
figure 5

Left: adapted from original notation by Fauconnier and Turner (1998: 143). Circles represent mental spaces. Generic space is made of a structure belonging to both input spaces. Solid lines define cross-space mappings of counterpart connections between two inputs. Dotted lines indicate connections between input space and the other space. In the blend, structures from the input spaces are run together. This creates novel structure from the selective projection from inputs (not all inputs are projected into the blend). Novel structures are represented with a square with additional dots in BLENDN. Right: Hutchins (2005) introduces a new notation for conceptual blends with a material anchor as one of the input/source domains, marked by a square around the mental spaces of INPUT2 and the BLEND. Physical elements in the external world can enter conceptual practices via selective perception and projection

Figure 6.5 (left) illustrates ways that conceptual integration networks come together through mental simulation to create novel meaning. Composition sets up new relations among elements that are absent from the individual input spaces. Completion allows novel structure to be interpreted against a background of cognitive and cultural models, filling in certain missing aspects, patterns, and relations. In elaboration, or “running the blend,” a new structure that is not present in the inputs develops according to the blend’s internal logic. Patterns of activity in one domain can be coupled to another domain through partial cross-space mappings of counterparts in the input spaces, as well as selective projection and creation of emergent structure in the new blended space.18 Resulting from these processes is a compression of entities like time, space, cause-effect, identity, and change into a distinctly species-specific human scale. These make reasoning about complex affairs possible for enculturated and embodied minds. As a cognitive phenomenon, conceptual integration reveals that higher-level conceptual structures, like those accumulated through scientific practices, are composed from intermediate forms, which are in turn supported by more basic lower-level image schemas rooted in embodied experience.

Extending beyond language, conceptual blending also supports the “general and ancient” phenomenon whereby mental and material structure jointly enable and constrain a wide range of cognitive processes (Hutchins, 2005: 1555). By introducing external, material elements into the blended space as an input condition, as seen in Fig. 6.5 (right), new resources can be made available. This affords human cognition with stable computational properties and enable new forms of reasoning that are unavailable in more ephemeral, conceptual forms. Hutchins calls these phenomena “material anchors for conceptual blends”. By taking seriously the effects of material culture on meaning-making, it is possible to account for many diverse cultural productions, including scientific practices. The notion of a queue, for example, can be produced by combining the image schema for a simple conceptual trajector moving through space, and superimposing it on a row of material elements. As such, the abstract cultural models studied by cognitive anthropologists are not just lodged in individual heads but embodied by the physical structure of material artifacts. In this view, scientific activities form a constellation of cognitive activities on a continuum of practices for meaning construction and knowledge-making (Ellen, 2004; Nersessian, 2010).

Maintaining Conceptual Structure in QPCR with Material Anchors

How do these cultural-cognitive abilities manifest in laboratory benchwork during qPCR? Much of the analytic work in Veronica’s activity system is accomplished with support from machine computation. Some of this advanced instrumentation appears as epistemically opaque black boxes for her peer community. With respect to the Applied Biosystems 7500-machine, the constraints that must be satisfied to execute qPCR and identify expression levels in targeted genes, are clearly given by the biochemical properties of reagents in the test tubes, the device’s optical detectors, and assumptions built into the computational transformations that are carried out on digital signals that produce a graphical representation on the monitor. Here, some of the action has been separated from human agency, as “working knowledge” built into the reliable behavior of the artifact (Baird, 2004: 45). However, for the machine to do its designated job, producing useful outputs for the ensuing representational cascade involved meaningful measurements of gene expression, Veronica also had to solve a series of spatial problems drawing on a variety of plastic resources. These related to the ordering of test tubes and their content, as well as manipulating representations of the tubes in accordance with the internal logic of her experiment. This work was performed in ways that made inputs accessible for the machine, as well as making the outputs meaningful for her own subsequent interpretations of relative gene expression levels in the samples, considering accepted background knowledge.

Keeping track of representational states and their constraints, is a major challenge for any cognitive activity, qPCR included. To reason meaningfully about an object or process, its associated conceptual structures must remain cognitively stable while the object of scrutiny is manipulated and transformed. Many cultural practices solve this problem by using material anchors for conceptual blends (Hutchins, 2005). In the molecular biology lab, the challenge of stabilizing representations by anchoring them in a sea of conceptual and material complexity, becomes especially pertinent in the context of handling nucleic acids. The contents of test tubes are invisible to the naked eye and cannot be differentiated visually, without using additional resources. Given that the amount of liquid being manipulated on the bench is usually limited to a few microliters, nucleic acids and other biochemical reagents only appear as homogenous specks of fluid on the test tube’s bottom. No matter one’s level of expertise, the content of these containers looks the same, as there are few clues to tell tubes or well plates apart, except for occasions when dyes are used. Since mixing up samples has disastrous consequences for experimental outcomes, experimentalists like Veronica and her peers are deeply concerned about keeping track of them as they propagate through the pipeline, by taking actions that exploit multiple layers of accumulated semiotic and material resources within their cognitive ecology.

To interact with these contents and maintain stable representations about relevant constraints, biologists incorporate meanings and sedimented structures built by coworkers into the organization of their own epistemic activities. One way of tracking items in the world is through the deceptively simple act of labeling something. In Fig. 6.6, we see how Veronica has marked the tube caps with unique inscriptions using waterproof markers. This act of labeling, as a cognitive practice, makes it easier for the agent to later assess and evaluate the state of the world and pick out relevant objects, thereby avoiding contamination or mixing up samples, in ways that would bring the experimental process to a halt. Time, experimental facilities, and reagents are all precious resources in molecular biology.

Fig. 6.6
figure 6

Creating stable representations of phenomena and keeping track of test-tubes in DNAse treatment. Vial racks contain wells for organizing test tubes: rows are marked with numbers, columns with letters. A drawn arrow highlights the superimposed, imagined trajector in space that moves horizontally and vertically across the plate during work. Tubes are organized along the number line with labels. Notice the compartmentalization of reagents into clusters of similar kinds that can be noticed and exploited to accomplish the task. These spatial arrangements simplify perception. Out of view, there is ongoing “cultivated opportunism” on the bench (Kirsh, 1995: 49). Clutter and items are left around to strategically display their affordances in the lab, thereby multiplying chances of “getting something for nothing.”

The photograph in Fig. 6.6 depicts an assembly on the bench from a brief procedure known as DNAse treatment, that I briefly described Veronica engaging in, before she synthesized cDNA from her sample of RNA molecules and initiated the qPCR. Here, we see how Veronica labels the caps on her test tubes with a sample number, having inserted them in a vial rack chronologically. When looking carefully, however, we see that labeling is not all there is to this process. Additionally, Veronica (like her peers) employed a range of other vehicles to create material and conceptual order in the work. In the picture, a red vial rack contains the original samples, while the other holds samples treated with DNAse. The black box contains special tubes that will be used for the PCR reaction. Here, we see that the experimenter has not merely labeled, but also individuated the tube containing the DNAse mix and a tube with H2O, to avoid confusing them during pipetting.

When I asked Veronica about why she organized her workspace this way, she explained: “it makes pipetting very easy because I can now pipette the same sample many times over.” Reliable qPCR results needed meticulous execution, and Veronica interpreted her actions as aligning with epistemic norms about proper benchwork in the lab, solving a set of practical pipetting problems in the process. Furthermore, this was not just an idiosyncrasy of Veronica’s. Identical strategies for organizing benchwork could be observed among her peers, who accounted for their practices in similar terms. Complementing this insider perspective about how it makes pipetting “easy,” I conjecture that we are not simply dealing with a pragmatic action on the bench, in the sense that it brought Veronica closer to her physical goal. On the contrary, these operations were profoundly epistemic in nature since they really concern the transformation of an informational environment with potentially far-reaching consequences for experimental outcomes. Of particular interest, is how Veronica engages in a set of sense-making routines that Kirsch calls the “intelligent use of space” (1995). This was achieved by using the physical space of the bench and her plastic vial racks as material resources to maintain conceptual order for later analytical processes. From a strictly representational perspective, one could misleadingly think that labels would suffice for this task. But not so for researchers who are enculturated to the laboratory. Here, they become capable of projecting conceptual structure onto the world and materialize cognitive processes through physical rearrangements of different media (Kirsh, 2010: 445).

Kirsh observes that we should not see management of spatial arrangements in our immediate environments as an afterthought, but as an “integral part of the way we think, plan, and behave, a central element in the way we shape the very world that constrains and guides our behavior” (1995: 32). To execute qPCR, Veronica outsourced some of the necessary computational work to her spatial environment, in such a way that the bench, and what it contains, becomes carefully maintained resources providing a continuous supply of affordances for thinking and action. Here, the Gibsonian notion of affordance is understood as an opportunity: “a dispositional property of a situation defined by a set of objects organized in a set arrangement, relativized to the action repertoire of a given agent” (Kirsh, 1995: 43). Mental representations of test tubes and their contents do not suffice to productively manage qPCR measurements.

In addition to inscribed labels on the tube caps, the edges on the red vial rack in the picture are also seeded with representational structure in the form of precomputed numbers and letters that encode spatial relations (together forming a coordinate system). During pipetting and downstream processing, these precomputed inscriptions accomplish several things. First of all, they change the task structure and redistribute the workload of pipetting so that the users may read the letters and numbers from the well’s edges, instead of counting each one and keeping the count lodged in working memory. Interestingly, Veronica made this artifact somewhat redundant, due to her exploitation of other available ecological structures that she assembled on the spot. Instead of using these fixed values while pipetting her reagents into the tubes, she rather superimposed a basic image schema, an imagined trajectory moving from the left to the right, on the physical array of tubes. By imposing this trajector, she effectively projected a queue on her materials for her pipetting actions that served as a guide for future activity. Thereby, she explicitly encoded information about which tube to operate next in physical space. When things form a linear pattern, they are predictable, and the agent knows where to look for the next item to complete her material engagements.

Insignificant as they may seem, these accomplishments are crucial for successful experimental results, and made possible by exploiting a broader class of “trajector-based” cultural practices (Hutchins, 2014: 38), a subset of material anchors for conceptual blends. In Veronica’s case, the first input space in the blend contains the imagined trajector, while the second input contains the physical array of tubes. Here, the conceptual order of benchwork necessary to complete qPCR emerges from a composition that effectively creates an action sequence. The blend’s actionable effect is that Veronica can now see a queue of tubes to be serviced in an order that aligns with the experimental design, and not just a line of random objects in space. By completing the blend, Veronica can also reason functionally about which element to service next. This creates more opportunities to reflect and elaborate on her task, such as which tube was used first, which sample goes last, how many she has left before she can take a break, the number of controls, and so on. As Hutchins points out, these simple building blocks have powerful cognitive effects since these questions cannot be answered when lines of objects are simply experienced as lines, and not as trajector-based queues (2005: 1559). Note that the reagents are also clustered in space and bundled together on the array so that they form “equivalence classes” reflecting key properties. This creates an additional memory encoded in local space that helps track the array of samples as they move through the laboratory and get transformed into meaningful measurements of gene expression.

I contend that Veronica’s encoding of samples and their properties in physical space presents us with a fundamental epistemic activity essential for obtaining productive experimental results from the system. While this constellation of resources was locally adapted to the needs of Veronica’s problem-space, material engagements of this kind were ubiquitous in laboratory benchwork at the Center. These practices are not universal modalities for organizing cognitive work but situated cultural performances with a history.

Meaning and Measurement on the Benchtop

I mentioned that Veronica, in advance of entering the DNA lab, had created a template design for all her experimental replicates in the RNAi trial on a digital spreadsheet, which is visible in Fig. 6.7B. This template offered an additional solution to the problem of maintaining conceptual and material order in her samples. Its basic structure was inherited from senior predecessors in her community, who had successfully performed qPCR many times before. Veronica then adopted this shared spreadsheet template to her own experimental configuration and printed the modified sheet on a piece of paper, which she brought with her into the workspace of the DNA lab.

Fig. 6.7
figure 7

A spreadsheet acquires epistemic function through ecological assemblies for the intelligent use of space. The artifact functions both as a regulatory representation for distributing experimental conditions and their accompanying inscriptions, as well as a “jig” in specific assemblies. a and b show alignment between the digital interface of the qPCR-machine and the spreadsheet prepared by Veronica before entering the lab. c displays how the spreadsheet is used by Veronica to organize an array of reagents as she pipettes her samples into a 96-well microplate before qPCR, according to her experimental design.19 This action was accompanied by “shadow-counting” each step aloud, ensuing further representational stability for the operation. The bottom right picture (d) shows how the representation is physically enacted when setting up the qPCR-machine’s interface. Veronica traces each column with her fingers to stabilize the layout, while entering the correct values and labels on the interface. In this diagram, cells with sample tubes are highlighted in red, while cells with various experimental controls are green

Initially, this spreadsheet functioned as a regulatory representation that governed the distribution of other representations within Veronica’s ecological assembly, providing long-term structuring of her environment. But as can be seen in Fig. 6.7, the grid that emerged from the spreadsheet also provided a material anchor for subsequent bench interactions with the microplate. Later, this relationship was reproduced on the computer interface. This act preserved and stabilized structural correspondences between the various elements of her experimental design while she was busy labelling the correct input and proper relationships between the samples on the computer. Here, she ensured that the machine’s outputs, like the CT values, would correspond to the correct physical structures and biological material on the microplate. Only then would they become meaningful in relation to the overarching experimental design. This assembly set up a multi-directional informational flow between Veronica and multiple artifacts, whereby each small incremental step in the configuration of elements not only determined the next stage of the task, but also changed the task structure itself (Heersmink, 2015: 585).

By executing these benchtop operations as part of her qPCR-experiment, Veronica created an interconnected ecological assembly using artifacts that simplified choices, reduced the complexity of perceptual processes and removed the strain of internal computation. Together, this helped to maintain conceptual order on multiple levels. Among the simplest constituents of her practice was the individuation of objects, the smallest informational structures possible in this physical space. Next, she used the cultural practice of counting, which can be technically defined as “the coordination of an internally generated sequence of number tags with a partitioning of perceived unitary objects” (Hutchins, 1995a: 138). Maintaining order in the samples as they were handled, required Veronica to track a partition as it moved in a trajectory across physical space. Here, it should be noted that the workbench itself limited the array of things that could potentially be noticed and attended to, setting up a physical “frame” for Veronica’s actions.

Again, Veronica mobilized the cognitive strategy of trajector-based conceptual blends in her assembly. By imposing an imagined trajectory on the top of the microplate, as well as the grid constituted by columns and rows on the spreadsheet, new structure emerged on the bench. This compositional technique set up a queue that laid out the order by which fragments should be serviced, handled, and labeled on the computer. Although the 96-well microplates and vial racks were seeded with imprinted numbers and letters along the edges, these inscriptions were again made redundant by Veronica physically encoding the spatial order, as she consecutively partitioned the well plate’s surface by servicing the tubes from left to right, top to bottom. Each tube being serviced thereby marked the position of the next sample in line.

Starting at the top, as seen in Fig. 6.7B, Veronica allocated her first fragment, named R74, in the working order of column 1A to 1F, and then proceeded to fragment R75, A to F, and so forth. During pipetting, the serviced tubes in the partitioned space were filled with a visible residue of fluids, effectively tagging them as “completed.” But visual inspection did not tell Veronica which fragment was contained in each tube. By aligning the excel sheet with the microplate and test tubes, she used a graphical representation to place additional constraints on her action space, ensuring that the right substance went in the correct well.

Figure 6.7 shows how the spreadsheet, when orchestrated alongside dexterous hands, micro-pipettes, computers, and other lab equipments, assumes a different representational function than a regulatory one. Veronica effectively uses the spreadsheet as a “jig” (Kirsh, 1995: 37). Jigs are cognitive artifacts that stabilize processes, and they are critically important for expert performance in many domains. In her hands, the sheet stabilizes allocations of reagents and reduces degrees of freedom in the target objects, both during pipetting, and when she interacts with the computer setup-wizard for qPCR. Drawing on the vocabulary of Kirsh, we see that her action combines both physical and informational jigging. She plants information in the environment to reduce perceived degrees of freedom, but also litter her surroundings with material impediments that reduce physical degrees of freedom. Her coordination thereby generates representational stability through a series of intermediate, short-term structures so that, finally, each gene fragment can be correctly labelled in the computer interface in advance of running the qPCR analysis.

Successful accomplishment of this will result in the device naming the expression level values for each well in the output file correctly and in accordance with her experimental design, thereby preserving meaningful relations within the experimental constraints for later analysis. Here, we see that spatial structures in the laboratory were not only central for the discovery and commercialization of PCR as a novel biotechnology (Rabinow, 1996: 142), but remain epistemically vital for PCR as an everyday accomplishment, long after it has sedimented into a technical thing in countless laboratories.

Remembering the exact layout of all her eighty-nine fragments on the microplate would be extremely demanding in terms of the necessary internal mnemonic resources. Instead, Veronica opportunistically made that information locally available by continuously consulting the representations on her paper sheet throughout her activity. Orienting this array to her own actions on the spot, she thereby updated the status of her activity system in accordance with the experimental design. At one point, visible in Fig. 6.7C, Veronica even aligned the paper sheet directly with her well plate during pipetting to further reduce the cost of her visual searches, supporting the correct transfer of materials from one location to another. Later, she used her finger to highlight the cell of interest on this grid, facilitating a comparison between sheet, tray, and screen when engaging the software interface on the qPCR machine. Besides using the sheet as a model representation, she also traced its layout with her fingers and verbally counted the units in the array while simultaneously engaging with the computer interface via the mouse to input the correct values and set her experimental settings right. In effect, she did not need to form a complex mental model of the objects of interest (e.g., the experimental design) and store this in memory. Nor did she need to mentally rotate the microplate or perform other demanding computations as she proceeded. Veronica used objects on the benchtop to make the world into its own best model for what she wanted to accomplish, a world that she could easily consult through embodied interactions before engaging in her next course of action.

As representational media, computers have become essential instruments to support reasoning about gene expression. The cultural accomplishment of scientific work like qPCR requires an intercalation of what Michael Lynch identifies as two orders of laboratory activities; the interface between the “opticism” of scrutinizing eyes at work with various epistemically enhancing instruments, and the “digitality” of fingers (digits) manipulating computer interfaces (1991: 61). As Veronica’s actions during qPCR reveal, making sense of nucleic acids, their properties, and complex pathways requires both skilled manipulation of the computer, but also a precise orchestration of paper representations, and other materials, often in parallel. These interactions with material artifacts does not only translate between the world of sight and world of touch, as Morana Alač reminds us; they afford a permeability between digital realms and the physical task space of concrete actions on the bench (2011).

The case of qPCR also highlights how cognitive artifacts simultaneously take on “representational” and “non-representational” functions in scientific practice (Heersmink, 2013).20 Representational artifacts contain informational structures about the world. They accomplish cognitive effects through C. S. Peirce’s familiar triad of iconicity, indexicality, and symbols. While icons create isomorphisms between the representation and what is being represented, indexicality relies on causal connections between an index as a representation and the represented object. Many artifacts also take on symbolic functions, based on representations whose meanings derive from conventional arrangements and shared use. Epistemic enhancers like qPCR, whose purpose is to give measurements of gene expression, achieve their cognitive effects by combining these three semiotic properties. The relation between machine-made curves that display relative gene expression levels and the nucleotide content of test tubes for Veronica and her peers is not only isomorphic, but also an indexical relation, since the detectors pick out causal properties of increased fluorescence. Additionally, a wealth of symbolic conventions annotates these displays, and meaningfully brings together isomorphic and indexical information. Non-representational or “ecological” artifacts, on the other hand, do not contain information about the world, but “as” the world. The trajector-based conceptual blends based on a choreography of test tubes, microplates, and other paraphernalia, exemplify how the world becomes its own best model by manipulating physical space.

The Pedagogy of Ecological Assemblies and Cookbook Biology

Scientific concepts like qPCR manifest through embodied, interlocking practices (Hutchins, 2012), situated in the social and material settings of the laboratory where these concepts are enacted through experimental efforts. Ecological assemblies, like those manufactured by Veronica as she meaningfully enacted qPCR, come together on the spot depending on circumstances peculiar to the task at hand. Knowledge about proper workspace organization, and correct ways of handling specific artifacts within the experimental system, is part of a corpus of habitual practices instilled by senior community members in newcomers via a complex chain of cultural transmission. Many of these benchtop practices become institutionalized through the Centre’s “hidden curriculum” (Mody & Kaiser, 2008: 382). Beside techniques, these include epistemic norms and values that motivate and guide research on the parasitology of lice. Reproducing this institutionalized knowledge, within the Center’s cognitive ecology, counteracts disorder in practice by preserving functional continuities in the experimental system over long timespans. This is achieved by entraining novices to acquire necessary expertise before more experienced predecessors eventually leave the system (Hutchins, 2012).

At the department where the SLRC was hosted, students of biology underwent rudimentary laboratory training on the undergraduate level and were expected to master a range of practical tasks by the end of their graduate studies. When novices like Veronica joined the Centre, usually during their master’s projects or early in their Ph.D. program, they would train with a laboratory technician to educate their attention and acquire the necessary skills to efficiently maneuver in their research. Experimental expertise was partly defined through the intelligent mastery of the material and spatial surroundings of the lab. One of the first tricks-of-the-trade that novices acquired was the skill of informational restructuring their work environment, like Veronica did, to constrain the scope of future activity in a focused environment for action. It was not uncommon that members of the community justified their practices with reference to something they learned from their predecessors, senior lab members who had epistemic success with a given practice in the past. Some of these resources were communicated explicitly, some unavoidably emerged from the spatial and temporal organization of the lab, and some were copied and adapted implicitly through participation in the craft. By institutionalizing certain cognitive practices within the experimental system, it could be robustly organized in the face of individual variability.

Many of the critical skills necessary for bench work cannot be transferred propositionally but were acquired through repeated performance. A most critical competence for molecular biologists in the DNA lab was mastery of the micro-pipette, the device Veronica used to transport small amounts of reagent and biological matter while working the bench. Manual control of the micropipette was rehearsed during early training sessions, often under the supervision of a senior, and we saw that pipetting is always performed in orchestration with other artifacts within the lab’s cognitive ecology. At the microlevel of material engagements, the ability to pipette correctly is cultivated through incremental and gradual coordination between hands, pipette, and eyes, and an assortment of supportive tools, through repeated motor routines which over time produces the skilled laboratory worker. While ostensive instruction plays a role to instill first principles about how pipetting should be executed, the acquisition of expertise depends on a significant portion of reinvention and entrainment that instills practitioners with the capacity to create the kind of ecological assemblies I have described above. With reference to Clifford Geertz’s notion of “local knowledge,” science historian Hanz Otto Sibum has introduced the term “gestural knowledge” to account for such complexes of skills and mastery, that are inevitably developed in real-time performances of experimental benchwork (1995: 76). Micro-pipetting, for instance, required intricate fingerspitzengefühl, fine-tuned gestural knowledges that concern performances such as:

  • Choosing the right pipette for the job (generally, one should always use the smallest pipette possible to handle the volume, since accuracy decreases when smaller volumes are handled with larger pipettes).

  • The ability to correctly hold the pipette in hand and set its adjustable volume.

  • Maintaining the smoothness of “plunger” action, which requires tacit familiarity with the level of resistance exhibited by the “plunger” under different conditions.

  • Correct immersion of disposable sterile plastic tips when drawing in liquids from samples or reagents.

  • Properly coordinating the pipette with the receiving tube.

  • Having a “feeling” for the relative viscosity of different solutions.

  • Making routines for changing pipette tips between new liquids.

Adaptive use of the plunger, the lever sitting atop the pipette, exemplifies the dexterous complexity of the task. Plungers stop at two different positions when pressed. A first point of resistance presents the loading volume, as the user inserts the tip gently into the liquid to be extracted, just sufficient to cover the instrument’s tip. The plunger is then released, and the content is drawn into the tip from the container. Following this, the pipette is then transferred to a receiving vessel, where the user presses the plunger all the way to the second point of resistance. This discharges the last drop of liquid. Subsequently, the tip is withdrawn, but without releasing the plunger, and the plastic tip is discharged using a special button over an appropriate waste bin before a new tip is pressed onto the pipette from a neatly arranged box.21 At first, the pipette is opaque, and requires strenuous concentration to wield properly. But over a period of habituation, the device may become “transparent equipment,” seemingly natural extensions of the body that effortlessly dovetails with the sensory-motor system of the unskilled user (Clark, 1998: 38).

Skilled practitioners must also learn to create downstream corrigible control systems to monitor proper execution of their own pipetting tasks. The sources of variation for a given qPCR experiment are not limited to biological samples alone, since actions like pipetting can potentially introduce major technical sources of variation. Depending on the performer’s technique, tubes may end up with slightly different amounts of reagents, or nucleotide template, which has cumulative effects downstream in the pipeline when the qPCR reaction takes place. As Veronica herself reported, neat organization of the bench through the intelligent use of space presented one way of counteracting such disorder. But we also saw how Veronica set up technical replicates to help with error checking, as the protocol suggested use of three such replicates.

Before concluding this chapter, we must attend to a final, conspicuous piece of enabling material culture in Veronica’s workflow, known as a “kit” among biologists. Kits, which are figured in the periphery of the ecological assemblies described above, refer to a collection of epistemic and cognitive artifacts, peculiar for the craft practices of benchwork in the molecular biosciences. Kits are functional systems, based around three constituent parts (Weiner & Slatko, 2008: 701). First, the kit contains one or several reagents with various input materials. Second, it contains instructions that guide researchers in performing biochemical reactions on said materials. Third, the kit transforms the input materials in a way that creates similar outputs, as long as the input materials are identical. Everyday experimental biology, of the kind performed by Veronica and her peers in the DNA lab, is premised on the mastery of a wide range of what Walter Gilbert has laconically described as “cookbook techniques” (1991), which are based around the cultural availability of commercial kits as a pedagogical resource. Gilbert observed that graduate students in the early 1970s had to labor hard to make their own restriction enzymes, proteins that cut DNA at specific sites in the nucleotide sequence. By 1976, these enzymes could be purchased in standardized form from the sales catalogues of biotech companies. Today, very few molecular biologists know how to make restriction enzymes, and knowledge about these reagents, along with many other molecular techniques, are managed by a small number of specialized enterprises providing services to the global research community.

Nowadays, kits range from very simple assortments of reagents bundled together, to highly complex setups, with the most advanced kits enabling whole-genome sequencing. But the use of kits, or “systems” as they were originally called, was hotly contested at first. One reason for the controversy over these epistemic artifacts was that their “cookbook” nature effectively black boxed many scientific practices. In the past, newcomers to molecular biology would have to master these to be recognized as competent practitioners. One concern was that students would no longer be able to make sense of their own experiments, since kits make learning about foundational biochemical principles in laboratory work obsolete. Today, it appears that the epistemic benefits of speed, convenience, and experimental control have outweighed the arguments of critics, as progress in all fields of molecular life science has come to depend on kits (Fig. 6.8).

Fig. 6.8
figure 8

Rapid Amplification of cDNA ends is a method to obtain full length-sequences of cDNA. An enzyme, reverse transcriptase, is used to reverse-engineer mRNA into cDNA before segments are amplified and sequenced. The figure shows the unboxing of a commercial kit from Sigma-Aldrich (Merck) for the 5’ RACE-reaction. The kit contains twelve standardized components that suffice for ten reactions

In practice, kits and the recipes that accompany them, are put to use in a variety of functional systems in the laboratory, such as RNAi and qPCR. But as Lynch and Jordan have remarked, laboratory protocols seldom provide their users with complete and exhaustive specifications of what is sufficient and necessary for successful performance (1995). Novice experimentalists must therefore rely on non-codified, situated knowledge, derived from other members of the community to accomplish many central benchtop activities. Since not all these artifacts are informationally and procedurally transparent, there must also be widespread epistemic trust in the justification of “dispositional” beliefs concerning these complex technologies in the extended peer community. If necessary, these can be mobilized to give a precise scientific account of the how and why of a given technology. And while kits are shortcuts that outsource parts of cognitive and physical labor through time and space, they do not substitute for technical competency altogether. At the SLRC, for example, it was primarily senior laboratory engineers who had recognized expertise on the selection of kits, and who advised lab members about augmenting them in appropriate ways. Some reactions, for instance, could yield adequate results by using less amounts of expensive reagents in a reaction than suggested by protocol, thereby extending a costly kit’s longevity.

A key epistemic feature of kits is their standardized nature, which ensures a level of quality without the need for labor-intensive control routines. Kits also embody a principle of modularity that underlies many practices in contemporary molecular biology. Modularity, according to Bradd Shore, “virtually defines the cognitive landscape of modernity” (1995: 117). While the adaptive benefits of modularity can account for the durability of natural forms of modularity, modularization is a pervasive design strategy that breaks complex cultural wholes into elementary constitutive parts that in turn can be recombined in a range of patterns. As a foundational schema for modern manufacturing, the modular strategy embodies values like flexibility, efficiency, and control. These values are highly regarded in the “Fordist” data-production regimes of contemporary biology (Stevens, 2013).

Traces of modular design are abundant in the laboratory practices of biologists. Like in many other universities today, the Centre relies on gene sequencing services offered by a “core facility” at the host university, which is operated by specialized, dedicated personnel. Veronica and her colleagues regularly handed over test tubes with nucleic acids to the shared Sequencing Centre on the 5th floor of the high-technology Centre. A few days later, they would receive an email with a file they could open on their computers to visualize the nucleotide sequence belonging to their gene of interest. These practices are effectively kits “writ large,” that outsource cognitive labor and puts additional distance between scientists as epistemic agents and the methods they depend on (Weiner & Slatko, 2008: 702). Here, the modular nature of social and technical practice makes it possible to distribute cognitive tasks beyond any one particular workbench and experimenter to originate new ideas and meanings in the laboratory.

Conclusion

This chapter has closely examined the tool-saturated environment of the DNA laboratory at the SLRC. Focusing on Veronica’s execution of qPCR, a quintessential method for learning about gene expression patterns in salmon lice, it has explored how this space is constituted materially and semiotically. I showed how meanings are construed by attending to activities at the microlevel of material engagement that, at first glance, appear epistemically trivial. Closer scrutiny reveals these as central for epistemic success.

Once more we have encountered how epistemic enhancers in the lab extend cognitive abilities, far beyond the normally sensory range of human beings. Theories about gene expression and the biological properties of nucleotides are built into objects like qPCR machines and kits. But these devices do not work purely through an “instrumental objectivity” where human judgment has been removed and where the scientific object speaks alone, with human agents only as passive witnesses (Baird, 2004: 191). Rather, such enhancers are softly assembled into new ecological assemblies by canny users to become critical infrastructures for exploratory efforts. For the qPCR machine Veronica used here, there are nine different instructional booklets available. Additionally, there are dozens of available tutorials for specialized experiments on the device, such as genotyping, presence/absence experiments, standard curve experiments, and various reagents and their protocols, each with their own product number. A tech-support hotline, and software help-package addresses any issues that may appear while engaging with the instrument. Using each of these materials to solve scientific problems requires new constellations of resources to be assembled on the spot.

Intuitively, cognitive artifacts may appear as pre-given, isolated objects in the problem-solving environment. This ethnography, however, shows how material practices throughout the experimental system’s pipeline integrate resources with different properties in powerful ways to scaffold scientific thinking, and creates new representational structures in the process. My interactional analysis of how Veronica executed qPCR demonstrates a powerful role for materiality in the “descent of meaning” (Turner, 2003: 139). In the humdrum of mundane laboratory activity, we see how construction of material anchors for conceptual blends through the use of image schemas and the intelligent use of space, contribute to the production of novel biological insights about what genes do. “Superpositioning” of material structures on the bench to create order (Hutchins, 2012: 318–319), plays a central role in facilitating “conceptual sex” (2003: 140), the process whereby parent meanings come together, recombine, and begets offspring in the form of new structures of meaning.

Performance of qPCR is an interplay between physical, social, and conceptual elements, but the source of the observed organization in the activity was not simply lodged in Veronica’s head. It emerged from the larger cultural-cognitive ecosystem. Knowing everything there is to know about the brains of young scientists like Veronica would still not be sufficient to explain her epistemic accomplishments. Ethnographic studies on these dimensions of laboratory practice offer clues about the representational structure of her activity, which again provide insight into the informational properties of the larger system and its emergent cognitive functions. Parts of this problem-solving environment were pre-made, like the structure of Veronica’s pipette and reagents, test tubes and microplates, the machine and its computer software. These were put to creative use by the canny cognizer on the fly to create tailor-made affordances for actions that exceeded the properties of a handed-down material culture. In the end, the many representational and physical transformations undertaken by Veronica in the above, would eventually be integrated to produce an output in the form of a few single values that enabled further meaning-construction about biological entities. This was a baseline for decision-making about functional questions like “what does this gene do?,” “was the RNAi successful?,” or higher-level questions such as “is this a good vaccine candidate?” Since Veronica reported that the particular gene described in the above events did not merit further pursuit, other explorative screening experiments would come to fruition in the future.

The availability of modular equipment and modular practices enables progress in contemporary molecular life science. Here, purely generic systems are few and far between; universally standardized artifacts become accommodated and assembled to specific organisms and experimental designs. qPCR offers a telling example, as the method has now expanded into medical diagnostics, and become a staple of fish health science and veterinary services. Fish health biologists and veterinarians in aquaculture now routinely use qPCR to detect and diagnose disease in fish. The technology has become so widespread, that even fish farmers with little training in molecular biology and biotechnology, have been envisioned as potential users of the method. In 2014, for example, the company Europharma advertised a new device known as the Genesig Q16 to salmon producers. Manufactured by Primerdesign Ltd., this small and cheap qPCR-machine was heralded as a potentially revolutionary instrument. Originally designed for testing consumables, infectious disease, biohazards, and for veterinary applications, the device, which comes with standardized kits for more than 500 applications, has been projected to play a role in the future of fish health diagnostics. With this device, the laboratory could be brought directly to the tissue samples, rather than the other way around.

As Arthur Kornberg, who received a Nobel-prize for his studies of DNA polymerase once said: “when sophisticated instruments and fine biochemicals become commercially available and affordable, research is extended a thousand fold” (quoted in Rabinow, 1996: 30). This statement can be read as a testament to the power of ecological assemblies for human cognitive flourishing. When transporting qPCR from the lab and into the wild, users will surely find new ways of creating representational and conceptual stability to reason about target domains. How this happens without the infrastructure of the laboratory raises interesting questions from the view of distributed cognition but is far beyond the scope of this study.

The material practices examined in this chapter are powerful cultural ratchets. Cognitive ethnography helps us noticing phenomena that would be partly invisible for the analytical toolkit of a cognitive anthropology that sees mind and knowledge as contained by skin and skull. While communally shared cultural models provide one source of representational stability, I have used the distributed framework to highlight other sources for creating new knowledge and insight. When this view is adopted, it is clear that we cannot do without the notion of representation in the study of meaning-making and knowledge-production, in contrast to some anthropological proposals (Ingold, 2000; Toren, 2012).22 But in recognizing the centrality of representations in the social production of knowledge, it should be clear that I do not suggest that a sole focus should be on disembodied, symbolic, mental representations lodged “in the head.” Instead, we must refine and re-specify our conception of representation, in a way that recognizes the centrality of material engagements and allows us to recast the boundary of minds to consider what happens outside the individual agent. On this matter, I concur with Malafouris’ diagnosis that “the science of mind and science of material culture are two sides of the same coin” (2013: 13).

In the final chapter, I direct the “Cognito-scope” toward the practice of collaborative microscopy. While some specimens from RNAi trials end up on RNAlater, others were placed on “fix” for further processing through visual inspection. Here, we will pursue the question of how scientists see meaningful biological complexity in lice tissues with the help of a microscope, among other things.

Notes

  1. 1.

    A long-standing debate concern levels of analysis in the study of “difference-makers” like genes (Godfrey-Smith, 2013: 89). “Classical genetics” and the “modern synthesis” of evolutionary biology, see genes as an abstract hereditary unit (a “factor”), using tools like linkage maps to study their position on chromosomes and calculate recombination frequencies of inherited traits. This idealization was not based on biochemistry or the information-bearing role of molecular structure. In contrast, molecular biology “de-particlize” genes, as macromolecular sequences of nucleotides whose transcription and translation are regulated by factors organized on the scale of genomes. In biological practice, these conceptualizations productively co-exist.

  2. 2.

    Nersessian offers a useful ontology of laboratory artefacts (Nersessian, 2006: 131). “Devices” are engineered facsimiles used as in vitro models and sites of simulation; “instruments” generate measured output in quantitative or graphical form; and “equipment” assists with manual or mental labor. In my examples, artefacts functionally cut across this classification.

  3. 3.

    For a general introduction to bioinformatic tools for sequence translation, see http://www.ebi.ac.uk/Tools/st/.

  4. 4.

    Terms like “qPCR” and “real-time PCR” are used inconsistently. Here, I describe the latter, which uses RNA that is reverse transcribed into cDNA as a starting template. The Minimum Information for Publication of Quantitative Real-Time PCR Experiments suggest the abbreviation RT-qPCR for this kind of experiment. To ensure ethnographic fidelity, I refer to this procedure as “qPCR.”

  5. 5.

    Therefore, answering the question of “who invented PCR” is hard, despite Kary Mullis winning the Nobel Prize in 1993 for his contribution. The story of PCR is too complex to elucidate here; as Rabinow’s informant quipped about the messy affair: “Conception, development and application are all scientific issues - invention is a question for patent lawyers” (Rabinow, 1996: 6).

  6. 6.

    dNTPs are molecules made of ribose or deoxyribose sugar, covalently bound to a nitrogen base, which contains a nucleoside bound to three phosphates (it is sometimes called a nucleotide when it has phosphates connected to its 5-prime end). Technically, nucleotides are classified as nucleosides, and have a suffix describing the number of attached phosphates (e.g., mono- or triphosphate).

  7. 7.

    The method relies on a principle called “fluorescence resonance energy transfer” (FRET). The Molecular Probes Handbook from ThermoFisher Scientific, a supplier of scientific instruments, describes FRET as: “a distance-dependent interaction between the electronic excited states of two dye molecules in which excitation is transferred from a donor molecule to an acceptor molecule without emission of a photon” (Thermo Fisher Scientific: the molecular probes handbook, 2017).

  8. 8.

    An alternative type of qPCR is an “endpoint semi-quantitative PCR,” where data is collected at the end of the amplification reaction, and where the template content is measured by back-calculation.

  9. 9.

    Commenting on a draft of this section, one researcher remarked: “The DNA polymerase translates RNA to DNA, but we don’t know if there was DNA in the sample before (in case DNAse treatment didn’t work sufficient). In that case we would get a wrong fluorescence signal, […] a mixture of the real signal from RNA and wrong signal from DNA. To avoid a wrong signal, we usually, if possible, also design primers in a way that they span over the exon-intron border.”

  10. 10.

    Absolute and relative quantification are two main analytical methods supporting RT-qPCR. Absolute, or “standard curve”-quantification calculates the sample’s amount of template (e.g., for estimating viral load). This description concerns relative quantification compared to a control sample, as my informants were comparing the results of an experimental condition with a baseline control.

  11. 11.

    Problems are determined by evaluating plots of variables in the experiment for outliers, atypical amplification, irregular amplification, threshold values and faulty baselines. The plots and their meanings are specified by the qPCR-machine’s user manual.

  12. 12.

    Δ is the symbol for delta, meaning “difference,” The “Livak-method” is named after the first author of “Analysis of Relative Gene Expression Data Using Real-Time Quantitative PCR and the 2-Δ ΔCT Method” in the journal Methods (Livak & Schmittgen, 2001), a highly cited paper in the history of science.

  13. 13.

    Lab workers occasionally ran a “standard curve” experiment alongside the variety described here, to account for deviations in the reaction’s efficacy. This is done by diluting the template and checking how an idealized 100% efficacy compares to actual efficacy.

  14. 14.

    A significance level of 0.05 means there is a 5% probability of getting the observed result, or more extreme ones, given that the null hypothesis is true (usually that there is no difference between treatments).

  15. 15.

    This view contrasts with “the romance of mathematics”; a belief in mathematical Platonism, where the structure of mathematics is conceived as existing independently of minds (Lakoff & Núñez, 2000: xv).

  16. 16.

    Following conventional notation, I write analytical concepts like image schemas, conceptual metaphors, and blends in small caps.

  17. 17.

    Four prototype integration networks have been proposed (Fauconnier, 2001). Simplexes takes one input as a frame (schematic knowledge like “buying groceries”) and uses specific elements in the other to fill roles in the frame. In mirrors, network spaces share a common organizing frame. Single scopes take inputs from different organizing frames, but the blend inherits only one frame. Double scopes use identity properties and essential frames from both inputs to resolve clashes between fundamentally different inputs.

  18. 18.

    Conceptual blends follow optimality principles. A blend must be integrated as an event that can be operated on as a uniform unit, where input spaces and elements match its respective counterparts. Manipulation blends must also maintain a web of connections and facilitate unpacking, so that users can meaningfully understand the connections to other elements in the blend.

  19. 19.

    This formatting differs from the paper sheet used at the bench, due to the use of different software for reading the original file provided by Veronica. Structural relations between elements are identical.

  20. 20.

    Heersmink distinguishes between “technology,” as intentionally made physical objects, and “technique” which comprise skills, methods and procedures for doing (2013: 468). While both are “artificially” developed by humans, only the former class constitutes physical objects. Techniques are internalized through enskillment (although people may rely on external instruction for complex actions). Heersmink suggests that natural objects adopted for cognitive goals constitute a separate class of “naturefacts.”

  21. 21.

    Pipettes are calibrated at regular intervals to maintain their accuracy.

  22. 22.

    Toren, for example, mistakenly writes off distributed cognition as “dualist” and “ahistorical” (2012: 36).