Introduction

On 6 February 1992, the discovery that an expanding trinucleotide repeat was the mutation that lay behind myotonic dystrophy made the front page of the New York Times. The article noted that neurologists had been seeing a pattern of inheritance in myotonic dystrophy where the disease seemed to worsen over succeeding generations, but that these observations had been dismissed by most geneticists, at least until the discovery of this new and unusual form of inheritance (Kolata 1992). The discovery of this new form of unstable mutation, first in fragile X syndrome (1991) and X-linked spinal and bulbar muscular atrophy (1991) and then in myotonic dystrophy (1992), excited the scientific and medical communities and paved the way for the rehabilitation of the concept of anticipation in hereditary disease. An examination of how anticipation was accepted or rejected in various locations and times over the course of the last century reveals some of the complex interactions that occur between science and society.

Precursor notions

Concern about the hereditary nature of disease is nothing new. Ancient, medieval and early modern medical thinkers all realized that the inheritance of certain diseases tended to run in families. References to the hereditary nature of disease appeared in the Hippocratic writings (Lloyd 1983), and medieval rabbinical writers developed formal opinions regarding the inheritance of disease and whether or not those afflicted should be allowed to marry if they risked passing on a disease to the next generation (Rosner 1998). In the seventeenth century, medical dictionaries referred to hereditary constitutional illnesses, including gout, tuberculosis, and kidney stones, which ran in families (Lopez-Beltran 1994). By the eighteenth and nineteenth centuries, physicians were writing manuals that advised individuals to carefully consider whether a prospective marriage partner had any hereditary disease or predispositions in their family, since such hereditary predispositions were seen as almost impossible to remove or treat (Waller 2001, 2002).

As the nineteenth century progressed, social, political, and economic upheavals in Western Europe contributed to an increasing sense that industrial pollution, adulterated foods, physical illnesses like tuberculosis and syphilis, and social ills like drink and promiscuity, were having a negative effect on the populace. For French physicians and psychiatrists in particular, the increasing numbers of the mentally ill were seen as a hallmark of decay and degeneration, and several influential individuals took up the study of theories of morbid heredity as their way to contribute to the betterment of the nation (Dowbiggin 1985; Pick 1989).

The French psychiatrist Prosper Lucas seems to have been the first to note that within certain families there was a tendency for various diseases to manifest earlier in succeeding generations, as opposed to the commonly perceived pattern of inheritance where the disease manifested at the same time generation after generation (Lucas 1847–1850). This idea was taken up and developed by the influential French psychiatrist Bénédict Morel. Morel noted a pattern of progressive physical, moral, and intellectual degeneration within certain families. What began as a fairly minor affliction in one generation became successively worse over succeeding generations, until by the fourth generation the offspring were sterile or so badly affected that they were unable to have children of their own. He thought that a wide range of environmental and social factors, ranging from famine and industrial poisons, to drink and promiscuity, could act as physiological insults that could set off this pattern of degeneration within a family line (Morel 1857). He argued that some sort of in-born taint allowed what was a nervous temperament in one generation to develop into hysteria or epilepsy in the following generation and so on until the descendants were no longer able to reproduce themselves (Morel 1860). This notion of progressive or degenerate heredity became very popular in European and North American psychiatry and medicine over the next half-century, as did the related notion of a hereditary predisposition or diathesis towards a variety of diseases, including mental illness. By the end of the nineteenth century, concerns about degeneration were widespread, appearing not just in the medical literature but also in cultural commentaries like the physician-journalist Max Nordau’s popular book Degeneration (originally published in German as Entartung in 1892), which went through several editions and was translated into a variety of languages (Nordau 1968).

Defining “Anticipation”

The rediscovery of Gregor Mendel’s laws of heredity by Carl Correns and Hugo De Vries in 1900 is generally seen as the watershed moment in the history of genetics. However, during the first decade of the twentieth century three main theories of heredity competed for dominance: the Galtonian theory of ancestral inheritance, Weismannian germ-plasm theory, and the Mendelian theory of unit characteristics. In the scientific world and within broader society, social Darwinian ideas, eugenics, and neo-Lamarckian notions of heredity also remained popular and influential (Bowler 2003). Demographic changes in the late nineteenth and early twentieth centuries—most notably a decreasing birthrate among middle and upper class families and a high birthrate among poorer classes, recent immigrants, and some ethnic and religious groups—shifted the focus of concerns about national degeneration to issues of differential fertility and raised the spectre of ‘race suicide’ (Soloway 1995). The new genetic science was used by members of eugenics movements in Britain (Mazumdar 1992; Kevles 1995), America (Carlson 2001; Lombardo 2008), Canada (McLaren 1990), and elsewhere to link together such diverse traits as pauperism, feeblemindedness, epilepsy, tuberculosis, alcoholism, and criminal behaviour in classes or groups of individuals whose high rate of fertility they sought to control, while they encouraged ‘fitter’ families to have more children.

The British ophthalmologist Edward Nettleship was one of a number of physicians interested in the study of hereditary diseases at the beginning of the twentieth century. Nettleship was unique in that he managed to befriend and learn from people on both sides of the acrimonious dispute that was then taking place in Britain between supporters of Galton’s biometric model of human heredity and those who supported Mendelian theories (Rushton 2000). He felt that rather than being mutually exclusive “there seems to me to be no necessary antagonism between Galton’s Law and a particulate inheritance such as is required by Weismann’s germ plasm and Mendel’s ‘unit characters’.” Nettleship also noted that there were certain patterns of inheritance that none of these theories of inheritance were then able to explain, including the inheritance of colour blindness which could not be understood before the discovery the sex chromosomes (Nettleship 1910) (Fig. 1).

Fig. 1
figure 1

Edward Nettleship (1845–1913). By kind permission of the Royal Society of Medicine

It was Nettleship who coined the term anticipation to describe the pattern of heredity where a disease appeared earlier in succeeding generations. The term had previously been used to describe cases of malaria where the fever re-appeared earlier than expected. Nettleship substituted years for hours and successive generations for repeated attacks of fevers in creating a term and concept to describe a pattern of heredity which he and others had noted in several diseases ranging from Leber’s hereditary optic neuropathy and cataract to diabetes and familial jaundice (Nettleship 1905, 1909).

The person who did the most to popularize the concept of anticipation and whose name is generally associated with the concept was not Nettleship, but rather the British neuropathologist Sir Frederick Walker Mott. Like many of his contemporaries, Mott was interested in eugenics and the question of national degeneration. He had long been concerned about a pattern of heredity which had been noted in the inmates of the London county asylums. It seemed that the children of asylum inmates tended to suffer from mental illness at an earlier age and often in a more severe form than their parents. As many psychiatrists had suggested since Morel, Mott also argued that this was a symptom of familial degeneration which would continue to worsen over succeeding generations until no more children were born to afflicted families and he likened this to “rotten twigs … continually breaking off the tree of life” (Mott 1910) (Fig. 2).

Fig. 2
figure 2

Frederick W. Mott (1853–1926). Image courtesy of the Wellcome Library, London

There was no way to explain these findings using Mendelian or Galtonian modes of heredity. However, Mott found his explanation in Nettleship’s concept of anticipation which he described first as a rule and then as a law of heredity. Mott believed that anticipation was Nature’s way of ending or mending diseased stocks by concentrating and intensifying the illness in certain members of the family while leaving others free of disease (Mott 1911a, b). In a series of papers published in both British and American journals, Mott popularized this “law of anticipation”. The concept became known as “Mott’s law” and Nettleship’s role in defining the term was forgotten (Harper et al. 1992).

Early struggles

Although it would seem that a concept like anticipation would be popular during the first decades of the twentieth century when eugenic thinking was quite common, this was not in fact the case. Anticipation was invoked as a way to explain findings of decreasing age of onset and increasing severity of disease in dementia praecox (Rüdin 1916) Huntington’s disease (Entres 1921), myotonic dystrophy (Adie and Greenfield 1923), Leber’s hereditary optic neuropathy (Kwakami 1926), and diabetes (Sherrill 1921), but it was by no means universally accepted, particularly among those who took a statistical rather than a clinical approach to the study of human heredity.

In Britain, anticipation came under attack by biometricians from the Galton Laboratory of National Eugenics after Mott used the concept to argue against the sterilisation of the so-called ‘unfit’ at the First International Eugenics Congress in 1912. Moreover, Mott asserted that those with a family history of mental illness who had reached the age of 25 without showing signs of disease should be free to marry and have children (Mott 1912). This was in direct opposition to the general eugenic opinion of the day that would have all individuals from such families abstain from reproduction. Karl Pearson and his student David Heron attacked anticipation on statistical grounds, arguing that findings of anticipation were due to the selection and ascertainment bias, and they were quite concerned that Mott’s law of anticipation might be used to undermine eugenic progress (Pearson 1912; Heron 1914). The influential American eugenicist Charles Davenport also believed that anticipation was the result of selection bias (Davenport 1915). Although the anticipation was originally adopted by German eugenicists, the community largely turned away from the concept during the 1920s, accepting Wilhelm Weinberg’s statistical arguments against such findings (Rüdin 1923; Baur et al. 1923). Nevertheless, physicians continued to find the concept useful in describing patterns of inheritance in a variety of diseases and anticipation remained in use in the medical literature.

Theoretical discussions of anticipation began once more in the 1930s in the context of the debate surrounding the possible adoption of sterilisation legislation in the United Kingdom that was being renewed in the face of difficult economic conditions triggered by the Great Depression. Proponents of the proposed act were concerned that anticipation would once again be used to argue against the need for such legislation, and so they set out once more to try to discredit the concept on statistical grounds (Pearson 1931; Editor 1931; Paterson 1932).

The young Lionel Penrose, a proponent of the quantitative genetics then being developed by Lancelot Hogben and John Burdon Sanderson Haldane (and who would later become one of the most influential human geneticists of the mid-twentieth century), also argued against anticipation as one of many older and outmoded ideas about human heredity which needed to be discarded in favour of the new mathematically informed Mendelism (Penrose 1933). Over the next few years, Penrose developed the theory that it was the modification of the disease gene by its normal allele that caused the variation in age of onset associated with anticipation (Penrose 1936). Nevertheless, the findings of anticipation continued to be reported by clinicians in a variety of diseases.

The only experimental geneticist who ventured an opinion on the nature of anticipation in the first half of the twentieth century was the German émigré Richard Goldschmidt. One of the arguments that had long been made against anticipation was that it was seen only in human pedigrees and lacked an experimental model in either the plant or the animal worlds. Goldschmidt, a developmental geneticist, newly settled at the University of California, Berkeley, believed that he had finally found such a model in a particular strain of fruit flies which showed a range of wing mutations from slight to severe. He theorized that these phenotypic variations were caused by the moderating effect of novel and ill-defined genetic modifiers called dominigenes. This, he argued, could explain the variation in wing phenotype in his fruit flies and in the age of onset and severity of myotonic dystrophy. His theory, however, made little headway in part because it was at odds with the previous understanding of the genetics of myotonic dystrophy (arguing that the disease was caused by a recessive mutation rather than a dominant one as was generally believed) and also because his concept of dominigenes failed to catch on among geneticists. Goldschmidt was also the first to mention a rift then beginning to open between the nascent field of human genetics with its mathematical approach to Mendelian heredity and the clinicians who continued to believe in older ideas such as progressive heredity and anticipation (Goldschmidt 1938). This divide would become much more marked with the professionalization of human genetics after the Second World War.

Post-war rejection

The period of dramatic social, political, and scientific change that followed the end of the Second World War and the beginning of the Cold War created a new milieu in which the concept of anticipation would be firmly rejected for decades to come. The field of eugenics entered a period of slow decline, in part due to the realization of the uses to which the Nazis had put the field of eugenics, but also due to the fact that the rise of the post-war welfare state helped to eradicate the social and economic conditions which had originally concerned members of Britain’s Eugenics Society (Mazumdar 1992). The fields of human, medical, and clinical genetics also underwent a period of marked expansion and institutionalization in the English-speaking world after 1945. As part of the post-war educational expansion new departments of human genetics were created in a variety of institutions. These benefited directly from funding from governments and foundations which had a say in the direction scientific research took and research became increasingly molecular and quantifiable (Paul 1998; Kevles 1995; Kay 1993). Research that did not fit into this model was marginalized (Sapp 1987).

As part of their institutional self-fashioning, Hermann Joseph Muller, president of the newly established American Society of Human Genetics, urged his colleagues to learn from the errors of the past and separate themselves and their work from the discredited science of eugenics (Muller 1949). The science of genetics itself was also rapidly evolving. The modern synthesis brought together Darwinian evolutionary theory and Mendelism through the work of qualitative and population genetics which developed along distinctly mathematical and statistical lines. In the West, genetics became increasingly quantifiable and molecular with the discovery of the nature and structure of genetic material and the development and elaboration of new experimental systems within the field of molecular biology. This led in turn to a further discrediting of non-Mendelian forms of heredity, including cytoplasmic inheritance, as these sorts of neo-Lamarckian forms of heredity ran counter to the dominant Mendelian paradigm promoted in the West (Sapp 1987) and were linked to Soviet Lysenkoism and the destruction of the field of classical genetics by the Communist regime in the USSR (Krementsov 1997; Harper 2008; Pringle 2008). During the Cold War, Western scientists pursuing research into non-Mendelian forms of heredity found their work under political as well as scientific scrutiny as Lamarckian/Lysenkoist views were equated with radical left-leaning or Communist ideologies which could result in the loss of grants from funding agencies like the Rockefeller Foundation (Krige 2006).

One of the major centres for the study of human genetics in the post-war period was the Galton Laboratory of National Eugenics at University College London. In 1945, Lionel Penrose returned from Canada, where he had spent the war years, to take up the position as Galton Chair at University College London and head of the Galton Laboratory. Penrose was one of the few human geneticists who had never embraced eugenic ideas and under his leadership, and with the aid of funding from the Rockefeller Foundation, the Galton would turn away from the study of eugenics and embrace the study of human genetics (Friedman 2008). In 1954, he changed the name of the Laboratory’s journal from Annals of Eugenics to Annals of Human Genetics and in 1963, after considerable effort, finally succeeded in having the name of the Galton Chair changed to the Chair of Human Genetics (Harris 1973) (Fig. 3).

Fig. 3
figure 3

Lionel Penrose (1898–1972). Image courtesy of College Collections Photos, UCL Library Services, Special Collections

As part of his program to see human genetics placed on a more quantitative and mathematically based Mendelian foundation, Penrose gave lectures and published articles in which he attacked several older ideas of heredity, including anticipation (Penrose 1946). Findings of anticipation continued in the clinical literature, particularly in the case of myotonic dystrophy, which had the most dramatic decrease in age of onset recorded (Franceschetti and Klein 1946). In 1947, the publication of Julia Bell’s study of myotonic dystrophy and related diseases as part of the Galton Laboratory’s Treasury of Human Inheritance offered Penrose the chance to attack anticipation directly. Like Karl Pearson, her first mentor at the Galton Laboratory, Bell was leery of anticipation. But after having accounted as she could for sources of ascertainment bias, she could not deny that anticipation seemed to be occurring in myotonic dystrophy, although she questioned whether it was truly part of the disease (Bell 1947).

Penrose used Bell’s findings and his own work as the basis for his landmark paper on anticipation (Penrose 1948). He argued that what appeared to be anticipation was merely an experimental artefact due to a variety of statistical and experimental errors. The variability in age of onset in myotonic dystrophy was caused, he thought, by allelic modification of the affected gene. In order to prove his point, Penrose carried out a thought-experiment. Since he believed that anticipation was caused by allelic modification, Penrose postulated the existence of a number of individuals who developed the disease at a later age of onset than their affected parent. However, he argued that the existence of these individuals could often be missed by researchers and he included this in his calculated values for correlation coefficients of ages of onset in parent–child pairs and sib–sib pairs. His theoretical calculations were very close to Bell’s observed findings which strongly suggested that his hypothesis of two modifying allelomorphic genes was correct, even though direct evidence of these missing individuals was entirely lacking. His rigorously logical arguments were couched in the language of Mendelian inheritance and backed up by extensive statistical and mathematical analyses, many of which he had made previously.

One of the most curious features of the history of the concept of anticipation is how quickly and completely the hypothesis put forward in Penrose (1948) came to be adopted. Significant parts of the argument had been made by Penrose and others before and yet anticipation had remained in use by medical specialists and psychiatrists. His 1948 article put forward a rhetorically robust argument against the biological reality of anticipation using the most up-to-date methods of mathematical genetics and was made by a researcher considered to be one of the best human geneticists in Europe and who led the most important laboratory of human heredity in England. These factors surely helped to add to the persuasiveness of Penrose’s arguments, but one of the most important features of its acceptance was how quickly his ideas were incorporated into the textbooks which taught the next generation of researchers in this small but rapidly expanding field. Between 1949 and 1970, fully half of the human and medical textbooks published taught Penrose’s hypothesis that anticipation was merely a statistical artefact and the other half ignored the topic entirely (Friedman 2008).

In the years following the publication of Penrose (1948), the concept of anticipation quickly faded from the medical and human genetics literature as a new generation of researchers entered the workplace. Only in the case of myotonic dystrophy were serious attempts made to put forward arguments in favour of anticipation, and these arguments were made by senior authors who had completed their medical education in the 1930s. In 1954, the Swiss geneticist David Klein and the New Zealand neurologist John Egerton Caughey both published findings establishing anticipation in myotonic dystrophy with decreasing age of onset and increasing severity of disease, particularly when passed from mother to child, but their findings were ignored (Klein 1954; Caughey and Barclay 1954). Anticipation had been discredited as a biological phenomenon.

New questions emerge

In the late 1960s and early 1970s, it became apparent that there was something odd about the transmission of Huntington’s disease and myotonic dystrophy. In the case of Huntington’s disease, the juvenile onset form of the disease was transmitted paternally (Merritt et al. 1969), while in myotonic dystrophy congenital cases were born to mothers who were themselves often only mildly affected (Harper and Dyken 1972). Throughout the 1970s and 1980s, a variety of hypotheses were proposed to explain these findings. These included not only the classical Mendelian explanations, but also the newly recognized importance of human mitochondrial DNA and epigenetic mechanisms of imprinting and methylation. Anticipation, however, still remained a largely discredited finding.

In the case of Huntington’s disease, the three major contenders to explain the paternal transmission effect were the maternal factor (mitochondrial) hypothesis (Myers et al. 1983; Boehnke et al. 1983), the modifier gene hypothesis (Boehnke et al. 1983; Laird 1990), and the genomic imprinting hypothesis (Erickson 1985; Ridley et al. 1988; Reik 1988). Each of these hypotheses failed to completely explain the patterns of inheritance seen in Huntington’s disease. The maternal factor hypothesis failed to explain why earlier ages of onset in Huntington’s occurred with paternal transmission. Identifying modifying genes was proving to be a challenging undertaking which made this hypothesis a difficult one to prove or disprove. Finally, the occasional finding of maternally transmitted cases of juvenile-onset Huntington’s suggested that genomic imprinting could not be the whole explanation.

In myotonic dystrophy, explanations for the maternal transmission of the congenital form of the disease included an intrauterine factor (Harper and Dyken 1972; Harper 1975) and genetic heterogeneity as possible causes (Bundey and Carter 1972). The posited intrauterine factor remained elusive, however, and only acted on foetuses with the myotonic dystrophy gene while leaving normal foetuses unaffected. Genetic heterogeneity as a cause of myotonic dystrophy was ruled out by linkage analysis studies and the suggestion that myotonic dystrophy might be made up of two or three different mutations in the same gene conflicted with the findings of earlier authors like Bell (1947).

In the 1980s, fragile X syndrome joined this group of diseases with unusual patterns of heredity. This X-linked form of intellectual disability was first described by James Purdon Martin and Julia Bell (Martin and Bell 1943) and later associated with the fragile site on the X chromosome which gave the disorder its name (Lubs 1969; Richards et al. 1981). In other X-linked disorders (like colour blindness), female carriers are free of signs of the disease while all male carriers exhibit signs of the disorder. This was not the case in fragile X where a percentage of female carriers exhibit symptoms of the disease and where some of the male carriers appear to be perfectly normal but have daughters and grandsons affected by the disorder. Moreover, individuals born into later generations of the family were more likely to be affected than those in earlier generations (Sherman et al. 1984, 1985). This unusual pattern of inheritance came to be called the “Sherman Paradox” (Opitz 1986). During the 1980s, a number of hypotheses were postulated to explain these findings. They included modification or suppression by an autosomal gene or genes (Froster-Iskenius et al. 1984; Steinbach 1986; Israel 1987); a multi-stage mutation process in which a pre-mutation event was converted to a full mutation by recombination (Pembrey et al. 1985), or by recombination and amplification of the genetic sequence at the fragile site which created a continuum increasing in length from non-affected to affected individuals (Nussbaum et al. 1986; Ledbetter et al. 1986; Warren et al. 1987); environmental factors like genomic imprinting and methylation (Holliday 1987); and a combination of genetic mutation and imprinting (Laird 1987). Determining which of these hypotheses were correct was difficult, and several groups of researchers worked on sequencing the Fragile X gene in hopes of finally providing an answer to the perplexing pattern of inheritance seen in this disease.

During the mid-1980s, the work of the Dutch neurologist, Chris Höweler, began to undermine Penrose’s hypothesis concerning anticipation. In his 1986 MD thesis from Erasmus University Rotterdam, Höweler carefully re-examined 14 families that had previously been studied for myotonic dystrophy. He accounted for all of the possible sources of ascertainment bias, statistical, and experimental errors suggested by Penrose and came to the controversial conclusion that anticipation was, in fact, taking place in myotonic dystrophy (Höweler 1986). Three years later, these results were disseminated more widely in an article published in the neurology journal Brain (Höweler et al. 1989), but the paper generated little response until after the sequencing of the genes for fragile X (1991) and myotonic dystrophy (1992) (Fig. 4).

Fig. 4
figure 4

Chris J. Höweler. Image courtesy of the Höweler family

Höweler’s work did convince a few geneticists, most notably the British medical geneticist Peter Harper, that anticipation was a real biological phenomenon (Harper 1989, 1990, 1991). The speculation made by Höweler et al. (1989) that the as yet unknown mechanism involved in the unusual pattern of inheritance seen in Fragile X might also explain findings of anticipation in myotonic dystrophy was further disseminated by Harper (1989, 1990). Nevertheless, with no clear causative explanation for anticipation, most geneticists remained convinced that Penrose’s hypothesis was correct and that anticipation was the result of statistical and sampling error and not a biological reality.

The discovery of a molecular mechanism

Technological developments in the 1980s and 1990s paved the way for the isolation and sequencing of genes associated with various diseases. The sequencing of Fragile X revealed not a standard point mutation, deletion, or insertion of DNA, but rather a region of unstable DNA where changes occurred in the copy number of a trinucleotide repeat (CGG) in the 5′ region of the gene (Oberle et al. 1991; Yu et al. 1991; Verkerk et al. 1991; Kremer et al. 1991; Fu et al. 1991). The next published report of an expanding trinucleotide repeat (CAG) was found in X-linked spinal and bulbar muscular atrophy (La Spada et al. 1991). The discovery of expanding DNA repeats in Fragile X led to speculation that this new form of mutation might provide the answer to some old genetic questions, including those surrounding anticipation (Sutherland et al. 1991). With the discovery of a similar unstable trinucleotide sequence (CTG) in the myotonic dystrophy gene in 1992, it became clear that this was the molecular mechanism that underlay genetic anticipation. Put simply, the longer and more unstable the repeat areas became, the earlier and more severely the disease manifested (Harley et al. 1992; Buxton et al. 1992; Aslanidis et al. 1992; Brook et al. 1992; Mahadevan et al. 1992; Fu et al. 1992). This was corroborated in 1993 when a similar pattern was seen following the sequencing of the gene for Huntington’s disease which revealed another expanding trinucleotide repeat (CAG) (The Huntington’s Disease Collaborative Research Group 1993).

This period after the discovery of a genetic mechanism that could explain anticipation marked a complete turnaround in the reception of this concept. In a few short years, anticipation went from a discredited notion to an acceptable, even popular concept. In the years since, a number of genetic diseases have been found to be caused by these regions of expanding DNA. It is remarkable how quickly these new findings were incorporated into the textbook as well as the journal literature. From 1992 onwards, human and medical genetics textbooks have included discussions of anticipation in their coverage of repeat expansion disorders which marked a complete change from their coverage of anticipation prior to the discovery of this genetic mechanism (Friedman 2008). Today, anticipation is viewed as the result of DNA expansion over succeeding generations, most likely through altered DNA replication, recombination, and repair, in a number of repeat expansion diseases (Penagarikano et al. 2007; Mirkin 2007; McMurray 2010).

Conclusion

Anticipation provides a unique lens through which to view developments in our understanding of human heredity over the last century and to understand how social factors, as well as scientific developments, play a role in the development of scientific concepts. Political, economic, and social concerns played important roles in the development and reception of anticipation. During the nineteenth century, interest in hereditary disease found itself expressed as concerns about biological degeneration which reflected anxieties about the political, economic, and social upheavals then taking place in the industrialized world. The theoretical background of the concept of anticipation lay in these notions of degeneration which were common in the second half of the nineteenth century. Much of the controversy surrounding “Mott’s law” had to do with anticipation being used to argue against sterilisation legislation in Britain during times of economic upheaval. In the decades following the end of the Second World War, a social backlash against ideas associated with eugenics, and political as well as scientific concerns surrounding notions of non-Mendelian forms of heredity, Lamarckism, and Lysenkoism, helped to move human genetics onto an increasingly mathematical and mechanistic track. This shift was encouraged by the directed funding of government and philanthropic foundations which wanted to encourage the development of the ‘best’ new science.

In large part, the reception of anticipation reflects the scientific norms of the day. The concept made inroads during the first two decades of the twentieth century when the notions of degeneration were still common and when the study of hereditary disease relied mainly on detailed clinical observation. As more statistical and theoretical approaches to the study of heredity gained in popularity, findings of anticipation came to be questioned and tensions increased between those who took a clinical approach to the study of human heredity and those whose notions of heredity were increasingly statistical, mathematical and mechanistic. This led to a rift between clinically inclined researchers who observed anticipation in their patient populations and the more theoretically and statistically inclined researchers who could find no biological explanation for findings of anticipation and therefore denied it or explained it away. As the new field of human and medical genetics professionalized and expanded after the Second World War, this more quantitative and statistical approach rooted in classical genetics and molecular biology rose to dominance. In the 1980s, the view of human heredity became more nuanced, including non-nuclear modes of inheritance and epigenetic changes, but also became increasingly tied to genetic sequencing, especially with the beginning of the Human Genome Project in 1990, with the goal of directly locating and defining disease genes. Anticipation, however, remained a largely discredited concept until a molecular mechanism was found which explained the perplexing patterns of inheritance seen in diseases like Fragile X, myotonic dystrophy, and Huntington’s disease.