Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Introduction

This chapter explores developments in mathematics, computing, mathematics education and scholarship relevant to understanding tools from 1960 to the time of writing. This exploration is biased in accentuating influences relevant to tools and mathematics education. I am European and there is also a bias towards that with which I am most familiar, Western influences. This opening chapter could be a book in itself. To avoid this I describe a broad landscape and focus on selected technological advances, ideas and people that I consider important. The chapter begins with a section charting developments in mathematics, computing and education followed by a section on intellectual trends relevant to understanding tools and tool use. The final section focuses on the development of ideas in mathematics education regarding tools and tool use.

2 Developments in Mathematics, Computing and Education

The 1960s is interesting with regard to the joint development of mathematics and computing. In mathematics Paul Cohen solved the continuum hypothesis (Cohen, 1963/1964) and Abraham Robinson introduced non-standard analysis (Robinson, 1966). Both of these advances were due to developments in mathematical logic. Mathematical logic, coupled with advances in physics/electronics , was also behind advances in technology. Mathematicians such as Alan Turing and Johnny von Neumann were instrumental in the development of the computer pre-1960. Jon’s top 10 algorithms (see Sect. 3.5) illustrate the co-development of mathematics and computing and the quicksort algorithm (#7 in Jon’s list), developed in the early 1960s, remain the most used sorting method in databases. As we will see in Chap. 8, mathematicians also developed the first high level computer languages. The mathematics-computing influence was two way and in 1961 Shanks and Wrench computed π to 100,000 d.p. using an inverse-tangent identity and a computer. By the 1970s the computer, as a tool, was an active agent in mathematics. In 1975 Benoit Mandelbrot introduced the world to fractals and, soon after, was using computers to plot images of Julia sets. In the following year a proof of the Four Colour Theorem was published; this was significant because it was the first major proof in which a computer was essential (for parts of the proof). Soon after the developments in experimental mathematics with computers, which Jon describes in Chap. 2, started. Borwein and Devlin (2008, p. 7) write of this period:

At the same time that the increasing availability of ever cheaper, faster, and more powerful computers proved irresistible for some mathematicians, there was a significant, though gradual, shift in the way mathematicians viewed their discipline. The Platonistic philosophy that abstract mathematical objects have a definite existence in some realm outside of humankind, with the task of the mathematician being to uncover or discover eternal, immutable truths about those objects, gave way to an acceptance that the subject is the product of humankind, the result of a particular kind of human thinking.

1967 heralded the first compact electronic calculator, Texas Instruments’ Cal-Tech , though it used transistors and required mains power. Two years later Sharp’s QT-8D appeared with semi-conductor technology replacing transistors. In 1972 Hewlett Packard introduced the first scientific calculator (with trigonometric and algebraic functions) followed in 1974 by the first programmable calculator. Calculators were not designed for education purposes (their design was largely forced by the technology available) and the early target for sales were business and scientific workers. In 1985 the first graphic calculator, the Casio fx-7000G, appeared. The mathematicianstouch is evident in this, and some other early scientific calculators, in that it used Reverse Polish Notation, i.e. 2 × 3 is input as 2 3 ×.

The development of semi-conductor technology also paved the way for small computers, which came to be known as ‘micro computers’ in these early years. These were, initially, often sold as kits to be assembled for amateur and professional ‘computer boffs’ to learn about microprocessors but the home and education markets were soon targeted. The ability to link micro computers, equipped with high level languages (rather than machine code), to a TV screen came around 1977 with three computers targeted at the home market: Radio Shack’s TRS-80, the Apple II and the Commodore PET. Soon after this some mathematicians, such as David Tall, started writing BASIC programmes for educational graphing with sub-routines to provide such things as an ‘intelligent scale’ to the axes and the ability to recognise asymptotes; Tall’s programmes were later marketed as Supergraph (Tall, 1985). Simultaneously with this (though the origins pre-date the advent of the micro computer) other mathematicians were developing educational ideas using the languages Lisp and Logo—this is discussed in some detail in the next chapter.

The 1980s witnessed a burgeoning of computer applications for doing and learning mathematics. The first interactive geometry system, the Geometric Supposer, appeared in the early 1980s. The Geometric Supposer did not allow geometric objects to be dragged but Cabri, which first appeared in 1986, did allow objects to be dragged and soon after the mathematical implications of ‘dragging’ for learning geometry became (and remains) a focus for mathematics educators. Cinderella, which Jon refers to in Chap. 2, first appeared in the late 1990s. The first computer algebra systems (CAS ), Axiom and Reduce , were developed in the 1960s; Maple and Mathematica (referred to in Chap. 3) were developed in the 1980s. The first CAS aimed at student use (muMATH, later to become Derive) first appeared in the late 1970s; by 1994 a version of Derive was available on a graphic calculator, the TI-92 (which also included a version of Cabri). Statistics has been radically transformed by advances in computing, not least because of the ability of computers to handle large datasets. Recognition of the importance of computers for statistics is evident in the establishment, in 1972, of a Statistical Computing Section of the American Statistical Association. Two years later Generalized Linear Interactive Modelling (GLIM) software appeared and was, thereafter, a tool for university students. In 1993 the programming language R appeared which is used by professional statisticians today (2014) and is particularly suited to data mining . Statistical functions were quickly introduced into early scientific and graphic calculators but work on large data sets is more appropriate for computers than it is for calculators. The first appearance of what would now be recognised as a spreadsheet was again the early 1960s. It was, of course, developed for financial work, not mathematics, but it has been appropriated by mathematics teachers at all levels of education. The appropriation, by mathematics teachers, of a tool designed for finance may be viewed as strange from within the mathematics community but is understandable from the perspective of education as a leading contributor to a nation’s economic future.

Developments in technology during the 1980s which would impact on education in the 1990s and beyond were the internet, interactive whiteboards and touchscreen technology . These developments affect all subject areas of education, not just mathematics. With regard to mathematics it appears they do not so much affect the subject matter itself (as a CAS might do) but they affect the means through which students and teachers can access and present mathematical explanations and ideas. It is clear that advances in digital technology, starting in the 1960s, ushered in a period of tremendous growth in tools for education and for doing mathematics. The development of technology and mathematical tools is ongoing but there is a real sense in which mathematicians and mathematics educators are struggling to understand the revolution (of sorts) which began in the 1960s. I draw a close to this opening section with a summary of developments in mathematics education since the 1960s.

There was considerable mathematics curriculum innovation in the 1960s originating in the birth of new mathematics in the previous decade. The 1960s was the period of the cold war, which negatively influenced the exchange of ideas between the two power blocs including Soviet work on activity theory (which we consider in Chap. 7). In 1957 the Soviets put the first artificial satellite, Sputnik, into space. Bybee (1997, p. 3) comments, ‘Sputnik made clear to the American public that it was in the national interest to change education, in particular the curriculum in mathematics and science’. Further curriculum reform started in the 1980s due to advances in digital technology. In the UK, for example, an influential government report, wrote, ‘We devote a separate chapter to electronic calculators and computers because we believe that their increasing availability at low cost is of the greatest significance for the teaching of mathematics’ (Cockcroft, 1982, p. 109). Developments in mathematics education, however, were not just at the curriculum/policy level , they were also at the organisational and academic levels .

At the organisational level the International Congress on Mathematical Education , which is now held every 4 years under the auspices of the International Commission on Mathematical Instruction of the International Mathematical Union, held its first conference in 1969. This reflected the rise of national mathematics education organisations. In the UK,Footnote 1 for example, the Association of Teachers of Mathematics was founded in 1962 from the Association for Teaching Aids in Mathematics (a group who valued tool use!) and was effectively a break-away group from the more ‘traditional’ Mathematical Association (MA) and, in 1971, the first issue of the MA’s professional journal, Mathematics in School, appeared.

The 1960s was, effectively, the birth of academic mathematics education as a research field, ‘In the 1960s and 1970s, research studies in mathematics education grew not only in number but in scope as researchers increasingly moved across the boundaries of disciplines and countries’. (Kilpatrick, 1992, p. 29), and the graphs Kilpatrick produces for research studies and for these are approximately exponential (k > 0) for the 1960s. Further to this, the current two academic mathematics education journals which are in the Social Science Index stem from this period: Educational Studies in Mathematics first appeared in 1968 and the Journal for Research in Mathematics Education first appeared in 1971. International academic exchange led to the first conference of the International Group for the Psychology of Mathematics Education in 1976 which continues as the main annual academic mathematics education conference.

I now look at considerations of tools within the wider intellectual climate of the times (1960 to the early twenty-first century).

3 Intellectual Trends Relevant to Understanding Tools and Tool Use

I divide this section into three subsections. In the first I look at what behaviourism had to say about tools and note its decline. I also introduce a construct—affordances—that is now regarded as important for consideration of tools and which rose as behaviourism declined as an influence. The second and longest subsection looks at the work of two scholars whose work should not be ignored in any serious consideration of tools, Marx Wartofsky and Lev Vygotsky. The section closes with a summary of theoretical approaches in which tools are regarded as agents in activity.

3.1 The Decline of Behaviourism

I am sure that we can look at any period and claim that various established and emerging ideas/approaches/paradigms competed in intellectual discourse, but the period around 1970 in the West is the period I focus on and this was a period of intellectual warfare. The century had been dominated in Western universities by ‘positivism’, founded on empiricism, and the psychological form of this was behaviourism. A feature of all forms of positivism was breaking phenomena of study into discrete parts which could be studied in isolation. Developments in mathematics, specifically in meta-mathematics, went hand-in-hand with this penchant for discrete study; Bertrand Russell’s ‘logical atomism’ being a case in point, e.g. that non-logical expressions ‘have meaning if, and only if, either they or the expressions that appear in their analyses (if any) signify existent things’ (Pears, 1972, p. 9). Behaviourism was interested in external actions and how these could be initiated and channelled, stimulus–response. Tools were a means to initiate and channel external responses. The work of Patrick Suppes is interesting in this respect and I consider the legacy of behaviourism through a consideration of one of his papers.

Suppes was a mathematician, a philosopher and a psychologist, who applied his talents to issues in mathematics education. He was clearly interested in tools, as his paper ‘Computer technology and the future of education’ (Suppes, 1968) shows. I select him as an intelligent and well-meaning example of the late behaviourist school of thought in mathematics education; and as someone who spoke of tool use in education and used tools in his research in mathematics education without entering into the niceties of the agent–tool dialectic.

In Suppes (1968) he speaks with enthusiasm that ‘individualised instruction once possible only for a few members of the aristocracy can be made available to all students’ (Suppes, 1968, p. 41). This is possible ‘because of its great speed of operation, a computer can handle simultaneously a large number of students’ (Suppes, 1968, p. 41). Suppes does consider student–computer interaction, of which there are three possible levels ‘individualised drill-and-practice systems … tutorial systems and dialogue systems’ (Suppes, 1968, pp. 42–44). The last level is viewed as speculative because of speech recognition ‘technical problems must first be solved’ (Suppes, 1968, p. 44) but Suppes conducted experiments at the second level where the ‘intention is to approximate the interaction of a patient tutor … As soon as the student manifests a clear understanding of a concept on the basis of his handling of a number of exercises, he is moved on to a new concept and new exercises’ (Suppes, 1968, p. 43). I summarise this approach to tool use as the tool, the computer, imitates a human instructor in techniques and understanding is judged by ‘correct’ responses to specific stimuli. Similar tutorial level systems are common today (2014) and go under the name of ‘individualised learning systems’ . It is interesting to note that there is very little research, apart from reports from parent companies, on student learning through interaction with such systems and what there is, e.g. Baturo, Cooper, and Mc Robbie (1999), suggests that students merely learn strategies which generate responses which produce correct answers.

A consideration of one paper by one author does not prove a claim about an approach but I have faith in the claim that the legacy of behaviourism with regard to tool use in mathematics education could be summed up as: useful as a means to simulate or speed up the work of humans. By 1970, however, behaviourism was on the decline as an intellectual force, even if ‘popular’ forms were thriving in mathematics classrooms. I now consider two non-behaviourist academics who were working in the 1970s on ideas relevant to tool use.

In a series of papers and books over three decades, from the 1950s to the 1980s, the psychologists J.J. and E.J. Gibson developed a theory of visual perception that was not tied to stimulus–response theory . Visual stimuli featured in their approach but perceptual learning concerned ‘differentiating qualities of stimuli in the environment rather than acquiring associated responses that cause greater differentiation by enrichment of stimuli as a result of past experience’ (Greeno, 1994, p. 336). The environment is central to their approach; environments afford some actions and constrain others:

The affordances of the environment are what it offers the animal, what it provides or furnishes … If a terrestrial surface is nearly horizontal … nearly flat … and sufficiently extended (relative to the size of the animal) and if its substance is rigid (relative to the weight of the animal), then the surface affords support. (Gibson, 1979, p. 127).

Depending on the computer, the computer screen as an environment, affords clicking (anywhere on the screen) with a mouse or touching the screen with a finger (or other object). To the Gibsons, an affordance (or constraint) is a feature of agent–environment relation; it does not need to be perceived by the agent. An icon on a computer screen is not an affordance, as Norman (1999, p. 40) notes:

The affordance exists independently of what is visible on the screen. Those displays are not affordances; they are visual feedback that advertise the affordances; they are perceived affordances.

A constraint of an environment is related to affordance in as much as it specifies what the environment does not afford: a large dog cannot lie down in a small broom closet; we cannot click with a mouse outside of the computer screen. Norman (1999) distinguishes between physical, logical and cultural constraints . The last example (mouse outside of the screen) is a physical constraint. Logical constraints require reasoning to determine alternatives (clicking on three points in a specific order to obtain the angle bisector in our dynamic geometry system (DGS ) example in chapter P1). Cultural constraints are conventions shared by a community of users, e.g. dragging an object in a DGS. The constructs ‘affordances’ and ‘constraints’ have wide application in mathematics education, from investigations into the extent to which tasks and questions afford participation in mathematics classrooms (Watson, 2007) to valuations of software for doing mathematics (Monaghan & Mason, 2013). We pick up the thread of affordances and constraints in later chapters but now, in this overview, turn to two people who made significant contributions to an understanding of tools, Wartofsky and Vygotsky.

3.2 Two Approaches Which Take Tools Seriously

Western analytic philosophy, a key creator of positivism, was, by 1970, well into a period of questioning its assumptions. A thriving twentieth century branch of philosophy was the philosophy of science. A focal mid-twentieth century publishing outlet for this branch was the series Boston Studies in the Philosophy of Science. In this series every ‘ism’, from anarchism to positivism, was debated. Volume 48 was a collection of papers by the philosopher Marx Wartofsky. Chapter 11 (Wartofsky, 1979) is an essay on perception, representation and forms of action and advances an interesting perspective on artefacts/tools. Wartofsky’s position with regard to perception is anti-empiricist and views perceptions as ‘an historically evolved faculty, and therefore based on the development of historical human practice’ (Wartofsky, 1979, p. 189), i.e. humans, in different epochs, actually perceive in different ways. Historical human practice or, to use a word favoured by Marxists, ‘praxis’, is firstly ‘the fundamental activity of producing and reproducing the conditions of species existence … human beings do this by means of the creation of artefacts … the ‘tool’ may be any artefact created for the purpose’ (Wartofsky, 1979, p. 200).

Wartofsky extends the artefact–tool idea to ‘the acquisition of skills, in the process of production (…hunting … agriculture …) creates such skills as themselves “artifacts”’ (Wartofsky, 1979, p. 201). Wartofsky created a new ontology of artefacts :

Primary artefacts are those directly used in this production; secondary artifacts are those used in the preservation and transmission of the acquired skills or modes of action or praxis by which this production is carried out. Secondary artefacts are therefore representations of such modes of action (Wartofsky, 1979, p. 202)

Wartofsky goes on to describe a third kind of artefact, ‘artefacts of the imaginative construction of “off-line” worlds’ (Wartofsky, 1979, p. 208), where ‘online’ is ‘in praxis’, but the imagination is constrained by our experiences in our ‘online’ world. With regard to tool-design this is basically a statement that tool-design is bounded by the designer’s past experience.

It was, unfortunately, many years before mathematics educators took note of Wartofsky’s work but around the same time a book with a similar, but quite independent, emphasis on the development of historical human practice was published that would deeply influence mathematics educational thought on tool use. This was the Soviet Lev Vygotsky’s Mind in Society. Vygotsky did not write this as a book. He died 40 years before it was published. Four American academics compiled it as a collection of Vygotsky’s essays. Vygotsky, with colleagues, had a tremendous influence on Soviet psychology but the cold war effectively meant that Western and Soviet psychology largely developed without reference to the other. I explore the details of the influence of Vygotsky and colleagues on mathematics education in Chap. 9 but take the opportunity at this point to sketch the origins of this school of thought in the 1920/1930s.

Today (2014) the founder of activity theory is considered to be Vygotsky but he was not alone. His interests initially centred on literary studies and he was particularly interested in language, signs and ‘mediation’. Physical tools were not, unlike the authors of this book, of interest in themselves, any interest was due to their mediating qualities, ‘the basic analogy between sign and tool rests on their mediating function that characterises each of them’ (Vygotsky, 1978, p. 54). The difference between signs and tools rests on:

The tool’s function is to serve as the conductor of human influence on the object of activity; it is externally oriented; it must lead to a change in objects … The sign, on the other hand, changes nothing in the object of a psychological operation. It is a means of internal activity aimed at mastering oneself; the sign is internally oriented. (Vygotsky, 1978, p. 55)

It is not unusual to see Vygotsky’s position on mediation to be represented by a triangle (see Fig. 7.1):

Fig. 7.1
figure 1

A representation of Vygotsky’s position on mediation

There are at least three ways to misinterpret this diagram. The first is to regard the object as a thing; it is not, the object is the raison detre of activity. The second is as a form of behaviourism that takes account of ‘artefact mediation’ . This is not the case: the subject–object pair simply represents unmediated activity, e.g. drawing a round shape in the sand with your finger; the subject–artefact–object represents mediated activity, e.g. drawing a circle with a compass. The third is to regard the subject as going through the mediating artefact to the object; this is simplistic to the extent that it is wrong, as Cole (1998, p. 119) points out:

the incorporation of tools into the activity creates a new structural relation in which the cultural (mediated) and the natural (unmediated) routes operate synergistically; through active attempts to appropriate their surroundings to their own goals, people incorporate auxiliary means (including, very significantly, other people) into their actions, giving rise to the distinctive, triadic relationship of subject-medium-object.

Vygotsky’s interest in mediated activity centred on the shift from external processes to internal (mental) processes, which he calls ‘internalisation’ . He provides an enlightening example of the genesis of meaning in the act that becomes pointing. A baby initially attempts to grasp something out of his/her reach. This may look like s/he is pointing but s/he is not, s/he is trying to get the object. Then along comes an adult who sees the attempt and brings the object within reach of the baby. What is crucial here is that another human has come into the process of trying to grasp the object. Over time, and with repetition, the act of trying to grasp becomes a gesture of pointing. Vygotsky comments, ‘At this juncture there occurs a change in that movement’s function: from an object-orientated movement it becomes a movement aimed at another person, a means of establishing relations’ (Cole, 1998, p. 56). This leads him to claim:

Every function in the child’s cultural development appears twice: first, on the social level, and later, on the individual level; first between people (interpsychological), and then inside the child (intrapsychological). (Cole, 1998, p. 57)

This is a powerful and persuasive claim but, as Bereiter (1994, p. 21) states, these lines

make an empirical claim, and one that is almost certainly too strong. There is ample evidence … that young children work out a substantial knowledge of the physical world, well before they could have gained much of it from the surrounding culture.

The word ‘culture’ in these quotes represents the phylogenetic accumulation of knowledge and this is important in the Vygotskian distinction between ‘everyday’ and ‘scientific’ concepts. Vygotskians regard everyday concepts as those acquired through our senses—the sun disappears from the sky and it is night—but scientific concepts have a theory behind them, such as the axial rotation of the earth in an orbit around the sun to explain night. A theory does not need to be correct for a concept to be scientific, as Scott, Mortimer, and Ametller (2011, p. 6) point out:

… scientific concepts are taken to be the products of specific scientific communities and constitute part of the disciplinary knowledge of that community. The term ‘scientific’ as used by Vygotsky is not restricted to the natural sciences, but covers all comparable communities such as those of history, philosophy, art and so on. As the agreed upon products of specific communities, scientific concepts are not open to ‘discovery’ by the individual but can only be learned through some form of tuition.

In Chap. 11 we shall see another criticism of Vygotsky’s account of internalisation . My view is that Vygotsky pointed to something important but he was engaged in early work. From the point of view of tool use in mathematics, the opposite of internalisation, ‘externalisation’, the shift from internal processes to external processes, is also important and this is something I mentioned, in so many words, in my Sect. 1.3 definition of tools.

I take a brief pause at this point to make a disclaimer (of sorts) and move into the field of semiotics. The chapter I am writing is about tools and this focus has biased my presentation of Vygotsky’s thought for, as I say above, he was particularly interested in language, signs—semiotic mediation—and his conceptualisation of internalisation was fundamentally as a semiotic process. Given the mediational similarity (there is, indeed, an overlap) between signs and tools, it is appropriate to consider their relationship.

In mathematics we tend to think of signs as ‘our symbols’: ×, +, ∫, etc. In linguistics the focus is on language(s) and the signs of interest are words, speech, text, etc. But a sign in semiotics is just an arbitrary thing (a signifier) that signifies something (the signified). Mathematics lessons are rife with signs as teachers and learners attempt to communicate with each other not just through mathematical symbols and words but also through pointing, gestures, intensity of speech, etc. The signs used in this communication must have some common meanings to each individual involved or communication would not be possible. This is particularly tricky in mathematics instruction because teachers introduce new signs and tools that initially have no meaning for the learner. Consider, for instance, the long process (Vygotsky’s internalisation) of turning calculus into tools for doing mathematics, e.g. \( {\displaystyle \int x\;dx}=\frac{x^2}{2}+c \) considered as an algorithm. The individual signs, e.g. dx, in this equation have a meaning that has been passed down by the culture of mathematics but they are arbitrary and have no reference to the learner as s/he starts learning calculus. Learning about this sign-tool takes place over a considerable period of time during which teacher and learner draw on formal signs which have meaning to the learner including a great deal of pointing and gestures. Vygotsky was not, as far as I know, interested in algorithms and mathematics educators may not be interested in the writings of Vygotsky but semiotic mediation is something that is central to mathematics teaching whatever one’s belief.

3.3 Tools as Agents

The final intellectual trend of this period I consider arises from scholarship into the history, philosophy and sociology of science. The publication of Thomas Kuhn’s The structure of scientific revolutions in 1962 (alongside work by other scholars), in the words of Pickering (1995, p. 2), ‘opened the way for new waves of scholarship …work on the sociology of scientific knowledge (SSK) has increasingly documented the importance of the human and the social in the production and use of scientific knowledge’. One of these waves of scholarship, originating from the late 1970s is now called actor network theory (ANT); Latour (2005) is a fairly recent exposition. ANT is a theory about how to study social phenomena—by following the actors, where an actor is ‘anything that does modify a state of affairs by making a difference’ (Latour, 2005, p. 71); ANT theorists would not look at a mathematics lesson as a given social structure but would describe the structure in terms of actors. It views social life as being in a state of flux and looks to the performance of the actors in situations. Objects (artefacts/tools) can make a difference in activity and so can be actors, exerting agency, in the playing out of social situations. Pickering (1995), who is ‘almost ANT’ in my opinion, examines the practices of late twentieth century physicists using machines to trace elementary particles. He accepts ANT’s human and material agencies and adds ‘disciplinary agency’ (in our discipline a + a = 2a regardless of what we might want it to be). He proposes a ‘dance of agency where, in the performance of scientific inquiry, human, material and disciplinary agencies ‘emerge in the temporality of practice and are definitional of and sustain one another’ (Latour, 2005, p. 21). What ANT and Pickering bring to a consideration of tool use in mathematics is that tools have agency. This is a controversial claim but an interesting one that cannot be ignored in a serious study of tool use.

I now move on to consider the development of ideas in mathematics education regarding tools and tool use.

4 The Development of Ideas in Mathematics Education Regarding Tools and Tool Use

I sketch the development of ideas in academic mathematics education regarding tools (including, in some cases, an absence of regard to the role of tools) from the early 1960s.Footnote 2 The dominant approach in the West to what would now (2014) be called ‘mathematics education’ or the ‘didactics of mathematics’ was behaviourism . I have discussed this in the previous section and do not discuss it further here. But there were notable others around. Zoltan Dienes conducted learning experiments which centred on young children making and finding patterns in play-like activity with manipulatives and concrete representations . Many schools purchased ‘Dienes blocks’, wooden cubes (units), 1 by 1 by 10 cuboids (tens), 1 by 10 by 10 cuboids (hundreds) and 10 by 10 by 10 cubes (thousands). These wooden artefacts are tools (by my Sect. 1.3.1 definition of tools) when their overt purpose is to instill conceptual understanding (of place value)—they are tools for learning mathematics rather than as tools for doing mathematics. Dienes (1963) writes with attention to detail and respect for learners about children’s mathematical activity with manipulatives. His interest is on the learners’ appreciation of mathematical structure (e.g. arranging blocks to appreciate that \( {12}^2=10\times 10+4\times 10+4\times 1 \)) and their formation of abstraction and generalisation. His model of the abstraction–generalisation process (see Dienes, 1963, p. 67) focuses on the mind and on mathematics—the manipulatives do not enter the model and appear, to me, to be mere props for mental operations (interpret, symbolise) that are part of his model. It was several decades before the usefulness of manipulatives in the long-term learning of mathematics was questioned (see Hart, 1989). Another notable figure in the 1960s is Guy Brousseau whose classroom experiments included the use of manipulatives. I do not discuss this here as this work is considered in Chap. 10. But the most important figure, in terms of Western mathematics education, in the 1960s was Jean Piaget.

Piaget, who was initially a biologist, was a contemporary of Vygotsky but unlike Vygotsky he did not die young; his academic opus spanned many decades during which time his influence increased. His main interest was children’s conceptions and he explored, amongst other things, their conceptions of geometry, logic, movement, number, space and the world. His interests centred on the development of thought from birth to adolescence. He posited that humans develop through stages and that growth within stages depended on schemes, ‘the structure or organisation of actions as they are transferred or generalised by repetition in similar or analogous circumstances’ (Piaget & Inhelder, 1969, p. 4). At any stage in the development of the child ‘reality data are treated or modified in such a way as to become incorporated into the structure of the subject’ (Piaget & Inhelder, 1969, p. 5), this is assimilation . By repetition schemes are modified to fit with the child’s new interpretation of reality, this is accommodation. This is a far cry from behaviourism. If behaviourism can be represented as ‘stimulus → response’, then Piaget’s version can be represented as ‘stimulus ↔ response’. An interesting thing about Piaget’s work is his neglect of the role of tools in cognitive development. This is obviously noteworthy in a book on tools but it also appears a little strange as he did pay specific attention to related things, signs (and semiotics in general) and objects. Respect for Piaget’s work increased to an extent that he had ‘guru status’. A prominent English mathematics educator of 1960s and 1970s was Kenneth Lovell and if you take any of his academic papers, then you will find Piaget’s theory being expanded on with regard to new mathematical ideas or experiments but with virtually no criticism (to my knowledge) of Piaget’s assumptions or theory. The nearest I have found to questioning Piaget in Lovell’s work is:

I believe that his position regarding the acquisition of certain kinds of new knowledge is of more value to the mathematics teacher than any other position at the moment, although I affirm with equal conviction that his theory does not cover all the facts and that one day it will be replaced or subsumed by a more all-embracing one. (Lovell, 1972, p. 165)

I make this point on ‘guru status’ because it may help to explain the virtual neglect on tools in pre-twenty-first century scholarship in mathematics education that paid homage to Piaget. The most influential (epistemological) theory in mathematics education in the 1980s was constructivism. Constructivists pay regular homage to Piaget’s theory as the inspiration for their ideas, e.g. Von Glasersfeld (1991, p. xiv) in speaking of constructivists, ‘They have taken seriously the revolutionary attitude pioneered in the 1930s by Jean Piaget’. Leslie Steffe was an early constructivist researcher. I consider a paper of his (Steffe, 1983) which considers children’s algorithms as schemes. I select this paper because, in Chap. 1, I argue that algorithms may be regarded as tools. Steffe pays homage to Piaget, ‘children’s methods can be viewed as schemes. This premise has justification in the primordial seriation scheme studies by Inhelder and Piaget (1969)’ (Piaget & Inhelder, 1969, p. 110). He considers operative and figurative schemes in the algorithms developed by two young children. Educational resources (strips with ten squares and blocks) are used by the children but the paper makes no mention of artefacts or tools.

During the 1980s constructivism divided into what are now called ‘radical’ and ‘social’ constructivism . The radical branch, of whom Steffe is an example, was concerned with ontogenic development of the individual child. The social branch was interested in microgenetic, i.e. child–environment, as well as ontogenic development. During the 1980s Paul Cobb moved from a radical to the social branch and early paper that could be called ‘social constructivist’ is Cobb (1987).Footnote 3 The paper investigates the sense young children make of statements such as 3 + 6 = 9 and 9 = 3 + 6 and notes conflicting models of early number development in the literature. Cobb uses clinical interviews (a research tool developed by Piaget) in worksheet tasks and tasks which employ felt squares to hide numbers. Cobb concludes that the ‘academic context’ of the task is crucial and influence children’s goals, they tend ‘to use primitive finger patterns in worksheet situations … [as] … these methods … constituted viable ways of operating in their classrooms’ (Cobb, 1987, p. 121) I have no criticisms of the conclusions reached by Cobb but, as with Steffe (1983), I find it curious that no mention is made to tools (as finger patterns may be regarded as arithmetic tools). Cobb, with various co-researchers, went on to develop a specific form of social constructivism, sometimes referred to as the ‘emergent perspective’. An important paper from this perspective is Yackel and Cobb (1996). This paper examines teacher–young children discussions and argumentation in a classroom context and introduced a new construct to the field of mathematics education, ‘sociomathematical norms’, ‘normative aspects of mathematical discussions that are specific to students’ mathematical activity’ (Yackel & Cobb, 1996, p. 458). The classroom is provided with various resources (centicubes and an overhead projector) but the paper does not mention tools. This neglect has been noticed by others, e.g. Hershkowitz and Schwarz (1999, p. 149) refer directly to Yackel and Cobb (1996) when they write ‘… sociomathematical norms do not arise from verbal actions only, but also from computer manipulations as communicative non-verbal actions’. In fairness to Cobb, however, by 2002 his output did explicitly consider tool use.Footnote 4 Cobb (2002) presents an analysis of seventh grade statistical data analysis; students were given data sets in which ‘it would be essential that they actually begin to analyse data in order to address a significant question’ (Cobb, 2002, p. 176). The instructional strategy behind students’ data analysis was supported with two ‘computer minitools’ developed to fit with the instructional sequence. The paper explores how symbolising, modelling and tool use interrelate in students’ data analysis.

During the 1980/1990s, however, influences of a non-Piagetian nature were stirring which Lerman (2000, p. 23) calls ‘the social turn’:

the emergence into the mathematical education community of theories that see meaning, thinking, and reasoning as products of social activity. This goes beyond the idea that social interactions provide a spark that generates or stimulates an individual’s internal meaning making activity. A major challenge for theories from the social turn is to account for individual cognition and difference, and to incorporate the substantial body of research on mathematical cognition , as products of social activity.

This was followed by what has been called ‘the sociopolitical turn’, which Gutiérrez (2013, p. 40) describes as ‘theoretical [and methodological] perspectives that see knowledge, power, and identity as interwoven and arising from (and constituted within) social discourses’. There is, to my mind, overlap in these ‘turns’ which I illustrate via Lerman. The ‘substantial body of research’ which Lerman goes on to cite includes situated cognition (e.g. Lave, 1988), Foucauldian analyses (e.g. Walkerdine, 1988) and cross-cultural studies (e.g. Bishop, 1988); Foucauldian analyses are clearly sociopolitical in essence.

I do not disagree with what Lerman says but I think another turn, ‘the technological turn’, was rotating in the same period and the moment of this turn was the technological revolution that I describe in Sect. 7.2. Calculators and micro computer applications became objects of great interest to mathematics educators and, to paraphrase Lerman above, they began to ‘see meaning, thinking, and reasoning as products of tool-based activity’. Further to this, many of these tool-focused researchers were also influenced by the social turn (e.g. the first two authors of this book). A recent (at the time of writing) paper (Morgan & Kynigos, 2014) illustrates social and technological foci. The paper concerns digital artefacts and external representations and I preface a consideration of the paper with some comments on representations.

I use the term ‘external representation’ (ER ) to refer to ‘a configuration of symbols, images or concrete objects standing for some other entity’ (Fagnant & Vlassis, 2013, p. 149). Some authors use the term ‘schematic representations’ or just ‘representations’, but I like to make the external aspect explicit. An ER may be presented to the learner (e.g. a Cartesian graph or a number line) or it may be generated by the learner. The ER may or may not correspond to something mathematical (e.g. a smiley face) and when two ERs which correspond to something mathematical they may do so in different ways. For example, the two images in Fig. 7.2 could both represent ‘3 × 4’ but the rectangular array may be presented by a teacher to focus on the idea of a Cartesian product whereas the ‘dots in circles’ may be learner generated and focus on repeated addition.

Fig. 7.2
figure 2

Two external representations for ‘3 × 4’

Both ERs in Fig. 7.2 are artefacts and, if they are used in doing mathematics, function as mathematical tools (by Monaghan’s Sect. 1.3 definition of a tool). Van Dooren, Vancraenenbroeck, and Verschaffel (2013, p. 322) point out:

an external representation of a problem situation can be a beneficial heuristic … may lead to a reduction of working memory load … to be effective, the external representation needs to be a correct display of the problem situation

Fagnant and Vlassis (2013) claim that ERs are central elements in expert problem solving and this certainly seems to be the case in Jon’s Chap. 3, which is awash with ERs and his claims about the visual theorems are essentially claims about the central roles of ERs in expert problem solving. The history of mathematics is, in part, a history of external representations as Sect. 4.3 and Chap. 5 (on ancient Greek and on Babylonian mathematics) attest. Digital technology has opened the way for new ERs, for example ‘sliders’ in dynamic graphing and geometry packages. I now return to a consideration of Morgan and Kynigos (2014) to illustrate differences in social and technological foci.

Morgan and Kynigos (2014) appear in a special issue of Educational Studies in Mathematics on the ReMaths project (Representing Mathematics with digital media: Working across theoretical and contextual boundaries). Basically what ReMath did was get pairs (or groups) of researchers (and, often, mathematical software designers) from different countries and using different theoretical frameworks to try out each other’s software and interpret the other’s data (if possible) from the perspective of a different framework. Morgan and Kynigos are one such pair. The paper focuses on a digital artefact, MoPiX, designed for:

constructing animated models using the principles of Newtonian motion . The objects of the MoPiX microworld were designed to behave in mathematically coherent ways providing an environment … intended to allow students to construct orientations to concepts such as velocity and acceleration consistent with conventional mathematical and physical principles (Morgan & Kynigos, 2014, p. 362)

Morgan represents the social turn:

a multimodal social semiotic perspective on representation … From this perspective, the elements of spoken, written, diagrammatic or other forms of communication are not taken to have a fixed relationship to specific objects or concepts … Rather, the resources offered by language, diagrams, gestures and other modes are considered to provide a potential for meaning making. The term representation cannot therefore be taken to have an internal reference to some individual mental image or structure. Nor can it be taken to refer to a determined relationship between signifier (word, picture, symbol, etc.) and signified (represented object or concept). As the elements of communication acquire meaning in interactions within social practices, the notion of representation must also be understood relative to specific social interactions and practices. (Morgan & Kynigos, 2014, p. 359)

Kynigos represents the technological turn via a constructionist framework (see Chap. 8):

The role of representations is important in the sense that they are perceived as integral components of artefacts-under-change and as a means for expressing, generating and communicating meaning. The nature of representations and the kinds of use to which they are put are at the centre of attention … artefacts can embody a wide range of complexity and have been perceived and analysed as representations themselves … Unlike the social semiotic perspective, digital artefacts are seen as representations designed by pedagogues to embed one or more powerful ideas … representations are not seen simply as objects to which some kind of meaning may be attached but as artefacts for tinkering with. (Morgan & Kynigos, 2014, p. 360)

Morgan and Kynigos (2014) proceeds to analyse student work from perspective of each framework. I do not detail these analyses and you do not get a prize for guessing that the analyses differ. The two authors, however, agree that the two approaches ‘are not on the whole incompatible and yield interpretations of the data that have some similarities’ (Morgan & Kynigos, 2014, p. 375). There are also instances where the two authors use a construct in a similar way, for instance, the term ‘meaning’. But even here the constructionist approach sees meaning residing in the individual and is often linked to ‘tinkering’ with artefacts, whereas, in the multimodal social semiotic approach, ‘meaning is conceptualised as the establishment of shared orientations through communication in interaction between individuals’ (Morgan & Kynigos, 2014, p. 377).

5 End Note and Anticipation of the Remaining Part II Chapters

Morgan and Kynigos (2014) illustrate social and technological foci in twenty-first century mathematics education research. It also illustrates two of the large number of theoretical frameworks used twenty-first century mathematics education research. This book on tools and mathematics is not the place for considering all of these frameworks but we consider frameworks that appear, to us, to be particularly important with regard to conceptualising tool use in mathematics in Chaps. 810. Chapter 8 focuses on constructionism , the framework that Kynigos, above, uses. Chapter 9 concerns activity theory which is a continuation of the work of Vygotsky. The flow of ideas in Chap. 7 did not provide a sensible place to outline Radford (2014) but it is a paper that readers may like to follow up if they are interested in representations because Radford (2014) is an activity theoretic critique of Morgan and Kynigos (2014) and has a lot of interesting things to say on ERs and on artefacts. Chapter 10 outlines several frameworks that emerged in France around the time of social and technological turns (late 1980s). In the 1990s researchers centred around Michèle Artigue following the approaches of Chevallard and of Rabardel were engaged in ground breaking research on the use of technology in mathematics classrooms that challenged the hegemony of constructivist interpretations of the role of technology in the learning of mathematics.