Keywords

The first perspective offered in this book is situated in the historical origins of probability as well as differences in the philosophical bases of probability theory. The four chapters in this section address origins, epistemology, paradoxes, games, modelling of probability and different pedagogical approaches grounded in the classical, the frequentist or the subjectivist approach. The aim of this commentary is not to regurgitate or summarize the content of the chapters, but to ground the first perspective offered in this book within a humanistic framework of mathematics.

The humanist tradition in mathematics education attempts to situate or ground the development of mathematical ideas to the people involved in their conception, and in particular to describe the motivation and context within which the ideas developed. To paraphrase Flusser, the more humanistic aspects of mathematics are the ones that are more fruitful in the sense that “Euler’s imperfection (which turns an otherwise perfect piece of workmanship into a work of art), raises many new questions” (p. 5). The origins of probability can be traced to old myths in different cultures in which divination or gambling games are mentioned, often resulting in a catastrophic loss or gain. For example, in the Indian epic Mahabharata (∼1500 B.C.), a kingdom is lost in a dice game which results in exile for the losers. This is not the only instance of “gambling” in ancient Indian culture. The Rig Veda, an old religious scripture contains what is called the Gamblers (Ruin) Hymn in which the dire consequences of playing dice games are extolled in the words, Cast on the board, like lumps of magic charcoal, though cold themselves they burn the heart to ashes! Other ancient cultures like the House of Ur in Babylon also developed board games, such as pre-cursors to what is popularly known as backgammon today. Native American cultures in North and South America also independently developed a rich tradition of games of chance. In other words, calculating “odds” seems to have been an integral part of human culture regardless of place and time. The gaming industry today is one of the most profitable sectors of the business world.

If one departs from ancient games of chance to the work coming out of post-Renaissance Europe, the academic strains of what we call probability theory today are found in the works of Luca Pacioli’s Summa de arithmetica geometria proportioni et proportionalia (1494), as well as Pascal, Fermat, and Jakob Bernoulli’s Ars Conjectandi” (1713). Borovcnik and Kapadia’s opening chapter examines the well-known work of Pascal and Fermat, with reference to de Méré’s problem and the Division of Stakes problem. While the former is covered in detail in their chapter, the latter is also worth paying attention to. The Division of Stakes problem if stated in modern terms reads as follows: Suppose a team needs to reach 60 points to win, with each “goal” counting as 10 points, how should the prize money be split up if the score stands at 50–30 and the game cannot be completed? Pacioli’s solution would be 5:3 and is incorrect but led to the development of a “more accurate” algorithm by Cardano, namely if the score stands at x:y and if “z” is the number needed to win, then the ratio should be [1+2+3+⋯+(zy):1+2+3+⋯+(zx)]. Pacioli’s solution would then be corrected to read 6:1. However, Cardano was incorrect as well! The reader is invited to devise the correct solution given the tools of modern probability theory at their disposal. Our point here was simply to present the humanistic aspects of developing a solution, which very often consists of trial and errors being corrected over time. The opening chapter also mentions Abraham De Moivre’s work on developing “Stirling’s formula” but the work that goes unmentioned is the betterment of De Moivre’s constant by James Stirling as \(\frac{2e}{\sqrt{2\pi}}\), and the historical fluke of De Moivre’s work being attributed as Stirling’s formula! De Moivre was gracious to let Stirling receive the credit for the betterment of the constant!

The second chapter, also by Borovcnik and Kapadia examines the role of paradoxes in the development of different mathematical concepts. The authors present numerous paradoxes as well as Pacioli’s original Division of Stakes problem with the historical solutions. De Méré’s problem is also revisited along with the problem of the Grand Duke of Tuscany. Numerous other paradoxes are covered in the context of inverse probabilities and the culmination of the chapter is in the unification of many ideas that arose from paradoxes and puzzles that eventually led to the axiomatization of probability theory by Kolmogorov, and the formulation of its central theorems. The paradoxes are also classified as either equal likelihood, expectation, relative frequency, and personal probabilities with the caveat that adopting a particular philosophical stance leads to restrictions in the scope of the applications that are possible. The examples presented are well formulated and serve any interested educator for pedagogical uses.

The last two chapters in this section (Eichler and Vogel; Pfannkuch and Ziedins) examine probability concepts from a mathematical modelling and teaching standpoint. The former present three approaches to modelling situations in probability and question what teachers need to know about probability in order to effectively present and teach the problems. In our reading, a salient feature of the chapter by Eichler and Vogel is the use of technology like Tinkerplots to simulate modelling situations with data, to eventually lead to the arrival of an “objective” understanding of randomness based on empirical data. The chapter by Pfannkuch and Ziedins extolls the virtues of a data driven approach for teaching probability with three examples to persuade the reader. The classical and frequentist approaches are critiqued as leading to confusion whereas a data-driven modelling approach could circumvent common student misconceptions. To this end, it may be of interest for the statistics education community to realize that modelling and data-driven approaches have been tried in the U.S. in some of the high school reform based curricula sponsored by the National Science Foundation. In the case of Montana, the Systemic Initiative for Montana Mathematics and Science (SIMMS) project resulted in well formulated pilot tested modules containing “real world data”. However, this data was perceived as being a little too real (and grim) by some of the public and politicians who used this perception to speak against such approaches to teaching mathematics at high schools. This may serve as the caveat emptor for those that advocate a “radical” approach to the teaching of probability in schools. Sanitized mathematics still fits the world view of most policy makers and too many deviations [pun intended] in the direction of real data may not result in the pedagogical pay out hoped for by these authors!