Introduction

The term mathscasts has been used in the literature referring to “narrated screen video recordings of explanations of mathematical concepts” (http://mathscasts.org). One of the most important features of mathscasts is the utilisation of freehand inking capabilities of tablet devices, which allows for effortless writing of mathematical concepts. Whilst mathscasts pertains to mathematical content, screencasts contain a variety of content. Irrespective of this differentiation, there appears to be no clear nomenclature for screencast content. Sometimes screencasts are referred to as video podcasts or vodcasts, whilst podcasts usually refer to audio only content. In this paper, we will refer to mathscasts when the context is clearly mathematics, otherwise we will use the term screencasts. Sugar et al. (2010) suggest that screencasts were originally developed to provide procedural information to students. Some podcasts and vodcasts (screencasts) today also address pedagogical issues (Heilesen 2010), but many do not seem to follow any framework such as that proposed by Sugar et al. (2010). In addition, the focus of many mathscasts still appears to be more on procedural knowledge rather than any other form of mathematical knowledge. In the absence of a comprehensive framework that addresses both procedural and pedagogical content knowledge, this study developed an evaluative tool specific to the creation and evaluation of mathscasts.

The focus of other research is on how mathscasts helps the learning and development of students. That is, mathscasts is usually created so students are the recipients of the mathscasts and not the creators. However, this study supports the premise that the process of creating and refining mathscasts deepens the creators’ own learning (Soto 2014). It should be noted that the pre-service and in-service teachers (P/ISTs) in this study have dual roles as university students and teachers. In other words, whilst P/ISTs are students of mathematics in a university setting, they are also teaching, or will be teaching mathematics in a school setting. This context aligns with Hattie’s notion of teachers becoming “learners of their own teaching and students become their own teachers” (Hattie 2008).

Our study is situated in a course designed for both on-campus and online P/ISTs, where student-created mathscasts provide a way for university lecturers to assess students’ quality of teaching and understanding of mathematics. When the P/ISTs submit an assignment based on the creation of a mathscast, they are presenting examples to their lecturers of the way they teach; their understanding of how to solve mathematics problems; and how to talk through and write a solution. This practice provides invaluable information as lecturers do not see online students teach or explain mathematical concepts, unlike face-to-face tutorials and thus addresses concerns relating to the online teaching of education programs (Ingvarson et al. 2014).

Earlier research in 2012 (Galligan and Hobohm 2013) focussed mainly on engaging P/ISTs to create a mathscasts using screencast technologies. We extended this research to focus on the quality of mathscasts. In this paper, we analysed mathscasts as a basis to formulate and test an evaluative tool to then guide the creation and critiquing of mathscasts.

Literature review

The aim of this paper is on the development of an evaluative tool in a mathematics education context. We needed a tool to assess the quality of mathscasts relating to production, pedagogy, and mathematics understanding. In the past, it has been shown that effective use of tablet technology, particularly with mathscasts, can increase engagement (Logan et al. 2009; Anderson et al. 2005; Galligan et al. 2015), foster understanding, and enhance multidirectional communication (Galligan et al. 2015) even with online and distance learning (Galligan and Hobohm 2013). Our research and that of others (Loch and McLoughlin 2011) suggest that future studies should investigate how mathscasts, when purposefully linked and scaffolded, can guide deeper understanding of mathematics and positively influence the creation and delivery of mathscasts (Galligan and Hobohm 2013). The following section investigates research around frameworks for quality screencast production, and attaining deeper mathematics understanding. As tablet technology is the enabler for freehand inking, we also review some of the available literature on teacher- and student-created mathscasts captured through tablet technologies.

Evaluative frameworks for the design and delivery of screencasts

In the absence of evaluative frameworks that focus on both pedagogical knowledge and deep mathematical understanding in mathscasts, we first investigated Sugar’s widely acknowledged “anatomy of a screencast” framework. This framework consisted of structural elements (bumpers, screen movement, and narration) and instructional strategies (overview, describe procedure, present concept, focus attention, and elaborate content). Loch and McLoughlin (2011) acknowledged Sugar’s research and its shortcomings, and suggested a model which consisted of self-regulated learning (planning, monitoring, and reflecting), but as yet this model is untested. Our research focused on three key elements: learning how to teach mathematics, a taxonomy of understanding mathematics, and a method to evaluate quality in mathscasts. A guide for evaluating online mathematics resources (Handal et al. 2005) had some useful criteria (such as displays, motivation, directions, learning sequences, and language), but was not specific enough for our purposes. Shafer (2010) also sought an evaluative framework in a study investigating school students’ screencasts in geometry. In her analysis, she found active engagement in peer evaluation and the valorisation of the assignment. In addition, she reanalysed her findings using Bloom’s digital taxonomy (Churches 2010) of creating, evaluating, analysing, applying, understanding, and remembering. Her study found that students were effectively operating at two of the highest levels (i.e. evaluating and creating). Mathison and Kosiak (2011) developed a draft rubric to assess PSTs mathematical content knowledge through PST-created podcasts. The researchers used Kilpatrick, Swafford, and Findell’s proficiency strands (2001) as a basis to assess procedural fluency and conceptual understanding. We adopted a similar approach that promoted both computational fluency and depth of understanding. Other taxonomies for understanding mathematics that we used in developing our evaluative tool (Table 3) were Skemp’s (1976) distinction between instrumental and relational understanding; and Mason & Spence’s (1999) categorisation of knowing-that, knowing-how, knowing-why, and knowing-to (Galligan et al. 2015). Added to this “knowing” framework is knowing about usefulness in context (Watson et al. 2003). This latter “knowing” includes understanding between mathematical contexts (for example, ratios or decimals in the context of measurement), as well as understanding simpler concepts (such as ratio) in a more complex mathematical context (such as algebra). Our evaluative tool also includes understanding how concepts can be used outside the school context (i.e. everyday numeracy). These understandings are reflected in the work of Ma (1999) on teachers’ profound understanding of fundamental mathematics as considered within the Pedagogical Content Knowledge (PCK) framework. PCK also emphasises other aspects such as knowing your audience and knowing how to model a concept, (Chick et al. 2006) both of which proved integral for pre-service teachers when producing quality mathscasts.

Our aim was to convert the above concepts into a comprehensive, yet user friendly evaluative tool to guide those who create or assess mathscasts.

Deeper understanding of mathematics through tablet technologies

Our theoretical investigation extended Sugars’ framework to include PCK and mathematics understanding. We then turned to research that investigated how tablet technologies (combining writing and recording) improve mathematical understanding and incorporate pedagogy on an operational level. Crompton and Burke’s review (2015) of mobile learning in mathematics, found only 36 peer-reviewed articles since 2000. Their study concluded that more research was needed as the majority of studies were limited to elementary or middle schools. Whilst this review did include tablets as part of the mobile learning definition, the review of tablet technology was not comprehensive. A small research study (Kosheleva et al. 2007) gave pre-service elementary teachers Tablet PCs to create lesson plans in mathematics. They found that the use of Tablet PCs significantly increased students’ understanding of mathematics concepts. A similar conclusion was made by Vasinda and McLeod (2012) in a school setting, who found established mathscasts to be a “more powerful opportunity for both self-assessment and teacher assessment” (p 127). Yet a more recent review of research on the transformations of teaching and learning through digital technology from 2013 to 2016 in Australasia (Geiger et al. 2016) verified that using technology in teacher pre-service and in-service teaching was an under-researched area. In the same book, a chapter on tertiary mathematics education (Copeland et al. 2016), assessed 13 peer reviewed articles on tablet technology and screencasting, and noted there was an emphasis on the technology as an assistive tool, rather than one that can transform current pedagogical approaches. However, they did find some reflective work (such as McMullen et al. 2015) on pedagogical transformation in a calculus class. The review authors felt that more research was needed on the utilisation of technology to enhance the nature of mathematical thinking and mathematics learning.

Recently, there has been a shift from teachers to students creating screencasts to improve learning. A study conducted in the intermediate science classroom concluded that not only were scores for students higher with those that created podcasts, vodcasts, and screencasts, but that students were also actively engaged in the learning process (Pena 2011). Recent research by Soto (Soto and Ambrose 2016; Soto 2014) investigating elementary students’ creation of screencasts in mathematics, affirmed screencasts to be a powerful assessment tool in that teachers who analysed their students’ screencast explanations gained deeper insights into student understanding, misunderstanding, and errors. Such findings have implications for more targeted interventions. As Soto’s rubric (2014) was used for school students, it focused on the analysis of procedures, verbalisations, solution strategies, and tools used. Our evaluative tool targeted P/ISTs and honed in on mathematics understanding, pedagogical content knowledge, and structural elements of mathscasts.

Furthermore, we investigated how P/ISTs used the evaluative tool to address quality of mathscasts in terms of production, mathematics understanding, and pedagogy, thereby specifically answering two questions:

  • What does an evaluative tool for mathscasts look like?

  • Does the use of the evaluative tool make a difference to the quality of the production and critiquing of student-produced mathscasts?

The results of this research will be of significance to multiple audiences, including practitioners wanting to use mathscasts as a digital tool for learning and teaching, and researchers wanting to verify students’ depth of understanding of mathematics and pre-service teachers’ approaches to explaining mathematical concepts.

Method

This research project commenced in 2012 using a cooperative research inquiry approach of iterated reflection and action (Reason and Riley 2008) as used by Sugar (Sugar et al. 2010). The preparatory reflections and actions were carried out by the researchers and colleagues (internal) to produce the first iteration of the evaluation tool. Following the first use of the evaluative tool by students (external), another cycle of reflection and action (internal) commenced to inform improvements to the tool. Another cycle followed with wider student use and subsequent evaluation and action. Figure 1 summarises our reflection and action cycle from 2012 to the present.

Fig. 1
figure 1

Timeline of cooperative inquiry approach

Throughout the iterative process, data were collected from a Mathematics for Teachers course at a regional university in Australia (with ethics clearance). The total enrolment of about 50 students each semester included over 90% in online mode. Males formed about one third of the cohort, and the average age is over 30. There were 22% postgraduate students in the course, with most of these students are practicing teachers. For the pre-service teaching students, less than 50% indicated a major or minor in mathematics.

A total of six instruments were used to collect data. The sequence of introduction is summarised in Table 1:

  1. 1.

    Initial mathscasts creation: Using a digital device, students had to record a mathscast in which they explain a mathematics concept. Typically, students used either an iPad, a tablet PC, or a graphics tablet with ScreenChomp® or Jing®. This instrument was used to identify common features of student-created mathscasts, as well as gauge students’ ability to create a mathscast unguided. The creation and uploading of a mathscast was submitted as a piece of assessment.

  2. 2.

    Peer Critique 1: Early in the semester, all students uploaded and critiqued their own and others’ first mathscasts via a dedicated online discussion forum. This instrument was used to identify students’ ability to highlight features of a mathscast without much initial guidance. The critiques were largely student led, with the intent to encourage the student, not the teacher, voice. These critiques were submitted as a piece of assessment. However, in 2014, students tested the evaluative tool as part of their critique, with the expectation that this “voice” may be reflected in the critique. In 2015, students were not given this tool, therefore allowing researchers to perform a comparative analysis of the critiques.

  3. 3.

    Peer Critique 2: In 2015, a second peer critique exercise was introduced on better quality mathscasts so the students could trial the previously unavailable tool. These critiques formed part of a second assessment. This piece of assessment was not used in 2014.

  4. 4.

    Final mathscasts creation: Students had to record a series of linked mathscasts on how to teach a troublesome mathematics concept of their choosing that could be given to school students to aid in their understanding of the concept. These mathscasts were submitted as part of an assessment. The researchers then compared this set of mathscasts against their initial attempts to see if students had improved with the use of the evaluative tool. This was produced by both the 2014 and 2015 cohorts.

  5. 5.

    Survey 1: Prior to creating their first unassessed and unguided mathscasts, students were surveyed about their experiences with watching and creating screencasts, perceived advantages and disadvantages of screencasts, and the value of using screencasts for mathematics teaching and learning. The first survey (Galligan and Hobohm 2013) was improved and re-applied for the next iteration of the study following the cooperative inquiry approach of iterative refinement and validation.

  6. 6.

    Survey 2: To measure changes in attitudes and overall experiences in creating mathscasts, the final survey repeated similar questions to the initial survey (about value, advantages, and disadvantages). In addition, students were asked about their attitude to screencasting (after having created their own), the tools they used, and their ratings of the importance of colour, legibility, clarity of voice, correct mathematics, completeness, clarity of explanation, comprehensiveness, and contextualising. In 2014 and 2015, we extended the surveys to ask specific questions about the use and value of the evaluative tool.

Table 1 Order of introduction of evaluative tool and peer critique 2014 and 2015/2016

The development of the evaluative tool

As depicted in Fig. 1, the development of the tool is based on multiple iterative processes. During the first iteration in 2012, students were only provided with general assistance in producing mathscasts, but received no guidelines on how to evaluate mathscasts and write reflective comments. Within the discussion forum, however, students’ critiques were discussed between other students and the lecturer. Using a qualitative research approach of constant comparison (Glaser 2008), the students’ reflective comments, surveys, and assignments were read by two authors and categorised. The categories were reduced into themes by one author and confirmed by the other (Galligan and Hobohm 2013). The strongest themes related to structuring the mathscasts for visual impact and best instruction, different approaches to solving the problem, metacognition, language, and affective issues. However, there was no evidence of students being able to evaluate the quality of mathscasts on PCK and mathematical understanding.

For the second iteration, the themes identified in the first iteration together with the framework developed by Sugar et al. (2010), assisted in drafting an evaluative tool for the context of teaching mathematics. The addition of two new approaches, based on mathematical understanding and PCK, to assist students to create quality mathscasts were also trialled. Our tool consisted of four elements: purpose, structural elements, PCK elements, and cohesion and completeness (for a series of mathscasts). The tool was then tested by the authors and two maths/statistics lecturers prior to being ratified. The process involved evaluating five mathscasts (three of our own and two professionally produced mathscasts) by using a combination of repeated observation and discussion, thus following the cooperative inquiry approach (Reason and Riley 2008). We tested levels of agreement between two authors on the four elements. Initially, the authors critiqued one mathscast and found between 60 to 100% agreement. The greatest discrepancy was on element one (the purpose of a mathscast). After discussion, we amended the language used, and added examples for clarity. Four lecturers then examined the second and third mathscast and repeated the process. The final iteration of the tool was used to examine two screencasts and we found 95 to 100% levels of agreement.

For iterations three and four in 2014 and 2015, the data collection was repeated using the same five instruments (initial and final surveys, initial and final mathscasts, and forum discussions on the initial mathscasts). In 2014, all student surveys had been collated, mathscasts marked, and students’ forum discussions were entered into NVIVO for initial analysis of word count, length of responses, and quantity of responses. All data were de-identified. We then randomly chose three students’ series of mathscasts (top, middle, and lower marks), and the three authors analysed one series each using the evaluative tool developed in 2014. After comparing notes and making minor refinements to the wording of the tool, we then analysed the remaining series of mathscasts to validate the changes. It is this final tool (Table 3) that has been used for the 2015 cohort of students to guide them in creating quality mathscasts, to analyse other students’ mathscasts, and to be used as a rubric for marking students’ mathscasts. Two themes that emerged in the earlier work in 2012 were student affect and metacognition, particularly around students’ anxiety about the quality of their own mathscasts, and students’ inexperience to reflect on their own mathscasts. In 2015, we addressed these two issues by asking students to analyse a variety of previous students’ sets of final mathscasts and encouraging students to revisit their first mathscasts using the critical perspective of the tool.

Results

We wanted to investigate to what extent the evaluative tool affected the quality of the critique and production of student-produced mathscasts. Questions pertaining to the tool from the survey results across 2012 to 2015 were analysed. In addition, changes in language and word count of critiques from discussion forums in 2013–2015 were compared. Finally, sets of student-produced mathscasts were analysed to interrogate their improved quality.

Survey results

Technology uptake in education can be quite rapid. Thus, it is important to note how students’ screencasting awareness and skills compared between the 2012 cohort (Galligan and Hobohm 2013) and the 2013–2015 cohorts in this study. Table 2 shows some results from the students who completed survey 1 and the question from survey 2 on “Describe what difficulties you experienced relating to the process of obtaining tablet technology, planning, creating, and sharing your screencasts?” This highlighted two main issues: accessibility (column 5) and using a stylus (column 6).

Table 2 Selected data from surveys related to screencast use

The comparison indicates that a significant number had not viewed screencasts before, and the majority of students were novices when it came to producing screencasts. From survey 2, we asked students “Describe what difficulties you experienced relating to the process of planning, creating and sharing your screencasts?” A key issue that has been consistently identified by the lecturers in the course (informally through student queries and phone-calls) is problems in making mathscasts accessible to others, either by providing a link or by embedding the file into a Word document. Anecdotally, the numbers of enquiries around the difficulty of uploading mathscasts decreased from 2012 to the present. Stylus issues were more obvious in 2015, and this may be due to the larger number of students using iPads to produce their mathscasts (from 20% in 2012 to 63% in 2015).

In 2014, students were able to trial the evaluative tool on an optional basis in their critiques after they had produced their first mathscast, but in 2015/2016 the use of the tool became mandatory for a later assessment.

Figure 2 shows that of the 21 students that participated in the final survey in 2015, all but one student found it useful. In 2014, 76% found the evaluative tool useful. Different uses for the tool include the following:

  • Analysing others work (5 students)

  • A check for own assignment or general guide (30 students)

  • A resource for teaching (2 students)

  • “as a guide when planning…not particularly helpful for conceptual type of screencasts but I could see its value for other screencast topics” (This 2014 comment was made by the student who evaluated the evaluative tool as “not useful”)

  • “It helped me to understand better what its purpose is and what I need to bear in mind when communicating in a screencast but also in the classroom in general” (2015 student).

  • “I am more able to identify good and not so good mathscasts on the web now” (2015 student).

Fig. 2
figure 2

Answer to post-survey question on usefulness of evaluative tool in 2014 and 2015

Initial mathscasts and discussion forum comments

Despite being new to creating screencasts, most students in 2014 were able to produce their first mathscast and publish it online in the learning management system. The three screenshots (Figs. 3, 6, 7) in this paper are of one student’s work (Student 13, 2014) and depicts the iterative growth in mathscasting skills. Similar trends were evident for the entire 2015/2016 cohort. Figure 3 was produced early in the semester showing the student explaining a solution to a mathematics problem. By the end of the semester, the inking skills had improved; placement of content was more thoughtful; and purposeful use of colour and directing attention to parts of the screen (e.g. by circling) was identified.

Fig. 3
figure 3

Final screenshot of mathscast on finding the total distance in a lift

In the student’s detailed reflection of this mathscast, and without the guidance of the evaluative tool, student 13 divided the approach into familiarisation of the equipment (further subdivided into iPad and ScreenChomp), solving the problem, and conclusion. The student was very happy with the attempt and concluded:

Whilst there is significant room for improvement, I know I have learnt some valuable information regarding iPad technology, ps. I have just realised how I can change font, underline, make bold etc. on the iPad, so no longer frustrated with this!! (So much easier via laptop!)

After producing their first mathscast, students in 2014 were given an early version of the evaluative tool. To investigate the influence of the evaluative tool on students’ assessment of mathscasts, we analysed the vocabulary used by students in the discussion forum comments by using NVIVO’s frequently used word counts. An example of a peer review in 2014 reflected some of the mathscast features referenced in the evaluative tool. This student’s self-review followed a similar structure (beginning, setting out, colour, noise):

After watching your screencast, there were a few things which I thought I could mention as evaluation points.

  • I thought the setting out could be improved initially by starting the lesson with either a blank screen and giving a clearer explanation of the question or by starting with an abbreviated form of the equation already there so you can just go through it with the viewers.

  • I felt that if you gave a clearer explanation, it might make a bit more sense to the viewer—as I have seen the question I knew what you were talking about, but for someone who is just hypothetically looking to this as a tutorial, they may have no idea.

  • It is hard to do with this example but the use of colour to clarify, highlight, or appeal to the viewer could be noted as well.

  • You could also maybe go into a quieter recording voice so that your voice is louder/clearer.

Otherwise, I thought it was all good and to answer your question, that is the way I solved it as well.

Figure 4 compares 20 of the most common words used in the 2014 forum (not including words like “question”, “thanks”, and “like”) and seven other words from the evaluative tool. For example, “goals” (which were contained in the evaluative tool) was not used by any cohort. The data were recalibrated to words per 100 students for ease of comparison. The words were further grouped into whether they primarily came from the Structural Elements (SE) or the PCK (and purpose) of the tool.

Fig. 4
figure 4

Vocabulary used from discussion forums on mathscasts reflection in 2013–2015

In both 2013 and 2015, when the tool was not available, we found a lower than expected usage of the words used in Fig. 4 in comparison to 2014, apart from the word “understand/ing” in 2013 and “method/s” in 2015. Following the introduction of the tool in 2014, we noticed some words not previously used in 2013 showed usage in 2014 (for example, “delivery”, and “pointer”). Some words were used very frequently, suggesting these words were picked up by the students from the evaluative tool. In particular, the words “colour” and “voice” were very common in 2014.

In addition, the total number of forum responses was compared (see Fig. 5).

Fig. 5
figure 5

Number of responses and number of words per student from discussion forums on mathscasts reflection in 2013–2015

In 2013, there were 157 responses (4.03 responses per student) compared to 301 responses in 2014 (4.93 responses per student) and 320 in 2015 (4.44). In addition, the total length of responses increased from 97.4 words per person in 2013 to 179.9 in 2014, but fell to 151.4 in 2015. We do not claim that the quality of the responses had improved because responses were longer and more frequent. However, this evidence, combined with the change in language, is indicative of higher engagement with the evaluative process. Because the exact circumstances of the responses did vary from year to year we can only speculate on the reasons for this change, but this presence of the evaluative tool in 2014 may have prompted more and longer responses.

Another area of investigation was peer-critiquing mathscasts. The final surveys in 2014 and 2015 asked students to “describe how useful you found the peer assessing process”. All students in 2015 and 15 of the 17 students in 2014 commented on usefulness for their own mathscasts. Examples of feedback include the following:

  • It (reflections) forced us to look at other screencasts which made me learn a lot from them! (2014)

  • I found it really helpful as I was able to see what others had completed, and the comments—it helps build my own toolbelt. (2014)

  • I thought this was a great tool, as other students were able to give you advice or feedback, and it definitely helps in the thinking process when you are producing a new screen cast. (2015)

  • I know I had to ask myself multiple questions “is this the right way to do it, how would I have done this screencast, and how can I be diplomatic/encouraging on the forum without sounding over critical? (2014)

Some mathscasts produced by students assisted other students in understanding the mathematics and pedagogy. This was particularly evident in 2016:

  • I wanted to get a genuine indication of whether my explanation and teaching is effective.

  • This showed me that when I teach these kind of questions to students I will need to be precise in my explanations for the students.

  • After watching your screencast [I] have come to realise that I went about this question the long way.

  • This screencast actually helped me to work out and understand the question better as I had difficulty with this one myself

Whilst most of the comments were positive, we found only three comments suggesting a lack of critical feedback from other students. For example:

  • I feel that most people were afraid of giving negative feedback and did not want to offend their peers (2015).

Final mathscasts

After marking over 350 mathscasts, we have observed that the 2014 mathscasts appeared technically better than 2013 both in visual presentation and verbal delivery, and better at addressing relational and instrumental understanding in mathscasts. Subsequently, the mathscasts have been technically better with more creative approaches taken by many students in terms of a wider variety of teaching approaches and incorporating more elements such as colour, graphics, and layout. In 2014, from an analysis of marker’s comments on the mathscasts, about one third of students struggled to explain the reasoning behind a concept or the problem solving process. They just performed a procedure, despite being prompted to articulate their approach. In 2015, we were able to categorise the 42 students’ 126 mathscasts according to “knows” as follows: why (44), to (45), that (45), how (73), and context (30), with some mathscasts in more than one category. We found only one student used this procedural approach (know how) exclusively.

An example of an improved mathscast in 2014 can be seen in Figs. 6 and 7 (a series of four). The first mathscast (Fig. 6) focused on simplifying algebraic expressions. Here, student 13 created a mathscast (5.09 min) for an assignment with an iPad and the “ScreenChomp” app.

Fig. 6
figure 6

a, b Screenshots of external student’s mathscast in 2014

Fig. 7
figure 7

a, b Screenshots of external student’s mathscast on how to expand and simplify and expression in 2014

Improvement from the first mathscast was noticeable (see Fig. 1) in that the mathscast now:

  • Had clear goals at the beginning

  • Focussed attention by circling and using arrows

  • Broke the screen into parts (e.g. used prewritten words such as division, addition, etc. as an unobtrusive, reference mechanism)

  • Used colour appropriately to differentiate parts of the problem

Despite these visual improvements, the student struggled with conveying mathematical thinking. For example, the first mathscast (in the series) concentrated on revising order of operations and began with the learning goals depicted in Fig. 6a. The student’s approach to simplifying algebraic expressions relied on the use of BOMDAS. Mathematically, this raises PCK issues as it is the concept of order of operations, and not BOMDAS itself, that is under discussion.

However, her final mathscast (4.09 min) showed excellent procedural knowledge of simplifying an expression, and evidence of more PCK understanding. It started with an introduction to revise what an algebraic expression is (Fig. 7a) and provided further practice examples at the end. The screenshot in Fig. 7b shows a neat, clear, measured approach, using colour for specific effects and added ticks and circles to focus the viewers’ attention.

In 2015, students were guided by the evaluative tool, and hence mathscasts included all the five “knows”. For example, Fig. 8a shows a student explaining the how, why, when, and where to add fractions with different denominators. In another example, Fig. 8b shows a screenshot of a student explaining when bolts have to be replaced in a helicopter by way of a formula (know-in-context).

Fig. 8
figure 8

a, b Screenshots of two external students’ mathscasts in 2015

Whilst students did put time and effort into producing quality mathscasts, a comment from a student in 2014 questioned the overall benefit against the effort required: “[there is a] difficulty in achieving the ‘discovery’ of concepts via a screencast…there are more disadvantages for conceptual type screencasts than technical/procedural.” On the other hand, another student thought that conceptual mathscasts were of more benefit than procedural ones: “watching numbers getting written on a page does not interest me that much, so if I had asked about ways to engage with mathscasts in more affective [sic] ways that make sense of maths rather than procedures, maybe that would have prepared me for the way I approached this...”

In 2015, we addressed these concerns by justifying the effort to acquire PCK knowledge to ensure quality mathematics teaching. This was achieved by discussing examples of mathscasts that demonstrated different types of understanding. Students did struggle with the time taken to create quality mathscasts. For example, in Fig. 9, a student animated the diagram to move the semicircle away from the rest of the compound figure and then discussed this area of the semi-circle first.

Fig. 9
figure 9

a, b, c Screenshots of external student’s animation on how to find the area of a compound figure in 2015

Whilst marking the mathscasts in 2013 and 2014, we identified a number of issues. In particular, some students did not actually teach the concept as if to a student, but instead explained the teaching strategies to the marker and it became an extension of what they had outlined in the written assignment. Another issue was the length of mathscasts as students tended to speak very slowly. By slowing their narrations, they could have lost their audience through sheer boredom. In addition, length was also determined by the time it took to write digitally, change pen colour, erase content etc. during the capture process. This could have been bypassed had students used the pause button to insert information (as student 13 had done). We addressed these issues in 2015 by giving students access to previous student examples and providing specific advice. We subsequently found over three quarters of the students applying the pause option. Whilst we recommended a limit of 5 minutes for succinct explanation in the creation of these type of mathscasts at this year level, 24% of students in 2014 and 50% in 2015 responded that mathscasts should be 5 min or longer. In addition, time to develop a mathscast was also a factor. In 2014, only two students (12%) cited time taken to develop a mathscast as a disadvantage. However, this increased to 43% in 2015, and this may be due to the time spent on visual and structural elements as seen in Figs. 8 and 9. But it may also be that students had to think more about what understanding a mathematics concept means. In 2015, one student commented: “I am not a natural presenter in unfamiliar contexts. The more elements I can incorporate naturally, the more time I have to think about the maths.” On a final note, we were also interested in the impact of the creation of the mathscasts on the P/ISTs. Whilst most comments in survey 2 related to advantages of mathscasts for school students, one PST in 2014 commented on the value of the mathscast for self-reflection: “I can rewatch the podcast and see what I thought. This is very different from in a classroom situation”. Throughout the trials (from 2012 to 2015), surveyed students agreed that the process of creating mathscasts assisted their own understanding of mathematics concepts. From 2012 to 2014 between 73 and 82% of students agreed or strongly agreed with this statement, with this number increasing to 90% in 2015. As one student noted “I found this app very useful and interesting for learning and teaching math at the same time”. Another student reflected in a forum post after completing the first mathscast on a question she got wrong in a quiz:

Honestly, I was pretty mortified to see I answered this question wrong as I consider algebra one of my strong areas however I thought it would be interesting to reflect upon and that it would help me remember to never make the same, silly mistake again!

Discussion and conclusion

In this paper, we investigated the usefulness of an evaluative tool to improve the quality of student-created mathscasts through critical analysis of their own and others’ recordings. By using the tool when reviewing mathscasts, students were guided to address the purpose of mathscasts, structural elements, and pedagogical content knowledge.

To build on more generalist frameworks (Sugar et al. 2010; Soto 2014; Shafer 2010; Handal et al. 2005), we developed a framework that was specifically designed for mathscasts and linked to PCK. Our iterative evaluation of the framework produced an evaluative tool that P/ISTs found useful. It helped students focus on purpose and context, structural elements such as visual quality, clarity and fluency of delivery, and PCK. In addition, the production of a series of related mathscasts by the P/ISTs highlighted the need to approach a mathematical concept with both breadth and depth. Successful P/ISTs produced their sets of mathscasts ensuring both coherence and completeness. Similar to the findings of Pena (2011) and Soto (2014), with school students, engaging P/ISTs in the process of creating and critiquing mathscasts actively engaged them in the learning process. In addition, P/ISTs found that this process helped their own understanding of mathematics and how to teach it.

Our current approach to provide P/ISTs more generally with mathscasting instructions is now more deliberate. First, we introduce an early assignment asking them to create, critique and self-critique mathscasts, giving them time to play with the technology, make errors, and get support if needed. We use P/ISTs’ lack of expertise in critically evaluating mathscasts as a learning strategy for improvement. We then present and discuss the evaluative tool to guide language and structure for critiquing mathscasts. The students then self-reflect before evaluating a series of mathscasts from other students. In the final assignment, students’ responses are expected to be more detailed and critical. Ultimately, these activities are designed to assist P/ISTs to produce more professional and pedagogically sound mathscasts for their final assignment. We anticipate this strategy will be used in their own classroom. An additional benefit is that these mathscasts also improve P/ISTs’ own understanding of mathematics.

In comparing data from 2013, 2014, and 2015, we found evidence that the evaluative tool helped students to become more critical in responses to peer-created mathscasts. In addition, incorporating the evaluative tool at the right time in the learning process allowed students in 2015 to produce mathscasts of a better quality than those in 2014. Issues that had been highlighted by students were all addressed. For example, the 5-min limit on these types of mathscasts does increase frustration for students, but they produce better mathscasts, as it forces students to think of better ways to present on the screen. We recognise that a 5-min limit is not suitable for all mathematics concepts and hence guide students to select a suitable concept before embarking on the assignment.

We acknowledge the time taken to produce mathscasts that focus on understanding concepts can be problematic for students. We argue that the benefits are worthwhile for a number of reasons. First, the creator of the mathscast has to think more deeply about the concept being taught. Second, the lecturer teaching the course gains insight into the online students’ understanding through the mathscast. Third, if created well, mathscasts are resources for P/ISTs use in their own teaching.

We have started to test this evaluative tool as a measure of robustness in teaching university level mathematics. That is, how does the production of mathscasts by students deepen their own understanding of mathematics. This approach may be beneficial both at the university and at high school, where more complex mathematical concepts are being introduced. We believe that studying mathscasts through the lens of this evaluative tool will assist students and teachers, in a variety of settings, to develop and critique their own mathscasts or select from the abundance of mathscasts against good structural and pedagogical qualities.