Keywords

For more than ten years, research has been undertaken on the use of various mobile technologies to support mathematics learning and teaching at the University of Southern Queensland (USQ). During this time mobile technologies have become cheaper, easier to use, and more accessible. At the same time various forms of video resources have been developed and utilized to support learning. According to Hartsell and Yuen (2006) online video -based instruction “brings courses alive by allowing online learners to use their visual and auditory senses to learn complex concepts and difficult procedures” (p. 31). In addition, we maintain that the kinesthetic sensory modality of students writing mathematics as well as the effort required to craft an explanation, incorporating gesturing and annotating, also increases students’ learning and understanding of mathematics.

Increasingly mobile technologies enable users to migrate from traditional pen and paper to digital writing. This digital writing can easily be captured as a form of video instruction, called screencasts . Screencasts can also be augmented with text images or animations (Student Screencasting with the iPad, 2014). Sometimes screencasts are referred to as video podcasts (vodcasts) or podcasts. However, podcasts usually refer to audio only content. In our research, we consider screencasts as an extension of vodcasts in that they include freehand inking from mobile devices. In mathematics this allows for effortless writing and drawing. To produce these screencasts , authors use mobile tablet devices (i.e. with a touchscreen and stylus) and recording software. The screencasts are then uploaded to video libraries on the web (such as OneDrive, Evernote, or YouTube) to be viewed anywhere at any time. A screenshot of an example of a pre-service teacher explaining similar triangles is shown in Fig. 1.

Fig. 1
figure 1

Example of an initial screencast by student in 2015

Here the student, using a tablet , started with the typed question, triangle diagrams, and a ruler. He then recorded his screen writing and narrations with a mobile app. The resultant screencast (https://vimeo.com/134467682) was then uploaded onto Vimeo®.

Creating screencasts nowadays is a fairly easy process, however, ensuring good content is far more challenging. Sugar, Brown and Luterbach (2010) suggest that screencasts were originally developed to provide procedural information to students. While some screencasts today are being developed as a pedagogical tools (Heilesen, 2010), many merely capture classes held face to face. In addition, the focus of many of the mathematical recordings still appears to be more on procedural knowledge rather than any other form of mathematical knowledge. Yet, screencasts have the potential to do much more. In particular, the focus in our latest research is on student-produced mathematics screencasts as a tool for reflective learning and effective teaching.

Here we aimed to address the aspects of: what “understanding” mathematics means using taxonomies from Skemp (1976), Mason and Spence (1999), and Watson (2002); Pedagogical Content Knowledge (Chick, Baker, Pham, & Cheng, 2006; Shulman, 1987); and how to critique a screencast (Sugar et al., 2010) in terms of structural elements (such as visual quality and delivery). This has allowed us to develop an evaluative tool , which guides students to critique screencasts and produce effective screencasts themselves (Galligan et al., 2017 (online)). This evaluative tool has four major components:

  1. 1.

    Purpose in terms of understanding mathematics

  2. 2.

    Pedagogical Content Knowledge (PCK)

  3. 3.

    Structural elements in terms of visual quality and delivery

  4. 4.

    Cohesion and Completeness in relation to a series of screencasts.

This chapter first summarizes our tablet -related research undertaken to date (Galligan & Hobohm, 2013; Galligan, Loch, McDonald, & Hobohm, 2015). It then reports on a case study of a USQ course where pre-service teachers used mobile technologies and associated software to produce mathematics screencasts with the guidance of the evaluative tool . The chapter then concludes with future explorations of combining the versatility of mobile technologies with screencasting to further enhance understanding and teaching of mathematics.

Tablet -Related Research

In 2010 we highlighted the advantages of tablet technology in teaching one-to-many (the lecture), one-to-few (the tutorial) and one-to-one (individual consultations) (Galligan, Loch, McDonald, & Taylor, 2010). At that stage, most of the tablet -produced recordings were generated by lecturers. We trialled digital writing with on-campus students, noting the potential of the mobility of tablet technology to engage students and improve understanding. Since then, we have continued to use tablet technology to enhance teaching (Galligan & Hobohm, 2013; Galligan, Loch, McDonald, & Hobohm, 2015), similar to other universities (Loch, 2005; Loch & Donovan, 2006; Olivier, 2005; Al-Zoubi, Sammour, & Qasem, 2007; Anderson et al., 2004). Tablet technology was shown to enhance teaching and increase engagement in the classroom (Logan, Bailey, Franke, & Sanson, 2009), and for pre-service teachers , created a “truly transformational experience ” (Kosheleva, Medina-Rusch, & Ioudina, 2007, p. 332). It also encouraged new approaches to teaching (Maclaren, 2014) including the creation of screencasts . There are time costs to the screencast producer (Corcoles, 2012), but if screencasts have an positive impact on a large number of students, then the time is well spent.

As mobile devices are becoming more ubiquitous, our focus has turned to the student, particularly pre-service and in-service teachers enrolled as university students. We have continued to refine our approach to support these students with creating screencasts , thus allowing us to assess their understanding of mathematics and how they teach it. Other research also focussed on student-produced screencasts . Croft, Duah, and Loch (2013) reported on an internship for undergraduate mathematics students to create screencasts for peers, finding that students who created the screencasts benefitted by gaining deeper understanding of mathematics concepts. Similarly, Wakefield et al. (2011) asked accounting students to produce screencasts for an assignment and found increased student engagement and performance . It has been documented in the past (e.g. Noss & Hoyles, 1996) that technology can be harnessed by teachers to become a window into student thinking . Now, with mobile technologies, teachers are in a better position to gain insight. In an elementary school setting, researchers have investigated student-generated screencasts using an iPad and Explain Everything® (Soto, 2014; Soto & Ambrose, 2016). Their studies found teachers were able to peer into students’ mathematical thinking with screencasts , providing springboards for rich discussions. Soto (2014) concluded that screencasting “has the potential to transform the learning environment by allowing teachers to gain more insight into their students’ mathematical thinking , encouraging students to reflect on their thinking and potentially influence the thinking of other students” (p. iii). Research by Richards (2012) at the middle school level, again using iPads and Explain Everything , found student-produced screencasts allowed them to document their own learning. In a study across three middle school classrooms in Germany, Ifenthaler and Schweinbenz (2016) found that while tablet PCs have the potential to “lead to new ways of designing personalized learning environments for the classroom” (p. 317), schools need teachers with good professional understanding, and knowledgeable technical staff.

Research has suggested that when students teach, they develop a deeper understanding of the material . However, Fiorella and Mayer (2013) argued that it had been unclear which features of teaching contributed to this learning. In their research with undergraduate students they found that “when students actually teach the content of a lesson, they develop a deeper and more persistent understanding of the material than from solely preparing to teach” (p. 281). They also found that learning gains even occurred with less than five-minute video -recorded lectures of the material , even if to an imaginary classroom. This suggestion of increased mathematical understanding has been found in other studies with mathematics students (Croft, Duah, & Loch, 2013). However, in these studies the nature of that understanding was not explored in any depth.

In 2012, when we first asked on-campus students to create screencasts , it was achieved relatively easily with university purchased tablets/iPads and face-to-face support. In order to provide on-line students with the same experience , it was only possible by mailing mobile devices to students in small numbers . However, by 2015, most students had access to iPads and other tablet devices, and recording programs such as Jing and Explain Everything were easy to use. In addition, cloud technologies streamlined the uploading and viewing of screencasts .

From our early research survey results, students indicated an increased understanding of mathematics as a result of creating and reviewing screencasts . However, students’ responses in forum discussions did not focus on increased understanding (Galligan & Hobohm, 2013) or pedagogical content knowledge (PCK ) when evaluating screencasts , instead focussing on procedural skills and structural elements . In our subsequent course development, we aimed to shift their focus to deeper understanding of mathematics. We first used Skemp’s (1976) distinction between instructional and relational understanding. We further divided understanding mathematics using the Mason and Spence (1999) categorization of knowing-that, knowing-how, knowing-why, and knowing-to. Added to this “knowing” framework is knowing about usefulness in context (Watson, Geest, & Prestage, 2003). This latter “knowing” includes, for example, understanding ratios or decimals in the context of measurement . Pre-service teachers ’ PCK was also incorporated into our evaluative tool (Chick, Baker, Pham, & Cheng, 2006; Shulman 1987). The PCK elements included three categories:

  • Clearly PCK (cognitive demands of the task, able to represent concepts and knowing target audience);

  • PCK (content) Content knowledge in a pedagogical context (procedural knowledge , mathematical structure and connections and methods of solution); and

  • PCK (context) Pedagogical knowledge in a content context (related to goals for learning).

Having identified the abovementioned components, we structured the evaluative tool into three categories: understanding mathematics ; PCK ; and structural elements of a screencast (Sugar, Brown, & Luterbach, 2010; Galligan et al. 2017 (online)). When creating a series of screencasts , we added an extra element of cohesion and completeness, encouraging students to create different screencasts based on the different “knowings”. The aim was for students to use this tool to evaluate screencasts and produce effective screencasts themselves.

Our research asked two questions:

  • What does an evaluative tool for mathematical screencasts look like?

  • Does the use of the evaluative tool make a difference to the quality of the production and critiquing of student-produced mathematical screencasts ?

Case Study

While the trials were conducted over 4 years, this case study describes two courses offered in 2015: an undergraduate mathematics for teachers course; and a similar post-graduate course for in-service teachers with a 90% online enrolment. The total enrolment amounted to 50 students (35 undergraduate and 15 post-graduate). The courses shared many lectures, and the post-graduate students had access to the undergraduate Learning Management System (LMS). In the LMS site, pre- and in-service teachers (P/ISTs) shared the links of their own mathematics screencasts and peer critiqued others’.

Each course contained a number of elements where mobile technology and screencasts were used:

  1. 1.

    Lecturer-produced screencasts: After the first live lecture was delivered (and recorded) to on-campus students using a tablet device, all remaining lectures were pre-recorded to deliver content. On-campus students attended a two-hour workshop and online students a one-hour live virtual (Zoom®) session (also recorded). In the lectures, digital writing on tablets was actively used, particularly in the six weeks of mathematical content. The lectures also utilized the interactive quiz feature of Camtasia Studio®. In the live and virtual workshop sessions, tablet devices were used by the tutor and, at times, by the students, to explain mathematical concepts. Zoom allowed for screen-sharing and online annotation by both the tutor and the student.

  2. 2.

    Assignment 1 where students created and linked their own screencast: Using a mobile device, students had to record a screencast in which they explained a mathematics concept. Typically, students created screencasts on an iPad, a tablet PC, or a graphics tablet using ExplainEverything®, Jing or ShowMe®. All the recordings were predominately viewed via web linked cloud storage. This assessment instrument was used to identify common features of student-created screencasts, as well as gauge students’ ability to create a screencast unguided. Students were encouraged to use their “warts and all” version regardless of imperfections such as errors and informal language often used in a classroom.

  3. 3.

    Peer and Self Critique 1: After students uploaded their screencast, they were asked to critique their own and others’ first ‘unpolished’ screencasts via a dedicated online discussion forum. This instrument was used to identify students’ ability to highlight features of a screencast without much initial guidance. The critiques were submitted as part of assignment 1. After students created and critiqued the first screencast, a discussion on the peer critiques was held in a lecture and subsequent workshop/Zoom sessions. It became evident that students were ill-equipped to critique screencasts, hence the evaluative tool was introduced to frame the discussion. We invited students to review their first screencasts in the forum, and this produced some discussion, albeit limited.

  4. 4.

    Peer Critique 2: Students were next asked to critique a set of mathematical screenscasts from a previous cohort with the help of the evaluative tool, and submit the critiques as part of assignment 2. The combination of the self-created and critiqued screencasts better prepared students to create more professionally produced and pedagogically aligned screencasts for the 2nd assignment.

  5. 5.

    Assignment 2: Students had to record a series of linked screencasts on how to teach a troublesome mathematics concept of their choosing that could be given to school students to aid in the understanding of the concept. This second set of screencasts was used to see if students could improve on initial screencasts with the use of the evaluative tool. Students were encouraged to restrict their screencasts to a maximum of five minutes. The screencasts were uploaded by students to their cloud-based video library for markers to access via web link.

In summary, the order in which screencasting tasks were introduced was deliberate and followed the development of the course over four years of trialling. We introduced the creation of unpolished screencasts early in the semester to force students to engage with the technology and to promote active learning. At this stage examiner presence was highly supportive. The self and peer critiques were then introduced to demonstrate different ways of structuring and presenting the screencasts ; and to showcase varied approaches for solving mathematics problems. Attention was given to foster a safe environment to encourage discussion of errors and engender a spirit of support. The screencast submissions and subsequent reflections were then followed by an introduction of the evaluative tool to prepare for the second assignment submission of sequenced screenscasts. The success of this approach, particularly for the online students, relied on easy access to mobile technologies, recording software and cloud storage.

Method

In this research, (with ethics clearance) we used both a cooperative inquiry approach of iterated reflection and action (Reason, & Riley, 2008) to create the evaluative tool , along with a qualitative research approach of constant comparison (Glaser, 2008) to analyse findings. Participants in both courses were asked to create , self-evaluate and peer-critique screencasts that explained mathematical concepts as described above. The case study aimed to ensure that P/ISTs (a) learnt to produce quality screencasts using appropriate mobile technologies, (b) understood the mathematics more deeply, and (c) have a better understanding of how to teach mathematics concepts.

Apart from the assignments and the forum posts, data were collected from pre-and post-surveys to measure changes in attitudes and overall experiences in creating mathematical screencasts . The post-survey repeated similar questions to the pre-survey (about value, advantages and disadvantages of mathematics screencasts ). In addition, students were asked about their attitude to screencasting (after having created their own), the mobile technologies they used, and their ratings of the importance of colour, legibility, clarity of voice, correct mathematics, completeness, clarity of explanation, comprehensiveness, and contextualising. We also asked specific questions about the use and value of the evaluative tool .

Results

This chapter focusses on students’ deeper understanding of mathematics and how to teach it more effectively with mobile technologies. Detailed results of this study, particularly around the development and effectiveness of the evaluative tool , can be found in Galligan, Hobohm and Peake (2017 online).

The following section incorporates results from peer and self-critique 1, and the post-survey, with a focus on three themes from the evaluative tool : understanding mathematics , PCK , and analysis of structural elements . Other results from the post-survey highlight the mobile technologies used and student opinion about the value of screencasting.

Understanding Mathematics

Students’ comments from peer-critique 1 often related to understanding mathematics , and they also related to PCK elements. The word “understand” was one of the top 20 words used in the comments (Galligan et al. 2017 (online)). The example below illustrates the “know why”, and the structure of the screencast is typical of the cohort A student (IST15) critique suggested that the screencast could have included relational understanding (know why). Notice this student also commented on clarity. This was a common response from students as it is something that is immediately apparent when viewing a screencast (Fig. 2).

Your screencast was very clear and succinct. If I were using your screencast for revision purposes or explanation purposes, I might have wanted to have asked the question, “Why do we have to reverse the inequality signs when we divide by a negative number ….” (IST 15)

Fig. 2
figure 2

Screenshot of student solving an inequality (using an iPad, Jing and uploaded to screencast.com)

Students appreciated new ideas on approaches to teaching mathematics (i.e. Clearly PCK ) and methods of solving as demonstrated by their peers (i.e. PCK Content). Even though most of the methods to solve problems were similar, students appreciated seeing their method used by others. At times, the approaches were quite different to those taken by the cohort. For example, one student mentioned the “cross-method” of factorising trinomials (know how). This was new to many students (four of whom explicitly commented on the discussion forum about this concept ).

I’ve never heard of factorising trinomials using the cross method, and now that I know how easy it is I might start using it to teach my students. I liked how you set out your page and used the cross in the middle to show which pair of factors is being multiplied by the other pair of factors. I also liked how you used trial and error to show students that working out answers is simply that, working out the right answer (IST 9)

In the post-survey, P/ISTs were asked to rate if their screencasts assisted their own and could assist their future students’ understanding of mathematics concepts (Fig. 3). Over the four years of using the survey, 82% of P/ISTs strongly agreed/agreed, (with an additional 15% remaining neutral) that the process of creating screencasts assists their own understanding. Ninety percent of P/ISTs agreed, and the remainder were neutral that it could assist student understanding. While data are not directly comparable between cohorts of different years, it is interesting to note that agreement was slightly higher for both students and teachers for almost every year.

Fig. 3
figure 3

P/ISTs’ opinion of the usefulness of screencasts to aid their students and their own understating of mathematics concepts (2012–2015)

In an open-ended question on the process of peer reviewing, all students commented positively on the process, and resultant increased understanding. The following is a typical response from students:

Providing reviews was extremely helpful in assisting my understanding of the topics. Receiving reviews was also excellent in pointing issues I may not have considered. (IST12)

Due to the nature of the course, we could not explore the full extent of students’ understanding of particular mathematics topics, as highlighted by Fiorella and Mayer (2013). This is the focus of current screencasting research within an elementary mathematics course undertaken by many education students.

Pedagogical Content Knowledge (PCK)

In the initial peer critiques, there were instances of students’ thinking around PCK elements, but not as strong as the “understanding” theme. One “Clearly PCK ” theme that emerged was knowing the target audience (examples IST6 and IST10) and knowing how to represent concepts (example IST6) in different ways:

It’s intentionally short (1 min) … At least it made me think about how succinctly we present information and different ways of saying things! (IST6)

I liked that you explained that the method you were going to use by saying out loud… I did find it a bit confusing to follow only because I didn’t have the original question to look at… it can be a good habit to encourage students to look back and re-read the question. (IST10)

However, as the course began to emphasize PCK , subsequent screencasts reflected PCK elements. For example, “original questions” were a feature of the start and end of screencasts (i.e. bumpers) created by students, along with more carefully crafted screencasts .

Your audio comes across as very calm and well-delivered, with your text well organised. …I have seen some, even professionally-made ones, where the presenter constantly repeats him/herself, over-talking, and jumps around so much it can be confusing. This can leave the viewer … quite exhausted! (IST6)

In the post-survey analysis, open-ended questions were themed. One theme, about the use of screencasts in the classroom, focussed on the PCK element of repetition and efficiency , but also knowing their students.

Honestly now I think it’s one of the teaching tools I will definitely use in teaching. And that is like daily basis. …, students that are bit slow to get a concept can play and watch over and over again instead of asking the teacher ten times, which they wouldn’t do anyway! (IST7)

….can link in more resources, can record your best version of teaching the material , can have writing pre-written so that it’s legible, can have resources pre-pasted into it. (PST26)

I like the fact that you can bring the outside world into the class room … that may help students stay attentive and help them think of mathematics as more than just a subject …. I also think there is room to use them for class and home help. Teams of teachers could share their work and have all classes in a school having similar lessons. (PST10)

In 2015, we asked students to rank how difficult it was to rate peer screencasts according to the evaluation tool. The results are tabled in Fig. 4, with ratings of extremely difficult/difficult, and extremely easy/easy combined. It became evident that structural elements (legibility, colour, and voice) were easier to rate than PCK elements (circled). The difficulty in rating abstract aspects is reflected in students’ comments in the next section of this chapter.

Fig. 4
figure 4

P/ISTs opinion of the difficulty ratings on peer screencast components/aspects (2015)

We also asked P/ISTs in 2015 to rate their own screencasts using the evaluative tool (Fig. 5). Students were relatively critical of their attempts compared to markers’ evaluation. We noted that PCK still appeared difficult to achieve, with half the students rating themselves average or below average. They also rated some aspects of the structural elements such as narration and mathematical language low, reflecting the difficulty in being able to articulate their thoughts appropriately and succinctly. This is evident in one student comment in the discussion forum: “Did anyone else find it hard to analyze themselves????”.

Fig. 5
figure 5

P/ISTs opinion of their own final screencasts (2015)

Analysing Screencasts Structural Elements

Structural elements were divided into visual quality (setting out, screen movement , directing attention, legibility, colour and aids) and delivery (bumpers, voice, narration, general and mathematical language). Students in 2015 were invited to re-evaluate their first screencast with the evaluative tool . Self-critiques were non-compulsory and hence only seven P/ISTs volunteered to critique their own first screencasts . The students commented predominantly on visual elements (colour, setting out) and delivery (voice, general and mathematical language). A typical example can be seen in IST1 (Fig. 6):

After viewing the reflective analytical tool, I can see that I need much improvement… but I could have used a different colour pen to direct the attention back to parts of the working out and diagram. However, it was a good idea that I used a diagram as an aid in the mathscast. My voice was very monotone. I found it challenging to talk and write at the same time. An idea might be to pause the video while I write and then talk. I believe that my general and mathematical language I used was suitable for teaching the concept . (IST 1)

Fig. 6
figure 6

Screenshot of final moment in IST1’s screencast (3.12 min)

Another example provided similar evidence of reflection on structural elements (i.e. setting out, legibility, and colour) and also included voice and language, which featured frequently across screencasts .

After having a look at my Screenchomp screencast with the analytical tool, I think there is so much room for improvement. I could have used separate screens for the two different methods of finding the area. This would then have an effect on the legibility of my screencast. I am glad I used different colours, which did make it little bit more appealing and easy to follow. I also think I could improve on my delivery, maybe my tone could have [been] better, maybe I need to speak louder and at a slower pace. As for mathematical language part, I think I did alright, the vocabulary and terminology were appropriate. (IST 7)

Tools such as Explain Everything® have features that assist in the creation of well-structured screencasts . For example, it has a pause button to allow for uploading words or diagrams; options for pointers and highlighting to assist with directing attention; insertion of mathematics equations, audio, images, and editing capabilities. In addition, these tools are used in combination with the latest touchscreens and pens (such as Microsoft Surface Pro® and iPad Pro®) which allow for smooth, effortless writing.

Mobile Technologies Used

At the start of the research project, mobile tablet devices were a novelty and not readily available to students. Tablets and their application in an educational setting were largely unexplored, but this changed substantially with the introduction of the iPad and other mobile tablet devices, along with access to cloud storage. This change is reflected in questions in the post-survey relating to what technology they used to create their screencasts .

The post-survey was used from 2012 to 2015. In 2012, of the 57 enrolled only 26% (15) completed the post-survey, whereas in 2015 of the 50 enrolled students, 21 completed the post-survey (38% completion rate). The most prominent change noted (see Fig. 7) was that by 2015, 67% of students were using an iPad to create their screencasts (up from less than 20% in 2012, when we first surveyed students), although a few students used their smartphones as seen in Fig. 8.

Fig. 7
figure 7

Devices used by students to create screencasts

Fig. 8
figure 8

Student using a smartphone to record writing on paper

Survey questions over the four-year period indicated changes in the use of screencasting apps. In 2012, most students were using Jing® or PowerPoint® (Fig. 9), compared to 2015, when Screenchomp® and Explain Everything® became more popular along with easy access to cloud storage.

Fig. 9
figure 9

Proportion of students using programs to produce screencasts

Value of Screencasting

Due to the difficulty and subjective nature of rating screencasts and associated experiences, we provided various open-ended questions in the post-survey to identify additional aspects of the screencasting experience . Students were asked to comment on their own perception of advantages/disadvantages of screencasts and to describe any changes in emotions, feelings and attitudes towards screencasting. In 2015, five themes emerged around: improvement in screencast production; emotive attitude; teaching efficiency ; changes in opinion, and disadvantages. Apart from statements on how students improved and understood the value of quality, engaging and accurate screencasts , students also welcomed the skill of producing such personalised educational artefacts through mobile technologies.

Positive emotive language was used such as “so excited”, “gained confidence”, “mind blowing”, “embraced the opportunity”, I am really happy” and “desire to produce screencasts ”.

Another theme reflected students’ shift in attitude positively from novice to professional screencaster. A typical example is illustrated from PST12 commenting on the emerging realization of the multimodal and social learning qualities of screencasting:

To be honest, I thought they were a little redundant in a classroom situation and suited to long distance study only. My attitude has definitely changed regarding this, as screencasts allow a certain specificity that can be orchestrated which I imagine would be a lot harder in the ad hoc classroom environment. After watching some great screencasts online, I was much more relaxed this time around, enjoying the relaxed candor of presenters which veered me away from a rigid monologue I employed the first time. (PST12)

A final theme identified the disadvantages of screencasts , particularly on procedural vs relational understanding, and the potential distance between student and teacher. Because screencasts do not provide a live synchronous experience , other students commented on the inability for students to “interact with the teacher”; and the recordings added an “air of distance between the student and the teacher”. This disconnect can also be seen in PST31’s comment:

If a teacher wanted to offer a (literally) tangible example of a maths problem, say, mixing up pancake batter in a classroom, a mathscast would not achieve this. Virtual reality is not tangible reality in this instance. (PST31)

The amount of effort to create a screencast (reflected in Corcoles (2012) research) is still real, and is reflected in this student’s comment:

I think that they are very valuable tools, yet it remains that initially there is a heavy impact upon time as I familiarise myself with the technologies, whilst also trying to familiarise myself with the demands of being a teacher. (PST anon)

This amount of effort has lessened because mobile and related cloud technologies are now much easier to use. Similarly, the culture of schools to embrace screencasting enabled through mobile technologies is changing. While one student related:

I actually found myself at a screencast moment in my recent prac placement, but decided against it because they didn’t ‘do that’ there so it seemed like too much of a stretch. (PST22)

Another commented:

I completed my placement at the end of last term and was lucky enough to be placed with maths teacher who loves using technology in the classroom and teaches year 8 - 12

and went on to mention the use of Kahoot®, working with iPads and tablets, and Desmos®.

Conclusion

Teaching and learning mathematics can be enhanced greatly by appropriate selection of mobile technologies. This chapter has outlined the use of screencasting and associated mobile technologies as an important approach to assist P/ISTs to understand mathematics and teach it. Our research undertaken to date has found the creation of carefully constructed quality screencasts evokes positive, even transformational, effects on in-service and pre-service teachers similar to that found by Kosheleva and colleagues (2007). Similarly, we have noted an increase in students’ perception of improved mathematical understanding, as found in other studies (Croft et al. 2013; Kosheleva et al. 2007). However, there are caveats. Examiners have commented on the excessive time taken to mark these screencasts . Like all technology , screencasts are tools that should assist learning and teaching, but not at the exclusion of teacher intervention . In our research, we have found disadvantages such as reduced interactivity, and time consuming efforts to create well-crafted screencasts , as mentioned by Corcoles (2012). We wanted to use screencasts to peer into students’ thinking in more depth, as Soto (2014) was able to do with school students, but students were reluctant to expose their errors in thinking . Our evaluation of the effects that student-created screencasts had on pre-service teachers ’ approach to broader pedagogy , is relatively exploratory to date. It needs to go beyond the confines of one course or one subject and into practice. As one student reflected:

I am amazed at how many ways I can see uses for this every day now! I studied foreign languages for many years and I can see how this teaching area would also benefit by incorporating inking devices with narration. (PST20)

Meanwhile, our teaching has allowed us to identify and trial newer mobile technologies for online learning, and we are keen to explore the effects.

Can we see a future where students and teachers can capture artefacts, such as screencasts , anywhere, anytime, and with any device? We are seeing some of this now in blogs (e.g. Mayer, 2016), YouTube videos, or university-hosted centres (e.g. Mathematics and Statistics Help (MASH) Centre). It is even more imperative that teachers and students have a framework to critique and produce screencasts in order to ensure good pedagogical quality. Once the students know this process, our research can now focus on what triggers their understanding, and to what extent, so they develop a deeper and more persistent understanding of the material (Fiorella, & Mayer, 2013). Future research, within a mathematics content course, will explore the extent to which pre-service teachers ’ understanding of mathematical concepts improves due to their creation and peer critiquing of screencasts . In particular, it will probe students’ own PCK related to what it means to fully understand and teach mathematics concepts. Such research will build on future developments of mobile technologies, their ease of use, and abilities to engage human presence.