1 Introduction

New developments in the technology available to classrooms, coupled with recent middle-grades mathematics standards, provide an opportunity for mathematics classrooms in the United States to focus more on the teaching of concepts than previously. Over the past ten years many states have adopted the Common Core Mathematics Standards (National Governors Association Center for Best Practices & Council of Chief State School Officers, 2010) or close derivatives, driving the adoption of new instructional practices. These standards stress key mathematics concepts and relationships, going beyond the teaching of procedures in a way not before widely addressed in mathematics classrooms in the US. As a result there is demand for instructional resources that support the teaching of concepts. That makes this an opportune time to introduce dynamic representational technologies into mathematics classrooms, as effective use of dynamic representations can support the investigation of foundational mathematical concepts (Orrill & Polly, 2013). This article provides insights into how 24 teachers of middle school students (age 11–14) in a large state in the United States varied in their pedagogical shifts away from traditional models of mathematics instruction to models of instruction that provide the opportunity for students to build conceptual knowledge.

The environment in which teachers and students engaged with technology-based, visual, dynamically linked multiple representations is called SunBay Digital Mathematics. SunBay is a curricular activity system supporting middle grades students’ learning of core mathematical concepts (Vahey, Knudsen, Rafanan, & Lara-Meloy, 2013). Core components of SunBay are dynamically linked visual representation technology, a Predict-Check-Explain (PCE) pedagogic routine, and teacher professional development (PD) aimed at developing knowledge of both mathematical content and the PCE pedagogy in technology environments.

We report on a qualitative study that drew from 24 out of a total of 342 teachers from middle schools in two large districts in the United States who participated in the SunBay program. We address the following research question:

How did participating teachers use the technology of dynamically linked visual representations and the curriculum’s recommended pedagogical strategies to support student learning?

2 Background

The key principles for the SunBay curricular activity system were developed from prior research on dynamic visual representations, effective pedagogy using dynamic representations to support student learning, and effective PD strategies for teacher learning and implementation.

2.1 Dynamic visual representations: theoretical foundations

SunBay can be traced to James Kaput’s work on dynamic representational technologies (DRTs). Taking a broad historical view, Kaput (1992) pointed to two important attributes of mathematical representations: they contain and create mathematical meanings rather than passively reflect some form of external mathematical “knowledge”; and both they and their entwined mathematical meanings continually evolve. That such representations are not simply novel presentations of the “same” or “old” mathematics implies that new kinds of content knowledge are potentially constructed by the advent and use of new dynamic representations. For instance, representing motion or other time-based phenomena visually and dynamically on the screen, with which students interact with a mouse or touchpad, may foster different understandings of the related mathematical concepts than ones afforded by static symbols or graphs.

Kaput applied these ideas to the mathematics of change and variation when he created a new DRT known as SimCalc MathWorlds (Kaput & Roschelle, 1998). This environment was capable of engaging youth across the grade levels in mathematical ideas that otherwise remained inaccessible or difficult to learn. This effort evolved to focus not only on representational innovation, but equally on activity design, PD, and the entire curricular support system required to sustain innovation. Perhaps as a result, SimCalc is one of the few technology-based mathematics interventions whose significant effects on learning have been documented on large populations through randomized controlled trials (Roschelle et al., 2010).

Kaput’s theses arose at the time of the initial emergence of dynamic representational technologies, the so-called Dynamic Geometry Systems of the late 1980s (Laborde, 1989; Jackiw 1991). Though these systems have been widely studied over 30 years, their research literature has surfaced no single unambiguous definition of the term “dynamic,” and instead often deploys it as a loose genre descriptor rather than as a precise or narrowly-delineated functionality (see King & Schattschneider [1997] for an early survey of both term and technology). In our work designing and evaluating various dynamic mathematical representations in the SunBay project we find it useful to articulate, compare, and name two specific aspects of dynamic experience drawn from the broader literature. Intra-representational dynamism suggests the capacity of a single representation to change in time; and moreover to change in ways pedagogically designed to illustrate specific mathematical structure. We take the draggable triangle of Dynamic Geometry Software as a prototypical example, where temporalized tours through a continuously-related set of visual examples of a single system of constructed relationships produce powerful and embodied conceptions of dependence and functional relationship (Laborde et al., 2006), and through them, new discourses of mathematical generalization (Sinclair & Yurita, 2008). By contrast, inter-representational dynamism frequently occurs when DRTs coordinate two or more representations of some mathematical phenomena in a synchronized fashion: for example, when dragging the graph of a function modifies the parameters of the equation that describes the function. Of course both types of dynamism—intra-representational and inter-representational—can be at play in a single environment. And though inter-representational dynamism draws on a long history of multiple representations in mathematics education (Goldin, 2014), it takes on a unique character in technology environments in which dynamic linkages between representations cause changes made in any one representation to be reflected instantly across all.

A Kaput-based perspective on both forms of dynamism pays attention to the shifting nature of mathematical claims supported by apparently traditional representations as they are newly empowered by the affordances provided by dynamic technology (by affordances we mean Hollebrand’s notion of a potential for action [2007]). These shifts happen in both intra- and inter-representational dynamic relationships. Thus a visual geometric diagram takes on, under intra-representational dynamism, new abilities to anchor general claims and arguments about invariance that were formerly reserved for non-visual geometric representations such as deductive proofs. Similarly, learners’ ways of attending to, and reasoning about, the inter-representational relationship between potential graphs of a function and the values of its parameters clearly differ between a traditional graphing exercise (of curve-stitching points calculated through manual arithmetic) and a more dynamic situation in which a parameter varies in real time response to the mouse in one’s hand, while the graph updates accordingly. They differ in time-scale (slow vs. fast), primary action (calculating vs. dragging), and semiotic focus (the development of a single curve through pointwise extrusion, vs. behavior of a family of curves in motion). These representational shifts in mathematical character or meaning offer not only mathematical opportunities for learners, but a combination of mathematical and pedagogical opportunities and challenges for teachers, who must consider the classroom’s emergent mathematical ideas as they are distributed across multiple simultaneous dynamic representations.

In terms of perspectives on dynamic technologies in the broader research literature, where embodied cognition focuses on the timeline of a learner’s hand gesturing with a computer mouse (e.g. Arzarello et al., 2002), and instrumental genesis (such as Trouche, 2014) focuses on learners’ acquisition of new tools over the course of a lesson, a unit, or a semester, Kaput’s perspective on the relation of mathematical meaning to its ever-evolving representational infrastructure broadens the dialectic into developmental, historical, and evolutionary timelines spanning differences in diverse learning materials, between teachers’ formations and their students’, and across forces at play in basal texts, curricula, and policy.

2.2 Dynamic representations in the design of SunBay units

Across multiple SunBay curricular units students and teachers work with DRTs, engaging in activities that feature both intra- and inter-representational dynamism. These include both SimCalc-derived technologies and DRTs and associated curricular units that were designed to support understanding the concepts of ratio, algebraic expressions, and geometric transformations. We briefly describe two of these DRTs below, along with the narrative storyline of the SunBay units in which they appear.

2.2.1 A DRT for equivalent ratio

Figure 1 (and https://youtu.be/3_6NQmGYpj4) shows the DRT used in the 6th grade True Colors unit in which students explore the mathematics underlying the idea of equivalent ratios, primarily through a storyline about artists wherein students are asked to mix different amounts of the same color of paint. The software provides multiple coordinated representational tools for investigating the equivalency of ratios. Students mix color blends by sprinkling an arbitrary number of drops of paint, chosen from two primary colors, in a large “mixing area” in the upper half of the screen (Fig. 1a). As the number of these drops increases, their net blend begins to emerge. At the same time, a “spectrum bar” (Fig. 1a and b) shows a static continuum of hues ranging between the primary colors with the currently-mixed blend dynamically marked on this continuum and dynamically moving as drops of either color are added to the mix. A “blend bar” describes the current mixture as a fixed-width chart, partitioned into individual drops, with all drops of one color on the left, all drops of the other color on the right. These tools’ intra-representational dynamics ensure that the spectrum’s “mix marker” always appears on the spectrum at the same horizontal location as the border appears between the two sets of colored drops on the blend bar. As shown in Fig. 1b, students can mix, save, and eventually visually compare multiple blends (giving rise to multiple blend bars, although only a single active blend is shown in the mixer). Finally, a “container” enables students to organize drops into equally sized rows and columns forming a rectangular array. This allows students to show the equivalence of different paint blends by switching between them and comparing the structure of the resultant containers. A supplemental display (not shown) graphs specific blends on perpendicular core-colored axes.

Fig. 1
figure 1

a Ratio DRT showing one blend, randomly mixed. b Ratio DRT showing one (of two) blends, actively “contained”

2.2.2 A DRT for equivalent algebraic expressions

A second DRT, depicted in Fig. 2 (and https://youtu.be/pOCJJecil4M), allows students to create and edit algebraic expressions and compare their equivalence. The DRT is designed to provide conceptual insight into properties such as the distributive property, and to support students’ understanding of why procedures such as adding like terms “always work.” It appears in two units. The storyline in a 6th grade unit, Little x Games, follows a card game in which players combine terms on cards to create equivalent expressions, and set the prices of sets of cards by using algebraic expressions. The storyline in the 7th grade unit, 3D Design Studio, involves creating algebraic expressions to determine how much polymer in a 3D printer will be needed to print different designs, and comparing different algebraic expressions to determine if they are equivalent. Within the DRT students enter expressions symbolically (e.g. 15z, 12 + 3z) and the DRT presents each in a graphical representation for inspection and comparison. This representation shows individual constants or variables as blocks aligned with a number line, with height proportionate to their value, and multiples of constants or variables—algebraic terms—as stacks of like blocks. Thus a left-to-right horizontal reading of the expression narrates the arithmetic of the expression (with stacking and displacement showing multiplicative and additive orders of evaluation), while a vertical reading represents the changing value of the expression over its arithmetical construction. An emergent property of this representation is that the “net height” achieved at the rightmost end of the expression describes the value of the entire expression. Inter-representational dynamics are such that dragging a z block right or left of its original location within the expression 12 + 3z generates a succession of new symbolic representations (12 + 2z + z, 12 + z +2z, z + 12 + 2z), while the intra-representational invariance of the height of these differing block expressions (or the inter-representational invariance of the measure of this height on a nearby number line) reveals that all of these symbolic representations are equivalent in value. Within these representations’ equations variables are always depicted as having specific values (e.g. z = 1.0 in Fig. 2a and z = 2.0 in Fig. 2b). Yet these values can be varied discretely (by entering a new value in an edit-box) or continuously (using animation controls at the bottom of the window). Assigning specific values to variables permits reasoning about how an expression’s value is calculated, while varying these values over time reveals aspects of the structure of that expression invariant across possible variable values. Varying values also helps students find salient details of the expression at critical behavior boundaries (e.g., when a slowly decreasing z value enters z ≤ − 4 and thus 12 + 3z becomes negative).

Fig. 2
figure 2

a and b Expressions DRT comparing 15z to 12 + 3z at different values of z

Under this representation one can visually compare the structure and height of expressions. Equivalent expressions are those that maintain identical “net heights” for all possible values of their variables. The graphically depicted structure over changing values offers tangible evidence for student argumentation about how and why two expressions are or are not equivalent. For instance, changing variable values supports the notion that two expressions can be equal for specific values of the variable but not for all values, and hence are not equivalent (see Fig. 2a and b).

2.3 Predict-check-explain (PCE)

The potential of dynamic technology environments in supporting the development of conceptual understanding through conjecturing and proving has been explored in prior research (e.g., Laborde, 2000). Students can check and reason about their conjectures by observing the effects of their interactions with the representations in terms of what changes and what does not change. Where a conflict arises between a conjecture and evidence from the technology, students must think about how to resolve it. It is through the process of resolving intuition and observation that students make the connections required for conceptual understanding.

PCE is a foundational instructional routine in SunBay, found implicitly in earlier SimCalc materials (Tatar, Roschelle, & Hegedus, 2014). It is used in the curriculum and in PD as an instructional routine (Kelemanik, Lucenta, & Creighton, 2016) in which students develop conceptual understanding as they engage in the practice of providing explanations for their ideas. In a PCE cycle students are presented with a question relating a simulation to a mathematical representation. The features of the representation in its static form and the students’ intuitions about the simulated phenomenon are the basis for students’ predictions. Using DRTs students test their predictions without having to rely on the teacher. The students are then asked to explain why their prediction matched or did not match what they saw in the technology. The activities and DRTs were designed to provide opportunities for students and teachers to use both intra- and inter-representational dynamism in their explanations.

A typical SunBay PCE task from the Managing the Soccer Team unit is shown in Fig. 3 . In this example students are first asked to Predict what a faster runner’s graph will look like, given a graph representing a slower runner’s motion (in earlier activities students investigate how the simulations are simplistic models of actual motion). To Check, students edit and then “run” the DRT; a simulation associated with the graph shows the motion of the runners. Finally, students are asked to Explain why their prediction did or did not match the observation, leading to an understanding of how the steepness of the graph relates to speed—a precursor to understanding slope.

Fig. 3
figure 3

a and b A workbook example of Predict-Check-Explain; The associated DRT screen

The teacher’s role in supporting students’ sustained engagement in all phases of PCE is critical to learning. Teaching moves include asking questions to ensure students understand the initial prompt, encouraging students to run the simulation and to interpret what they see, drawing attention to particular features of mathematical representations, and eliciting explanations of a mathematical nature.

2.4 Teacher knowledge and PD

Effective use of the SunBay DRTs and PCE routine requires that teachers have a high level of technological pedagogical content knowledge (Koehler & Mishra, 2009), and engage in the process of instrumentalising the materials in a way that conforms to their own teaching practices (Hoyles, Noss, Vahey, Roschelle, 2013; Trouche, 2004, 2014). TPACK is knowledge that integrates three areas: technology, pedagogy, and subject-matter content. It is not a simple union of three areas of knowledge but is an emergent understanding where all three areas of knowledge interact (Mishra and Koehler, 2006). For example, teachers might understand how to represent concepts to support student understanding, which is part of pedagogical content knowledge (PCK) (Shulman, 1986). However, when such representations are in the form of DRTs teachers need to support students in taking advantage of their dynamic affordances. This requires teachers make a fundamental pedagogical shift away from traditional models of mathematics instruction toward a model that requires a higher level of TPACK.

Successful instrumentalisation of SunBay requires that “students and teachers come to use the potential of the technology (or other digital artifact) for their own purposes, transforming it as they do so” (Hoyles et al., 2013). SunBay PD is designed to support this process as teachers develop understanding of how mathematical concepts are represented with technology and what pedagogical strategies can support the use of technology for student understanding. SunBay PD engages teachers with the curriculum in two ways. First, teachers work on each unit as learners, connecting technology to content, and developing their technological content knowledge. Then they work on the units as teachers, connecting technology to pedagogy, developing their technological pedagogical knowledge. The PD aims to support teachers’ instrumentalising as they are given the opportunity to consider how they will use the technology in their own teaching.

Engaging with PCE as learners enables teachers to use dynamic representations of mathematics themselves. For example, teachers’ traditional approach to equivalent ratios in the United States is primarily numerical and procedural. In contrast, dynamic representations offer new ways to consider ratio equivalence (Fig. 4). Teachers explored the visual dynamic representations as they went through a PCE cycle. Each time a new representation was introduced, the group discussed its intra-representational strengths and limitations for representing aspects of ratio as well as its inter-representational affordances for connecting to other representations.

Fig. 4
figure 4

Incoming ratio knowledge and new learning about the meaning of equivalent ratios

Working on the units as teachers they explored technology through pedagogical lenses. At this stage, PCE was introduced as the overarching pedagogical strategy that structures the teacher’s work. Each phase was modeled with the whole group, and the roles of the teacher and students were discussed. Teachers also considered possible alternate conceptions or difficulties that might arise. Pedagogical advice was offered and discussed, e.g.:

  • Encourage collaboration around the technology.

  • Ask, “How do you know?” even when the right answer is given.

  • Ask students to:

  • make connections between representations.

  • Show, draw and verbalize their ideas.

  • Ensure students talk more than the teacher.

3 Methods

3.1 Context of the study

Our analysis focuses on a subset of teachers who used the SunBay materials, selected from an evaluation study that used a randomized control design involving 342 middle school teachers from two large US school districts. The evaluation study compared treatment to control schools to measure the impacts of the SunBay program (Sirinides & Gray et al., 2018). Teachers in the treatment schools were provided with SunBay-compatible devices (one for every three students), used two SunBay units per year, and used the district-adopted curriculum the remainder of the year. Treatment teachers participated in a 2-day PD workshop during summer break, and then met for another 3 hours shortly before teaching the unit. They were visited twice per year by district facilitators who provided individualized support and teachers could get additional support at their request. Control teachers used the district-adopted curriculum for their full year of mathematics instruction: this curriculum did not involve using DRTs.

The independent external evaluation showed that the SunBay program had significant positive effects on two teacher measures: a subset of TPACK focused on teacher comfort and confidence with technology (d = 0.29) and Teacher Allowance for Student Struggle (TASSP) (d = 0.26) (Sirinides & Gray, 2018). The TASSP survey measured teachers’ belief that mathematics teaching and learning should include productive struggle (Clark et al., 2014). Survey items for these measures can be found in “Appendix A”.

These findings are encouraging because when a teacher is not comfortable using technology or does not believe technology has potential to enhance teaching and learning, it’s unlikely they will implement SunBay effectively. Additionally, PCE is consistent with student productive struggle, and these findings suggest that PD was effective in building teacher beliefs compatible with PCE. We also found that teachers in this study could act as technical advisors to students on how to use the DRTs; they demonstrated comfort with using technology in teaching and developed classroom norms used to transition between different activities and resources.

These findings led us to an investigation of how SunBay teachers used technology and PCE together: in particular, how they instrumentalised dynamically linked multiple representations in supporting students’ mathematical learning.

3.2 Data source and methods

To explore how teachers used SunBay’s DRTs in their classroom instruction we took a qualitative approach to analyzing classroom video data (Creswell & Poth, 2012; Saldaña, 2012). The classroom sessions were video recorded as part of the independent evaluation. The selection criteria used by the evaluators for deciding which classes to record included a range of district, grade level, teaching experience, as well as a range of implementation (informed by district facilitators). We reviewed all 26 classroom videos and used 24 videos for our analysis. We excluded all control teachers as well as two videos that captured lessons that did not require technology use, as this study investigates variation of teachers’ use of DRTs when provided a particular set of materials and PD experiences. Table 1 summarizes grades, mathematics topics addressed, and the number of lessons for each curriculum unit for the video data (for further information, see Ebby, Sirinides & Fink, 2018).

Table 1 Lessons used for the analysis
Table 2 Summary of levels of classroom technology use

We analyzed the video data to examine how teacher instrumentalisation resulted in the use of features of DRTs to support students’ mathematical learning. We started with the categorization provided by the external evaluation, which sorted the videos into low, medium, and high implementation. Two of the authors then engaged in a six phase coding process:

  • Phase 1: The researchers jointly reviewed five videos and created analytic memos. Initial codes were generated based on the expected use of PCE as presented in the professional development.

  • Phase 2: The researchers reviewed the original five videos and an additional five videos to apply the initial coding scheme and independently generate emergent codes. The researchers generated an initial set of themes based on the codes (see “Appendix B”).

  • Phase 3: The researchers coded another set of five videos. The researchers discussed their coding, resolved disagreements and further refined the themes.

  • Phase 4: The researchers applied themes and codes to another set of five videos, resolved the disagreements through discussion, revisited the prior videos to check that the themes still applied, and generated the final written version of the themes.

  • Phase 5: One researcher applied the themes to the rest of the videos and then the two researchers resolved any remaining issues.

  • Phase 6: The researchers created levels by combining themes and assigned the levels to the classroom videos.

As the coders wrote their analysis, they checked with other team members for trustworthiness.

4 Results: the range of technology use with predict-check-explain

Analysis revealed variations among the 24 lessons in the extent to which technology use provided students the opportunity to develop conceptual understanding. We define three levels of teachers’ instrumentalising routines of PCE and dynamic representation use, ranging from using dynamic representations in a singular, static way to using representations dynamically to support explanations of mathematical concepts (See Table 2).

4.1 Level 1: technological representations used minimally

Teachers at Level 1 minimally integrated DRTs into their instruction. The teachers told students that they were required to use technology to check their predictions; they also projected the outcome of the DRTs to a shared classroom display. However, classroom discourse did not include comparing students’ predictions to what the DRT showed dynamically, instead focusing on procedural considerations.

The following example from Little x Games (Excerpts 1 and 2) illustrates using technology as a static diagram (and thus not taking advantage of either intra- or inter-representational dynamism). Ms. S (all teacher names are pseudonyms) worked with the class on a task involving the cost of purchasing playing cards for a fictional game. The expression the class wrote for the total cost of the cards was 12.50 + 3n + 1.50n, where n represented the number of specialty cards to be purchased. The task then required them to write another expression for the total cost using a single term with n in it. The teacher input the expression 12.50 + 3n + 1.50n into the tool and asked students to predict an equivalent expression, check it using the tool, and then explain. She circulated among the students and stopped to converse with individual students. The following excerpt illustrates Ms. S’s interaction pattern with student groups and individuals.

Ms. S: Okay, what equivalent expression can you write there?

Student: I don’t know.

Ms. S: What’s 3 plus 1.50?

Student: 4.50

Ms. S: Good, perfect, so write that.

Excerpt 1 Level 1 teacher interactions during small group work

Typically, students input the given expression and their predicted expression into the tool. After looking at a student’s predicted expression she asked funneling questions (Wood, 1998) that elicited the procedure for combining like terms. She failed to draw attention to the blocks representing the expressions in the tool, and she did not ask them to animate the DRT to show a range of values for n for the two expressions. There was little room for students to evaluate the structural nature of the two expressions or whether they were always equal in value. The following excerpt illustrates how Ms. S then led a whole class discussion.

Ms. S: What you should have seen is [pointing to the tool], you can’t combine these into just one term [teacher changes the projection view from the software to the workbook, as she writes the right side of the equation: 12.50 + 3n + 1.50n = 12.50 + 4.50n], because that starter deck, does it have a variable?

Student: No.

Ms. S: It’s not a like term, is it? So you can do it to 2 terms. You could have $12.50 – and most of you did this – plus 4.50n, for each additional, because it came with a box. I wanted my cards in a box. I didn’t want them all over the place, and my rubber band keeps breaking, right? But I couldn’t just add them all together.

Excerpt 2 Level 1 teacher interactions during whole class discussion

Similar to her approach with individual students, she did not use the dynamic nature of the DRTs to explain why two expressions were equivalent (or not) based on the structure of the block expressions. Furthermore, she did not orient students’ attention to variance and invariance of term blocks and expression height as n changed. Instead, she quickly moved to the workbook to show the static algebraic form of the expressions and shifted attention to procedures for expressions having like and unlike terms.

4.2 Level 2: representations used to check predictions for correctness without invoking technological scaffolding for concepts

Implementation at this level drew more on the DRTs than at Level 1. Teachers acknowledged that correct or incorrect predictions were acceptable, because the technology could be used to check the predictions. Teachers had students generate ideas and test them on their own using dynamic features of the DRT. However, teachers suggested DRT use in limited ways, such as to elicit the correctness of their predictions only. Further, when it came to explaining they provided hints or explanations that did not draw students’ attention to the mathematical structures that the representations revealed.

4.2.1 Instructional practice around “Predict”

The Predict phase is designed to create an opportunity for students to express an intuition about the mathematical representations or the behavior of the simulation. An important feature of teachers’ practice was communicating the idea that it is acceptable to make a prediction that is not accurate, because the prediction could be checked with the technology.

Excerpt 3 illustrates how a teacher launched a PCE task in the ratio unit by communicating what students were expected to do during a Predict activity. Before this excerpt, Ms. Q asked a student to read the task in the student workbook. Then she introduced the Container, shown earlier in Fig. 1, demonstrated how to work with it, and communicated her expectations of students. Then she introduced predicting as follows:

Ms. Q: Here’s what I want you to do. Remember when you predict something, it’s just a guess. We don’t check our guesses until afterwards. It’s like our hypothesis. You guys are gonna walk through number 3, 4, 5 with your group, so you guys can talk quietly. Remember you’re just right next to each other, so you don’t have to be loud, but when it says Predict, do make a guess. Don’t check it until you get to the Check phase. You are not gonna get into trouble if your prediction is wrong.

Excerpt 3 Level 2 teacher introducing Predict

The teacher communicated that a prediction is a hypothesis that students were going to test with the tool. Also, by stating that a prediction is a “guess”, while less accurate than “hypothesis”, she made explicit to students that “you are not going to get in trouble” for an inaccurate prediction, setting a positive norm for student participation. Often in the mathematics classroom students avoid participation for fear of making a “mistake.” This introduction was instead aligned with the SunBay notion that prediction is expected to be brief and fallible so that any student, regardless of their level of prior knowledge, can engage in the activity and refine their ideas through an iterative process of checking and reasoning with the technology.

4.2.2 Instructional practices around “Check”

During Check, teachers implementing at Level 2 encouraged all students to check their predictions by attending to the representations in the technology. Sometimes teachers prompted students to check using inter-representational relationships, but more often they focused on intra-representational relationships only. For example, in the equivalent ratios unit, students can tell whether two ratios are equivalent or not by looking at whether the markers fall on the same location on the spectrum bar (see Fig. 1). In the algebraic expressions units, students can tell whether two expressions are equivalent by looking at the height of block expressions as variable terms within them are animated.

Excerpt 4 illustrates how a teacher interacted with student groups when they were working with the ratio tool to check their predictions.

Ms. R (with group 1): How did you figure this out?

Student: We compared the spectrum on here.

Ms. R: You used the spectrum. Good. Have it in the simplest form.

Ms. R (with group 2): Those are not equivalent. Look at it. You’re looking at it and they’re not equivalent.

Student: Yeah, I know, that’s what I’m trying to figure out.

Ms. R: The blend bars are not together.

Excerpt 4 Level 2 teacher interactions when checking predictions

The teacher had students check the correctness of their prediction using intra-representational relationships that required less interpretation but did not probe further student thinking for inter-representational relationships such as connections to a container that could yield mathematically important insights.

4.2.3 Instructional practices around “Explain”

While teachers implementing at this level encouraged the use of technology at the Check phase, their Explain phase was similar to Level 1. They tended to ignore the technology or use it in a limited way. They invoked procedures such as “cross multiply”, “simplify fractions’, or applied rules such as “combine like terms” or “use the distributive property” instead of focusing on the structures of the representations.

4.3 Level 3: multiple representations used across PCE focusing on mathematical structure

Implementation at this level showed teachers’ use of DRTs in supporting students to create their own predictions, investigate multiple representations to check predictions, and build explanations by attending to the connections and covariation drawn from these investigations. Instructional practices for Predict and Check were enriched through asking students to consider the inter-representational relationships afforded by multiple representations, in contrast to Level 2. The Explain phase is quite distinct from the prior two levels: Teachers elicit student explanations based on inter-representational relationships, and these observations yield important mathematical insights about mathematical structure and properties.

4.3.1 Instructional practice for predict and check

Teachers at Level 3 set an expectation that predictions need not be correct and should be checked by engaging with the DRT. Teachers demonstrated and encouraged students to investigate the different available representations. Teachers prompted students to use more than one representation to visualize the mathematical ideas behind the numeric or symbolic representations students used in their predictions and then asked for explanations based on inter-representational dynamism, such as how those numeric representations were related to other visual representations.

In lessons from the True Color Mural unit on ratio, students checked the spectrum bar to find out whether their ratio was correct or not. They typically did this by looking at whether the divider in each blend bar fell at the same place in the spectrum. Teachers at this level further asked students to use the container tool dynamically, providing new mathematical insights. The container tool can help students develop an understanding of the multiplicative relationship present in ratios through visually organizing drops of paint in each blend (see Figs. 1b and 4).

4.3.2 Instructional practice for explain

Teachers at Level 3 called for the use of inter-representational relationships and their different mathematical and pedagogical affordances. The following excerpt illustrates Ms. P’s use of inter-representational dynamism to challenge student thinking. Ms. P facilitated a conversation following student work on finding ratios that are equivalent to the ratio 5:7. Two students started with 5 drops of black and 7 drops of white. Then they added some black drops and white drops, until the shade they created looked the same as the shade of 5:7. When the teacher asked them to report on their findings, they claimed that 16:22 was equivalent to 5:7; Note that the ratios are very close in value, hence their shades are similar, but they are not equivalent.

Ms. P: I like how this group is checking the artwork. Why are you checking the artwork, Holly or Ben?

Holly: That way we can try to match it with them.

Ms. P: Okay, they’re matching the color to the other ones, they’re testing it. What other strategies are you using? Ken, what strategy [are you using]?

Ken: The arrows. [referring to the triangular portion of the spectrum bar]

Ms. P: The arrows.

Ken: Line up.

Ms. P: Line up, okay. Ken is checking if his fourth blend lines up with the spectrum. Any other methods of testing? Anyone else? So basically the spectrum and the artwork. Good. Did anyone check the container? Okay, I want you to check the container. Make that first row 5 to 7. See if it works, because remember, we need to know if it’s 5 to 7, if it matches 10 to14. A lot of hands, good job. Who wants to make blend 4? Blend 4.

[Maria comes up to the board and puts ratio 16:22 on the projected app.]

Ms. P: Let’s leave it there, let’s leave it like that. Okay. Alright, let’s look carefully. Let’s look carefully. Okay, look at the triangle [points to the triangle on the spectrum bar]. Maria, look carefully at this triangle. Tell me if it never moves? Let’s look at—ready, what happened to that? [teacher uses the app to go back and forth between 5:7 and 16:22, and draws the class attention to the motion of the triangle on the spectrum bar]

Class: It moved.

Ms. P: It moved. Even though it’s very close, look at the blend. You see how it moved a little bit to the left? So blend 4, it’s a little bit darker. Look, Maria, look how I’m going to be on blend 3 right here. Look at how the blend 4 moves a little bit to the right. So if we’re buying paint, it probably won’t show the difference, but it’s a little tiny bit darker because the spectrum moved to the right, the triangle.

Excerpt 5 Level 3 teacher use of dynamically linked multiple representation, part 1

The teacher used the same representation that the student used, the spectrum bar, in order to check whether two ratios are equivalent. However, as illustrated in Excerpt 6, when an equivalent ratio was offered by another student, the teacher moved beyond how an equivalent ratio was shown on a spectrum bar to orient students’ attention to how the container representation showed the ratios and pressed further to justify the equivalence to connect to using multiple representations.

Ms. P: Now, Jose. Put the numbers how you think, create the same blend exactly so the spectrum triangle does not move.

[Jose comes up to the board and puts ratio 40:56 on the projected app]

Ms. P: Now, let’s test it. What do you think? Raise your hand if you agree that that’s the same color. Look, let’s look at the triangle. It did not move. Let’s look at the artwork. Now, let’s go to the container. Let’s make—in your iPad, you can actually make it smaller.

[…]

Look at blend 1, look at blend 2. 5 black to 7 white. Look at blend 1. Still what do I have in each row? 5 black to 7 white. What do I have on each row for blend 3? What do I have? Jason, on each row?

Jason: 5 and 7.

Excerpt 6 Level 3 teacher use of dynamically linked multiple representation, part 2

After the class verified all the equivalent blends had containers that consist of copies of the row 5 black and 7 white, the teacher facilitated a conversation about the relationship between the ratios from a numerical representation and the ratios presented in other representations.

Excerpt 7 provides another illustration of how a teacher probes student reasoning with the container tool in the ratio unit. During a group discussion where students figured out the ratio of black and white that made the same shade of gray, several students offered their observations using the spectrum bar. After she revoiced students’ explanations from other representations back to the class, Ms. A reoriented student attention to the container. She constructed two blends of paint in the ratios 4:10 and 16:40. Using the container tool, she reorganized the drops of paint using the rectangular arrays shown in Fig. 5. Notice that each row of the container on the right has the same number of black and white drops as found in the one row of the container on the left. That is, each container can be seen as having “the same” mix of colors, while the container on the right has more paint.

Fig. 5
figure 5

Container representation of two equivalent ratios

Ms. A: How does the container show that it’s the same blend?

Student C: If you were to add four of the first container, it will look like the second container.

Ms. A: If you were to add four of the first container, it would look like the second container?

Student C: Uh hum

Ms. A: How?

Student C: Because the second container is just the four of the first container put together.

Ms. A: Okay. So what student C just said is this. The second container, he said that the second container is four of the first container, okay? If the first container has one row, does container 1 look like one row of these? Raise your hand. Does the first container look like one row of this container? Student D?

Student D: Yeah

Ms. A: Yes it does and it’s four of them in here. They match exactly. So how many times as much paint is this one gonna make? What do you think?

Student E: How much…

Ms. A: How many times? How many times as much?

Student E: Four

Ms. A: Four times as much paint. Same shade. Questions?

Excerpt 7 Level 3 teacher interactions during Explain

Ms. A selected the container as the focus of the whole class discussion and connected it to different strategies students used. She then used a series of probing questions about the structure of the containers representing the two ratios, and therefore the reason why the ratios are equivalent. Unlike teachers at the previous level, who relied on the spectrum bar for checking correctness, Ms. A also used the container representation to explain mathematical structures and deepen students’ understanding of multiplicative relationship underlying ratio.

5 Discussion

Our analysis shows that, while teachers consistently used the SunBay technology, when it came to using multiple representations found in the DRTs teachers varied in their instrumentalisation: the levels described differ not only in tool use, but in pedagogical approach and the type of opportunities for students acquiring procedural knowledge or developing conceptual understanding.

The findings suggest that PCE, a coherent and consistent routine throughout the SunBay materials, can provide a structure for teachers to instrumentalise technology as a mathematical tool in their teaching, but current amounts and forms of PD may not be sufficient for supporting widespread use of best practices with dynamic technologies. The SunBay PD strategy was to support teachers’ learning of the PCE routine through PD in two ways: as learner, then as teacher, and then tie them both together. This PD strategy was found helpful in strengthening the multiple relationships between technology, content, and pedagogy. However, to help teachers go beyond the first two levels of implementation, the authors hypothesize that teachers require extended experience as learners to develop new understanding from interacting with dynamic technology. Further work as learners could allow teachers to better understand new ways of doing and expressing mathematics, and notice student thinking as they interact with technology, therefore developing both rationale and methods for fully using the dynamic representations to deepen student learning.

In what follows we explore potential underlying factors for teachers working at different levels. It was not possible to examine these factors using the data from this study, but such an examination could be the subject of future studies and would certainly be important for those who conduct PD that promotes the use of DRTs, particularly for teachers at the middle grades.

One potential factor for the observed teacher differences was teacher belief. As designers of the DRTs and curriculum with the embedded PCE routine we saw how challenging it was for many teachers to fully embrace the approach. We know that teachers’ beliefs and goals influence their instruction (Charalambous, 2015; Fennema et al., 1996), and we can imagine ways in which these may have influenced teachers’ decisions on how to use the DRTs. Some teachers may have seen themselves as effective at teaching the procedures needed to demonstrate proficiency on standardized tests and did not want to abandon those techniques while using SunBay Math—these teachers were likely at Level 1. Some teachers may have had conceptual goals but weren’t able to use the technology to support achieving those goals. These teachers, likely at Level 2, often trod well-worn pedagogical paths such as using funneling questions, which led students to produce the answers the teacher expected. Teachers may have (at least implicitly) believed that what we call ‘funneling questions” are appropriate in-the-moment scaffolding through which students achieve desired learning goals.

Another challenge was teachers’ understanding of the connections between the tools and the mathematical content of the curriculum units. It is likely that the PD was insufficient for teachers to develop this understanding. Time was one constraint, but also, in our experience, teachers often believe that they should have already mastered the mathematics content. As a result, in the PD context they may not want to show that they have more to learn, which may prevent them from delving deep on the mathematical affordances of the DRTs. Creating an environment where teachers can feel supported in displaying what they do not know about content requires setting norms and developing trust, both of which take time and require activities explicitly addressing them (Shechtman et al., 2019). If teachers do not develop deep knowledge of the content and its relationship to DRTs as a foundation, it is unlikely they will be able to use new pedagogies that use the DRTs effectively, and particularly that use inter-representational dynamism effectively.

Even for teachers who had or developed these foundational understandings, there is the challenge of teachers learning new pedagogies. Research suggests that a practice-based approach can be productive for this (Grossman et al., 2009), and while the PD model did allow time for addressing pedagogy, there may not have been enough time for the kind of repeated engagement in teaching practices that would allow for deep and meaningful changes. This is consistent with research that has found that teachers need time to develop a rich repertoire of practice, without which, teachers may revert to pre-existing teaching methods (Clark-Wilson & Hoyles, 2019). An implication is that an individual teacher may not consistently perform at the same level, as performance at a particular level depends on the mathematics being taught, the pedagogical practices they are comfortable with, and the tools being used in the classroom.

Effectively instrumentalising dynamic representational technologies and a PCE pedagogy requires deepening teachers’ understanding in technology, content, and pedagogy, and the interactions between them. The examples shared illustrate that the PCE routine structures classroom activity for both students and teachers. Classroom norms were created that streamlined the work with technology and determined the role of students and teacher, times when technology is to be used, when it should not be used, and the ways in which technology should be used. Use of PCE can provide students the opportunity to develop conceptual understanding by investigating multiple representations and it structures the work for the teacher by integrating technology, content, and pedagogy.

6 Conclusion

This study explored ways in which middle school teachers and students engaged with technology-based dynamically linked multiple representations as they investigated important mathematics concepts. We attempted to answer the question: How did participating teachers use the technology of dynamically linked multiple representations and the curriculum’s recommended pedagogical strategies to support student learning? The results of our analysis point to three main levels of use of dynamic representations in teaching:

  1. 1.

    Teachers exploited technology minimally and instead focused on procedures independent of the representations.

  2. 2.

    Teachers exploited representations in the technology mostly to check predictions for correctness without invoking technological scaffolding for concepts.

  3. 3.

    Teachers exploited multiple representations in the technology during prediction, checking, and explanations focusing on mathematical structure.

We attribute the lack of use of dynamic representations to teachers’ knowledge of the mathematical affordances of the DRTs, and the extent of their authentic adoption of PCE. We also speculate on the role of teachers’ beliefs and goals. The dynamic multiple representations carry meaning beyond what has been traditionally required in mathematics classrooms in the United States, and these must be learned and embraced by teachers as means to deepen students’ conceptual understanding. This extended work should be part of ongoing teacher PD to support teachers’ learning.

While our results are preliminary and based on the limited data, it illuminates the complexity and depth of teacher knowledge required to be able to successfully exploit dynamic and representation-based features of technology to support students’ conceptual understanding. It would be worthwhile to investigate relationships between the patterns found in the study and other teacher’s characteristics such as mathematical knowledge for teaching (Ball, Hill, & Bass, 2005) or beliefs about teaching and students with a larger sample. Furthermore, since teachers in the study used specific curricular tasks such as PCE, the patterns might not be generalizable to teachers who use dynamic tech with less structured tasks. Further studies are needed to explore how teachers exploit dynamic multiple representation technology combined with different curricular tasks.