1 Introduction

Over two decades of research have established that computers, on their own, do little or nothing to change the nature of learning (Penuel 2006). However, the use of technology is exciting when technologies are integrated into classrooms in ways that support teachers’ ideas about and knowledge of technology and the particular content area that is being taught (Mishra and Koehler 2006). Digital games in particular have demonstrated potential for supporting student learning across disciplines (Gresalfi 2015; Barab et al. 2007; Pareto et al. 2011; Squire 2006). The diversity of game designs makes it challenging to pinpoint exactly why games support learning. But much has been said about the potential of games to motivate and capture student attention (Dickey 2007; Garris et al. 2002; Lepper and Malone 1987), to situate disciplinary learning in realistic contexts (Barab et al. 2005; Clarke and Dede 2009), and to offer consistent and substantive feedback about reasoning (Gresalfi 2015; Mayer and Johnson 2010; Nelson 2007; Rieber 1996).

Despite their potential, integrating digital games into instruction also creates new challenges for teachers and requires a shift in instructional practices; the new technology cannot simply be substituted for past practices. One challenge is that students usually play digital games individually and reach different points in the game at different times, making student progress a challenge to monitor and whole-class conversations difficult to structure. Additionally, teachers are often unsure about how to support students to share their thinking without the traditional artifacts of worksheets or overhead projectors. As a consequence, the mismatch between current pedagogical practice and the practices afforded (or demanded) by new technologies creates barriers to integration into classrooms (Ertmer 2005; Straub 2009). Integrating digital games into schools is not simply a matter of making the tools available (Ertmer et al. 2012, 2014; Takeuchi and Vaala 2014). How and when games are used in relation to other instruction, the role that teachers take as they are playing the game, and how the game is integrated into the overall classroom ecology all play a role in whether and what students ultimately learn.

Indeed, research that simply examines the “efficacy” of games could miss out on the potential of games to transform the overall classroom learning ecology. While we do know teachers are using games, there are few studies about the integration of games into instruction. A recent survey of teachers (Fishman et al. 2014) found that 57% of survey respondents used games in their classrooms at least once a week, and over 80% say they are moderately comfortable using games in their classrooms. But teachers also reported many barriers to implementing digital games, including the challenge of finding games that connect to the school’s curriculum (47% of respondents), and being unsure about how to integrate games into instruction (33%). However, what game integration looks like for these teachers and how using games affects teaching practice remains unclear. Therefore, rather than examining the fidelity of game implementation, part of the goal of the current study is to explore what teaching using videogames looks like when teachers choose how to integrate games into their classrooms.

One factor that can contribute to the integration of games in classrooms is teachers’ experience with the games. More knowledge of and experience with a technological innovation contributes to successful implementation and affects teacher practice in a number of ways (Ertmer and Ottenbreit-Leftwich 2010; Ertmer et al. 2006; Mumtaz 2000; Sheingold and Hadley 1990). For instance, more familiarity with a technology gives teachers a sense of what to expect when using the tool in a classroom, which reduces teachers’ anxieties during implementations. Experience with a technology potentially reduces the stress caused by unexpected technical issues as well. As with any new classroom technology, the more teachers use it, the more they understand how students interact with the technology and what aspects are difficult for students to understand. Knowing how students use the technology can lead to more organized and focused classroom discussions based on students’ needs.

This paper will contribute to our emergent understanding of what teaching using videogames can look like, focusing on a specific example of a videogame that was designed to incorporate teacher-student interactions, rather than to replace instruction. In this context, the teacher’s role is central to implementing the game successfully. The game that is the focus of this study is called Boone’s Meadow, an interactive problem solving experience that involves using mathematical ideas of ratio and proportion, important and difficult concepts for middle school students to understand. We explore how one teacher uses the game across 2 years, and examine how the teacher’s role in supporting problem solving during gameplay changes as the teacher gains experience with the technology. Specifically, we ask:

  1. 1.

    How does the teacher’s support of students’ mathematical thinking change as the teacher gains experience with the game?

  2. 2.

    Who has the mathematical agency to solve problems, and does that change as the teacher gains experience with the game?

  3. 3.

    How do the teacher’s interactions with students around the narrative of the game change as the teacher gains experience with the game?

2 The Game

The game students played in this study is called Boone’s Meadow. The game includes a problem solving adventure that leverages concepts of ratio and proportion, and builds on a storyline from a project-based mathematics activity from the Adventures of Jasper Woodbury, called “Adventure at Boone’s Meadow” (Cognition and Technology Group at Vanderbilt 1997). We leveraged this activity for the game in part because of the history of research and development that had gone into the original project-based learning unit (Bransford et al. 2000; CTGV 1997; Van Haneghan et al. 1992; Van Haneghan and Stofflett 1995), suggesting its effectiveness at supporting problem solving and learning. In adapting the storyline for the richer affordances of the interactive game, we made modifications that were more consistent with game conventions, including adding more choice points and therefore different possible outcomes.

The game begins when students are told that an endangered eagle has been shot in Boone’s Meadow—a place that cannot be reached by car and takes 6 h to hike by foot. Figure 1 shows a picture of the map of the game world with the veterinary clinic, the gas station, and Boone’s Meadow as the three main points. In exploring the problem and resources, players meet three characters who own different ultralight flying machines, which can fly at different maximum speeds, operate with different fuel efficiencies, hold different amounts of gas, and can carry different weights. Students must decide which route to take, which plane to fly, the length and time of the journey, how much gasoline will be required (and where to stop to get it), who will pilot the plane, and whether any additional cargo is necessary (or feasible) given the weight limit of the small aircraft. The problem that students are solving is rich and complex, in that they need to determine what information is relevant and necessary to solve the problem, and, once they have determined this, they must use the information in order to make a final determination of which plane and route is best, how long the trip will take, and how much gas they will need. Figure 2 shows the Route Planning Tool where players calculate and input their decisions. Players have two attempts to save the eagle, and they can use the second try to find a more optimal solution (taking less time, using less gas, and spending less money) or test another route.

Fig. 1
figure 1

The map of the game world in Boone’s meadow

Fig. 2
figure 2

The route planning tool where players plan and calculate how to save the eagle in Boone’s meadow

The rationale for our design came from a commitment to seeing learning as participation in a set of practices; because our goal is to empower learners to become agentic problem solvers, we design games that create opportunities for students to see and experience the world in that way (c.f. Greeno and Gresalfi 2008; Barab et al. 2010). For example, it is one thing to be able to generate a proportional ratio (such as 1:8 = x:1); it is quite another to use that idea to figure out how much gas it would take to travel a particular distance (if your car gets 8 miles per gallon, how much gas would you use to travel one mile?). Although the calculations are the same, the activity fundamentally transforms the mathematics from a mere calculation to a legitimate problem solving task (Greeno 1991; Boaler 2002; Lave et al. 1984). Thus, central to our work is designing game-based learning environments where what you know, what you do, and who you become are interrelated (Barab et al. 2010).

A typical implementation of the game usually takes between 3 and 5 classroom periods to complete, which includes both time when students are playing the game, and times when students are discussing the game either in small groups or in whole class discussions. An implementation begins with a letter overviewing the game to the students and inviting them to play. The teacher and the students review the letter to ensure that they understand the purpose of the game. Students then move to computers to play the game. Students were allowed to talk to each other while they played, and there was often quite a bit of chatter and laughter while students played. Each day of game play usually started with a whole class discussion reviewing what students knew about the narrative of the game and a review focused on the mathematics relevant to that day’s game play.

3 Methods

3.1 Setting/Participants

This paper focuses on Ms. Lynn (pseudonym), a 7th grade mathematics teacher who, at the time of the study, had 7 years of teaching experience. Year 1 of the study was her first time using Boone’s Meadow, and she had very little experience using games of any kind in class. The middle school in which Ms. Lynn worked was ethnically diverse, served primarily a low-income community (92% free and reduced lunch), and enrolled many student who did not speak English as a first language (30% English language learners). The school is located in a medium-sized city in the Southeastern United Sates. Ms. Lynn used the game in her classroom for 4 days during the fall of years 1 and 2. In year 1, Ms. Lynn’s focal class included 29 students, and in year 2, 32 students.

Ms. Lynn participated in a one-day professional development session held by the researchers the summer before she implemented the game for the first time. The PD session overviewed students thinking about ratio, what teachers wanted students to learn from the game, the problems students solve in the game, and how teachers might fit the game into their instruction. Teachers in the PD session had an opportunity to play through the game and discuss their plans for implementation.

Ms. Lynn also received a set of teacher materials, which the research team designed. The teacher materials included pacing suggestions, such as which “missions” in the game students should complete each day, questions to guide whole class discussions before and after gameplay (involving ratio, components of the narrative, and the use of tools in the game), suggestions for thinking conceptually about ratio, and supplemental mathematics problems to discuss rates, ratio, and proportion with students. Ms. Lynn used the example problems from the teacher guide quite frequently, but she adjusted the pacing and discussions to meet her students’ needs.

3.2 Data Collection

Each day during the game implementation, a camera was set up in the back of the room to capture the teacher’s talk and actions. A researcher panned and zoomed the camera to follow Ms. Lynn’s movements. The entire class period on game days was recorded, so even if students only played the game for 10 min, the entire 50–140 min class period was recorded (length of classes varied because of modified schedules and time students spent traveling and settling in between classes). Although an analysis of the whole class discussions before and after gameplay time might show additional changes in the teacher’s practice between years 1 and 2, for this paper we were interested in how changes in the teacher’s actions during gameplay supported students’ problem solving around the game. Therefore, our analyses focused specifically on individual teacher-student interactions while students were actually playing the game.

Students were also given pre-and post-tests to check their understandings of ratio and proportion. Below we detail how the assessments were developed. While data was collected on three class periods, we chose to analyze Ms. Lynn’s first period class for this paper because it was the class with the highest pre to post change both years. Since research already tells us about the difficulties of using games in classrooms, we wanted to focus on the classes that demonstrated the most learning gains in order to talk about what worked successfully. We interviewed Ms. Lynn informally after the game implementations both years, and we used her responses to triangulate our findings.

3.3 Analysis

3.3.1 Assessments

The assessments that were used for this project were developed by a team of Mathematics Education faculty and Ph.D. students, drawing on example items from (Lamon 2012; Lobato et al. 2010; Schwartz et al. 2011). The assessment was then vetted by teachers during the PD session, who offered feedback about the wording of the items and their relation to ratio and proportion as it was taught in their schools. The assessment ranged from procedural items (generate an equivalent ratio), to application items (for example, comparing relative rates), to complex problem solving items that were aligned with the problems in the game. A team of four researchers developed a system for scoring the pre-and post-tests. Most questions were scored on a scale of 0–2, with 0 assigned for totally incorrect or no answer, 1 assigned for correct procedures or evidence of thinking but some sort of error in the answer (either a calculational error or missing labels so it was not clear what their numbers referred to), and 2 being completely accurate. Using the pre-and posttest scores, we calculated the average pre-to post-test change for each class. Two researchers scored all assessments, and instances of uncertainty were discussed until agreement was reached. We used these results to determine focal classes for further analysis.

3.3.2 Videos

To answer our questions about how the teacher implemented Boone’s Meadow in her classroom, we transcribed the talk from the teacher videos from the focal classes in years 1 and 2. With our research team, we watched the teacher videos several times along with the transcripts to identify major themes around how the teacher supported students’ mathematical problem solving during gameplay. This helped us begin to identify codes that we could explore more deeply with each research question to determine what Ms. Lynn did differently during gameplay in years 1 and 2.

For our first research question, we asked about how the teacher supported mathematical reasoning during gameplay. To answer this question, we analyzed the talk in the interactions between the teacher and students during gameplay. The teacher’s talk was coded by utterance, defined as a turn of talk (a switch in speakers) or a change in the person the teacher was addressing (if Ms. Lynn said something to student A and then said something else to student B, that was counted as two separate utterances). Drawing on Gresalfi and Barab’s (2011) work, we distinguished between four different types of mathematical engagement: (1) procedural—following procedures correctly, (2) conceptual–conceptual understandings of procedures or ideas, (3) consequential—examining how the procedures used relate to the outcomes, and (4) critical—questioning why one procedure should be used over another. Four researchers coded all the transcripts, and instances of uncertainty were discussed until agreement was reached.

Our second research question asked about the agency to solve problems in the game. To answer this question, we coded the gameplay transcripts by utterance again for teacher agency or student agency. We thought about agency by asking who has the ability to decide how to approach a mathematics problem, which procedures to use, and what to do. We counted an utterance as teacher agency if the teacher gave an answer or specified a procedure to follow, meaning the teacher was the person who solved the problem. We coded an utterance as student agency if the teacher asked an open question that gave the student an opportunity to decide how to approach the problem and which procedures to use. Four researchers coded all the transcripts again, and uncertainties were discussed until we all reached agreement.

For our third research question, we asked about how the teacher interacted with students around the narrative of the game, since the narrative provided the main source of feedback for students’ problem solving in the game. We looked at teacher talk during gameplay and whole class discussions and coded the teacher’s utterances for narrative immersion, or when the teacher explicitly made connections to the game world in her interactions with students. We also looked for instances when the teacher interacted with students around saving or killing the eagle, since those are the major narrative outcomes of the problems in Boone’s Meadow.

4 Findings

We first analyzed the pre-and post-test changes for Ms. Lynn’s classes from both years 1 and 2. Both years showed significant pre-to post-test change, with a larger gain occurring in the first year of implementation (paired t test, p < 0.004 in year 1 and p < 0.04 in year 2). Although Ms. Lynn devoted four days to the Boone’s Meadow unit in both years, there was more instructional time devoted to the activities in year 2, taking less time to transition between classes or talk about other school issues unrelated to the content of the game. Thus, in year 2 there was both more time for class discussions and significantly more gameplay time than seen in year 1 (see Fig. 3).

Fig. 3
figure 3

Graph of discussion versus gameplay time over all days of gameplay in years 1 and 2

4.1 Question 1: Supporting Mathematical Thinking

Ms. Lynn’s support for students’ mathematical engagement shifted in years 1 and 2. In year 1, most of the mathematical talk occurred during whole class discussions, while in year 2, the mathematical talk largely happened while students were actually playing the game. In both years, Ms. Lynn did not provide many opportunities for students to engage conceptually, consequentially, or critically in the mathematics of the game, with less than 30 utterances in each of those categories. However, Ms. Lynn frequently engaged procedurally in the mathematics around the game with her students (293 of Ms. Lynn’s utterances in year 1 and 306 utterances in year 2 presented opportunities for students’ procedural mathematical engagement). Figure 4 shows the number of utterances coded as procedural engagement in years 1 and 2, separated according to whether the utterance occurred during discussion time or gameplay time. While the overall counts of utterances are similar, more of the mathematical engagement occurred during gameplay time (and less in discussion time) in year 2.

Fig. 4
figure 4

Graph of number of utterances coded as procedural engagement during discussions versus gameplay in years 1 and 2

4.2 Question 2: Problem Solving Agency

Ms. Lynn had about the same number of mathematical problem solving utterances coded for agency in years 1 and 2 (Fig. 5). However, she gave more problem solving agency to the students in year 2, meaning she let the students initiate the procedures they used to solve problems. When agency was distributed to the teacher, Ms. Lynn scaffolded students’ mathematical thinking. Specifically, when students asked for help, Ms. Lynn responded by asking open questions that allowed students to think about their own solutions. However, if confusion continued, Ms. Lynn scaffolded students’ problem solving by slowly taking back some of the agency and narrowing the question. The following exchange from year 1 exemplifies the distribution of teacher and student agency in Ms. Lynn’s interactions with students during gameplayFootnote 1:

Fig. 5
figure 5

Graph of number of utterances coded as teacher or student agency during gameplay in years 1 and 2

  1. 1.

    Student1: Is this how fast they go? Right here?

  2. 2.

    Lynn: Mm hmm. Oh, and it says it right here. So first, you gotta figure out how much gas you’re gonna use. Then, you’re gonna figure out how much time it will take.

  3. 3.

    Student2: Any number I want. I just put it in.

  4. 4.

    Lynn: Okay. So, 65 miles.

  5. 5.

    Student2: Yes.

  6. 6.

    Lynn: And you get 8 miles per gallon. So, do the calculations like we did on the math review. [crouching behind S2, points to screen] Okay. So, you’re going 65 miles or you did, 65 miles is how, how far you’re going. And, uh, I’m sorry, you’re going 65 miles, not 60. Right? Yeah. You got it. … Okay. Do you see what you’re doing?

  7. 7.

    Student1: This is confusing.

  8. 8.

    Lynn: Okay. So, you’re going 60 miles. And you get 7 miles per gallon. So, just like we did on our warm-up, you need to figure out how many gallons you need.

  9. 9.

    Student1: You gotta divide don’t you?

  10. 10.

    Lynn: That’ll work, yeah. [seems like she’s looking at what S1 does on a calculator then reads it aloud] Seven remainder two. Don’t forget your two. Don’t forget your remainder two.

  11. 11.

    Student1: Okay.

  12. 12.

    Lynn: Two. Two-eighths. What does two-eighths simplify to? Seven and two-eighths? Two and eight have a common factor of [walks to another student].

In the exchange above, Ms. Lynn helped two students think about fuel usage and time traveled for one of the ultralights along the route the students chose. This is an example of Ms. Lynn giving students agency, then taking back some of the agency to scaffold their problem solving if students continued to struggle. In line 2, Ms. Lynn oriented the students towards calculating fuel used and time, but she did not tell them how to find the answers. She let the students think about how to solve the problem, giving agency to the students. However, the student’s response of “any number I want” seemed to indicate that they needed more help thinking about the problem, so Ms. Lynn pointed out what numbers they should pay attention to and reminded them of a similar problem in a warm-up activity that morning. Ms. Lynn’s responses in lines 6 and 8 specified some information students could think about to help solve the problem, but she still gave them the agency to come up with their own procedures and answers. In fact, in line 9, Student 1 suggested using the division procedure. At that point, Ms. Lynn took back some of the agency to help her students calculate precise answers in lines 10–12. During this whole exchange, Ms. Lynn scaffolded students’ problem solving by first giving students the agency to solve the problems, then specifying some elements to pay attention to when students seemed confused, and finally helping students calculate precise answers using the procedure the students suggested (division).

4.3 Question 3: Supporting Narrative Engagement

Like the mathematical thinking in question one, the amount of teacher talk supporting narrative engagement by explicitly making connections to the game world increased during gameplay time in year 2 (see Fig. 6). These increases in math talk and game talk also reflect the overall increase in gameplay time in year 2. The number of episodes in which Ms. Lynn interacted with students around saving or killing the eagle, the major narrative outcomes of the game, also increased in year 2. In year 1, Ms. Lynn had 11 interactions with students about their outcomes during gameplay, but in year 2, Ms. Lynn had 29 interactions with students about their eagle outcomes during gameplay.

Fig. 6
figure 6

Graph of number of utterances coded as narrative immersion during discussions versus gameplay in years 1 and 2

5 Discussion and Conclusions

In our discussion, we relate the findings to our interviews with Ms. Lynn about her experience implementing the game in her classroom. Our original question was about how experience implementing the digital game more than once affected teacher practice. We operationalized teacher practice by focusing on the nature of the teacher’s talk about mathematics, the agency to solve problems, and the nature of the teacher’s talk about the narrative of the game. We examined changes in Ms. Lynn’s math and game talk during discussion and gameplay time in her first 2 years of implementing Boone’s Meadow.

In our informal interview with Ms. Lynn after her second year of using the game in her classroom, she reported on her thoughts about the implementations. Ms. Lynn felt like the second year with the game went much better than the first year, specifically because she felt that students were much more engaged in the game and the mathematics in year 2. “Most of the days I am on the whole time. I am helping a lot, I am talking a lot, and I feel like they are not doing enough of the thinking. And I felt like that was switched, they were thinking the entire time. And that’s one thing that I liked this year versus last year…they would come in and they were working and trying to understand and very rarely did they need an adult.”

However, there were actually quite a few similarities between years 1 and 2 with the game. The ratio of game time to discussion time was similar for both years, but with more of each in year 2. The amount of talk the teacher devoted to mathematical engagement was almost the same across both years, with lots of procedural engagement and very little conceptual, consequential, and critical engagement in the mathematics. We found that the amount of behavioral management and talk related to technical issues also remained largely unchanged.

The biggest differences in teacher talk between years 1 and 2 can be seen in the context of the talk, that is, whether it occurred during class discussions or gameplay time. While the total amount of procedural mathematical engagement remained the same, much more of that discourse took place while students were actually playing the game in year 2. Ms. Lynn’s reflection about the game experience was consistent with this finding; “I do think that them having to think through the mathematics in order to save the eagle, that made them really want to get it right…. but it’s almost like [the math] is instinctively there but they didn’t even process that’s what was happening.” This offers some insight into that change; while Ms. Lynn knew the context of the game better, having gone through it the prior year, and valued the ways that the game framed students’ mathematical engagement, she also worried that they were not thinking explicitly about their mathematical work. Ms. Lynn’s experience with the game also allowed her to give more mathematical agency to her students during year 2. She still scaffolded their problem solving when students struggled, but she allowed students to explore more of their own solutions first.

Ms. Lynn also engaged her students in much more narrative immersion during gameplay in year 2. That is, she talked more about the details of the game that framed students’ mathematical engagement. This might be due to her increased familiarity with the game, or with her increased value of the narrative, which she felt created an important context for her students’ thinking. Overall, Ms. Lynn allowed much more of the mathematical and immersive game talk to occur around students’ actual gameplay experiences. We believe this shift in the context of discourse reflects Ms. Lynn’s increased experience with integrating the technology into her classroom. Ms. Lynn was also able to have more interactions with her students around the major narrative outcomes of the game in year 2. The teacher was clearly more comfortable and familiar with the game during the second year, so she was able to use students’ gameplay time more productively, through mathematical engagement and narrative connections.

Despite the increases in Ms. Lynn’s interactions with students during her second year of using the game, the pre to post-test gain was greater in year 1 than year 2. While we do not know what caused the differences in learning gains, we conjecture this difference is due in part to the changes we made to the game in year 2. The tool where most of the mathematical problem solving occurs within the Boone’s Meadow game, including planning the route, picking a plane, and calculating fuel used, time traveled, and payload, is called the Route Planning Tool (see Fig. 2). In year 1, the Route Planning Tool included a button labeled Formula Help, which displayed formulas for calculating fuel and time, such as “Time = Distance/Speed” and “Fuel Used = Distance/Fuel Efficiency.” In year 2, the Formula Help option was removed and a new Ratio Tool was added because we found that students were focusing on memorizing the procedures of the formulas rather than developing a conceptual understanding of ratio. The introduction of the Ratio Tool allowed students to find answers to problems in the game more easily, but it also encouraged them to look for patterns rather than calculating ratios (Gresalfi and Barnes 2016). That is, the procedural skills students developed in year 1 transferred more readily to the questions on the tests than the pattern recognition skills students focused on in year 2, which might be why we saw a higher pre to post-test gain in year 1. These results point to the need for improvements to the assessments. Specifically, for future iterations of the design, the assessments should include questions that probe for conceptual understandings of ratio along with more open-ended problem-solving items that allow students to solve ratio problems without the need for memorization of procedures.

The case we examined in this paper was unique in a number of ways. First, the students in Ms. Lynn’s class, in both year 1 and year 2, were far below grade level in their mathematics achievement; most students relied on a multiplication chart to recall basic multiplication facts, and seemed to have developed very little multiplicative reasoning whatsoever. This might help to account for what seems like an inordinate amount of time spent on procedural engagement. However, in that context, it is an interesting shift that the teacher saved her procedural mathematics talk to the times when students were engaged in the game, rather than “pre-teaching” before game play commenced. This case was also unusual in the amount of commitment the teacher had to providing her students with an opportunity to engage in mathematical problem solving in this environment. This commitment could be seen in the teacher’s drive to learn about the game and make sure her students were making sense of the mathematics in the game. She saw the game play as an important and unusual opportunity for her students, despite comments she received from the mathematics coordinator that the game had caused her class to fall behind: “…to me, I feel like the experience is so valuable, that it is worth the time, and we will skip something else that is less valuable.”

Findings from this study highlight one of the many factors that influence teaching with videogames. In this case, experience using the game clearly impacted the teacher’s ability to support mathematical thinking during the game, her ability to make connections to the narrative of the game, and her skill with giving more problem-solving agency over to the students. We know from our observations and from Ms. Lynn’s interviews that students were much more engaged in year 2 and spent more time thinking carefully about the problems in the game. Integrating games into classrooms requires teachers to shift their instructional practices, which is not an easy task. These findings suggest that educational game designers should consider how to support teachers, especially as teachers may have different experience levels with the game, which affects teacher practice and ultimately, student learning. This may seem obvious, but given the present lack of research focused on teacher practice with games, we stress the need for a better understanding of how teachers actually use games in their classrooms, rather than just the number and types of games teachers use. In explanations of what and how students learn from educational games, the teacher, and the teacher’s familiarity with the game, clearly plays an important role in what students learn.

Furthermore, even when teachers receive similar resources and training with a digital game, they may implement the same game very differently in their classrooms (Bell and Gresalfi in press). Therefore, game designers must not only include teacher materials but craft the materials to be adaptable to teachers’ needs and experiences. In Ms. Lynn’s first year with the game, it was helpful for her to have materials explaining how long the game would take, the pacing of the game, when to have discussions, and sample questions to ask. However, given that Ms. Lynn increased her interactions with students during gameplay time in her second year with the game, Ms. Lynn would have benefited from supplemental teacher materials providing suggestions for how to support students’ engagement specifically during gameplay time rather than just during whole class discussions, such as questions to probe for conceptual, consequential, and critical mathematical engagement. We have recently added to the teacher materials, providing specific examples of different ways teachers can support mathematical engagement during one-on-one interactions with students during gameplay, which we will analyze and refine further as we continue to test iterations of the design.