Keywords

1 Introduction

The increasing emphasis on practice-based teacher education in the United States has resulted in a focus on assessments that provide information about pre-service teachers’ abilities to actually do the core tasks of teaching. This means combining instructional techniques and skills together with complex specialized knowledge of the content and insights into students’ thinking and development. Such assessments match the new practice-focused learning goals of teacher education. Research suggests that specific feedback about practice increases pre-service teachers’ ability to use feedback to improve their practice (Grossman 2010).

Many approaches to assessment have focused on appraising pre-service teachers in real contexts of practice, such as in field placements and during student teaching. They have included microteaching, field-based performance tasks, and systematic field observation of lessons (e.g., Hammerness et al. 2005; Elliott 2003). Observation tools have been developed (e.g., Danielson 2007) and portfolios (e.g., Darling-Hammond and Pecheone 2010) have been used as means to gather information about teachers’ skills.

A more recent addition has been the use of simulations to assess pre-service teachers’ developing skills. Simulations are used in many other professional fields as a means to assess skill with the practices of the profession. For example, in many medical schools, doctors in training engage in simulations of physical examinations, patient counseling, and medical history taking by interacting with “standardized patients,” adults who are trained to act as patients who have specified characteristics. Evaluation of medical students’ interactions with standardized patients makes possible common and sustainable appraisal of candidates’ knowledge and skills (Boulet et al. 2009). Simulations have not been widely used in education in the U.S., but there is growing interest in their usage for learning (Dieker et al. 2014; Dotger 2015) and assessment (Shaughnessy and Boerst, 2017). Although the use of simulations in education may provoke skepticism, simulations address challenges inherent in field-based assessments, provide a sustainable and fair way to assess pre-service teachers’ knowledge and capabilities, and offer a complement to other forms of assessment in which contextual variables impact implementation and in turn affect ability to assess pre-service teachers’ skills (Shaughnessy et al. accepted). Here, we focus on simulations as a means of learning about pre-service teachers’ developing capabilities.

This chapter aims to advance work on assessments of teaching practice in teacher education by focusing on the design of simulation assessments to appraise pre-service teachers’ developing capabilities. We focus on the teaching practices of eliciting student thinking and interpreting student thinking. Eliciting student thinking makes the nature of students’ current knowledge available to the teacher. This is essential for engaging students’ preconceptions and building on their existing knowledge in instruction (Bransford et al. 2000). Interpretation is just as crucial because teachers must be able to comprehend students’ ideas and their implications for subsequent teaching. Eliciting and interpreting are foundational skills for formative assessment, which has been shown to substantially impact student learning (Wiliam 2010). There has been much recent attention to developing skill in this area (e.g., Gupta et al., this volume).

Although many teaching practices could be examined, we believe that eliciting and interpreting student thinking are particularly important foci because what students think is foundational to teaching. Skilled teaching builds on and is responsive to students’ understandings. Second, these practices are foundational to many other teaching practices (e.g., skillfully leading a discussion is dependent upon being able to elicit student thinking). Based on our experience designing and using simulations, we will show the potential of such an assessment to evoke, document, and appraise pre-service teachers’ skills and the design decisions entailed in developing such an assessment. Throughout, we use pre-service teacher to refer to individuals who are enrolled in a teacher preparation program and student to refer to children in elementary school classrooms.

2 A Simulation Assessment for Assessing Teachers’ Capabilities

In our assessment, pre-service teachers engage in three stages of work. First, pre-service teachers are provided with student work on a problem and given 10 min to prepare for an interaction. The task for the pre-service teacher during the interaction is to determine the process the student is using to solve the problem and the student’s understanding of the core mathematical ideas involved in the process.

Second, pre-service teachers interact with a “student.” The role of the “student” is carried out by a teacher educator whose words and actions are guided by a detailed profile of a particular student’s thinking and rules that govern this student’s interactional norms. To ensure standardization of the role, the “student” is trained to follow the highly specified rules for reasoning and responding, including responses to questions that are commonly asked by pre-service teachers. Pre-service teachers have five minutes to interact with the “student,” eliciting and probing the “student’s” thinking to understand the steps she took, why she performed particular steps, and her understanding of the key mathematical ideas involved.

In the third part, pre-service teachers respond verbally to a set of questions that are designed to probe their interpretations of the “student’s” process and understanding and their prediction about the “student’s” performance on a similar problem. The assessment takes approximately 25 min and is scored in the moment based on criteria for proficient performance, including mathematically and pedagogically key aspects.

3 Considerations in the Design of Simulation Assessment

Three considerations guide our design of the simulation assessment. First, we must identify and articulate the focus of an assessment. That is, to elaborate the teaching practice that we are appraising (e.g., eliciting student thinking) through a decomposition of the practice (Grossman et al. 2009). The decomposition reflects what it means to “do” this aspect of teaching. Our approach to decomposition starts with identifying requisite parts of the focal teaching practice. Importantly, the goal is to determine a set of techniques associated with the practice that can be taught to novices and appraised (Boerst et al. 2011). For example, eliciting student thinking is a teaching practice whereas formulating a question to pose to a student is one of a set of techniques that are implied in the more complex practice of eliciting student thinking. The work of decomposing a teaching practice is influenced by the work of Cohen et al. (2003) who depict the work of teaching as interactions with students and content in learning environments. In this view, teachers must integrate simultaneous and flexible attention to content, and to students as they engage with that content, in contexts that influence the nature of the work.

Second, we consider the assessment situation. Because we seek evidence of capabilities with teaching practice, assessment situations must be designed to prompt and document the teaching skills of teachers. The mathematical knowledge for teaching (Ball et al. 2008) entailed in the situation must be carefully considered as a part of this design work. In other words, we must design situations which allow pre-service teachers to demonstrate their capabilities with teaching practice in connection with content that students learn and use. Further, our design must create residue of interactive teaching practice that might otherwise be fleeting or unavailable.

Third, teacher education assessments requires assessors to make inferences based on things that pre-service teachers say, do, or make to hypothesize about what they know or can do more generally (Mislevy et al. 2004). Once we have documented pre-service teachers’ performances in an assessment situation, we must make inferences about pre-service teachers’ skills based on their performances. To make such inferences, we draw upon our conceptions of teaching practice (in this case, eliciting and interpreting student thinking ) and how pre-service teachers develop teaching proficiency, as well as research on the mathematical knowledge needed for teaching (Ball et al. 2008). In sum, our assessment development process considers teaching practice itself and how it can be decomposed for the purposes of assessment, the assessment situation and the opportunities it creates for pre-service teachers to demonstrate their skills, and the practice-focused developmental frame that supports inferences about pre-service teachers’ skills.

4 Constructing the Situation to Reveal Pre-service Teachers’ Eliciting Capabilities

In our simulation assessment, the “student” profile (see Fig. 9.2) is crucial both for providing opportunities for pre-service teachers to demonstrate their capabilities with eliciting student thinking and for enacting the assessment. There are three main considerations in the design of the student profile: (a) the mathematics topic; (b) the characterization of the student’s process and understanding; and (c) the student’s way of being. We next describe each of these considerations.

4.1 The Mathematics Topic

The mathematics content embedded in the student work sample shapes pre-service teachers’ opportunities to demonstrate their capabilities with eliciting and interpreting student thinking . When designing the assessment scenario, we select mathematics content that is high-leverage for elementary mathematics teaching (Shaughnessy et al. 2012) to provide insight into pre-service teachers’ capabilities in the context of mathematics content that we expect them to understand well.

4.2 The Characterization of the Student’s Process and Understanding

Our knowledge of teaching and the knowledge, skills, and dispositions that pre-service teachers bring to teacher education has led us to identify a second set of features to consider in the design of the assessment: the student’s process for solving the problem, the student’s understanding of the process and related mathematical ideas, and the accuracy of the student’s answer.

A fundamental diagnostic problem of teaching is that students use an array of methods that often stretch beyond those that teachers prefer or even understand themselves. As we noted earlier, teaching requires a learner-centered orientation where teachers actively seek information about student thinking, especially in situations where the approach is unfamiliar. This is particularly demanding for pre-service teachers who are likely to know less about non-standard approaches.

It is crucial that teachers are able to determine the processes that students use to solve mathematics problems. In the strand of number and operation these processes include standard algorithms, alternative algorithms, and invented approaches. In our experience, pre-service teachers in the U.S. are often highly proficient with standard algorithms, but their understandings of these processes are tacit and often either not well developed or not well remembered, following over a decade procedural-focused use. Further, pre-service teachers are often unaware of alternative approaches. As a result they often have less of a sense of what is important to ask when students are using alternative algorithms or invented strategies and may revert to directing the student to more familiar territory through prompts such as, “why aren’t you doing … [referencing an element of the standard algorithm].” Even when students use the standard algorithm, pre-service teachers face other challenges, such as not eliciting pertinent information from students due to assumptions that they make about what students think about parts of the process. Thus, standard algorithms, alternative algorithms, and invented approaches all provide productive arenas for assessing skill in eliciting and interpreting student thinking .

In terms of our focus on understanding, research indicates that it is crucial to track on students’ understandings of processes that they are using (Fuson 2003; Steffe and Cobb 1988). At that start of a teacher education program, pre-service teachers track more on students’ processes than on their understanding of that process (Shaughnessy and Boerst, 2017). Thus, we have found that it is important to articulate the student’s understandings in the profile and to track on pre-service teachers’ skill with eliciting those understandings from the student in the simulation.

With respect to accuracy of the answer (i.e., the correctness of the final answer), we have found that our pre-service teachers are more likely to ask questions about answers that are wrong than answers that are right. Further, pre-service teachers may be likely to discount processes and understandings when faced with an incorrect answer. This may lead them to generate interpretations that fail to capture what students do know and are able to do. Of course these categories are interrelated. For instance, pre-service teachers may be less likely to ask about the understanding behind correct answers, perhaps presuming that understanding must be there to produce the correct answer. In sum, for each assessment, we articulate the student’s process, understanding, and accuracy as a critical set of assessment features.

4.3 The Student’s Way of Being

Students differ in terms of how they think or approach mathematics problems. But just as importantly for the work of eliciting student thinking, they differ in terms of their dispositions, interactional styles, and use of language. We have termed these unique personal traits, the “student’s way of being.” In a recent study conducted in classrooms, we found that about one-third of students (N = 44) gave a full explanation of their process for solving a problem after being asked just one question about their written work by a pre-service teacher (Shaughnessy et al. accepted). Further, almost all of these students articulated their understanding of the process and core mathematical ideas without being prompted. In classrooms, students do of course vary in how much they share about their thinking. But for an assessment, having “students” disclose relatively little about their process and understanding unless directly asked makes it possible to learn more about pre-service teachers’ eliciting skills. When students are reserved, pre-service teachers have to ask more questions, which makes their skill with the practice of eliciting student thinking more visible. We explicitly design for the student’s way of being because of its impact on teacher-student interactions and the nature of eliciting and interpreting that can happen.

4.4 The “Student” Profile

We summarize information about the mathematics topic, the characterization of the student’s process and understanding and the student’s way of being in a “student” profile. In the example assessment, we selected multi-digit addition. Specifically, the problem: 29 + 36 + 18 (see Fig. 9.1). In this example assessment, the “student” uses an algorithm, sometimes known as the column addition method, to solve the problem. The “student” adds the digits in each column (2 tens + 3 tens + 1 ten = 6 tens) and (9 ones + 6 ones + 8 ones = 23 ones). The “student” interprets the 623 in the written work as 6 “tens” and 23 “ones.” The “student” knows that 23 ones can also be thought of as 2 tens and 3 ones. Then, the “student” combines the 6 tens and the 2 tens (from the 23 ones). This yields the final answer of 83. The “student” has conceptual understanding of the procedure and the final answer is correct. This profile also includes scripted responses to anticipated questions and these responses are based on what has been articulated with respect to the mathematics topic, the characterization of the student’s process and understanding, and the student’s way of being. Figure 9.2 contains an abbreviated version of a “student” profile.

Fig. 9.1
figure 1

A student’s work on a multi-digit addition problem

Fig. 9.2
figure 2

An excerpt from the “Student” profile

5 Considering a Pre-service Teacher’s Eliciting Performance

We next present a vignette based on a pre-service teacher’s performance. This pre-service teacher begins the interaction by asking the “student” to talk about his process. In the vignette, we use T to refer to the pre-service teacher and S to refer to the “student.”

T: I was wondering when looking at the problem where you started? What numbers did you start with?

S: So, I added the tens. So I added the two, and the three, and the one and I got six.

T: Okay. And how did you know that was six?

S: ‘Cause I know my facts. I mean, so two and three makes five, and one more makes six.

T: Okay. And that’s why you wrote down the six right there?

S: Yeah, that’s right.

The pre-service teacher elicits that the “student” first added the digits in the tens column, the sequence in which the “student” added the numbers within the tens column, and the sum that resulted (6). She also elicits that the student believes that he is “adding the tens.” The pre-service teacher continues to ask questions about the process.

T: And then what was your next step?

S: Then I added up the ones.

T: Okay. And how did you add up the ones?

S: So, nine, and six, and then eight to get twenty-three.

T: Okay. What if you started by adding eight, six, and nine? Would you still get twenty-three?

S: Yes. It doesn’t matter which way you do it.

T: Alright. And then you have the twenty-three here [points to it]. So, what does that twenty-three mean?

S: That’s twenty-three ones.

At this point, the pre-service teacher has elicited that the second step in the process was to add the digits in the ones column and the order in which they were added. She has also pressed to see whether the “student” believes that the sum will be the same if the digits are added in a different order. She continues to ask questions, focused on the combining step.

T: Twenty-three ones? Alright. And then how did you get that eighty-three?

S: Well, when you add stuff, you can’t have more than one digit in a place– in an answer. Like this just looks wrong. So, you have to regroup it so that the answer will look right.

T: So, how did you regroup it?

S: So, this twenty-three, this two right here is two tens. And this, like I told you before, was also tens [pointing to the 6]. So you put the tens together and that’s how you get the eight tens.

T: You got the eight tens. So, did you add six plus twenty-three ones?

S: I added the six plus the twenty ones to give me eight tens. Then I still had the three ones.

The pre-service teacher continued to press on why the “student” knew to combine the six and the two.

T: Three ones. So why didn’t you add the twenty-three? How did you know that the two meant tens when you just told me before you had twenty-three ones?

S: Yeah, so in twenty-three ones, this part of it is ten and this part of it is the ones that are left after you made all your tens.

T: And how do you know that’s two tens?

S: Because when you’re adding it and you get past nine, then the next number is gonna be in the teens, so that you know that that digit is– actually stands for ten.

T: Okay. And then eighty-three is your final answer. What does that eight mean?

S: Eighty.

T: And then that three, what is that three referring to? What does that three mean? What is that value?

S: Three ones.

By the end of the interaction, the “student” has revealed why he combined the six and the two and his understanding of the value of the eight and the three in eight-three.

5.1 Scoring of the Eliciting Performance

We conceive of the work of eliciting student thinking as involving: (a) formulating questions designed to elicit and probe student thinking; (b) posing questions; (c) listening to and interpreting what students are saying; and (d) developing additional questions to pose (TeachingWorks 2016). This work is iterative. It involves teachers listening to and interpreting what students are saying, generating and posing questions to learn more about the student thinking, listening to and interpreting what students are saying and so forth. Teachers make sense of what students know and can do based on evidence from interactions and other artifacts of student work. Importantly, students are at the center of this work. It is their thinking which is sought and intended to be understood, and the work is situated in mathematical contexts that focus dialog, shape interpretation, and influence follow-up questions.

Because the simulations make use of highly specified protocols for the student’s processes, understandings, and ways of being, we are able to use observational checklists as scoring tools as a simulation unfolds. Our observational checklists for the eliciting portions of the assessment are based upon an articulation of “high-quality” eliciting of student thinking. For example, high-quality eliciting of student thinking entails launching the interaction in a way that focuses on the mathematics of the student’s approach (i.e., formulating and posing an initial question designed to elicit student thinking); developing additional questions which are focused on eliciting the student’s process for solving the problem and probing the student’s understanding of the process and of key mathematical ideas; listening to the student which can be demonstrated through the posing of additional questions which are tied to things that the student says and does; and the posing of questions. The checklist includes specific things that the pre-service teachers might do (e.g., Elicits where the 8 comes from) and specific responses that the “student” provides based on their preparation and training (e.g., I combined the 6 and the 2) when prompted by the pre-service teacher.

As summarized in Fig. 9.3, in the simulation, we are able to see evidence of this pre-service teacher’s skill in formulating an initial question (“What numbers did you start with?”) that is general, open-ended, and focused on an important piece of the mathematics at hand. We also have evidence of the pre-service teacher’s skill in posing the question to a student, where skilled delivery is sensitive to how students might hear and respond to the question. While we are not able to directly see the pre-service teacher’s skill in interpreting the student’s thinking in the moment, we are able to see that follow up questions are responsive to what the student has said, which is an indicator of a pre-service teacher listening to a student. Further, the questions focus strategically on particular ideas that the student has shared/not shared such as parts of the process about which the student has said little and mathematical ideas that related to the student’s process (e.g., whether it is possible to add the numbers in a column in a different order).

Fig. 9.3
figure 3

Abbreviated scoring checklist for eliciting: example performance

We use specific pre-service teacher performances and trends across the performances to improve our articulation of high-quality eliciting within a particular scenario and by implication the components of the scoring tool. For example, pre-service teachers might repeatedly probe a student’s understanding of a particular mathematical idea that we had not initially identified on the observational checklist, but that seems quite reasonable to include. We also use their performances to improve the student role protocol so that the student will engage in the situation in ways that allow pre-service teachers to demonstrate their eliciting skills. Our goal is to design the situation such that we are able to appraise the eliciting and interpreting skills of our pre-service teachers. If pre-service teachers are not probing the student’s understanding of particular parts of the process or incorrectly interpreting the student’s understanding, we do not assume that our pre-service teachers are not skilled at eliciting and interpreting student thinking . Instead, we consider whether we need to make changes in the way that the “student” responds to specific questions. Even subtle shifts to the “student’s” language can make it more likely that a pre-service teacher would ask important questions about understandings. We use the performances to identify changes that we believe will increase the likelihood that pre-service teachers are able to demonstrate their eliciting skills.

6 Constructing the Situation to Reveal Pre-service Teachers’ Interpreting Capabilities

The follow-up interview is designed to assess pre-service teachers’ capabilities with interpreting student thinking and their mathematical knowledge for teaching . Interpretation is the work that teachers do to give meaning to what they see and hear. Two crucial areas for interpretation are: (1) the student’s process, and (2) the student’s understanding of that process and the underlying mathematical ideas. The follow-up interview is designed to focus on both of these aspects, including the use of evidence to support the interpretations. Pre-service teachers are asked to talk about what they learned from the simulation about the student’s process for solving the problem. Later, in the context of a related problem for which pre-service teachers anticipate the student’s process, we ask pre-service teachers to anticipate student understanding. We ask about specific mathematical ideas and/or steps in the process because in earlier work we found that asking a targeted question can reveal more about the capabilities of pre-service teachers than a general question.

At the same time, the follow-up interview is constructed to reveal evidence of pre-service teachers’ mathematical knowledge for teaching . We target four aspects. First, we elicit whether pre-service teachers can solve the problem themselves and judge the accuracy of the student’s solution. Second, we ask pre-service teachers to construct a problem that they could use to confirm their understanding of the student’s process. We learn whether pre-service teachers are able to identify the features of the task, including the traits of the numerical example, that must remain consistent to confirm the student’s process or understanding. Third, pre-service teachers are asked to apply the student’s process to a similar problem that we provide. Fourth, pre-service teachers are asked to generalize whether the process will generate a correct answer for a particular category of problems, and why.

6.1 Considering a Pre-service Teacher’s Interpreting Performance

The questions and the pre-service teacher’s responses to them are summarized in Table 9.1.

Table 9.1 A pre-service teacher’s responses to follow up questions

6.2 Scoring of the Interpreting Performance

We use an observational checklist as the interview unfolds. The observational checklist, completed for the example assessment, is shown in Fig. 9.4. It shows that this pre-service teacher is able to describe the student’s process and to anticipate the student’s understanding of two key mathematical ideas, using evidence from the interaction with the “student.” Further, this pre-service teacher demonstrates developed mathematical knowledge for teaching through generating a follow-up problem which can be used to confirm the student’s process and articulating a rationale for that problem, applying the student’s process to a similar problem, and thinking critically about the mathematics of the student’s process and the mathematical cases to which it will generalize.

Fig. 9.4
figure 4

Abbreviated scoring checklist for the follow-up interview: example performance

7 Simulation Assessments: The Potential and Next Steps

As illustrated in this chapter , simulation assessments hold promise for assessing pre-service teachers’ pre-service teachers’ developing capabilities with important interactional practices of teaching, including eliciting and interpreting student thinking . But for the use of such assessments to become more widespread, there needs to be additional conversation in the field about the design and use of simulation assessments. This chapter is designed to support such conversations.

In our current work, we are continuing to explore the design of simulation assessments. In our early work, we designed assessment simulations relying on the wisdom of practice, that is, insights generated through our own experiences working with students, analyzing data collected from students, knowing a variety of ways students approach different mathematical situations. These insights have allowed us to articulate how students at a given grade level could reasonably be expected to talk about the problem and the ways in which they could reasonably be able to convey their understanding. We used these insights to construct the student profile after specifying the mathematical topic/practice, characterization of the student’s process and understanding, and the student’s way of being. Currently, we are exploring ways to draw on two additional sources of information for our design work: (1) interviews with students around the selected problem; and (2) learning progressions research which details how students at a particular point in a learning progression understand particular content. These are promising possibilities for strengthening the development of the student profile.