1 Introduction

Recent science education reform documents (National Research Council 2007, 2012) have called for a focus on scientific practices as a vehicle for developing students’ scientific literacy. This literacy includes an understanding of the “content” of science, but also emphasizes students’ understanding of the nature of science as a human endeavor, including what scientists actually do and how knowledge is constructed in science. An interconnected understanding of the content and nature of science can support a person’s ability to reason critically and make informed decisions when science intersects with their everyday lives. Such scientifically literate citizens would be able to read articles about science in lay publications critically, engage in public discussions about scientific issues, recognize faulty scientific arguments, and make scientifically informed choices when voting and spending money. Scientifically literate citizens would also know how to find, read and evaluate information about a scientific subject when their existing content knowledge is inadequate for engaging in the previously described practices.

In order to develop such scientific literacy, A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas (NRC 2012), recommended that K-12 science education experiences should incorporate the scientific practices that are common to scientists in many different fields. The way that these practices are defined in this document, and in the recently published Next Generation Science Standards (NGSS, NGSS Lead States 2013) is likely to have a great influence on the form and language of future science curricula across the United States, and perhaps internationally. In order to become fluent in these practices and effectively interpret curricula in which scientific practices are emphasized, teachers will need support. As eloquently argued in the Framework:

teachers are the linchpin in any effort to change K-12 science education. And it stands to reason that in order to support implementation of the new standards and the curricula designed to achieve them, the initial preparation and professional development of teachers of science will need to change (NRC 2012; p. 255).

Preservice elementary teachers are perhaps most in need of this support, as they commonly have weak knowledge of science content and practices (Davis et al. 2006; Smith and Anderson 1999; Zembal-Saul 2009). People learn how to engage in practices of any sort by participating with others in a community that engages in those practice (Lave and Wenger 1991). Considering that preservice elementary teachers have little to no experience participating in a scientific community, it is not surprising that their knowledge of scientific practices is often limited. One place where they can gain experience with scientific practices is in their science teaching methods courses. In these courses, preservice teachers can read about, discuss, and engage in scientific practices within their community of fellow preservice elementary teachers. They can also apply their understanding of these practices by writing reflections about what they have read, and about their experiences engaging in the practices in class; designing lessons that engage their students in scientific practices; teaching those lessons; analyzing video of their teaching by identifying the instances where their students were engaged in the practices.

Learning about scientific practices in preservice elementary science methods courses is certainly not a new idea born of the Framework or NGSS, nor are studies of preservice teachers’ understandings and applications of scientific practices a novel idea. But the publication of these new, influential documents calls for new research investigating preservice elementary teachers’ understandings about the practices as they attempt to use these documents as frameworks for teaching science. Using a variety of modalities (e.g. responses to text, video, in- and out-of-class experiences, written lessons, enacted lessons), teacher educators must assess their preservice teachers’ ideas about scientific practices on an ongoing basis, in order to effectively select materials and design experiences that work with and challenge those ideas, and develop their teaching practices. In this way, teacher educators who are informed about their preservice teachers’ ideas are in a better position to help them construct more sophisticated understandings about scientific practices and enact those understandings effectively in the classroom. In turn, these future teachers may be better able to support their students’ scientific literacy. To this end, this study seeks to provide initial insights into the possible ideas that preservice elementary teachers hold about scientific practices.

2 Conceptual Framework

2.1 Scientific Practices as Defined by the National Research Council (2012)

Although various scientific practices have been defined and distinguished from one another in multiple ways, this study uses the demarcation and definition of scientific practices described in A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas (NRC 2012) as a conceptual and analytical framework. Drawing on 60 years of work by historians, philosophers, psychologists and sociologists who have studied what science is and what scientists do, the NRC (2012) derived these practices from those in which actually scientists engage as part of their daily work. These practices include:

  1. 1.

    Asking questions

  2. 2.

    Developing and using models

  3. 3.

    Planning and carrying out investigations

  4. 4.

    Analyzing and interpreting data

  5. 5.

    Using mathematics and computational thinking

  6. 6.

    Constructing explanations

  7. 7.

    Engaging in argument from evidence

  8. 8.

    Obtaining, evaluating and communicating information (NRC 2012; p. 42)

The NRC (2012) argues that these practices should not be taught in isolation from one another, nor separate from science content. Instead, they should be used as vehicles for “developing a deeper understanding of the concepts and purposes of science” (NRC 2012; p. 43). They further argue that teaching science as a set of practices:

minimizes the tendency to reduce scientific practice to a single set of procedures, such as identifying and controlling variables, classifying entities, and identifying sources of error. This tendency overemphasizes experimental investigation at the expense of other practices, such as modeling, critique, and communication (NRC 2012; p. 43).

In order to support the development of students’ scientific literacy, the Framework (NRC 2012) argues that teachers should not simply engage students in these practices, but should ask students to reflect on them as well. For example, as they engage in the practices, students should not only identify the specific practices in which they have engaged, but also consider how their experiences in class are similar to and different from: (1) what scientists actually do and (2) how knowledge is constructed in science. By engaging in the practices in authentic ways and making explicit connections between their classroom experiences and the nature of science and scientific knowledge, over time students can develop an understanding of science that prepares them for scientifically literate citizenship as adults.

2.2 Literature Around Preservice Teachers Ideas About Scientific Practices

As discussed above, although the Framework (NRC 2012) is the first widely disseminated document to name and define eight scientific practices in which all K-12 students should engage as part of their regular science instruction in schools, teacher educators have emphasized scientific practices with their preservice teachers for many years. Along the way, researchers have also investigated preservice teachers’ understandings of these practices. In their review of the literature around challenges that new teachers face, Davis, Petish and Smithey (2006) found that for the most part, preservice elementary teachers “lack adequate understandings of science content” (p. 615) and have “unsophisticated understandings of inquiry and related skills” (p. 616). However, they also found that preservice teachers reported both positive and negative experiences with science, and pointed out that “positive experiences in traditional science courses may lead them to naïve understandings of the nature of science” (p. 617).

In a science content class designed for preservice elementary teachers, Smith and Anderson (1999) engaged their students in several scientific practices, including developing and using models, analyzing and interpreting data, constructing explanations, engaging in argumentation, and evaluating information. Surprisingly, their students who reported previous positive experience with science had difficulty using models to support their reasoning, interpreting data, giving priority to evidence in their arguments and explanations, and evaluating the validity and reliability of evidence. On the other hand, the students who reported previous negative experience with science insisted on making personal sense of the data they generated, persisting when their models did not coordinate with the evidence or their understanding of a concept, rather than referring to information sources such as texts.

Other studies of preservice and inservice elementary and secondary teachers highlight some of the struggles associated with learning about particular scientific practices (rather than scientific inquiry in general). In regard to the practice of analyzing and interpreting data, Bowen and Roth’s (2005) study found that both elementary and secondary preservice teachers had difficulty making sense of data from tables and graphs, despite the fact that many held bachelor of science degrees. Several other studiesFootnote 1 have found similar difficulties in preservice and inservice teachers’ understandings of the practice of developing and using models. Specifically, while preservice and inservice teachers generally understood that models can be used to represent phenomenon when communicating explanations to others, many have difficulty understanding how models are also useful in generating explanations. In regard to the practice of engaging in argument from evidence, Zembal-Saul et al. (2002) found that while their preservice secondary teachers were able to craft arguments in which a claim was based on evidence, their arguments displayed a number of limitations. Specifically, some arguments relied on single (rather than multiple) pieces of evidence, while others failed to account for counter arguments.

These studies highlight the challenges that understanding scientific practices pose for preservice (and even inservice) teachers and the teacher educators who support them. It is important to note, however, that in every study mentioned above, whenever researchers introduced an intervention to support the teachers’ understanding of scientific practices, teachers were able to improve their understanding and use of the practices, to varying degrees.

3 Methods

3.1 Research Question

This study seeks to answer the question, “What are preservice elementary teachers’ ideas about scientific practices?” There are several sub-questions relevant to preservice elementary teachers’ ideas about scientific practices, including what they think scientists actually do when they are “engaging” in the practices, how they enact the practices in their lesson plans and teaching, and what value they assign to teaching science through a practices-based approach. Each question is important for informing science teacher educators who work with this population. As a starting point, this study specifically seeks to investigate preservice elementary teachers’ ideas about what the practices are—what scientists (or students) actually do when they engage in the practices, as revealed by their enactments of those practices in their written reflections, lesson plans and teaching.

3.2 Methodology

This research employs a retrospective, qualitative case study design. Yin (2009) defines case study as “an empirical inquiry that investigates a contemporary phenomenon in-depth and within its real-life context, especially when the boundaries between the phenomenon and context are not clearly evident” (p. 18). Because the ideas that preservice teachers construct about scientific practices are inseparably embedded within the particular context of their methods class experiences (which can vary significantly from class to class, and are expected to interact with the phenomenon), case study is an appropriate design for this inquiry. This design is called ‘retrospective’ because the data were not originally collected for research purposes. Rather, a reading of the data prior to the study raised the research question, which led to the design of the study.

3.3 Context

The participants in this study (n = 18) were senior undergraduate elementary education majors at a large university in the Northeastern US. The participants’ demographics were typical of preservice elementary teachers: most were Caucasian, aged 21–22, and all were women. They were enrolled in a one-semester elementary science teaching methods course in which the researcher/author was the instructor. The course coincided with a semester-long, twice-weekly preservice field experience in an elementary classroom. As part of the requirements of the course, participants taught a series of three connected science lessons on a single topic (which was chosen by the cooperating teacher) to the students in their field experience classroom.

In the methods course, participants learned about scientific practices all throughout the semester, by engaging in various experiences. They were first introduced to scientific practices early in the semester by reading about them in the Framework (NRC 2012). In class, a “jigsaw” method was used to help the preservice teachers make meaning of what they read. To begin, the participants were assigned to read about all of the eight practices in the Framework for homework. In class, students were divided into “expert” groups that were each assigned to focus on one particular practice. In these expert groups, the participants verbally negotiated meaning, and wrote a single group response to two prompts: “Summarize this practice in language that makes sense to you”, and “Considering the grade and topic you’ll be teaching, how might you specifically engage your students in this practice”? These responses were written in an online document that was shared with the entire class and the instructor. The participants were instructed to write their responses so that they may be used as tools for teaching their peers about the practices, and as a resource that all participants could access throughout the semester. While the participants met in these expert groups, the instructor travelled from group to group checking on their progress, reading what they had written, posing questions and providing feedback about their ideas.

Next, the participants were regrouped in “jigsaw” groups in which at least one “expert” for each practice was present. The experts took turns teaching their peers about their assigned practice, sharing with them the responses their group had written. Again the instructor traveled to each jigsaw group, providing feedback and posing questions for the group to consider. All participants were encouraged to add to or revise the shared document based on these jigsaw group discussions, so that the document represented the class’s shared understandings of the practices, not just the understandings of the participants in each expert group. Following the jigsaw groups, the class reconvened as a whole group to debrief the practices overall and address any questions that arose within the jigsaw groups. Again, all participants were encouraged to add to or revise the shared document based on this whole group discussion.

Throughout the semester, participants also read about the practices of engaging in argument from evidence and constructing explanations in their textbook, What’s Your Evidence? Engaging K-5 Students in Constructing Explanations in Science (Zembal-Saul et al. 2012). This book emphasizes science instruction that includes public sensemaking of data in order to generate explanations for scientific phenomenon. It provides a framework for constructing explanations using claims, evidence and reasoning (adapted from Toulmin’s 1958 argument pattern). Whenever the participants were assigned to read the book for homework, they were also assigned to identify key ideas and pose questions about the reading in their course journal. These journal responses served as formative assessment and helped to shape the in-class discussions. Although the book shares the language of “constructing explanations” and “engaging in argument” with that of the Framework (NRC 2012), it does not specifically address the Framework which was not yet published when the book was being written.

Throughout the semester the participants engaged as learners (rather than teachers) in seven in-class investigations, each of which included one or more scientific practices (see Table 1). In each of these investigations, a scientific question was posed to the participants (see Table 2). Participants collected data in small groups, then engaged in whole group, public sense making and argumentation in order to co-construct a claim, based on evidence and reasoning, that answered the investigation question. Following each investigation, the participants verbally identified each of the practices that were included in the investigation as well as the specific instances from the investigation that constituted an example of each practice, then justified why each instance constituted an example of the identified practice. For example, following the “Bulbs and Batteries” investigation, participants agreed that they had engaged in the practice of analyzing and interpreting data when they were looking for similarities among the bulb/battery configurations that lit the bulb, because “analyzing” includes searching for patterns. In cases when participants initially disagreed about whether they had engaged in a particular practice, they presented and defended arguments for and against their ideas, guided by questions posed by the instructor until consensus was reached.

Table 1 In-class investigations and corresponding scientific practices emphasized
Table 2 Investigation questions posed to participants

Mid-semester, the participants wrote lesson plans on topics assigned to them by the cooperating teachers of their field experience classrooms. These lesson plans included specific objectives for engaging students in practices of science appropriate to that topic, and corresponding activities within the lesson designed to meet those objectives. For example, in a lesson plan on categorization within the animal kingdom, one participant listed as an objective, “Students will analyze and interpret data as they look for similarities and differences between animal characteristics found on the inside and outside of the animal bodies”. The detailed steps of the lesson plan then included a section called “Analyzing Data” that included specific analysis questions that the participant planned to ask the students to help them make sense of their data.

Near the end of the semester, participants then taught and video recorded their lessons in their field experience classroom, and used StudioCode© software to select and annotate instances where their students were engaged in scientific practices from the video. In their annotations for each video instance, they were instructed to identify the practice in which the students were engaged and justify how the video constituted evidence of their students engaging in the identified practice. For example, one participant provided a video instance from a lesson on animal adaptations in which the class was engaged in a whole group discussion about the function of various animal structures, which she identified as the practice of engaging in argument from evidence. She annotated this instance:

In this clip, students are debating about why a giraffe has a long tongue. One student explains that he agrees with another student that a giraffe has a long tongue because in one of the photos they were given, a giraffe is extending its tongue to drink. This is argumentation because he is agreeing with one student and disagreeing with another at the same time by using evidence.

3.4 Data Sources

Several types of artifacts from the preservice elementary science teaching methods course were analyzed in this study. Each data source is described more thoroughly in the previous section.

  1. 1.

    A written assignment in which participants summarize and apply their understandings of the practices as described by the Framework (early semester)

  2. 2.

    Participants’ written lesson plans, in which they were assigned to include objectives and experiences for engaging their students in the practices of science (mid-semester)

  3. 3.

    Instances (selected by participants) from video of their teaching, where their students were engaged in scientific practices (end of semester)

  4. 4.

    Participants’ written annotations of these video clips, in which they were assigned to identify the practice, and justify how the video constitutes evidence of their students engaging in that practice (end of semester)

3.5 Methods of Analysis

Each data set (described in the previous section) was analyzed thematically (Braun and Clarke 2006), to identify and interpret participants’ ideas about what it means to engage in each of the eight scientific practices defined by the NRC (2012) Framework. Drawing on the methods of grounded theory analysis, the constant comparative method (Glaser and Strauss 1967) was used to generate initial codes for interpreting participants’ ideas. To generate initial codes, each written data set (written assignment, lesson plan, video annotation) was inspected sentence-by-sentence. As advised by Charmaz (2006), initial codes stuck closely to the participants’ actual words, and reflected action. While researchers can never completely abandon their own tacit conceptual frameworks, codes were generated as inductively as possible from the data, without consciously imposing any preconceived ideas upon their meaning. An example of initial codes generated from the raw data are shown in Table 3.

Table 3 Example of initial coding (Engaging in Argument from Evidence)

Within each data set, initial codes were sorted and collated by practice. Related codes were then clustered within each practice (across data sets), and themes were generated to summarize the meaning of related codes. An example of a cluster of initial codes and the resulting theme is shown in Table 4. Each theme was compared to the original data to reevaluate its validity. Any themes that were not supported by the original data were discarded or revised. Because the participants were no longer in contact with the researcher, member checking was not an available analysis strategy. Next, related themes within each practice were clustered and categories were generated to summarize the relationship between the themes. An example of related themes and the resulting category is shown in Table 5.

Table 4 Example of generating themes (Asking Questions)
Table 5 Example of generating categories (Planning and Carrying Out Investigations)

4 Findings and Discussion

The findings are summarized practice-by-practice in Table 6. The first column lists the individual practice to which the participants’ ideas refer. The second column lists the categories of the participants’ ideas about that practice. The third column contains the themes of participants’ ideas, generated from the data using the methods described in the previous section. No findings are reported for a particular practice if data were not sufficient to support claims about themes among participants’ ideas. In addition to the participants’ most common ideas, these findings also include less common, but particularly promising or problematic ideas. As a result, some of the themes reported contradict one another. Following Table 6, the findings are further unpacked, including examples of how these ideas were enacted in participants’ lesson plans and teaching videos.

Table 6 Preservice elementary teachers’ ideas about scientific practices

Regarding the practice of asking questions, in the participants’ early semester written assignment about the practices, they included attention to questions that scientists pose about phenomena as well as questions they pose about each others’ ideas (see Table 6, “Asking Questions”, line 10). But of the three participants who incorporated this practice into their lesson plans or identified instances of this practice in their teaching video, they always identified students posing questions suitable for scientific investigation, but never posing questions about one another’s ideas. While this study collected limited data about this preservice teachers’ ides about practice, posing questions about others’ ideas may be an aspect of this practice that preservice teachers need more support to understand and enact in their teaching.

Turning next to Developing and Using Models, the data included both promising and problematic ideas about modeling. For example, in the initial written reflection about the practices, participants identified developing questions and explanations and communicating ideas to others as goals of developing and using models (see Table 6, Developing and Using Models). But that same data represented limited understandings of what ‘counts’ as a model. For example, one participant wrote, “My students may be able to make simple physical scale models, depending on what the topic is. For instance, my topic is seeds so they could plant their own seeds.” Also, after the “States of Matter” investigation, several participants suggested that subjecting the mystery substance to various “state tests” (e.g. the smash test, the pour test) was an example of making a model, but they were not able to explain what phenomenon they had modeled. Many other participants objected to this idea, and were able to convince their peers that no model had been constructed during the investigation (The investigation did not include constructing particulate models of the states of matter). Preservice teachers then, may need more support in understanding exactly what a model is.

Participants’ limited understanding of the purposes of modeling was also revealed during the first in-class investigation (phases of the moon). They were asked to bring a list of materials that they would like to use to help them construct an explanation of this phenomenon. Several participants listed Oreo Cookies as the sole material on the list. While these cookies are certainly useful for demonstrating the shapes of the moon at different times of the month (a common activity used by elementary teachers), they are not very useful for generating evidence or explanations for the moon phases. None of the participants listed spheres or light sources. Most of the participants brought no list at all and explained that they did not understand what they were being asked to do. When it was explained again that they were going to have to use materials to figure out what causes the phases of the moon, a few students eventually suggested items such as spheres and paper cutouts representing the Earth and Moon, and flashlights to represent the sun. These participants had never been asked to develop their own model to help them construct an explanation, they had always been provided with one and asked to use it to communicate an explanation that scientists had constructed. This finding mirrors that of Harrison (2001), who found that only 5 of 22 teacher participants expressed the idea of using models as tools for thinking, not just communicating.

Only two participants included Developing and Using Models in their lesson plans or identified instances of modeling in their teaching videos, although this fact is likely related to the topic of the lesson, not the participants’ understanding of the practice. That is, modeling was more appropriate to some topics (such as understanding the cause of the phases of the moon) than to others (such as repelling and attracting magnets in a 2nd grade lesson). Of the two participants who included modeling in their lessons, both taught about the phases of the moon. One participant taught a lesson in which her students used the model to generate their own explanations of the cause of the phases, while the other used the model to communicate the scientifically accepted explanation to her students. Because these two participants’ lessons were so closely related to the investigation they did in class, it is difficult to know how they might enact their understanding of this practice when teaching a different topic.

In regard to Planning and Carrying Out Investigations, participants’ written reflections characterized investigations solely as experiments. None mentioned other types of investigations, such as observational inquiries in astronomy. All participants included the practice of Planning and Carrying Out Investigations in their lessons. In contrast with their reading reflections however, some of the activities that they characterized as “investigations” were not experiments, but instead were purely observational inquiries, such as observing the motion of the sun in kindergarten. It is unclear whether these few participants consciously understood that not all investigations are experimental in nature, or whether they simply characterize all scientific activity as an “investigation.”

Participants’ ideas about analyzing and interpreting data were somewhat mixed. Their written reflections include the idea that graphs and tables can be used to interpret data (not just organize it) (see Table 6, Analyzing and Interpreting Data, rows 5–7), and they frequently included “analyzing data” as an objective in their lesson plans, but many of these plans devoted little attention to the mechanics of the practice. In some cases, the videos of their teaching revealed that after their students carried out investigations and organized the resulting data into tables and graphs, the preservice teachers expected their students to immediately generate a claim, without leading them through any sense making procedures such as comparing and contrasting. This enactment of the “analysis” objective in their lesson plans calls into question whether these participants actually understand what it means to analyze and interpret data. Then again, this discrepancy may reveal the limitations of their pedagogical knowledge, rather than their ideas about the practice itself. That is, they may understand what it means to analyze and interpret data, but not how to teach students to engage in this practice. It is important to note that some participants’ lesson plans and teaching did sufficiently attend to making sense of data, including scripting specific analysis questions to pose to the students.

In regard to Using Mathematics and Computational Thinking, participants’ written reflections included organizing data with graphs and tables as part of engaging in this practice (Table 6, Using Mathematics and Computational Thinking, row 10). They also included that graphs and tables are useful not just for organizing and representing data, but also for interpreting it. Only one participant included this practice in her lesson plan. In her lesson on natural selection for 5th graders, the students constructed pie graphs to represent the percentages of a population (butterflies) that demonstrated a particular physical trait (color) over several generations. They used the changes from one graph to another to explain how the trait essentially disappeared from the species over time.

Themes generated from participants’ written reflections, lesson plans and teaching analyses suggest that they likely conflated the interrelated practices of Constructing Explanations and Engaging in Argument from Evidence. For example, their characterizations of the two practices in their reading reflections were practically identical (see Table 6, Constructing Explanations and Engaging in Argument from Evidence). Furthermore, every participant who identified evidence of engaging their students in the practice of “constructing explanations” used video in which they facilitated an “argument” among the class, regardless of whether that argument resulted in consensus around an explanatory or non-explanatory claim. For example, one instance of video identified by a participant as “constructing explanations,” showed the participants’ students trying to answer the question, “Why does the moon seem to change shape”? While another showed the participant’s students trying to answer the question, “Was new stuff made when we mixed the chemicals together”? While these two questions both indeed require students to engage in argument from evidence, only the question about the moon phases results in a causal explanation. That is, the question about the chemicals does not ask why or how new stuff was (or was not) made, but simply whether new stuff was made. The resulting claim then, is not an explanation because it is not causal (Osborne and Patterson 2011). Therefore, while this particular video clip could certainly be identified as an example of Engaging in Argument from Evidence, because the claim that the class constructs does not qualify as an explanation, it should not be used as an example of Constructing Explanations. It is important to note that the distinction between these two practices, and the criteria for what counts as an explanation, have been the subjects of ongoing debate in the science education community in recent years (see Osborne and Patterson 2011; Berland and McNeill 2012; Osborne and Patterson 2012, for an example of this discourse). This debate and its relationship to these findings and to teacher education in general are further discussed in the implications section below.

Some themes among participants’ ideas spanned multiple practices (see Table 6). For example, the idea of communicating with others was common in the written reflections about the practices of Asking Questions, Developing and Using Models, Analyzing and Interpreting Data, Using Mathematics and Computational Thinking, Constructing Explanations, and Engaging in Argument from Evidence. The idea of asking and answering questions appeared in the data around the practices of Asking Questions, Developing and Using Models, Planning and Carrying Out Investigations, Constructing Explanations, and Engaging in Argument from Evidence. Evaluation/critical thinking was another idea that spanned several practices, including Asking Questions, Planning and Carrying out Investigations, Analyzing and Interpreting Data, Constructing Explanations, Engaging in Argument from Evidence. It is important to note however, that ideas captured in the participants’ written reflections about the practices based on their reading of the Framework (NRC 2012) were not always enacted later in their lesson plans or teaching. The implications of this finding are discussed in the next section.

5 Implications

The findings of this study have important implications for teacher education, including implications about how teacher educators might use the Framework to help develop preservice teachers’ ideas about the practices, and how those outside of teacher education (such as philosophers of science) can contribute to teacher education, and in turn to science education in general. First, in several places among the data, it was unclear whether the participants’ lesson plans and teaching analyses encompassed their comprehensive understanding of the practices, whether it reflected their pedagogical knowledge about teaching the practices, or both. For example, if a preservice teacher lists “Analyze Data” as a “step” in her lesson plan, but does not elaborate on that step in the written plan, nor actually lead her students through analysis procedures when she teaches the lesson, does that mean that she does not fully understand what it means to analyze data, or does it mean that she has not yet mastered how to teach data analysis techniques to her students? The answer to this question will vary from teacher to teacher, so it is important for teacher educators to use various methods to assess preservice teachers’ ideas about how to engage in the practices versus their ideas about how to teach the practices. The results of these assessments can help teacher educators distinguish between areas where preservice teachers need more support making sense of the practices themselves, and areas where they only require pedagogical support.

Second, using the Framework (NRC 2012) as an introduction to the practices in elementary science methods courses may be a way for preservice teachers to begin to make connections between the practices, to improve their understandings of the nature of science, and to develop their scientific literacy. For example, while science content is commonly taught in disconnected “silos” in K-12 curricula, participants in this study drew several connections across practices, based on their reading of the Framework. Regarding their understandings about the nature of science as a human endeavor, the participants in this study identified “communicating with others” as a common theme among the practices, in contrast to the common layperson’s image of a scientist working in isolation. The findings of this study also include the participants’ common ideas about evaluation and critical thinking, skills necessary to the development of their scientific literacy. These data do not support a claim that simply reading the Framework will achieve the goals of making connections between the practices, improving their understandings of the nature of science, an developing their scientific literacy, but they do support the claim that using the Framework as a reference may help preservice elementary teachers begin to develop these understandings.

Third, these findings warn teacher educators of the dangers of relying too heavily on the Framework itself. Although the text is written in language that is meaningful to science educators, some preservice elementary teachers may have difficulty making personal meaning about the practices that they are then able to enact in their lesson plans and teaching. For example, in this data set the participants’ written reflections of the Framework (NRC 2012) contained somewhat sophisticated ideas about the practices, yet their lesson plans and teaching videos did not always reflect this level of sophistication (for an example, see the discussion about “Asking Questions” in the previous section). Their written reflections about the practices then, may represent their ability to identify the main points in the text of the Framework (NRC 2012), but not necessarily reflect their understanding of how to enact those practices. Preservice teachers will likely need many more opportunities to apply these understandings in their methods class before they are able to make useful meaning of the ideas in the Framework (NRC 2012). Identifying the practices in which they engaged during the in-class investigations was one opportunity for these participants, but the nature and topics of these investigations offered opportunities to make sense of some practices more than others. For example, in every investigation participants engaged in argument from evidence, but only one investigation offered participants an opportunity to develop and use models. Engaging in investigations in class is a common practice in elementary science methods classes. When methods instructors make decisions about which investigations to use in class, they may want to consider the how this collection of activities provides opportunities for participants to engage in and make sense of each of the practices, especially those known to pose a challenge to preservice teachers.

This study, along with previous research in this areaFootnote 2 indicate that preservice teachers may need extra support in understanding the practices of modeling and data analysis. In regard to modeling, in-class experiences should challenge preservice teachers to not only use existing models as tools for communication, but also to develop their own models as cognitive tools in the process of constructing explanations for phenomenon as often as possible. In regard to data analysis, methods instructors might place an increased emphasis on specific data analysis questions—those posed by the methods instructor during in-class investigations, by expert teachers in videos of classroom instruction, and by the preservice teachers themselves in their lesson plans. This study contributes a new finding to the literature by pointing out the additional difficulty that preservice teachers may have in distinguishing between engaging in argument and constructing explanations (discussed below).

Lastly, these findings remind the science education community to continue the discussion about argument and explanation in K-12 education. The distinction between the practices of “Engaging in Argument from Evidence” and “Constructing Explanations”, along with what counts as an explanation, have been the subject of ongoing debate in the science education community in recent years,Footnote 3 although the Framework (NRC 2012) does not specifically address the distinction between these two practices. In short, some authors (Osborne and Patterson 2011; Braaten and Windschitl 2011) argue that explanations and arguments are distinctly different kinds of knowledge claims. In order to qualify as an explanation, the knowledge claim must explain how or why something happens, not simply justify that something is true. These authors further argue that this distinction is necessary to make with students. Other authors (Berland and Reiser 2008; Berland and McNeill 2012) define “explanation” more liberally. In their work with students and teachers, arguments containing a claim, evidence and reasoning count as “scientific explanations”, even if the claim itself is not a causal explanation. Furthermore, Berland and McNeill (2012) argue that this distinction may not be important to make with students, and that over emphasizing this distinction may contribute to students’ relegating the two very interrelated practices to completely unrelated silos. The distinction between an argument and an explanation was not specifically addressed in these participants’ methods course, and the more liberal definition of explanation was usually applied when analyzing in-class investigations. It is no surprise then, that the participants in this study conflated the two practices.

If teachers are expected to master these practices and use the terminology appropriately with their students, the science education community must first decide whether it is necessary to make this distinction with students, and if so, next publish texts that explicitly clarify the distinction in terms that are easily interpreted by teachers, even those without strong preparation in science. The science education community may need to seek the expertise of philosophers of science to help make this distinction more understandable. In this way, philosophers of science may have the opportunity to make an important contribution to the preparation of science teachers, and in turn, impact the way that science is taught in schools.