Keywords

Introduction

When inquiry investigations were first promoted for school science in the mid-1990s many pre-service teacher education “science methods” courses were premised on the assumption that an undergraduate background in science was sufficient for (pre-service) teachers to implement those inquiry activities. Yet, those of us teaching those courses often encountered difficulties promoting inquiry science to both new and experienced teachers. In Canada, the Council of Ministers of Education released the “Pan Canadian” framework documents for science curriculum in 1997 (CMEC, 1997) and that document was influential in promoting inquiry science in many provincial curricula.

Notably, the inquiry science described in provincial curricular documents is “open” inquiry (in contrast with the “guided” inquiry prevalent in the United States), and in our experience, most of our academic colleagues teach about engaging public school students with this type of inquiry in their science “methods” courses. Despite the instruction in methods courses focusing on inquiry investigations for the last decade and a half, conversations amongst our peers at conferences suggests that Canadian public school students appear to experience few, if any, science inquiry investigations in most middle and high schools (with some notable exceptions, such as those described in Jones, Kaplanis, Melville, & Bartley, 2009) impacting the science literacy skills they develop.

Hodson (1998) has described science literacy as having three areas of focus:

  1. (i)

    ‘Science’ – referring mainly to what can be thought of as the “products” of science, such as laws, theories and inventions.

  2. (ii)

    ‘About Science’ – including learning ideas about the nature of science and the methods used for scientific inquiry.

  3. (iii)

    To ‘Do Science’ – referring to the expertise, confidence and motivation of scientists – much of it which appears to be tacit knowledge – which is required to develop and communicate knowledge in science and technology.

Generally it is recognized that schools tend to emphasize instruction in ‘Science’ and provide little education about the second and third categories. However, inquiry investigations, promoted in provincial curriculum documents, do focus on the other two categories, and one might expect that teachers should be teaching in those domains. Despite this, research suggests that teachers have “difficulty creating classrooms that are inquiry-based” (Crawford, 2007, p. 613) and that they are actually infrequently implemented (see Brown & Melear, 2006; Salish I Final Report, 1997 Footnote 1) despite mandates to do so.

Research into pre-service teacher’s own competency with science inquiry identifies numerous issues ranging from difficulties asking inquiry-possible questions to representing and drawing conclusions from data, as well as the atheoretical nature with which student teacher participants often approached their inquiry tasks (see overview by Bowen & Bencze, 2008). In general, pre-service teachers had difficulty with almost all of the features identified by the National Research Council (1996) as characterizing inquiry instruction, including identifying researchable questions, designing and conducting experiments, developing explanations, thinking critically about the relationship between evidence and explanations, and communicating scientific procedures and explanations. This is perhaps for the understandable reason that “Most teachers have not had opportunities to learn science through inquiry or to conduct scientific inquiries themselves” (NRC, 2000, p. 87). Teachers’ undergraduate education experiences in university science programs often tend to be lecture and confirmatory-laboratory activity-oriented (Woolnough & Allsop, 1985), thereby influencing their perspectives on teaching (Beisenherz & Dantonio, 1991, p. 44).

This general lack of inquiry science experiences, affects the belief system each pre-service teacher has about their own science teaching (Bryan, 2003; Guillame, 1995) and influences their confidence (Cheng, 2002). Bianchini, Johnston, Oram, and Cavazos (2003) describe the challenges faced by first-year science teachers as they try to teach in contemporary ways reporting that overcoming pre-conceived notions inquiry-based science teaching was, perhaps, their greatest challenge. Our own pre-service teachers generally find it difficult to describe successful and engaging science experiences they had with science inquiry investigations as students in school at any level to us.

Overall, this suggests that “science methods” courses need to serve a role with addressing this issue. The idea that it might be beneficial for pre-service teachers to engage in inquiry activities as part of their preparation to become teachers is not a new one. Duschl (1983, p. 753) recommended that pre-service teachers engage in “independent semester long science investigations or replications of previous investigations” but there are few reports of this happening.

All together this suggests that as science teacher educators we should provide inquiry science experiences within the context of our own pedagogy, which may in turn help pre-service teachers develop an understanding of inquiry from both the perspective of a learner of science as well as from the perspective of a teacher of science. In this chapter we report on three different examples of activities drawn from our own instructional “science methods” courses where we have attempted to model what was expected of the pre-service teachers in science classrooms while at the same time providing them, as students themselves, with experiences engaging in an inquiry environment. Our approach in these example activities are consistent with an experiential need identified by John Loughran:

…in teacher preparation there is an acknowledgment of the need for student-teachers to be familiar with new teaching procedures and strategies, yet attempts to do so often flounder because these teaching approaches are ‘delivered’ through lectures, handouts and reference material as opposed to creating situations through which students genuinely learn about the teaching by experiencing it as both a learner and a teacher. (Loughran, 2001, p. 4)

Korthagen , Loughran, and Russell (2006) identified the need for pre-service teachers to “genuinely engage in experiencing the various aspects of teaching in an environment where [engagement in experience] is the focus, rather than in an environment where successful teaching and ‘controlling’ students is the dominant concern” (p. 1029). Each of the examples we have described here sought to offer an environment where pre-service teachers could immerse themselves in various aspects of teaching in a supportive and reflective inquiry-based context. However, keeping in mind Cheng’s (2002) comments on the role of previous science experiences, we wanted our pre-service teachers to engage with science experiences that were inquiry-based investigations, where they were able to ask questions and experiment (reasonably) freely with the aim that they would feel confident enough to enact inquiry-based science investigation lessons as a significant part of their own teaching.

The activities described in this chapter represent ones we developed to address issues with other inquiry activities we tried and encountered problems with (such as those described in Bowen & Bartley, 2007; Bowen & Bencze, 2008). As our awareness of the problems student teachers had with inquiry activities developed we revised our classroom activities with the intent to more effectively engage pre-service teachers. The activities we describe in this paper are generally focused on the development of pre-service teachers’ “data literacy” (see Bowen & Bartley, 2013) because the idea of “data” underlies the practice of science (Latour, 1987). In addition, current curricular directions in the United States have an increased emphasis on data literacy (see NRC, 2011) and, thus, we see working with data as fundamental to the science inquiry process.

In each of the following science inquiry examples we describe what we were hoping to accomplish with each activity, how the pre-service teachers engaged in it, what they accomplished while participating in the activity, and what each of us as science “methods” instructors learned from the activities. Example I discusses student teachers engaging in an introduction to self-directed inquiry using Jello (Bartley). Example II examines the outcomes of pre-service teachers sampling and counting grass in an uneven area (Bowen). Example III describes a three-part activity where pre-service teachers were engaged in “science fair” activities first producing a project of their own, then judging student projects in a local science fair, and finally engaging in a critical discussion about an academic publication critically discussing science fairs (MacDonald). As part of the discussion of these examples we highlight how our preservice science teacher students engaged in these different investigation activities and insights that we gleaned from their participation.

Data Methodology

While attending a Canadian conferenceFootnote 2 we (the authors) discussed the different approaches we used in our methods courses and the subsequent student learning. We decided that a paper discussing some of these methods might be useful for other “methods” instructors and that our collective experiences might provide insights into issues arising in science “methods” courses. In our discussions we realized that we had each kept written records as our recent science methods classes had progressed. Not only did we have copies and records of our student’s assignments, we also had notes and records we had recorded by hand for individual classes. These were on our teaching outlines, in teaching diaries, or in emails we had exchanged with various others (including amongst the authors, with our students, or with other instructors or administrators). We essentially realized that we had a data set we could examine to determine the effectiveness of our individual teaching practices in relation to each others successes or failures – particularly from the perspective of activities which were about trying to teach our students about independent inquiry investigations that our discussions had revealed we were each having difficulty being successful with in our individual settings. Consequently, we decided to go away and each write about an example teaching activity that our peers might find useful in the conduct and planning of their own “science methods” courses which we then shared with each other. In our collective reading of these cases we gained insights into our own practices and those of others, but we also gained insights into broader issues of science teacher preparation through the juxtaposition of findings across the different cases. Subsequently, from our reading of the experiences of our peers in teaching inquiry science methods through experiential approaches, we decided to construct a conference proposal of these experiencesFootnote 3 which we then presented in the next year of our conference.

Each instructor used the following resources to write about their curricular examples:

  • the description (written and verbal) provided to students about the activity

  • notes of comments recorded during classes with students

  • notes of comments and student engagement recorded following the classes with students

  • student work collected from the activity

  • notes of comments following the return of graded student work

  • notes drawn from class records and course outlines

The instructor of each example elaborated on the example providing the context of its use, the student engagement, and implications drawn from the student engagement.Footnote 4 As experienced instructors (each with 10+ years instructing “science methods”) we each described scenarios with the intention both of improving our own practice as well as providing critical descriptions of classroom practice that may engender discussions amongst our peers so that our collective practices may improve.

Interpretations of data records from the individual studies were strengthened through collaborations within our group. Bowen and Bartley had attended each of the activities done by the other in previous offerings of their courses and used that experience to examine the various records used by each of them to write their individual case studies. MacDonald’s data was collected based on his class, but his interpretation of that data were checked by Sherman who had previously worked on that very sort of activity with previous students at his institution.

After we produced these three examples, we then collectively used them as a data source which we analyzed using an interaction analysis approach (Jordan & Henderson, 1995) drawing on grounded theory (Strauss & Corbin, 1990) as we conducted our interpretations of each other’s work as we constructed insights gained from looking across the case studies. Collectively, we found that the examples we had individually provided gave us insights into issues that are found throughout science education and we write about these in the overall conclusions. Thus, inasmuch as our individual examples represent the data/information we individually collected on the activities in our classes at our different institutions, each example also then acted as a data source for our co-authors in drawing our overall final insights and conclusions. At the very end of the chapter we will discuss the influences this has had on the classes we now teach and what new activities some of us are trying either in addition to or to replace the issues we concluded were present more broadly than in just our own individual classes.

Example I: An Introduction to Investigations (Bartley)

Background

My approach to this investigation activity is informed by the work of Tamir (1991) who provides two illuminating tables (depicted below). In the table below Tamir describes the roles of scientists and technicians, and teachers and students by posing the question, “Who does what in the science laboratory?” (p. 16)

Activity

Scientist’s lab

School lab

Identifying problem for investigation

Scientist

Textbook or teacher

Formulating hypotheses

Scientist

Textbook or teacher

Designing procedures and experiments

Scientist

Textbook or teacher

Collecting data

Technician

Student

Drawing conclusions

Scientist

Student or teacher

Tamir argues that student work will often correspond to that of a technician, representing a lower status and level of engagement than that of the scientist advocated for in the documents guiding science teaching and learning, e.g. National Science Education Standards (National Research Council, 1996); Science for All Americans (AAAS, 1990) and in various provincial science education documents in Canada.

Tamir (1991) also categorized inquiry investigations according to the degree of openness of the problem choice, the experimental design and the choice of conclusions.

Levels of Inquiry in the Science Laboratory

Level of inquiry

Problems

Procedures

Conclusions

Level 0

Given

Given

Given

Level 1

Given

Given

Open

Level 2

Given

Open

Open

Level 3

Open

Open

Open

These levels represent different degrees of openness from Level 1 where problem and procedures are given and students only collect the data to Level 3 where students do everything themselves. Tamir describes situations where most teachers typically operate in levels 0 and 1, while levels 2 or 3 would offer students more authentic learning experiences; these higher levels would correspond to “open inquiry” types of investigation activities.

Context

The secondary physics and chemistry methods courses at Lakehead University (in Ontario) are taught in a single group. Over the last two decades instructional time for this course has varied from 54 to 81 h and is currently 72 h (20 % of the total program instructional time). Compared to many jurisdictions students in science methods classes at Lakehead often have strong science backgrounds (which might be considered “well-qualified”) ranging from being in the final year of an honours science degree program to some with doctoral/post-doctoral experience. However, few students reported prior experiences with designing and performing investigations. In reference to Tamir’s levels of science inquiry, most students had experience with Level 1 science investigations, some had experience with Level 2 investigations, while a few had some Level 3 investigation experiences (thesis work at honours (bachelors), masters or doctoral level). Apparently little has changed since Woolnough and Allsop (1985) wrote, “most science teachers have themselves been brought up on a diet of content dominated cookery book type practical work” (p. 80). My challenge was to enable pre-service teachers to experience level 2 or 3 investigations, and thereby feel able to engage their own future students in such activities (which most provincial curriculum documents call for at middle and high-school levels).

The Activity

For this chapter I present a single activity (out of several I conduct like this over the school year). The first 20 or so minutes of the first class is spent in an ice-breaking activity in which participants introduce themselves to adjacent classmates and work together to melt ice cubes as rapidly as possible. This leads to a discussion of the science used for the ice cube melting activity and various other issues.

The follow-up activity – “Gelatin and the Bath” (See Fig. 13.1) – is usually presented in the second or third class of the course. Pre-service teachers are provided with three packets of Jello™ and are advised that they have 80 min this class, 45 min the next class, and 30 min the following class, 1 week later, to complete the activity.

Fig. 13.1
figure 1

Gelatin and the bath activity sheet (Ainley, Brown, Butler, Carrington, & Ellis, 1988)

Students are allowed any equipment available in the lab, and are provided with a broad range of balances (milligram to 2 kg ranges). In addition they are asked to consider how they would approach teaching this activity with 13-year old students, as this was a planned component of the activity. The activity involves trying to find the minimum amount of Jello that will “set” a tub half full of water and the “answer” will be in the form of packs or mass of Jello (which, therefore, is the variable of interest). During the activity the pre-service teachers initially need to establish criteria about what it means for the Jello to be “set”. Because there are different definitions of “set”, it is a variable (a covariate) that has a considerable effect upon the result, yet it is of no direct interest in their final claims of the amount of Jello needed. In essence, this is an activity related to the scientific practice of extrapolation of data from a known (how much Jello is needed in small amounts of water) to an unknown (how much Jello is needed in half a bathtub full of water).

Data and Discussion

Much of the activity sheet, and my introduction to the investigation, ensured that pre-service teachers were well aware that this was not a confirmatory experiment. However, some groups flirted briefly with solely analytical perspectives by arguing that the scaling model might/could not apply here.

If we only have three packs of Jello we cannot make the full size bath tub.

What if the Jello has to be close to a surface to set?

After about 15 min they perceived the problem as being sufficiently defined without any tensions concerning the size of the bathtub. However, their recognition that the problem did indeed have some approachable solution led them to move to the experiment.

Find out how big the bath tub is, say x Litres, then x/2 is the volume that we are working with.

Producing a model bathtub and scaling up led to some working with 100 mL beakers and 50 mL of water while others used 250 mL or 400 mL beakers with 200 mL of water.

The bigger sample is much better. We can get a better model of the bathtub than if we use small samples.

The small samples enable us to do many experiments. We can be more certain about our results with many experiments.

For some the definition of “set” was uncomplicated, as was its measurement.

Our definition of set is when the Jello will support a coin such as a penny for at least a minute.

The Jello is set when we can turn the beaker upside down and the Jello does not fall out.

For others, there was a necessity to seek authority for the definition of “set”. The manufacturers of Jello™ provide a toll-free questions/concerns/help phone line. Each year, at least one group member contacts the company to elicit information regarding the company’s definition of “set”. Some years this has been fruitful, other years of little value other than comic relief as the person at the call centre has not taken the request for information about a definition of “set” as a serious question. One of the more valuable responses was:

The food technician on the phone told me that they carefully remove the Jello from the container and place it on a plate or similar flat surface. Then they cut the Jello with a sharp knife. If the cut line is straight and the knife blade is not wet, then the Jello is set.

On being asked if it would be possible to compare results from one group to another, most groups concluded that such a question would require its own sub-set of experiments, and that had I needed such a comparison, it should have been built into the question. The provision of a solution concentration (g of Jello/litre of water) was deemed an appropriate starting point for later discussions.

I have 17 years of working with this activity and 1 year there was the unique result where none of the five groups were able to produce an answer to the question. Either all of the samples had set, or none of the samples had set. As an instructor it was a teachable moment: university science graduates had failed to complete an experiment deemed appropriate for middle school students and they wanted to explain what had happened. How was the ‘teachable moment’ addressed during class? Describe.

My experiences in working with around 110 groups on this activity have allowed me to enjoy many rich discussions about why this is not a trivial task. The stronger teams will usually set up five or six trials after the first session (varying mass of Jello per unit volume of water) and have some “set” and some not “set”. Then for a second set of trials, the pre-service teachers will use samples with a concentration between the most dilute “set” and the most concentrated not “set”, using up the remaining Jello™ for these trials.

Typical results come in around 80 packets of Jello™ (mass of each packet = 80 g) for a volume of about 150 L (≈40 US gallons). Follow-up conversations discuss “accuracy”, “precision”, and why both would be problematic given the latitude in the definitions of “set.” At my prompting the discussion also involves whether providing a range of concentrations would be appropriate or feasible.

This activity is an activity particularly useful introduction to self-directed inquiry. The task demands careful analysis, much deliberation and calculation and a sense of humour. My experience is that students remember this activity and can discuss it well into their teaching careers.

One year my class was able to spend some time with two seventh grade classes to work through the activity. Having completed the activity, my pre-service teachers thought they had a good handle upon what might be the issues with real students in a school setting. The ensuing hour could be best described as a wonderful learning experience as they realized that not only was their own specialized science language beyond this audience, but that teaching students to think about their own ideas required patience and gentle tenacity. Timing was also an issue as the allocated 65 min passed very quickly. Questions of “fair test”, dilution and “set” reappeared consistently with mixed levels of resolution. At the end of the class the school students thanked the pre-service teachers and left. We then held a debriefing sessions where there was a sense of amazement at the kinds of questions school students would come up with given an audience. For example: “What is the influence of shaking the packet before dilution?” or “How would you find the coolest location in the classroom for setting?” and “Could I define ‘set’ as strong enough for me to stand upon without splatting?” The second debriefing came after the teacher had spoken with the grade 7 classes about their experiences. In that debriefing the teacher made the primary suggestion that the pre-service teachers be less directing in their suggestions and suggested that they think more about the questions used to probe student comprehension. Given a week to reflect upon this we returned to support another class in the activity. While the ability and the experiences of the class were similar to the first group, the revised approach of the pre-service teachers led to a very different and more positive learning experience for all in the room.

Example II: Counting Grass (Bowen)

At the beginning of a fall semester middle school/secondary science methods course, in which most students have a university background in biology (as opposed to chemistry or physics) I have my post-baccalaureate B.Ed. students participate in a short inquiry investigation. I have them go outside to a bounded area of grass and address the following science “problem” which is framed as “authentic” for them: “On the designated patch of grass outside, with your partners (in teams of two or three), estimate the total number of blades/amount of grass in the area.” The students are also told that in the classroom they are to then “Provide a written step-by-step illustration of how your group calculated the total amount of grass in the patch.” with the goal of providing a “compelling and convincing” argument about how much grass is found in the area. Finally, after the data for each group has been collected, it is combined with that of previous years and students are asked, “Using the cumulative data set, choose and draw a graph that shows the most useful/interesting summary of the data that is possible. When you have completed your graph, write a paragraph to describe your interpretation of the data that you depicted in your graph.”

For this activity I have chosen a piece of land (approx. 11 m by 4 m) almost completely bounded by concrete curbing (with a small 2 m section that has rocks intruding on it from an adjacent area). There is a small worn “path” that crosses it diagonally, a stump, two trees, a hydro pole, and a small dirt patch. In addition, the grass is obviously “patchy” with higher densities in some areas and lower densities (mixed with other small plants) in others. Three sides are essentially straight, and one has a gentle arc. As a middle-school teacher I have successfully used a similar activity with grade 8 students (Note: their site had fewer complexities; no gentle arc, concrete around the complete area, and less “patchiness”).

For the pre-service teachers this is framed as an ‘authentic’ type of activity that a field scientist would do. They are provided examples of scientists such as ecologists monitoring long-term changes correlated with other factors (such as earthworm density, vertebrate herbivore density, or the addition of nutrients to the soil). The ‘authenticity’ is also embedded in students recognizing that (a) there is a finite amount of grass in the area, and (b) that neither I, nor anyone else, actually know what that finite amount of grass is. I provide the students with clipboards, blank paper, and measuring instruments (field tapes and metre sticks). Notably, all of the measuring instruments have both metric and imperial scales on them. [Canada adopted metric in 1974 and it is all that has been used in science and other classrooms since that time.]

My purposes in having my secondary science methods students engage in this activity are multi-fold. Firstly, a critique of another diagnostic/teaching exercise that I have used in the past (the Lost Field Notebook (LFN) exercise, see Roth, McGinn, & Bowen, 1998) is that the data is decontextualized for the students as it lies outside of experiences they have had. In the LFN activity students are provided a map of an “ecozone” (no scale is provided) with unevenly shaped areas that break up the ecozone. Within each area is a pair of numbers, one for bramble density, the other for light intensity. Students are asked to analyze and make a decision about the relationship between light intensity and bramble density. For individuals with a BSc in science responses are often quite poor, although science professors have indicated that they would expect a high level of response from anyone with a BSc degree (unpublished data). In order to address this data analysis issue (and address critiques that the decontextualized nature of the LFN activity contributed to the poor response), I developed the grass-counting activity as one that the type of data that field scientists would collect (i.e., it was “authentic”) and was do-able in a 3-h class. The grass-counting exercise provides students a data set to work with that derives from their own first-hand experience because they themselves have collected it. Thus, it provides me a diagnostic on their overall data collection and representation skills (how they define variables, how attentive they are to detail, what units they use, what sampling regimes they use, their use of significant figures, etc.). I further use this activity as a starting point to discuss the various forms of inquiry (as depicted in Tamir’s table), how one would evaluate investigation activities (particularly open-ended ones), what makes an activity “science”, characteristics of science (i.e., the Nature of Science), student motivation (when they have input into the design of activities), and so forth. This is all described to the students in advance so that they understand that the activity is the foundation for later activities and discussions in their “methods” course.

Student Engagement: Reports

For this example I will discuss a summary of the methods (collection and analysis) used by 19 groups of students (41 students in total). Student participation outside was quite focused and generally enthusiastic. All students seemed engaged and interested, and they were encouraged to talk with each other if they had difficulty deciding how to do something. In general, this was encouraged to raise the standard of the work and to set the tone that would be encouraged throughout this and following semesters for engaging in class activities.

Of the 19 project reports, two general strategies were used for sampling. One strategy, the “whole area” strategy (Fig. 13.2) involved treating the entire area as “one sector” and randomly sampling within that sector, then extrapolating to the surface area of the entire sector. Eleven groups used this approach. To develop accurate estimates one would suspect a reasonably high number of samples would be needed (given that there was quite noticeable patchiness of grass in some places, with a high density of other small non-grass plants in some areas). Of these 11 groups, the arithmetic mean was 3.45 samples/total area (with a median of 3, and a range of 1–10 samples) with four groups collecting 1 or 2 samples and three groups doing 4 or more. These groups also “eyeballed” decisions around how to compensate for weeds, or grass patchiness. For instance “we decided to subtract 10 square feet from area to compensate for uneven grass growth”, “Assuming ~6 % weeds”, and “We estimated the number of weeds to be 1/3 of the area.” typified statements about this.

Fig. 13.2
figure 2

Examples of sampling strategy maps. (a) Simple “subsection strategy” sampling strategy map showing three sampling areas in zones delineated by physical features. (b) Detailed “whole area” sampling strategy map showing “scale” and four sampling areas

The other strategy, the “subsection strategy”, involved dividing the total area into subsections, and then making estimates within each of those areas. Eight groups used this strategy with two different approaches being utilized. In the first approach groups divided the area into zones on the basis of physical features (such as the path) or for geometric measuring/calculating purposes (for example, creating rectangles) (See Fig. 13.2a for an example). The problems with using this approach could be mitigated by higher sampling (which one group did; measuring 1 cm2 of grass five times in each zone), but most groups only conducted one or two measures of grass density. The other approach to using a “subsection strategy” involved dividing the area into zones based on visible differences in grass density with sampling in each area – when using this approach sampling ranged from 1 to 11 samples per zone (note that the one sample/zone project sampled large areas, ~100 cm2). Notably, the projects that divided the areas into zones on the basis of visible differences in grass density also sampled and counted larger areas of grass (in all cases but a few 25 cm2 or higher per sample), whereas in the projects with a single zone 8 groups sampled 8 cm2 or less.

When students were instructed to conduct their measurements in metric all but one group did so, however when that instruction was not given, five of eight groups did not use metric measuring. This was surprising because, according to curriculum guides and their own commentary, all of their instruction in schools (both public school and university) was done using the metric system and, notably, all commercial product sizes and signs in stores in Canada are also in metric and have been for 30+ years. The “default” practice of so many students being imperial measurement was thus quite surprising. However, even when students did use metric (13 groups did), there were quite a number of calculation errors when extrapolating from the sample size (such as # of blades of grass in 100 cm2) to the number of blades of grass in a square metre. Four groups made errors when doing this calculation, usually neglecting to “square” the area in the extrapolation (such as multiplying the number of blades of grass by 10 instead of 100 when going from a 10 × 10 cm sample to 1 m2).

Previous to the activity the role of inscriptions as being central in science to constructing compelling and convincing arguments was discussed. Despite the previous in-class discussion of the importance of visual representations in science, two groups did not provide a drawing to accompany their calculations and description. Of the 17 groups who did, 10 were essentially simple sketches of varying detail. Fifteen of the 17 included scales/measurements, only 9 labelled or included features on the graph (such as trees, rocky area, etc.), and only 4 included any indication of where samples of grass density were taken.

Student Engagement: Graphing the Year-to-Year Data

Twenty students were instructed: “Using the cumulative data set, draw a graph that shows the most useful/interesting summary of the data that is possible and provide an interpretation.” Students could produce more than one graph if they chose. In that year, there were a total of 22 “grass estimates” (11 each from this and the previous year), including from their own data collection and estimates (which they watched entered in the table as they gave the numbers to me). As an instructor, this was quite instructive as the issues that had been present in the diagnostic Lost Field Notebook exercise were again played out but went further, indicating even more serious issues. Students appeared to have no conceptual framework guiding their choice of graphs. Overall students produced ten bar charts, one pie chart, one stem-and-leaf graph, and nine line graphs/scatterplots. The bar charts were often a compilation of the 2 years of data with ordered (but uneven) categories (see Fig. 13.3a). There was no bar chart showing the average # of blades of grass for each of the 2 years (which is what I thought students would produce, and which would be the most sensible comparison from the data). The pie chart was a compilation of data from the 2 years (showing 22 “slices”). The line graphs/scatterplots were either (a) the 11 data categories ordered from smallest to largest and then plotted with a joining line (Fig. 13.3c), or (b) as with (a) but with “trend lines” drawn, or (c) entered in the order shown in the data table, sometimes broken into the 2 years, with the x-axis being the row # in the data table and the y-axis being the # of blades of grass (see Fig. 13.3b) and in several cases, a trend-line was drawn. In the majority of cases there were issues with the interpretive statement the students made about the graph. In general, the graphs the students produced indicated some serious issues in choice of graph (for example, a scatterplot is not possible, nor is there any “science” reason for an “ordered” line chart to be drawn particularly in order to compare the data across the 2 years, or to draw a “trend line”).

Fig. 13.3
figure 3

Various graphical representations to show grass counts in two different years – (a, b and c) left-to-right respectively – note that each shows the x-axis having a count of “11” (which actually represents the total number of student groups in each year)

Conclusions

An ongoing concern of mine has been data literacy issues in my secondary “science methods” students. Engaging in this short inquiry investigation with them, an activity that any science professor I have discussed the activity with believes should be well within the scope of someone with even a minor in science, highlights for me that we need to engage pre-service teachers in a “science methods” course (not a “science teaching methods” course) as part of their preparation to be a science teacher to help them develop their skills in the applied practices of engaging in science investigations. My student’s difficulty with this activity, in both the sampling and the graphing, suggests to me that it would be difficult for most of these pre-service teachers to effectively engage their own students in any form of independent inquiry investigation activity. Anecdotally, over the years I have found the students who were most comfortable with (and most skilled at) data/investigation activities such as this one are also the ones who seemed more inclined to have their own students engage in science investigations when they became teachers…this highlights the importance of developing these data/investigation skills in their BEd program. If we really intend to have middle and high school students engage in activities that involve investigations where there is a variety of possible data collection and representation methods at play, then the data literacy of their teachers needs to be developed. Although I am satisfied with the role this “grass count” activity plays in other aspects of the “methods” course (such as discussions of inquiry, evaluation of inquiry, and the role that student control can have on motivation), even the other data-literacy oriented activities I conduct throughout the year to address data literacy issues may well be insufficient to address the depth of problems that I feel that this activity reveals. As a result of these findings (and others) I have become an advocate for a course to be offered in BEd programs that focuses on inquiry and data literacy through hands-on investigations (such as the elective documented by Melear, Goodlaxson, Warne, & Hickok, 2000).

Example III: The “Science Fair” Project (MacDonald)

This teaching example reports on an activity carried out at a teacher education institution in Nova Scotia with secondary level pre-service teachers. The example seeks to present aspects of the pre-service teachers’ perspectives on science inquiry as revealed by their engagement in a three-part assignment – intended to both develop their understanding of inquiry reflecting the higher levels of Tamir’s scale (1991) and reveal the way they engage in scientific inquiry – that was part of a “science methods” course. The three parts were: (1) pre-service teachers were asked to conduct their own inquiry investigation and present their findings at a university course-based science fair; (2) pre-service teachers were asked to participate as judges at a school-based science fair and describe in writing the projects they felt were the best exemplars of science inquiry; and (3) pre-service teachers were asked to read Bencze and Bowen (2009)Footnote 5 and make connections between this paper and the science fair in which they acted as a judge. The following is a synopsis of the assignment as provided to students:

This Assignment Has Three Parts as Follows:

(1) You will carry out an extended open-ended science investigation. You may work together with a partner if you wish. You should choose a question to investigate and gather data over time (8 weeks are available). You should strive to formulate an original question about the everyday world around you that you can explore using easily accessible materials and equipment (most material and equipment requests can be accommodated using our existing science resources). You will present the findings of your investigation in a poster board (available in our resource centre) presentation as a part of an in-class science fair held near the end of this course. You should keep a journal that records your activities in this project over time. Your project will be evaluated using the Canada Wide Science Fair Evaluation Rubric (Note: CWSF judging has changed since this activity).

(2) In the second part of this assignment you will participate in a local science fair as a judge. The local Junior School will hold a science fair and everyone in our Science Education course has been invited to participate as judges. At the science fair you will be given a judging assignment that will involve you in interacting with several young people in short (i.e. 10–15 min) discussions about their projects and using the school science fair rubric (i.e. Canada Wide Science Fair Evaluation Rubric) to evaluate projects. After the completion of the science fair, you should write a short essay (e.g. 12 pages) in which you describe key aspects of the projects that impressed you as being good examples of science inquiry.

(3) After the completion of the science fair, you are asked to write a reflective essay in which you use specific examples from the science fair to respond to the arguments of Bencze and Bowen (2009). It will be important to make specific reference to cases from the science fair where you act as a judge as you respond to this paper. Try to respond to the following questions: What did/did not surprise you in the science fair? What would you change about the science fair, if you could change anything? Do you agree with the perspective toward science fairs presented by Bencze and Bowen?

There were 50 pre-service teachers involved in this activity and all were students in a 2-year post–baccalaureate teacher preparation program. All of the participants had successfully completed a BSc degree at a Canadian university that involved them completing at least 30 credits of undergraduate university science coursework including lab-based science courses. Approximately half of the participants reported they had participated at least once in a middle-school and/or high-school based science fair as a student.

Inquiry Perspectives Revealed by Pre-service Teachers’ Science Fair Investigations

The science fair projects carried out by the pre-service teachers were disappointing to me, as their professor, in the sense that they were not high level inquiry projects, based on the Canada Wide Science Fair Evaluation Rubric. As the professor, I rated all of the projects presented by the pre-service teachers in the science education course. Most of the projects were low level (i.e. level 1 or 2) on Tamir’s scale whether they were studies or experiments. In short, none of the pre-service teachers in the class carried out a high level (i.e. level 3 or 4) inquiry project even though those had been presented to, discussed and modeled with the students previous to the assignment. A typical list of the questions investigated is presented below (Table 13.1).

Table 13.1 Example “Science Fair Project” questions

This list of questions reveals that none of the pre-service teachers chose to investigate an original question (i.e. one that they did not know the answer to in advance). The pre-service teachers in this example seemed to choose one of two ways to engage in their own science fair inquiry activity. One category of students used the opportunity to recreate science fair projects they had found reported on the internet. This group rationalized their decision by saying that it was a valuable way for them to better understand a project they might encounter during their potential engagements with young people’s school based science fair projects. The second category of students used the opportunity to refine an undergraduate university-based lab to “give it science fair qualities”. This group rationalized their decision as being an opportunity to develop a resource that might be potentially useful in their future teaching. Revealed both through verbal comments and various written submissions, essentially none of the pre-service teachers felt that high level inquiry based thinking was an important or necessary component of their teacher education program.

The pre-service teachers who had some experience in university-based science inquiry (e.g. as research assistants) also reported that they did not consider high-level science inquiry to be especially important at this point in their development as teachers. Many of the pre-service teachers felt the amount of time and energy they would need to invest in a high-level inquiry project would be too much for a three credit science methods course as part of their B.Ed. experience. Pre-service teachers who described this belief tended to choose science fair projects based on university-based science labs they had completed as a part of their undergraduate degree (sometimes with small modifications).

Conversations held with the pre-service teachers after the completion of their science fair projects revealed that most of them did not consider themselves to have ever engaged in authentic science inquiry, either in school-based or university-based science experiences, so perhaps it should not be surprising that these pre-service teachers were not able to produce high level inquiry projects despite their science degrees.

Inquiry Perspectives Revealed by Pre-service Teachers’ Experiences as Science Fair Judges

The pre-service teachers all reported they enjoyed their experiences as judges in the school-based science fair. The pre-service teachers visited three separate school science fairs displaying projects completed by students in grades 7–10. A portion of a typical response by pre-service teachers is shown as follows:

To finish off this reflection I want to touch on a couple of the questions presented to us. I was surprised that no one did a presentation on something that I had never heard of before. Each topic was something that I knew what the result was going to be before I started. It did not surprise me on how well the students did on the presentations. Being involved in the school previously I had known the high expectations the teachers have for the students. If it was possible to change one thing about science fair I would want all the students to do something original. I was not able to see any innovations and I think this would be a perfect section for this part. However, I do not think that innovations and original ideas are the most important part of a science fair. Rather, I think that motivating students to want to do more science is the most important thing. The students that I talked to were all very excited to be participating in the science fair.

This response is “typical” in that the pre-service teachers noticed that student projects were typically not original, they did not consider it an important consideration in the quality of the student projects. This suggests that their orientation was towards having students engage in more traditional “confirmatory” investigations rather than having them conduct more original investigations.

After reading the article by Bencze and Bowen (2009) and being asked to comment on it, virtually all of the pre-service teachers disagreed with the perspective presented in this article. Overall, they tended to dismiss the issues identified in the paper and chose to focus on what they perceived as significant benefits for school students during science fair engagements. For instance, one pre-service teacher wrote in his reflection:

The last thing that I want to talk about is the stressful and frustrating part mentioned by Benzce & Bowen (2009). Being in the classroom before this science fair was carried out, I was able to see that the students were given months for figuring out their topics and ideas and they had many deadlines along the way so they did not complete the activity in one night. The science fair provided an excellent opportunity for students to work in a hands-on way and display their multiple intelligences. Overall, I think that science fairs are great tools that allow the students to have some fun with science.

All of the pre-service teachers highlighted the “fun” aspect of science fairs. I now suspect pre-service teachers are not fully ready to think about science fairs in a critical way. This suggests they need more experiences interacting with young people involved in science fair activities in order to develop a more critical eye about student participation in science fairs. My own experience as a science educator in Nova Scotia is that many teachers in schools tend to remove themselves from the science fair process (either as judges or as support people) because they do not feel qualified to support inquiry of any kind. Most school-based science fairs and regional science fairs select judges who are not active teachers. I think this suggests that school teachers also need to develop a deeper understanding of inquiry and need to become more involved in the nature of their students’ thinking as the students engage in science fair activities.

Discussion and Implications from the “Science Fair” Activity

It seems that my pre-service teachers considered motivational features of science fairs to be the most important learning component of this kind of learning experience. All of the pre-service teachers engaged only in relatively low-level inquiry projects when asked to conduct their own investigations. In fact, all of these teachers reported that they did not consider their own performance as science investigators to be tightly connected to their future performance as teachers. In short, they seemed to be saying that one does not need to be able to do inquiry in order to teach inquiry effectively.

In their reports of what they noticed in their science fair judging experience, pre-service teachers tended to focus their attention on project features such as length of time, STSE connections, independence, and quality of the written and oral reports made by young people. While these features are useful to know about, none of these features focus on the level of inquiry displayed.

Finally, after reading Bencze and Bowen (2009), and reflecting on this piece of literature in the context of their school based science fair experiences, none of the students considered the issues described in this paper to be relevant. The pre-service teachers all tended to emphasize the “fun” and “hands-on” dimensions of science fair experiences for young people as being most important and tended to ignore more complex issues of privilege, power , and money that had been identified in the article.

I think this teaching example suggests that we need to rethink the way that a science fair experience for preservice teachers does or doesn't promote inquiry based learning. A science fair tends to be an experience that involves individuals in an almost completely independent activity, unlike much of the practice of science. While the promotion of inquiry based learning is a reasonable goal, it seems that more scaffolding, perhaps through guided-inquiry activities, is needed to help build a more complex and nuanced understanding of inquiry investigations, both with young people and with pre-service teachers , before being asked to engage in independent inquiry investigation activities.

The way the pre-service teachers involved perceived the importance of inquiry thinking as a part of their teacher development experiences remains a challenge. How can these pre-service teachers be encouraged to consider their own inquiries as important to their teaching? Perhaps rethinking the focus of the inquiry may be a useful way to move forward on this question. Rather than asking pre-service teachers to conduct original inquiries on science themes, perhaps teacher educators should ask them to conduct inquiries on topics they consider to be more relevant to their development as teachers. One way this might be addressed is to require teachers to conduct action research studies into their own practice during the field experience components of their teacher education program. Of course, this would require that pre-service teachers be introduced and educated in action research methodologies, a significant departure from the typical curriculum of many teacher education institutions in Canada.

Insights from the Cross-Case Examination

All of the examples reported pre-service teachers engaging (reasonably) enthusiastically in the various types of inquiry activities they were engaged in. The different investigation activities can be thought of as laying along a trajectory of complexity running from Bartley’s example where students were expected to extrapolate, in some fashion, from known data to an unknown situation. Farther along that trajectory, in Bowen’s example the students were provided a research question and were expected to develop a methodology, collect data and draw conclusions. Finally, at the terminus of that trajectory, MacDonald’s students engaged in activities most resembling “authentic” science in that they were expected to engage in an open-inquiry activity, evaluate other science investigations, and then reflect on the benefits of those previous activities. From this we can see that the three different activities map onto Tamir’s “Levels of Inquiry” scale (Tamir, 1991; described in Bartley’s section) to facilitate a discussion about where “breakdowns” in student performance are found.

All three activities are ones which science faculty (i.e., professors of science) would expect graduates of their program to engage with successfully, in the case of MacDonald’s activity at a high level on Tamir’s (1991) scale (unpublished data). The example described by Bartley that examined the very beginning of data literacy, framing variables, deciding on criteria, and extrapolation from small to large samples, reported both considerable student enthusiasm and engagement, as well as satisfaction on the part of the instructor with the progress made with this as an introductory activity to introduce these concepts to his pre-service students. In Bartley’s example, different approaches to solving the stipulated problem were acceptable, use of outside resources allowed, and collaboration and discussion across groups occurred – thus apart from other NOS perspectives (Lederman, 1992), the social nature of science communities were also conveyed (Latour, 1987; Latour & Woolgar, 1979). From a complexity perspective Bartley’s example would appear less difficult than the other two examples because of the provision of the initial data (the ratio of water: jello at specific volumes) from which students could seemingly extrapolate to larger quantities.

The Bowen and MacDonald teaching examples both described student engagement in forms of ‘authentic’ inquiry (Level 2 and 3 respectively, Tamir (1991)) which were more complex than Bartley’s because of the need to establish the preliminary relationships (from which, in Bowen’s example, extrapolation could then occur). Both Bowen and MacDonald’s examples reported on numerous issues with data literacy and inquiry that arose from their pre-service teacher student participation. In MacDonald’s example there was an attitudinal issue in that the students themselves did not think that competency with inquiry/data literacy was relevant to their role as a teacher (this mirrors attitudes reported by Melear et al. (2000) who also engaged preservice science teachers in inquiry investigation activities in a course specifically focused on those), and this attitude may have influenced the depth of their engagement and the quality of their work. However, in contrast, the Bowen example described an activity where student time was essentially not limited so students had as much time as they wished for the activity to be conducted in the detail they desired and there was considerable engagement in the activity with most students participating enthusiastically and positively. Despite this, in the pre-service teacher grass count studies there were often low sampling rates, either or both of categorization of different zones by grass density and the counting of grass in small samples to be extrapolated upwards, and numerous other methodological issues. In this instance, it is hard to argue that there was a time or resource issue as the pre-service teachers had access to the internet in the classroom as well as resource books. Thus, a lack of time and resources can not be responsible for limiting the quality of their work, nor that their interest was lagging. In both the Bowen and MacDonald examples, one gets the sense that the pre-service students are under-challenging themselves conceptually from a science perspective, and often it seemed as if they weren’t approaching the problems from a conceptual perspective at all; this might be understandable in MacDonald’s example where a time limit was set, but less so in the case of Bowen’s example where time restrictions were not imposed. Although it is arguable that explicit instruction could be used, in both cases there was a diagnostic aspect of the activity in that the instructors were attempting to determine what students would implement of their own accord. Both instructors engaged in explicit instruction in later activities but with apparently limited success at addressing the inquiry issues as demonstrated when students were, again, expected to engage in some investigation they themselves had designed. In some ways what really seems to limit many preservice science teachers in their own investigation efforts is their perspective that doing such activities is not something their own students will need to do nor should be expected to do under their direction as teachers. That they would have this perspective is not be all that surprising given their own experiences as students and the way in which science is presented in textbooks (Binns & Bell, 2015).

Studies reporting on issues with pre-service teachers ’ competencies with data literacy issues, some from authors of this paper, have often suggested that it is important to develop these competencies in methods courses and that perhaps even a separate course “on” inquiry is warranted. Claudia Melear and her colleagues at the University of Tennessee have conducted courses such as this for over a decade (see Bowen & Bencze, 2008 for a summary of this work, and other references therein) and have reported some success with this approach, although it was only a small subset of pre-service teachers in each Bachelor of Education program who participated in their course. However, in this chapter one of us has suggested an approach wherein competency with research practices, including data literacy, could be developed in pre-service teachers through the use of action research. There are differing views amongst the authors on this, as another of us (GMB) has been at two universities where action research was a part of the Bachelor of Education program and in both cases, he found considerable disinterest amongst the pre-service science teachers in participating in that type of research (with many being outright disdainful of that research approach when asked why they hadn’t taken the action research course). At Bowen’s current institution an attempt was made to implement action research as an activity in the BEd program-wide “seminar” course, and the comments made in their “methods” course by preservice science teachers about this activity were almost universally negative. This does not exclude including action research in a science “methods” course as an approach to address data literacy issues, but suggests that particular care must occur in designing such a course so that the pre-service science teachers engage in positive participation.

It is possible that the reasons for the negative outcomes, described by both Bowen and MacDonald, reside in why these pre-service teachers decided to go into teaching instead of staying in science – we speculate that perhaps many of these individuals are not really interested in research and related issues, such as dealing with abstractions, intangibles, unknowns and uncertainties, which are found embedded in science research. Recently, when his department was presented some of the findings of this study in a professional development session, a chemistry department head commented that it was only ever the weaker students in his department who didn’t really seem interested in science who went into education programs so he didn’t’ find the findings all that surprising (unpublished data). If this is indeed more broadly the case, it would certainly help explain the relative disinterest in science inquiry investigations reported here by MacDonald (and others of us elsewhere).

Clearly, when you remember that these pre-service teachers are being certified to teach high school science, including International Baccalaureate courses and other senior courses, it is evident that there are insufficiencies in the competencies with data literacy that were revealed in one of the examples (Bowen’s), and which were only beginning to be addressed in the other two examples. If we truly expect to have high school teachers teach inquiry-oriented courses, then at least in the Canadian context, where most of our science education students have attained 4-year BSc degrees, we need to adopt other approaches to help develop the associated data literacy skills. Whether that involves specialized science inquiry investigation courses in BEd programs, adoption of action research courses during the BEd degree, requirements of prerequisites such as honours thesis courses from the science degree or statistics courses, or some combination of all of these, there are issues with data literacy that science methods professors need to both better understand and develop better strategies to address. However, in this day and age of declining enrolments, when administrators are often less discriminating than in past about who is admitted to our programs, we recognize that we will certainly have challenges ahead of us in driving such an agenda.

It’s worth noting that in this chapter we have only described three individual activities in our courses – courses which have other activities designed to address issues with conducting inquiry investigation activities and data literacy – the success of students in these described individual activities do not necessarily reflect their competencies with inquiry at the end of their programs. What we have done in this paper, we hope, is provided an indication of where the problems with inquiry begin with students so that our peers can better think of how to address these issues with their own students. In keeping with recommendations made for the need for “multiple experiences, spanning several semesters, in which potential teachers of science are routinely expected to engage in authentic science activity and the use of inscription…” (Lunsford, Melear, Roth, Perkins, & Hickok, 2007, p. 561) both Bartley and Bowen believe that other follow-up activities they engage in over at least two semesters lead to greater data literacy by the end of their programs, and a greater orientation towards engaging their own future students in inquiry investigations, than was evident at the end of these described example activities. In contrast with the inquiry-focused “Knowing and Teaching Science: Just Do It” (non-methods) course reported on by Melear (see Brown & Melear, 2006; Lunsford et al., 2007; Melear et al., 2000), which did not seemingly lead to much focus on the teaching of inquiry in their own courses, all of us believe that the integration of a series of inquiry activities in our methods courses leads to a stronger inclination towards doing higher-order inquiry investigations by our program’s graduates in their own future classrooms (an area of future investigation for us).

Coda: Changes in Perspectives and Practices

In the beginning the four of us thought that our struggles were individual…that the student outcomes were based on our individual interactions, our particular pool of teacher candidates , the activities we designed, the way we enacted those activities. All too often conversations at conferences were not about teaching methods courses, not about what challenges we faced when teaching those courses, not about what we were experiencing with our students. In Bowen’s case he experienced these issues with engagement in higher level inquiry activities at several of the institutions he worked at (some of which has been reported in other literature). Realizing that the challenges with inquiry are more than “just mine” provide a considerable incentive to work at resolving the issue. Our expectation as science “methods” instructors was that our students should easily be able to engage in inquiry activities at reasonably high levels and some of us (particularly Bowen, MacDonald and Sherman) over the years were quite surprised at the struggles our students had with those sorts of activities. We realized that our “assumption” – that students coming into education programs with BSc degrees should be able to do inquiry and so teaching them to teach it should be reasonably straightforward – was deeply flawed. Clearly our “methods” teaching had to address inquiry investigations in a more fundamental and basic fashion and for our students we have to start from the basic assumption that knowing about inquiry is not the same as being able to do inquiry.

All of us have subsequently worked at developing other activities to further our interests at improving data literacy and inquiry competency in our “methods” students…particularly hands-on activities combined with explicit instruction on inquiry. Shortly after completing this work Sherman and MacDonald both entered administrative roles so both teach “methods” much less frequently than previously (although there is much to be said for having an administrator who strongly supports science methods professors in engaging their pre-service students in inquiry activities, as sometimes there is considerable student resistance). With insights gained from this work Bowen and Bartley developed a series of activities on improving data literacy, tested with their methods courses, which they now conduct workshops on at the national conference of the National Science Teachers Association and elsewhere regionally. Recently those activities were developed into a book on data literacy (Bowen & Bartley, 2013) which is now used widely in science teacher professional development workshops and action research courses in the United States. Further “teacher professional” publications on these issues are also forthcoming.