Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Recent educational reports in the USA (Duschl, Schweingruber, & Shouse, 2007), the UK (Osborne, 2007), and elsewhere in Europe have called for a science education that places an emphasis on scientific literacy, and makes the connection between science and everyday life. The focus of this approach is on the social aspects of science, aiming to prepare young people for life beyond school. Aikenhead (2006) has attempted to define the term by explaining scientific literacy as acquiring knowledge for science—that is, both knowledge of the content and knowledge about science, which he sees as the social processes of science. Likewise, in national reform documents, the core of scientific literacy is related to understanding knowledge and processes of science, and the application of this knowledge (AAAS, 1993; National Research Council, 1996). For example, the most recent US science education reform document states that:

Expectations of what it means to be competent in science and understanding science have also broadened. […] Learners who understand can use and apply novel ideas in diverse contexts, drawing connections among multiple representations of a given concept. They appreciate the foundations of knowledge and consider warrants for knowledge claims. Accomplished learners know when to ask a question, how to challenge claims, where to go to learn more, and they are aware of their own ideas and how these change over time. (Duschl et al., 2007, p. 19)

This call for the emphasis in science education on understanding the evidence and claims of science and being able to assess them is associated with the shift from studying science as exploration and experiment to studying science as argument and explanation (Duschl et al., 2007). For instance, Duschl (1990) argues that if we do not present science as a process of revision and substitution of knowledge claims we run two risks. Firstly, to develop in students the perception that “scientific knowledge growth is governed by the addition of new ideas, facts and theories to old ones” (p. 54) and, secondly to portray science as an activity in which scientists always agree. Hence the emphasis on teaching argument and explanation can contribute to students’ appreciation of both the power and the limitations of scientific knowledge claims. Such an understanding is increasingly required within the context of a society where scientific and socio-scientific issues (SSI) dominate the cultural landscape, where social practices are constantly examined and reformed in the light of scientific evidence, and where the public maintain an attitude of ambivalence (Giddens, 1990) or anxiety about science (Beck, 1992).

The work presented in this chapter is informed by these recent trends in science education, and the goals of my research in the context of socio-scientific issues have been related to argumentation (a component of scientific literacy) and engaging students with problems from their local or national communities as a means to improve their science learning experiences. In the early stages of my work, I explored SSI contexts: (a) as a way to improve elementary school students’ systems thinking (Evagorou, Korfiati, Nicolaou, & Constantinou, 2009), decision-making (Nicolaou, Korfiati, Evagorou, & Constantinou, 2009) and (b) as a way to support students’ collaborative argumentation (Evagorou & Osborne, 2007, 2008) when supported by the use of technology and more specifically scaffolded with tools from the WISE platform (Linn, Eylon, & Davis, 2004). WISE (Web-based Inquiry Science Environment) is an online platform that was developed to scaffold teachers and learners as they were learning to take advantage and manage new Internet technologies (Cuthbert & Slotta, 2004). The tools within WISE are designed based on four metaprinciples: (a) making science accessible to all students; (b) making thinking visible by modeling students’ ideas and evaluating how these ideas are transformed and synthesized to form new knowledge; (c) helping students learn from others by encouraging them to build on ideas presented by others and question peers or experts, and (d) promoting autonomy and lifelong learning (Linn et al., 2004). Early research in WISE focused on improving the platform, adding new tools or redesigning the existing tools that were integrated in the environment, in order to improve learning and scaffold students in more effective ways. Results from early studies were used as a means for refining the four metaprinciples (Linn et al.). The first projects implemented in WISE placed an emphasis on teaching heat and temperature through the use of simulations. Some of the projects implemented later included learning skills such as evaluating data gathered from websites, connecting claims to evidence or discussing findings with peers. More recent work has focused on using the tools within WISE to support students while collaboratively constructing arguments (Clark & Sampson, 2008). The choice of WISE is based on the evidence from previous research that the tools can support students as they collaborate with peers in their effort to investigate an issue (e.g., Bell, 2004).

For the first WISE project that was implemented in Cyprus, the problem that served as the curricular and instructional focus was not a real issue. Our group designed the project as a way to introduce systems thinking, and it focused on ways to control the population of the mosquitoes in a swamp. The elementary school students explored the issue through information provided to them; they used WISE tools to collect and organize information, and then reach a decision using their systemic knowledge to build computer models that would predict the long-term effects of their proposed solutions (e.g., use chemicals or introduce new species). For the second WISE project we featured a problem that had emerged as a national issue in the UK. The focal issue related to the decline of the indigenous grey squirrel because of the introduction of the red squirrel. This project was implemented in the UK with middle school students and they used the WISE platform and tools to collect and organize information about squirrel ecology as well as prepare and discuss their arguments regarding ways to control the population of the grey squirrels online.

These learning environments helped students to engage with the different SSI and improve their systems thinking, to build their content knowledge related to the system and to strengthen their argumentation practices. The implementation of the two projects also helped to understand some deficiencies in the design. For example, the students did not have any personal experiences with the issues presented since they did not constitute real or authentic problems for them or their nearby area. Hence, the students did not have opportunities to collect evidence from the field to support their arguments, and were not motivated to further explore the problem. Another design problem was that the curriculum materials were designed by researchers, but not in collaboration with teachers that hold a more practical understanding of the classroom reality. Based on these experiences and the ideas and questions generated from my involvement in these projects I started developing a new research project that received funds from the Cyprus Research Promotion Foundation. This research project, the focus of the chapter, is called Technoskepsi, from the Greek words technologia (technology) and skepsi (thinking) and aimed to collaboratively develop (with a group of teachers) curriculum materials making use of technologies (the WISE platform and handhelds) in order to support elementary school students’ argumentation through the context of an SSI. An additional element of the learning environment was the integration of formal and nonformal settings for students’ investigations that would provide field experiences and an authentic aspect of the science learning process (Braund & Reiss, 2006) for the students.

Project Goals

The Technoskepsi project had four goals, both design and research goals: (1) to design, in collaboration with teachers curriculum materials that will support younger students’ argumentation within an authentic SSI; (2) to explore elementary school students’ arguments and collaborative argumentation, (3) to explore the use of WISE and handhelds and the effect they might have on students’ learning; and (4) to explore supplementing formal with nonformal settings, and understand their impact on students’ argumentation, decision-making, and emotions toward science.

One of the main goals of the Technoskepsi project was to utilize the knowledge and experience of a group of teachers in order to design curriculum materials that will support elementary school students’ argumentation and decision-making within an SSI, making use of technology. The decision to involve teachers in the process of designing the curriculum is associated with results from the implementation of curriculum materials from previous projects that indicated that teachers felt they did not have ownership of the materials because they were not involved in the design process. Furthermore, feedback from the same teachers states that researchers design curriculum materials for classes without having the experience from the classes. In order to address these issues I recruited four elementary school teachers to work collaboratively on the design of the curriculum. The teachers, two female and two male had special interest either in the use of technology or in science education, and were already working as elementary school teachers in Cyprus at the time. Table 8.1 shows the qualifications and years of teaching experience for the four teachers that participated in the design group.

Table 8.1 Overview of teachers’ qualifications and teaching experience

Initially the four teachers participated in four, 3-h meetings aiming to familiarize the teachers with the goals of the research project and the main theoretical perspectives of the project: (a) argumentation, students’ difficulties, and methodological issues, (b) socio-scientific issues and their importance for science learning, (c) scaffolding science learning with the use of technology, (d) sociocultural theories of learning, and (e) project-based learning. Between meetings, the teachers had to read papers, evaluate curriculum materials, and prepare short activities on the topics discussed. An online environment was designed in which the teachers could find all resources and also discuss issues with the other members of the group. In one of the meetings, Aris suggested the design of curriculum materials that would focus on the study of a socio-scientific problem that was causing problems to the community of his school. This problem was associated with a pig farm near this community. The issue was generating a lot of media coverage primarily because of complaints from community members related to the farm’s smell. Following Aris’s suggestion, we collected information regarding how pig farms function, what kinds of pollution they can potentially cause (water, soil, and air), the European legislation regarding farms and optimum waste management techniques, and all the latest techniques that can be used to minimize the smell and pollution issues. After having acquired this information, we worked collaboratively to design the structure for an instructional unit.

The second goal of the project was to explore elementary school students’ arguments and collaborative argumentation. Argumentation is considered a major aspect of the resolution of scientific controversies (Fuller, 1997; Taylor, 1996). It is seen as “a social process, where cooperating individuals try to adjust their intentions and interpretations by verbally presenting a rationale for their actions” (Patronis, Potari, & Spiliotopoulou, 1999, pp. 747–748). Therefore, argument and argumentation have two different aspects, an individual and a social. The individual aspect of the argument refers to articulating a point of view (Jimenez-Aleixandre & Erduran, 2008), while the social aspect involves two or more people and aims to persuade others (Bricker & Bell, 2008). In science, arguments are commonly constructed to explain a phenomenon or to explain a theory or a new discovery, and argumentation is seen as part of the process of knowledge construction in science. More specifically,

… argumentation in scientific topics can be defined as the connection between claims and data through justifications or the evaluation of knowledge claims in light of evidence, either empirical or theoretical. (Jimenez-Aleixandre & Erduran, 2008, p. 13)

However, argumentation is not a skill specific to science; on the contrary, it is central to people’s ability to solve problems, make judgments and decisions, and formulate ideas and beliefs (Kuhn, 1991). Argumentation is essentially a thinking/reasoning skill. According to Kuhn (2005), thinking is the process that enables us to make informed choices between conflicting claims and understanding this leads a person to value thinking. Usually, when learners are constructing arguments, they need to evaluate alternative perspectives and opinions and select a solution that is supported by evidence and explanation (Cho & Jonassen, 2002). Hence, argumentation is an important skill for everyday life because we are frequently faced with situations in which we have to evaluate alternative solutions or scenarios and decide on a course of action based on evidence. The ability to identify alternative solutions—a skill associated with argumentation—can potentially help people move toward more informed decisions in their everyday life.

Even though argumentation in science education has been an emphasis of many studies (Erduran, Simon, & Osborne, 2004; Jimenez-Aleixandre, Rodriguez, & Duschl, 2000; Jimenez-Aleixandre & Pereiro-Munoz, 2002; Kuhn, 1991; Sandoval, 2003; Osborne, Simon, & Erduran, 2004a, b), little is yet known about how young students develop their arguments, especially in the context of socio-scientific issues, and how young students work collaboratively in order to construct their arguments. Technoskepsi explored the issues of construction of arguments and collaborative argumentation by younger students, and some of the findings are reported later in this chapter.

The third goal of the project was to explore the use of WISE and handhelds and the effect the combined use of these technologies might have on students’ learning and development of argumentation. Results from previous studies agree that even with specially designed instruction, students do not construct the high quality arguments that might be desired of them (Erduran et al., 2004; Jimenez-Aleixandre & Pereiro-Munoz, 2002). Students’ failure to construct high quality arguments can be explained partly by the dominance of a pedagogy which is authoritative and rooted in education as a form of transmission (Simon et al., 2005) which provides students with few opportunities to engage in the process of argumentation (Jimenez-Aleixandre et al., 2000). It has been suggested that online learning environments have the potential: (a) to scaffold the teaching process and help teachers move away from authoritative pedagogy and (b) to scaffold argumentation in more constructive ways (Andriessen, Baker, & Suthers, 2003; Bell & Linn, 2000; Clark & Sampson, 2008). Educators favoring the use of handhelds in education according to Zurita and Nussbaum (2004) suggest that handhelds “support constructivist educational activities through collaborative groups, increasing motivation, promoting interactive learning, developing cognitive skills, and facilitating the control of the learning process and its relationship with the real world” (p. 235). The choice of the WISE platform and the handhelds is explored further in the intervention section.

Finally, the last goal of the project was to explore supplementing formal with nonformal settings, and understand their impact on students’ decisions, argumentation, and emotions toward the lesson. The decision to make use of nonformal settings is associated with the claim that when students engage in authentic practices that can provide a context that can potentially increase students’ motivation (Edelson & Reiser, 2006). Blumenfeld, Kepler, and Krajcik (2006) add that motivation sets the stage for cognitive engagement and leads to achievement by increasing the quality of the cognitive engagement.

Setting

As described earlier, a group of four teachers proposed a set of activities they thought would be appropriate for 10–12 year old students for the socio-scientific issue proposed by one of the teachers, Aris. Three of the teachers changed school districts the following year and were no longer able to work on the project. Aris, on the contrary, stayed at the same school and offered to implement the curriculum with two of his classes. By that time, Aris was in his tenth year teaching at the elementary school level. Aris graduated from a prestigious 4-year Bachelor’s degree program in Elementary Education with a specialization in Science Education. Immediately after graduation, he began teaching. During his career, he worked at three different elementary schools and taught all grade levels and subjects. Four years into his teaching career, Aris took two years off from teaching to pursue a Master’s in Science Education and worked part-time as a researcher at the same university. During that time, Aris was involved in argumentation and computer modeling projects with elementary school students. After finishing his master’s degree, he returned to teaching at the elementary school at which the current project was conducted. When we started working together on the Technoskepsi project, he was in his third year at that school which will be referred to as MA Elementary, and he was teaching sixth grade (student age: 9–10) language, mathematics, and science and fifth grade (student age: 11–12) science.

MA Elementary serves the local community of a small suburban town, with a total of 160 students (K-6) and 15 teachers. The curriculum materials were implemented in two of the classes that Aris was teaching science: a fifth grade class and a sixth grade class. The fifth grade, similar in terms of students’ abilities to the sixth grade, had 17 students and served as a forum to pilot test the curriculum. Based on observations made during the fifth grade class implementation, we enacted curricular changes that were implemented with the sixth grade class. Hence, the sixth grade served as the class in which we collected the main data for the Technoskepsi project. Aris’s sixth grade, with 18 students (10 boys and 8 girls), was a low achieving class. Since in the Cypriot educational system there are no national exams or formal grades until middle school, I asked Aris to describe his students’ general abilities in mathematics, Greek language (the official school language), and science, and I also administered an argumentation questionnaire to identify their argumentation levels. According to the teacher, the majority of the sixth graders were low achievers in all three main subjects, with two of the students (immigrants) having Greek as a second language. Additionally, Aris reported that some of the students exhibited behavioral problems and it was difficult to include them in group activities. Furthermore, the argumentation questionnaire showed that most of the students had difficulties in either choosing or constructing the best argument (Evagorou, 2008).

Aris organized his students in groups at the beginning of the year, and the groups designed investigations and interacted with technology as part of their science curriculum throughout the year. Five desktop computers were available in the class, and students had already worked on collaborative computer programs prior to the initiation of the Technoskepsi project (e.g., they started a wiki describing the history of their village and also used modeling programs). More information regarding how the students worked during the implementation of the current project are presented in an upcoming section.

Teacher–Researcher Relationships

I had known Aris for several years, and we had worked together in the past as researchers in science education and technology projects. We shared similar ideas regarding teaching and learning and the same passion for the use of technology as a tool for learning. After the initial drafting of the curriculum with the group of teachers, we worked with a biologist and an educational technology expert from my institution to design the final version of the curriculum materials. The biologist assisted us with the content knowledge that was necessary to understand the air, water, and soil pollution that the waste from the pig farm could potentially cause as well as various waste management techniques. Implementation of the curriculum started in January and finished in March. Materials were implemented with the fifth grade class every Monday and then in the sixth grade class every Friday of the same week. Aris met with his students for two, 40-min sessions each week. During the implementation, I was a participant observer, sometimes coteaching with Aris, sometimes simply participating in the group discussions and offering technical support and other times simply observing the teaching process. After each implementation in the fifth grade, Aris and I held meetings during which we would discuss difficulties that the students had with the curriculum and possible changes to the activities to better meet student needs. During the week, we made the changes to the curriculum and implemented the modified learning experiences with the sixth grade class at the end of the week. After each implementation in the sixth grade, we held a shorter meeting in which we discussed the lesson.

Intervention

As described earlier, the first form of the curriculum was designed collaboratively by a group of teachers, and the final form was designed by myself and the teacher that implemented it with the help of a biologist and an expert in educational technology. The learning environment is partly designed within the WISE platform (Linn et al., 2004) and poses the following guiding question to the students: What are the effects of the pig farm on your area and what course of action do you suggest? The socio-scientific topic was chosen because it was relevant to the students’ everyday lives and was an issue that could potentially engage these students in the investigation and challenge them to construct arguments considering all aspects of the topic (moral, financial, environmental, social). We designed the Technoskepsi curriculum materials to meet the following instructional objectives:

  1. 1.

    Help the students to develop an understanding of argumentation and how argumentation is different from simply expressing opinions.

  2. 2.

    Develop argumentation skills and be able to use evidence to justify their claims.

  3. 3.

    Engage in scientific investigations and collect evidence from the field.

  4. 4.

    Engage in investigations regarding an authentic socio-scientific issue and understand and appreciate the social and scientific factors that contribute to the controversy.

  5. 5.

    Understand the systemic nature of their environment and the short- and long- term effects that various decisions have on their environment.

  6. 6.

    Develop an understanding of waste management techniques and the impacts they can have on the environment.

In order to achieve these objectives we designed eight, 80-min lessons. We examined resources from the Cyprus Department of Agricultural with information regarding the pig farms and interviewed environmentalists and visited a pig farm to have a holistic approach to the problem. Our decision was to develop the curriculum based on project-based learning (Krajcik et al., 1998), sociocultural theories of learning (Rogoff, 2003), and what we already know regarding how people construct arguments. For example, Kuhn (1991, 2005) suggests that most people tend to be certain of their theories, even when they are using pseudoevidence; they tend to reason better on the subjects for which they have personal knowledge; and they assimilate new information in existing theories and they express considerable certainty that new evidence supports their theories even when it is contradictory.

Another decision was to use the WISE platform which incorporates knowledge representation and discussion-based tools that can potentially scaffold students in their effort to work collaboratively to construct arguments (Bell & Linn, 2000; Evagorou & Avraamidou, 2008). More specifically, a way to support students to construct higher-level arguments is through scaffolding provided by the use of computer-based tools (Cho & Jonassen, 2002). Several technology-based classroom interventions have been designed with the aim of argument construction in science classrooms (Sandoval & Reiser, 2004). Research findings on technology-based environments and argumentation suggest that technology has the capacity to support high quality argument construction within the classroom by taking account of the epistemic and social factors that support and promote argumentation (Bell, 2004). Such environments are designed to support students’ argument construction in two ways: (1) by incorporating knowledge representation tools (Edelson, Pea, & Gomez, 1996) and (2) by incorporating discussion-based tools (Scardamalia, 2003).

Knowledge-representation tools have been designed to help students construct arguments by connecting evidence to the appropriate claim. For example, in these tools, evidence might be represented with a specific shape and color and claim, with a completely different shape and color. These tools address the difficulty that students have with evaluating evidence and claims and the fact that they usually tend to provide a claim with no evidence or with single rather than multiple pieces of evidence. Furthermore, as Suthers (1999) states, knowledge representation tools mediate discourse “by providing learners with the means to articulate emerging knowledge in a persistent medium, inspectable by all participants, where the knowledge then becomes part of the shared context” (p. 4). Suthers (1999) also explains how the visual presence of the knowledge unit in a shared representational context can serve as a reminder of the work that needs to be done by the learners. For example, a linear text, like an online discussion does not provide any hints to whether learners need to do something specific. On the contrary, a graphical representation tool illustrates how learners need to find connections between different bits of the knowledge. An example of a knowledge representation tool within WISE that we used to scaffold our students is Sense Maker (Fig. 8.2).

Discussion-based tools can facilitate communication, either through online asynchronous communication or face-to-face synchronous communication, with other learners, promoting dialogic argumentation. Discussion-based tools are based on the recognition that the construction of knowledge is not an individual process but rather a collective process of ideas and arguments that come together, such as in the work undertaken in scientific research groups (Scardamalia, 2003). Furthermore, according to (Lampert, Rittenhouse, & Crumbaugh, 1996) discussion-based tools allow students to take more time before formulating a contribution or an argument, something that usually does not happen within classrooms, and in that way help them contribute to discussions in more coherent ways. These tools can also reduce social and emotional obstacles of expressing opinions in public when lacking the appropriate representations, and, thus enable more students to take part in the discussions. Research associated with these environments has demonstrated that such technology-enhanced learning environments can be used to successfully scaffold dialogic argumentation (Sandoval & Reiser, 2004).

Finally, another form of technology we decided to use was handheld data collection devices. Recently, the use of mobile devices has taken the place of ordinary computers in the classrooms. What is of importance about the use of mobile technologies for education is that tools which first existed only on expensive desktop machines are now available on inexpensive handheld units (Soloway et al., 2001). In the case of Technoskepsi, the use of handhelds provided a number of affordances to support student learning experiences. Handhelds are highly portable, are relatively inexpensive (compared to other data collection systems), have a relatively long battery life (compared to laptops), and facilitate easy synchronization and data sharing. Given the opportunity for students to visit and collect data from a field site, the handhelds proved to be ideal tools for supporting the students’ work.

The curriculum materials were made up of eight interconnected lessons that ranged from an introduction to argumentation, to an introduction to the problem, a visit to the nearby pig farm and a whole-classroom discussion of the decisions the groups reached. Table 8.2 presents the structure and content of the curriculum materials.

Table 8.2 Overview of the Technoskepsi curriculum

As shown in Table 8.2, the first lesson was an introductory lesson in argumentation, in which the students had to discuss in their groups a different socio-scientific issue, that of constructing a zoo in their area and then present their arguments. The material for this argument was adapted from the IDEAS pack (Osborne et al., 2004a). The teacher focused the discussion on what a good argument should look like and how an argument differs from an opinion. The emphasis during the whole classroom discussion was on the use of evidence to support one’s claim and what kind of evidence we should trust.

The second lesson was an introduction to the problem—that of the excessive smell from the pig farm and the protests of the people in the community. The problem was presented through newspaper clippings and recorded interviews from people in the community, an environmentalist, and the owner of the pig farm. After studying the problem the students (working in groups of threes) were asked to state their opinion and whether they suggest closing down the pig farms. In Table 8.2, the argumentation task is shown in the third column as Argument 1. Figure 8.1 presents a screenshot from the learning environment showing a Note Window and the introductory page presenting the problem.

Fig. 8.1
figure 1_8

The introductory page of the online part of Technoskepsi

During the third lesson the students had to work in their groups in order to familiarize themselves with all aspects of the problem (e.g., environmental and financial issues) and understand the possible effects the waste from the pig farm could have on soil, air, and water, and the kind of solutions that the various waste management techniques could offer. The online platform (WISE) was used to scaffold students in two different ways during that time: (a) to collect all the available information to help them construct their arguments and (b) to scaffold students in the process of constructing and sharing their arguments. In order to achieve the first goal, various “note” windows were designed (see Fig. 8.1) which scaffolded students to collect information and evidence from the online learning environment. In order to achieve the second goal, knowledge representation tools and discussion-based tools were used. Figure 8.2 presents a screenshot of the knowledge representation tool (SenseMaker) that was used in the online learning environment.

Fig. 8.2
figure 2_8

The knowledge representation tool, SenseMaker

SenseMaker allows students to coordinate their evidence with the appropriate claim, a function that addresses one of the difficulties that students face with argumentation. More specifically, the rectangles represent the claims, and these are set in the learning environment; the students then have to type in their evidence (represented by the underlined text) and put them under the appropriate claim. At the end of the third lesson the groups had to submit online their argument as formed after studying all the available evidence within WISE. This is shown in Table 8.2 in the third column as Argument 2.

After carefully reading and discussing the online data, we explained to the students that they would visit the area of the pig farm to collect data. However, an important goal was to prepare them for the field investigation; hence after the teacher’s suggestion we designed Lesson 4 in order to familiarize the students with the data collection techniques they would use on the field. During Lesson 4, the students developed familiarity with the handheld devices, and used the water, air, and soil quality kit in some investigations in the school yard, and prepared a list of questions for their visit to the pig farm.

Lesson 5 and 6 (two, 80-min lessons) consisted of visits to the pig farm. During that time, the students had to collect evidence to support their argument (e.g., water, soil, and air quality, interview with the pig farmer regarding his waste management techniques, location of pig farm relative to inhabited areas), and use their handhelds to store the data collected. After they returned to the classroom, students transferred their data from the handheld devices collected in the field to their computers.

The aim of lesson 7 was to help students unpack the experiences from farm visit and to scaffold students in using the evidence collected from the field to further support or dispute their arguments. After revisiting their arguments, the groups had to submit a new argument online and share it with the other groups using the discussion-based tool. The groups then commented on another group’s argument. The purpose of this activity was to help the students strengthen their arguments based on the feedback from another group. The final outcome of this lesson was Argument 3 (as shown in Table 8.2, third column) that was submitted by each group.

Finally, during the last lesson the groups presented their arguments during a whole-class discussion, and students engaged in a debate. Additionally, the class talked about the different kinds of justifications and how to rank them based on importance. Finally, students decided which points to include in a letter that was addressed to the local authorities and presented the outcomes of their investigations.

Research

Research Questions

The research associated with the Technoskepsi project explores how various types of technologies can be used to support argumentation and decision-making within a socio-scientific issue, both in formal and nonformal settings. More specifically the research questions guiding this study are: (a) Is the specially designed curriculum material (combining investigations in formal and nonformal settings, and the use of technology) successful in engaging 11–12 year old students in argumentation? (b) How do students’ arguments and decisions develop/change after the outdoors visit? (c) What is the contribution of the learning environment on 11–12 year old students’ attitudes and emotions toward science?

This study is significant because it describes how different technological tools could be used in order to support investigations and argumentation in formal and nonformal settings, aiming toward argumentation. Even though a lot of studies in science education place an emphasis on argumentation, previous studies in argumentation have not identified how students’ arguments develop and change (and why) especially after a nonformal investigation of an authentic problem. Although some research has been published on the argumentation practices of relatively young learners (e.g., Neylor, Keogh, & Downing, 2006), most of the work in this area has focused on older students (e.g., Jimenez-Aleixandre et al., 2000; Osborne et al., 2004b; Sandoval & Reiser, 2004). Therefore, the current study’s focus on elementary students’ argumentation has potential to offer new insights for the field. Furthermore, students’ attitudes after participating in such argumentation projects supported by technology have not been previously documented.

Methods

The students worked in groups of three both indoors and outdoors for a period of eight, 80-min lessons. Furthermore, students’ artifacts including online submissions, online discussions, final presentations were also recorded. After the end of instruction, all students were interviewed in order to identify their emotions and attitudes from the implementation of the learning environment. The video interactions from the implementation and the interviews were transcribed, and a qualitative case study research approach (Creswell, 1998; Merriam, 2002) was used in order to analyze students’ construction of arguments and their responses to the interview. Table 8.3 presents the research questions and corresponding data.

Table 8.3 Research questions and corresponding data

Data Collection and Analysis

Analyzing Students’ Written Arguments

In order to analyze students’ written arguments, a modified version of Toulmin’s (1958) Argumentation Pattern (TAP) devised by Erduran et al. (2004) was applied in order to assess the structure of the arguments. In Toulmin’s framework, the essential elements are claims, data, warrants, and backings. According to this framework, data are “the facts we appeal to as a foundation for the claim” and warrants “general hypothetical statements, which can act as bridges” (pp. 97–98). According to TAP, data are the facts that those involved in the argument appeal to in support of their claim. A claim is the conclusion whose merits are to be established. Warrants are the reasons that are used to justify the connections between the data and the conclusion, and backings are the basic assumptions that provide the justification for particular warrants. Additionally, in more complex arguments, Toulmin identifies two more features in his framework; the qualifiers that specify the conditions under which the claim is true—and rebuttals—which specify the conditions in which the claim may not be true. The elements of argument are also presented in Table 8.4, with an explanation of each of the terms.

Table 8.4 Definitions of the elements in Toulmin’s framework of argumentation

Summarizing, in terms of Toulmin’s framework, quality means having all the different components that Toulmin suggests. However, how can you decide which argument is better quality than the other? In order to address this methodological issue, Erduran et al. (2004) devised five argumentation levels to “measure” or explain the quality of argumentation, especially as a measure of interactive discourse, since the main identifier of quality in their levels is the presence or not of rebuttals (Erduran, 2008). These levels are based theoretically on Toulmin’s framework and are informed from empirical evidence on how young students construct arguments (e.g., Osborne et al., 2004a,b). The authors suggest the following levels of argumentation, which were used in the analysis of students’ artifacts in this study:

  • Level 1: Arguments that are a simple claim versus a counter-claim or a claim versus a claim.

  • Level 2: Consist of a claim versus a claim with either data, warrants, or backings but which does not possesses any rebuttals.

  • Level 3: Consists of a series of claims or counter-claims with either data, warrants, or backings with the occasional weak rebuttal.

  • Level 4: Arguments with a claim with a clearly identifiable rebuttal. Such an argument may have several claims and counter-claims.

  • Level 5: An extended argument with more than one rebuttal. (Erduran et al., 2004).

According to these levels, a sophisticated argument is one that consists of more than one rebuttal (Level 5) which points to the circumstances under which the claim would not hold true, and an argument that consists of only a claim is a Level 1 argument. The value of this modified version of Toulmin’s framework lies in the fact that it enables an identification of the level, or what might be termed the quality of argumentation, and can be used to evaluate both interactive or oral argumentation, and written arguments, even though the presence of rebuttals in written arguments should not be expected to be as frequent. This modified version of TAP by Erduran et al. (2004) is the main framework that guides the analysis of the data in this study, and the choice of this framework is based mainly on the fact that is has been previously applied for the analysis of students’ dialogs for a similar age group as the one in the current study (e.g., Osborne et al., 2004a,b), and it has been widely used by science education researchers (e.g., Jimenez-Aleixandre et al., 2000; Osborne et al., 2004a,b). However, we appreciate that this is not a sufficiently elaborated representation of the levels of argumentation that can occur in a science class. A person, when constructing an argument, for example, can propose one that consists of a claim and a single piece of datum and another person might propose the same claim but support it with multiple data. Are these two arguments at the same levels of sophistication or should they be placed in different levels? Is the presence of more data an indication of the quality of the argument? The Erduran et al. (2004) framework does not discriminate between the two (Evagorou et al., 2009), hence an additional measure of the quality of the students’ written argument was the number of pieces of evidence in each one of the arguments. Furthermore, the arguments were also analyzed in terms of the socio-scientific nature, using the following coding categories: social, environmental, moral, and financial. Finally, all written arguments (Argument 1, Argument 2, Argument 3; see Table 8.2) were also analyzed in terms of the decision that the groups made (e.g., to move or not to move the pig farm) in order to identify the impact of the outdoors visit on the decision.

Analyzing Video Interactions and Interviews

The video interactions from the whole-classroom discussion were transcribed and analyzed in order to identify any arguments constructed and presented by the students, using the Erduran et al. (2004) argumentation framework. The interviews were transcribed and open coded in order to identify students’ emotions and attitudes from the implementation of the learning environment.

Results and Discussion

Students’ levels of Arguments and Changes in Decision

The analysis of students’ artifacts as presented in Table 8.5 indicates that the students engaged in argumentation during the implementation of the curriculum materials with some of them providing higher level arguments by the end of the instruction. More specifically the table presents the levels of arguments for all six groups, for their three arguments during the lesson, and their decision in each one of the arguments (to move or not the pig farm). The first argument is the opinion that the students offered at the beginning of lesson 2, the second argument is the one offered by the groups after familiarizing with the problem and ways of solving it using the WISE platform and all available information during the indoors investigation. The third argument is the final argument presented by the groups after the visit to the pig farm.

Table 8.5 Levels of arguments and decision

As shown in the table above, during the first lesson, all the students could provide an argument, but some were unsupported claims (Level 1 arguments) or claims supported by a single piece of evidence (Level 2), usually based on everyday experience. By the end of the implementation as shown in the table only two of the groups improved their final argument in terms of the level of argumentation. However, looking into details into the arguments offered by the groups it was evident that even though there was no improvement in the levels of the arguments, there was improvement in the content of the argument, and the number of pieces of evidence offered by the groups. An example is the arguments constructed by Group 1 in the first and last lesson.

  • “We think that the pig farm should be removed from the area because it is causing problems” (Group 1, Level 2 argument/Lesson 1).

  • “We should not close the pig farm because: if we do so many people will lose their jobs, we will have no meat to eat, people will lose their jobs, it might smell but there are various ways of waste management that can help reduce the smell.” (Group 1, Level 2 argument, Final Lesson).

Based on the issue identified above, and the fact that the Erduran et al. (2004) levels of argumentation cannot always capture the improvement in students arguments, especially the written ones (Evagorou et al., 2009), the data were also analyzed based on the number of pieces of evidence provided by the groups (Table 8.6), and the socio-scientific nature of the argument (social, S; environmental, E; financial, F; moral, M) as shown in Table 8.7.

Table 8.6 Levels of argumentation and number of pieces of evidence for each group
Table 8.7 Levels of argumentation and socio-scientific nature of argument

Comparing the first and the second argument constructed by the groups, it is evident that even though only Groups 1 and Group 3 improved their levels of argumentation, all six groups improved in terms of the number of pieces of evidence they included in their arguments. This finding suggests that the learning environment supported the students in collecting and including new pieces of evidence in their argument, even though the structure of the argument (e.g., inclusion of rebuttals) did not necessarily change. Comparing the first argument, with the argument submitted after the outdoors visit, it is evident that only three groups improved their arguments, both in terms of the level of argumentation and the number of pieces of evidence, something that suggests that the outdoors visit did not necessarily help the students to improve their argument. This finding can be explained by looking (a) into how their decision (to move or not the pig farm) changed after the outdoors visit, and (b) the socio-scientific nature of the groups’ arguments. Table 8.7 presents the socio-scientific nature of the arguments for each one of the groups.

As shown above, for the first argument all groups offer an argument which focuses on the environmental aspect of the problem (that the smell is bothering the people in the village), an aspect that they were experiencing in their everyday lives. Only two of the groups (Group 2 and Group 3) offer additional moral and social aspects for their arguments. For the second argument the socio-scientific aspect is more complex since the students offer evidence that link to the social, environmental, and financial aspect of the problem as well. Examples of arguments that were offered by the students and accommodate those aspects are presented below:

  • The pig farms should close because there is a lot of bad smell in the air. They should build the pig farm away from inhabited areas. (Group 5, Argument 1)

  • We believe that the pig farm should close because the smell is very bad and influences the people at the village. On the other hand though the people need the meat (Group 6, Argument 1)

For the last argument, the one constructed after the pig farm visit, the nature of the arguments for all groups again focused on the environmental aspect of the problem—the smell—an aspect that is associated with the experience they had when visiting the pig farm. The analysis of the data above supports that all the groups improved their arguments in terms of the number of pieces of evidence they included after the use of WISE (argument 2), but returned back to their original argument after the pig farm visit. This finding suggests that the experience in the visit, and the excessive smell pushed students to ignore the evidence they had collected from WISE, a finding that is supported by previous studies in argumentation (e.g., Kuhn, 1991).

The analysis of the students’ arguments in terms of the decisions (see Table 8.5) show that the WISE learning environment supported students in collecting all the available information and changing their initial decision which was to move the pig farm to a different area, to not moving the pig farm. More specifically, for the first argument all groups decided that the pig farms should be moved to a different area, whilst for the second argument only one group (Group 4) supported the same idea. An example of how groups changed from the first to the second argument is that of Group 6:

  • We believe that the pig farm should close because the smell is very bad and influences the people at the village. On the other hand though the people need the meat (Group 6, Argument 1)

  • The pig farm should not be moved because then we will not have meat, a lot of people in our area will stay without a job, we can use the waste to produce energy, and we can find ways to minimize the bad smell […] (Group 6, ­argument 2)

However, what is more interesting is the change in decision after the visit to the pig farm. After the pig farm visit, four groups reverted back to their original decision to move the pig farm to a different area, and only two groups insisted on their decision constructed after studying the evidence explaining the problem with the pig farm. All these arguments focused on the environmental aspect of the problem—the bad smell—based on the students’ experience from the visit, and their experience from living in the community close to the pig farm. These findings support findings from previous studies showing that students easily ignore evidence if these are not in accordance with their own claims (e.g Kuhn, 1991). In the case of the issue under study, a problem of personal interest in the area, especially after experiencing the bad smell during the visit, the students ignored the evidence they had previously collected, and the evidence from the field regarding the water and soil pollution. Furthermore, the analysis of the whole classroom discussion shows how the students during their presentations focused on a specific aspect of the problem—the bad smell, and even though they would recognize that there are solutions to the problem, they insisted on moving the pig farms because of the smell.

Students’ Attitudes and Emotions Regarding Project

After the implementation of the curriculum, all students were interviewed either by their teacher or the researcher. Students were asked to express how they felt about the research project, what they liked, what they did not like, and whether the experience was different from what they usually do in their class. All of the students who participated in the interviews offered positive appraisals of the project. They expressed excitement in reflecting on the use the handheld devices, enjoyed interactions and experiences in the WISE platform, and appreciated the opportunity to visit a field site.

The excerpt below, taken from a postintervention interview, provides an example of a typical student reaction to the use of handhelds. This excerpt is representative of many student comments offered relative to their experiences with the handhelds.

Researcher:How did you feel when you first used the handhelds?

Erena: I was happy because it was the first time that I had seen such a thing and I wanted to use it. I was smiling, I was happy.

Researcher:Do you remember similar feelings from school?

Erena: Yes, when we went to the pig farm.

Researcher:Other than from the learning environment, do you remember having similar feelings?

Erena: Yes, every time I have my birthday.

Researcher:Do you think that it matters that you were happy?

Erena: Yes, because when I am happy it means that I want to learn more.

Researcher:What other feeling do you remember having?

Erena: I was anxious to go and visit the pig farm.

This excerpt highlights an interesting pattern that emerged across the data set. All of the students indicated that they had seen handheld devices (including Erena despite her statement in the beginning of this excerpt), but they had never considered using them for the purposes of science. Most students discussed handhelds as something their parents used for business and a device that they may be able to use to play games. The idea of using these devices for school and science was clearly novel to the students but also very well received.

The excerpt below provides an example of a typical student reaction to participating in the research project, with references to the aspects of the project that the student enjoyed the most. This excerpt is representative of many student comments offered relative to their experiences with the project.

Researcher:What did you like about the learning environment?

Kyriaki: That we visited the pig farms and someone explained the process. I also like that we did research and presented the outcomes to the other groups.

Researcher:What do you mean when you say you did research?

Kyriaki: We searched online for information, we interviewed the pig farmer, we collected information from other resources, we visited the pig farm to see what is happening. And at the end we presented the outcome to our class.

This excerpt highlights an interesting pattern that emerged across the data set with most of the students indicating that they enjoyed participating in the Technoskepsi research project because they engaged in research (searching for information from various resources). They were asked to express their opinion, they visited the pig farm and had the chance to talk with the farmer, and to see whether the information they collected from the other resources were trustworthy.

Finally, the expert below is representative of students expressing that they enjoyed working in groups, and expressing their opinions.

Researcher:What did you like about the learning environment?

George:That we could work in groups for so long and ask questions and talk to each other about this topic. We could express our opinion.

The analysis of the interviews and classroom observations suggest that important aspects of the learning environment were the positive feelings that students expressed both during and after the instruction, especially about the use of the handhelds, working collaboratively in groups, expressing their opinion, and visiting the pig farm to collect data for their research.

Implications for …

Teaching and Learning

An important implication of this study is that students can improve their written arguments, when supported by an online learning environment as the one in Techoskepsi even though the quality of argument, and hence improvement in argumentation is an issue that needs to be further explored and is discussed in the implications for research section. Associated with this issue is the question of how we can enable teachers to evaluate students’ arguments and provide feedback. One of Aris’s concerns during the instruction, even though he was familiar with argumentation frameworks, was how to evaluate his students’ SSI arguments and provide feedback during the lessons. He was concerned with what was “wrong” and what was “right” in the discussions, and how to frame that for the class. Hence, one of the implications from this study is associated with finding ways to support teachers, not only to teach argumentation, but also to find consistent ways to evaluate argumentation, especially in socio-scientific contexts.

Another important finding in this study that has implications for teaching is that students can easily revert back to their original argument even though they have opposing evidence. It is interesting how most of the groups changed back to their original argument/decision to move the pig farms because of the bad smell after the visit to the pig farm. This finding suggests that the teacher’s role during the instruction should be to scaffold students to weigh the evidence and decide based on not only selected but all evidence. Furthermore, the specially designed inquiry-based instruction supplementing formal and nonformal studies seems to have the potential to support students’ argumentation while concurrently contribute to increasing student “motivation” for participation in science activities. Based on findings from the students’ interviews, various characteristics of the research project were considered positive and as the students stated helped them engage in the learning process. These characteristics (collaborative work, engaging with authentic problems, meeting the “actors” of an issue, use of novel technology) could be incorporated by teachers when they design their lessons in order to help them feel happy during the lesson and motivate them. According to Blumenfeld et al. (2006), motivation sets the stage for cognitive engagement and leads to achievement by increasing the quality of the cognitive engagement.

Research and Methodology

An important aspect of the Technoskepsi project was supplementing formal with nonformal settings when trying to engage students with an SSI. The findings from this study suggest that there is a great impact on students, both in terms of learning, and emotions when using nonformal settings. Most of the groups changed their decision after the visit to the pig farm, something that suggests that the visit (nonformal setting) had a greater impact on how they talk about the SSI, and what kind of evidence they use to support their arguments. Furthermore, most of the students when interviewed stated that they enjoyed the visit to the pig farm and the opportunity to talk to the people who are involved in the issue they were studying (e.g., farmer). Future research should focus explicitly on the impact that nonformal visits have on students’ SSI arguments, and the kinds of evidence they choose to use, and try to explore possible reasons for the change in the decision/arguments, based on the students’ experiences or personal identities.

A methodological implication that derives from the analysis of the data is what counts as quality of argument in SSI, especially in the case of the short, written arguments (artifacts) that were the main data sources of this study? In the methods section I explain how I decided to use the Erduran et al. (2004) modified levels of argumentation as the framework to guide “measuring” the quality of students’ arguments and argumentation, even though I was aware of the limitations. Two of the limitations of the framework identified by previous studies are (a) It is not easy to distinguish between warrants, data, and justifications (e.g., Duschl, 2008; Erduran et al.), something that has an effect on the inter-rater reliability of the coding. However this issue was addressed through the coding of part of the data by a second researcher, (b) it does not account for the content of the argument but only for the structure (e.g., Clark & Sampson, 2008; Osborne et al., 2004a,b), something that was evident in the analysis. Two additional limitations of the Erduran et al. framework identified through this analysis is that it fails to account for the number of pieces of evidence in the students’ arguments as an additional characteristic of the Levels, and it is a framework designed to evaluate dialogic argumentation, and not written arguments, since the focus is on the rebuttals. According to TAP, rebuttals specify the conditions in which the claim is not true, and are more easily found in dialogic argumentation, in which claims are challenged by someone else, hence the person who is arguing offers rebuttals to justify and protect their argument. But how easy is it to include rebuttals in written arguments? This position is also supported by Carey (1985) who argued that both children’s conceptual change and their growth in scientific reasoning are fundamentally driven by a growth in domain-specific knowledge. More specifically, Kuhn (1991), who studied the skill of argumentation, found that people who have knowledge of the subject seem to be more able to provide an alternative theory and that they tend to reason better on the subjects for which they have personal knowledge. Based on the above, I suggest that the number of pieces of evidence should be a measure of the quality of argumentation since they indicate an improvement in knowledge, and knowledge and skills are interrelated. Hence, a research implication from this study is the need for a framework that is designed to evaluate written arguments that should have an intermediate level between Level 2 and Level 3, in which the emphasis is on the number of pieces of evidence, before evaluating the use of rebuttals.

Conclusions

The Technoskepsi project was designed to explore young students’ argumentation within a socio-scientific context, and explore how students argue when their study concerns an authentic problem and with formal and nonformal investigations. The analysis produced evidence regarding how elementary school students change their arguments during their investigations regarding a socio-scientific, authentic problem, and raised questions on how to design these learning experiences to support more integrated arguments that include all aspects of the problem. Supplementing formal with nonformal investigations also raised questions about the affordances of nonformal experiences, and how they can change students’ attitudes and emotions toward science. An important aspect of the project was the collaboration of teachers and researchers for the development and implementation of the project, and it allowed the design on a curriculum that was based on the needs of the teacher and the students and not on the needs of the researchers. One of the limitations is that the specific curriculum was implemented in collaboration with a teacher and a researcher, an opportunity that is not given to many teachers, but I believe that this is still a good example of how a socio-scientific context can be used as a means to improve students’ argumentation and decision-making.