Introduction

Previous research has noted that the design of effective learning environments involves multiple dimensions of social infrastructure such as school curriculum, assessment, school culture, social practices, and belief systems (Bielaczyc 2006; Hung et al. 2009). However, despite the movement toward restructuring schools, two persistent themes seem resistant to change: “often the work students do does not allow them to use their minds well”, and “the work has not intrinsic meaning or value to student beyond achieving success in school” (Newmann and Wehlage 1993, p. 8). Since many schools still face “the mile wide, inch deep problem” (Bransford et al. 2002, p. 24) in curricula, it is not uncommon to find teachers who focus on covering much factual knowledge within a short period of time, but fail to provide students with opportunities to gain deep understanding. Newmann and Wehlage (1993) argue that schools need to be restructured to provide authentic instruction for deep learning, which can be characterized by five standards: higher-order thinking, depth of knowledge, connectedness to the world beyond the classroom, substantive conversation, and social support for student achievement.

In an effort to redesign learning environments where deep authentic learning is encouraged and fostered in Singapore, we adopted the Knowledge Building Community Model (KBCM hereinafter). The KBCM has been implemented in several schools in America and has gained increasing attention in the Asia–Pacific region (So et al. 2007). Bereiter, Scardamalia, together with their colleagues have developed an innovative pedagogical approach called knowledge building that emphasizes the collective cognitive responsibility for advancing knowledge (Scardamalia 2002). In short, knowledge building can be defined as: “…the production and continual improvement of ideas of value to a community, through means that increase the likelihood that what the community accomplishes will be greater than the sum of individual contributions and part of broader cultural efforts” (Scardamalia and Bereiter 2003, p. 1370). What makes the knowledge building approach more innovative and powerful is the use of a computer-supported collaborative learning tool called Knowledge Forum to support knowledge building activities in an online environment. Knowledge Forum—formerly known as CSILE (Computer Supported Intentional Learning Environment)—provides a set of embedded textual scaffolds that students can choose from a drop-down menu before they start typing their ideas or answers. The scaffolding can be sentence openers like “My Theory is…(followed by students texts)” or “I need to understand…(followed by students texts)”. The purpose of scaffoldings is to direct students to generate, refine, and reflect on their ideas to better support a knowledge building discourse.

The knowledge building approaches have challenging implications for educational practices in Asia–Pacific countries. It is well known that competition among students and preparation for national examinations are highly emphasized (Chan and Aalst 2007) in Asian schools. This emphasis can result in the mile wide, inch deep curricula mentioned earlier. First, to overcome this problem, the KBCM aims to transform educational settings to move away from didactic teacher-centered to learner-centered constructive pedagogies. Second, both students and teachers are responsible for learning by engaging in the collective cognitive process of pursuing and building knowledge together. Lastly, the knowledge building approach shifts the focus of learning from task-oriented to idea-centered so that high-order thinking skills are engaged and exercised for both individual and collective understanding.

The study reported in this paper is part of a large research project in Singapore that employed the KBCM with the aim to transform science learning to step away from teacher-centered to student-centered practices. The emphasis of the present study was to observe how the KBCM was enacted in classroom contexts where students had little previous exposure to such approaches. While many of the previous knowledge building research studies have focused on online discourse activities mediated in the Knowledge Forum environments (e.g., Hakkarainen 2003; van Aalst and Chan 2007; Zhang et al. 2007), our research examined the full learning discourse of both online and offline activities. Our belief is that to create a truly pervasive knowledge building culture—where a community of learners works together toward a complex level of understanding—it is critical to understand both classroom and online discourses.

Hence, the main purpose of this paper is to illustrate discourse patterns between classroom and online knowledge building activities to better understand the nature of pervasive knowledge building discourse. Specifically, this research aims to answer two research questions: (a) what is the nature of online and offline discourses in classes where science lessons based on knowledge building principles are enacted to foster deep learning and (b) what are the similarities and discrepancies of online and offline discourse from qualitative and quantitative views? Adopting socio-cultural perspectives, discourse in this study is broadly defined to include both textual and verbal forms, and both linguistic and social practices (Hicks 1995–1996). However, it should be noted that the purpose of this research is not to decide which discourse type is inherently more meaningful or better, but to describe the nature of discourse exchanges enacted in different contexts.

Theoretical background

Collaborative knowledge building

The main purpose of employing the KBCM in this study was to help students learn how to collaboratively work with others, how to inquire knowledge, and how to reflect on their thinking process. Specifically, the theoretical background of this research is twofold: knowledge building and collaborative learning. Knowledge building exploits both the technological and social affordances of the communal knowledge. On the social dimension, it relies on collaborative learning, which has been shown to be more effective than traditional whole-class methods of teaching (Johnson and Johnson 1999; Slavin 1995). On the technological dimension, an asynchronous discussion platform, Knowledge Forum, allows many-to-many communication. Thus, it empowers students and reduces domination by teachers.

According to Scardamalia, twelve principles of knowledge building address socio-cognitive and technological dimensions: (1) real ideas, authentic problems, (2) improvable ideas, (3) idea diversity, (4) rise-above, (5) epistemic agency, (6) community knowledge, collective responsibility, (7) democratizing knowledge, (8) symmetric knowledge advancement, (9) pervasive knowledge building, (10) constructive uses of authoritative sources, (11) knowledge building discourse, and (12) embedded and transformative assessment (see Scardamalia 2002, for more discussion). Among these principles, the present study was particularly interested in examining pervasive knowledge building discourse.

A collaborative learning environment is an important component of a knowledge building culture, as knowledge building is predicated on individuals in a community working together to build a communal knowledge pool. Collaborative learning refers to any of a variety of teaching methods in which students are placed in pairs or small groups to help one another learn academic content. In collaborative classrooms, students are expected to explain, discuss, assess each other’s current knowledge, and fill in the gaps in each other’s understanding. For effective and successful implementation of collaborative learning, five elements must be in place: (1) positive independence, (2) individual accountability, (3) face-to-face promotive interactions, (4) social skills, and (5) group processing (Johnson and Johnson 1999).

Collaborative knowledge building involves the collective efforts of learners in the community engaging in a common task in which each individual learner depends on and is accountable to each other, and is steeped in the socio-constructivist theory of Vygotsky (1978). In collaborative knowledge building, groups of learners work together, not only merely to help each other learn the content knowledge, but also to search for and/or create new knowledge, understanding, meaning, solutions or artifacts of their learning (Paavola et al. 2004), in the vein of scientific inquiry and research.

Discourse as a mediator of knowledge building

Several dimensions of learning can be examined to understand the enactment and embodiment of the Knowledge Building Community in classrooms. In this research, we adopted discourse as a mediator of knowledge building. That is, we posited that discourse is a fundamental form of learning that reveals how knowledge building is enacted and embodied by a community of learners in both online and offline situations. Knowledge building discourse can be defined by progressing mutual understanding rather than simply producing descriptive information for others to agree or disagree with (Scardamalia and Bereiter 2006).

Classroom discourse exchanges are typically initiated and terminated by teachers, thus providing little opportunities for students to exercise higher levels of cognitive skills. Then, how can we transform such didactic discourse to knowledge building discourse? The importance of knowledge building discourse has been emphasized in previous studies. Bereiter et al. (1997) showed how progressive knowledge building discourse was successfully realized for science learning in elementary classrooms. They argued that knowledge building discourse is a fundamental skill to learn in a knowledge-based society. Cummings (2003) discussed how knowledge building discourse was enacted in classroom situations without Knowledge Forum. The study found that students who used Knowledge Forum previously also showed some forms of knowledge building discourse in face-to-face situations, which might be an indicator of pervasive knowledge building. Similarly, Caswell and Bielaczyc (2001) observed that students carried out continuous debates on scientific problems in online and offline contexts, thus improving both individual and collective understanding.

Methods

Research context

The data presented in this paper came from the coding and analysis of ten transcribed science lessons. The transcripts were taken from 7 hour of videos and depict the discourse activities from three Grade 4 classes (Class A: n = 38, Class B: n = 38, Class C: n = 36, hereinafter) doing a science unit on the digestive system.

In terms of academic abilities, Class A included mostly high-achieving students, while Classes B and C included mixed-ability students. The streaming was based on the school curriculum assessment results of the previous year. The same classes also used web-based Knowledge Forum (see Fig. 1) in connection with classroom learning. Students in this research engaged in the progressive knowledge building process including four main phases: (a) idea generation, (b) idea connection, (c) idea improvement, and (d) rise above. Students’ improvable ideas are the center of this inquiry process. In addition, meta-cognitive reflective thinking is continuously emphasized throughout the inquiry cycle to help students learn by reflecting on their own learning process. In the learning of the digestive system, the progressive inquiry process was triggered by big questions “What happens when a small intestine is missing?” and “Where does the apple go after swallowing it?” Previous research has shown that students in general hold robust alternative views of scientific concepts, and their understanding of concepts and knowledge in science are largely influenced by everyday experiences, language, and culture (Baker and Taylor 1995; Cakici 2005). It was hoped that the use of progressive knowledge building approaches in both online and office settings would help students gain deep understanding of science theories and concepts.

Fig. 1
figure 1

The Knowledge Forum view

To analyze offline and online discourses, we gathered online postings and coded them with a set of categories for question and answer exchange used for the classroom discourse analysis between student and teacher, and between student and student. Once this was done, we compared the data to see whether there were any significant similarities or differences concerning: (a) the diversity of student interaction in classroom and online learning environments, (b) the estimated quality and structure of verbal and written interactions, and (c) the number of words students used to answer and ask questions in classroom and online environments.

Discourse analysis: types and tokens in natural languages

We initiated this work by creating a coding scheme using MAXqda™, a qualitative data analysis software program. The framework of the coding scheme consisted of six main categories with a total of thirty-one sub codes sorted under them. The main categories for the teacher discourse were (1) feedback types to students, (2) knowledge building principles, and (3) content learning. The main categories for the student discourse were (4) verbal exchange between student and teacher related to knowledge building principles, (5) verbal exchange between student and teacher related to content learning, and (6) knowledge building activities between student and student. Here, knowledge building principles include those twelve principles proposed by Scadamalia (i.e., real ideas, authentic problems, improvable ideas, idea diversity, and so on). And knowledge building activities involve the idea generation, idea connection, idea improvement, and rise above in the progressive knowledge inquiry process.

The foundation for the coding categories was the theory of types and tokens in natural languages (Bromberger 1992). As an example, one can say that “Question” is a type within natural languages where the sentence “What year did Singapore become an independent country?” is a token that belongs to the type “Question”. In practice, this meant that we had to identify type values that could omit or apply to a wide range of common discourse activities. Overall, this approach resulted in quantitative data in terms of the number of occurrences of verbal activities as well as qualitative data in terms of the structure and patterns of particular discourse interactions.

Lexical analysis

On top of the broader qualitative discourse analysis, we chose to conduct lexical analysis (Lee and Fielding 2004) of the text segments coded as (1) teacher’s content-related question, (2) student’s content-related answer, (3) student’s clarification question, and (4) student’s clarification answer. Lexical analysis can be described as talk as number. It focuses on quantity in discourse exchange rather than quality. When applied to text or speech, the purpose is to look at the average number of words used for certain verbal activities or how often a specific word appears. This approach can be particularly relevant when used in combination with a broader discourse analysis since it can provide some hard facts that support findings or suggestions from qualitative-oriented discourse analysis.

Reliability and agreement

We used intersubjective identification (Smaling 1992) where independent researchers had to correlate an anonymized set of retrieved segments with the correct code types in order to give a general validation of the coding system. The codes and their applications were semantically delineated with few possibilities for subjective interpretations. This method allowed us to accept the token sentences based on the actual validation of the coding system.

Results

The discourse interaction that we chose to examine most closely was the exchange of question and answer between student and teacher, and between student and student. We wanted to approach this micro-discourse within the broader discourse from both quantitative and qualitative points of view. This upcoming section presents findings of discursive and lexical analyses of both classroom lessons and Knowledge Forum postings to describe the nature of knowledge building discourse in different learning contexts.

Classroom lesson discourse

Discursive analysis of classroom lessons

Overall, we noticed that the exchange of question and answer was tightly clustered together and followed certain well-identified routines. Additionally, we noticed a lack of teacher wait time at a few occasions where students were asked to explain or clarify ideas. The most common discourse types were teacher’s content-related question and student’s content-related answer: That is, a teacher asks a question—a student answers the question shortly and quickly—a teacher repeats the student’s answer—the student is encouraged to respond with a short and quick answer—the teacher continues to repeats the answer or ask a new question. As shown in Table 1, this type of almost ritualized routines are commonly known as I–R–E (Initiation, Response, Evaluation), or I–R–F–R–F (Initiation, Response, Feedback), which are common patterns of classroom discourse (Mehan 1979; Mortimer and Scott 2003; Scott and Mortimer 2005). This routine reflects a pattern of expected behaviors. Without additional explanations from the teacher, the students automatically start to interact according to the implicit rules that define the situation.

Table 1 Transcript segments illustrating IRE patterns in classroom discourse

Besides this IRE pattern, we also noticed that teachers lacked patience when it came to allow students to explain or clarify their ideas. At only some occasions, teachers asked follow-up questions like “what do you mean when you say…?” or “why do you think that…?”, and students were actually allowed to clarify their ideas or answers. These follow-up questions are crucial to ask if students should acquire the ability to reflect on their own ideas and thinking. Nonetheless, we noticed a focus on memorizing and repeating content. A related problem, as pointed out from numerous other studies, is the teacher’s good will to help students answer a question by answering it. Black (2001) has described this in a clear manner. The posing of questions by the teacher is a natural and direct way of checking on learning, but may be unproductive in some cases. When a teacher answers her or his own question after only two or three-seconds, there is no possibility that a student can think out what to say. In consequence, students do not even try—“if you know that the answer, or another question will come along in a few seconds, there is no point in trying… [t]he question–answer dialog become a ritual, one in which all connive and thoughtful involvement suffers” (Black 2001, p. 17). An example of this way of communication between students and teacher is illustrated in Example 1 below:

Example 1 Excerpts where the students were interrupted or not given time to clarify their answers

Teacher:

When we talk about the digestive system, we are actually digesting the food. What do you mean when you say digest the food?

Student:

Ehhh, how to say ah, ehhh…

Teacher:

Let’s say I am very, I am very full, I have just eaten but someone said I am hungry, there is no food left. The food has disappeared; the food has been digested by your body

Student:

(looking at other students, puzzled) Keep quite lah!

Teacher:

Ok, the machine that digests it is the digestive system. It’s too full now because of the digestive. So, when you digest the food, you actually break the food down into simpler forms. The whole apple like that disappears. Where does it disappear to?

Student:

Gullet, stomach, then small intestine

A closer look at the video of this situation indicated that the hesitating answer might be an expression of frustration of not being able to come up with the answer fast enough. Given the observed IRE routine of short and quick answers, one might understand the sense of pressure that the student felt to keep up with the tempo. It is interesting to notice how the verbal exchange soon returns to the routine of teacher’s question and student’s short and quick answer. However, it is important to point out that these examples are not intended to criticize the teacher. The excerpts mentioned earlier might be understood within a macrostructural framework where the lack of lesson time and the pressuring reality of large student groups affect how lessons are conducted (Heinze and Erhard 2006; Tobin 1986).

Lexical analysis of classroom lessons

The most noticeable finding from the lexical analysis was the low average number of words that students used to answer content-related questions in classroom activities. It ranged from an average of 2.5 words (Classes A and B) to 2.25 words (Class C) per answer. Specifically, in Class A, we found that 153 of the 430 segments (35.3%) were coded as teacher’s content-related question and student’s content-related answer. For the word count, the average number of words that students used to answer teacher’s content-related questions was 2.5 words. In Class B, 235 of the total 410 codes (57.3%) were teacher question and student answer to teacher’s questions. For the word count, the average number of words students used to answer the teacher’s content-related questions was 2.5 words. In Class C, we found that 84 of the total 307 codes (27.3%) were teacher question and student’s content-related answer. On the whole, Class C differed from other classes in such a way that a fewer number of answers and questions were spread over the lessons and that students worked more on their own or in groups. For the word count, we found that the average number of words that the students used was 2.25 words per answer to answer teacher’s content-related questions.

The way that students use their language can be seen as an indicator of how robust or anchored their knowledge is (Cazden 1988). In that sense, data from lexical analysis—with the remarkably low average number of words used by the students to answering questions—indicates that there might be a need to provide a forum where students have enough time to exercise their ability to formulate more complex and reflective ideas.

Knowledge Forum discourse

Overall, the lexical and discursive analyses of the Knowledge Forum online activities revealed quite different patterns when compared to the classroom activities. The average number of words for answering questions increased significantly. There were new types of questions and answers that rarely occur during the classroom lessons. We chose to name these new types of questions and answers as clarification questions and clarification answers. For example, “what do you mean when you say this…?” is a request to a student to clarify their answers, and “I mean that…” is an attempt to make ideas or answers more clearly.

Discursive analysis of online postings

The discourse of the online activities differs from the classroom activities in such a way that there is more diversity in the types of written exchange among participants. As shown in Example 2, the questions range from pure content-related questions (e.g., ‘Why is the digestive juice important in the digestive process?’ to clarification questions (e.g., ‘What do you mean with your question/answer?’ or ‘Can you state you question/answer more clearly?’).

Example 2 Clarification question and answer in Knowledge Forum

The type of discourse exchange exemplified in the excerpts was common in Knowledge Forum, but rarely appeared in the classroom discourse. As an isolated occurrence, this type of written exchange might appear trivial. Nonetheless, if this is allowed for an extended period of time as a natural part of learning processes, then one might have a reason to believe that it would give students useful opportunities for self-reflection and formulation of their ideas and knowledge. Hopefully, such training may improve the quality and carefulness in formulating ideas as students are progressing through different levels of education.

It should be noted that this way of learning and sharing ideas online was fairly new to the participants in this study. Hence, they had not developed any conventions for online exchange and etiquettes (even though we noticed some indication of such a development as the project proceeded). Additionally, we noticed a few cases where pressuring questions for clarification created responses that might be interpreted as upset or embarrassed (e.g., defensive or angry answers written in capital letters followed by numerous exclamations). It is difficult to be certain about the emotional state behind online responses. However, it is well-known that being pressured to answer clearly and accurately can be highly frustrating if students do not have the conceptual tools to re-formulate thoughts. The worries of being unable to answer is a common human fear and has been pointed out as having a negative impact on learning (Erikson 1963; Papert 1996). In the end, these few occurrences should not be considered as a negative effect due to the use of Knowledge Forum, but rather as an indicator that further training of these abilities is needed. It can also be seen as an indicator that there is a need to create a more open and relaxed knowledge building culture where learning is in the center, not the competition to excel each other.

Finally, there were some occurrences where students seemed uncertain about concepts or answers, but the discussion was not fully developed as presented in Example 3.

Example 3 Unfinished discussion that might result in misconceptions

Even though we could not determine whether any of these opposing ideas had greater authority or influence over the other students, the danger of incorrect group thinking (Barron 2003) needs to be taken seriously. It should be noted that this is not a situation unique for online learning, but a problem that may occur in any settings. In fact, it is easier to discover unwanted group thinking in online settings than in the classroom situations since discussions are saved as textual formats for later reviews.

Lexical analysis of online postings

In Class A, the lexical analysis showed that the average number of words students used for content-related answers was 22.4 words for 21 answers posted to teacher question, and an average of 6.0 words to other students’ content-related questions. There were a total of 5 clarification questions, but only 1 clarification answer (7 words). In Class B, the average number of words that students used for content-related answers was 18.6 words for 3 answers posted to teacher question, and an average of 11.2 words for 69 answers to other students’ content-related questions. There were a total of 50 clarification questions and 16 clarification answers with an average of 10.9 words per answer. In Class C, the average number of words that students used for content-related answers was 36.7 words for 25 answers posted to teacher question, and an average of 10.7 words for 18 answers posted to other student’s content-related questions. There were a total of 27 clarification questions and 9 clarification answers with an average of 9.3 words per answer. Table 2 presents the summary of lexical data of online and offline discourse in terms of content-related question/answer and clarification question/answer between student and teacher, and between student and student.

Table 2 Summary of lexical data: Occurrences and word counts for questions and answers

Discussions and conclusion

This research examined discourses in classroom and online learning environments where the Knowledge Building Community model was enacted to foster deep understanding in science learning in Singapore primary classrooms. In this section, we revisit each research question and discuss the implication of findings. The first research questions is “what is the nature of online and offline discourses in classes where science lessons based on knowledge building principles are enacted to foster deep learning?” On the whole, the discourse analysis of the verbal activities in classroom lessons showed clear signs of IRE patterns with quick and short answers. The dominance of this practice was also reflected in the low average number of words that students used to answer questions. The teachers sometimes initiated a conversation that could have lead to deeper discussions, but most of the time it was interrupted before students had a chance to formulate an answer that could have lead to a more meaningful verbal exchange.

In the analysis of online discourse expressed in Knowledge Forum postings, we observed patterns of clarification questions and clarification answers among students that rarely appeared in classroom discourse. However, it is important to note that online discourse is not completely unproblematic. We observed instances of incorrect group thinking and fear of appearing ignorant online. There is a need to moderate online activities and provide scaffolding for how questions and answers should be exchanged. This is unavoidable effect of the students starting to learn collaboratively online. It might even be seen as indicators revealing what has been lacking in the traditional teacher-centered learning. From practical perspectives, there might be other reasons to use online platforms like Knowledge Forum. It is a common problem in many countries that only a small number of students are active during teacher-centered lessons. Due to large class sizes, often only the most verbal or extrovert students are allowed to speak up. Knowledge Forum and similar online collaborative learning environments may provide more students with the equal opportunity to express their ideas and thoughts.

The second research question is “what are the similarities and discrepancies of online and offline discourse from qualitative and quantitative views?” The major structural difference between online and offline discourses may be described in terms of time. While working in the online environment such as Knowledge Forum, the students are given enough time to formulate and think their answers and questions, something that was rarely observed during the classroom lessons. This also resulted in a higher diversity of different question and answer types. In the lexical data, there was a significant difference between online and offline discourses. Overall, there was a significant increase in the numbers of words that students used to express their ideas in an online setting. There is also a clear tendency to a higher diversity in the types of written interaction online when compared to those in the classroom lessons. The most obvious difference is the demand from classmates to clarify answers and ideas, which rarely occurred during classroom activities.

This research aims to advance our understanding of knowledge building discourse by examining patterns of discourse by young learners in both online and offline settings. We believe that education is not only about learning a specific content. It is important for students to acquire the skills to gather information, produce new ideas, and reformulate existing ideas and answers in a clear and accurate manner. These skills naturally improve as students are progressing through different stages of development, but ought to be exercised intentionally through all levels of learning. There are reasons to believe that online collaborative learning environments like Knowledge Forum allow students to exercise their ability to construct more complex answers, questions, and ideas. The analyses in the present study show that the diversity of question types increased significantly when students are working online. We observed questions about meaning and intentions, which stimulated the students to reflect and reformulate their own answers and ideas.

As we mentioned in the introduction, findings presented in this study should not be used to decide which discourse type is inherently meaningful or better. Instead, our main purpose was to describe the nature of discourse enacted in different modes of communication and to look for ways to better integrate online and offline discourse toward more pervasive knowledge building discourse across different settings. Finally, we suggest that future research into the discourse patterns of knowledge building in primary science classrooms should look into how students can be equipped with linguistic tools and skills, specifically those relevant to the language of school science that will help them to be a better knowledge builder. More research in this area is warranted to further examine how to make the transition between offline and online learning more seamless, and how to create an online learning culture conducive to pervasive knowledge building discourse.