Introduction

Arguing that one of the main goals of education is the development of students’ capability to manage information in order to solve problems or make everyday decisions, many recent educational reform initiatives, especially within science learning, have advocated the adoption of inquiry-based learning (IBL) in schools (Brand and Moore 2011). IBL is a pedagogical approach that provides students with authentic problems, and the materials that allow them to construct their own conclusions (Hmelo-Silver et al. 2007). As such, IBL leaves more room for student initiative and creativity compared to traditional textbook exercises (Yerushalmy et al. 1990). Furthermore, previous research indicates that IBL is more effective in developing scientific thinking skills than traditional, expository teaching approaches (Kuhn 2010). Similarly, recent meta-analyses show that IBL can even lead to higher student achievement, provided that students are given sufficient support (Alfieri et al. 2011; Furtak et al. 2012; Lazonder and Harmsen 2016).

The increased interest in IBL has resulted in a relatively large body of research on professional development initiatives that aim to facilitate teachers’ adoption of this pedagogical approach (e.g., Brand and Moore 2011; Crawford 2007; Levy et al. 2013; Lotter et al. 2014; Morrison 2014; Nadelson et al. 2013; Voet and De Wever 2017a). These initiatives often start from the ‘teach as you preach’ principle, which is about putting into practice what teachers are expected to do in their own classrooms (see the review by Capps et al. 2012). This principle has its roots in social learning theory, which states that “most of the behaviors that people exhibit are learned, either deliberately or inadvertently, through the influence of example” (Bandura 1971, p. 5). In line with this theoretical framework, Lortie (1975) found that teachers’ long ‘apprenticeship of observation’ during their own time as students caused them to teach very similar to how they themselves were taught. Professional development initiatives therefore generally assume that, in order to get teachers to organize IBL in their classrooms, they first of all need to be immersed in IBL, by working through a substantive amount of content in a way that mirrors this pedagogical approach (McDermott 1990). Instead of merely delivering information about IBL, immersion in IBL thus aims to provide teachers with ‘good practices’, in the hope that teachers will subsequently adjust their teaching based on their observations (Struyven et al. 2010).

However, an overview of good practices is, by itself, not enough to reach sustainable change. In fact, research has shown that teachers’ behavior in class is strongly connected to their educational beliefs (see the reviews by Kagan 1992; Pajares 1996). In short, educational beliefs are propositions about schooling, teaching, learning, students, and subjects, which are held consciously or unconsciously (Pajares 1992). These propositions are evaluative, in the sense that they are accepted as true by the individual, and are therefore imbued with emotive commitment (Borg, 2001). Unlike knowledge, they are not necessarily based on evidence, and may very well defy logic (Richardson 1996). Because the work of teachers is, by nature, ill-defined, teachers generally resort to educational beliefs to coordinate their behavior (Nespor 1987). Even so, there do appear to be certain limits to the power of educational beliefs, as studies indicate that complexities of classroom life may constrain teachers’ ability to provide instruction in line with their beliefs about education (Fang 1996; Mansour 2013). Educational beliefs should therefore be understood as ‘intuitive screens’, through which teachers interpret new information and organize their work (Goodman 1988), and not be confused with actual classroom behavior.

Not much is known about the exact impact of immersion in IBL on teachers’ educational beliefs, even though some have argued that it is likely to fail to produce the desired results without a meta-commentary that makes the underlying ideas explicit, (Swennen et al. 2008). According to a review by Capps et al. (2012), one of the main reasons behind this lack of knowledge is that very few professional development initiatives have systematically assessed teachers’ educational beliefs. As such, the present study aims to investigate whether immersion in IBL is able to alter the educational beliefs that teachers have formed during their long careers as students.

Toward an operationalization of educational beliefs

In his review of research on teachers’ beliefs, Pajares (1992) argued that: “The construct of educational beliefs is itself broad and encompassing. For purposes of research, it is diffuse and ungainly, too difficult to operationalize, too context free” (p. 316). In other words, research on teachers’ beliefs should specify exactly what sort of educational beliefs are under investigation.

The present study focuses on beliefs about IBL in school history, in which inquiry is based on the central assumption that knowledge is constructed, rather than extracted, from evidence (Wilson and Wineburg 1993). This is because historical sources are human crafts, which were made from a particular point of view, and thus inevitably represent a partial account of the past (Rouet et al. 1998). Historical explanation therefore requires the mediation of a historian, who has to sift through, interpret, evaluate, and integrate the available evidence (Kuhn et al. 1994). This does, however, not imply that historical explanations are merely opinions, but rather that their value depends on the arguments and evidence used to support them (van Drie and van Boxtel 2008). As a result, IBL in history is both interpretative and argumentative in nature.

When it comes to translating these principles of historical inquiry into IBL activities, research suggests that, next to conceptions of their work environment, teachers’ decision to organize IBL in class is largely driven by their beliefs about knowledge goals, and self-efficacy for organizing IBL (Voet and De Wever in press). The focus of the present study lies on the latter two types of beliefs, as these can act as a drive that stimulates teachers to work around constraints presented by their working context.

Looking first at teachers’ beliefs regarding knowledge goals in history, a review of the literature shows that previous work in the field of history education has commonly distinguished two types of knowledge: substantive and procedural knowledge (e.g. Havekes et al. 2012; Lee 2004, 2005; VanSledright and Limón 2006). Substantive knowledge involves a framework of the past (i.e., knowledge of historical periods, evolutions, and patterns), whereas procedural knowledge refers to a conceptual understanding of how the past is investigated (i.e., knowledge of heuristics, inquiry standards, and meta-concepts). As teachers generally attach different weights to both substantive and procedural knowledge goals, this helps to explain why some teachers are more inclined to organize IBL activities in class (Bouhon 2009). In particular, it turns out that teachers who tend to value the development of procedural knowledge are more inclined to engage their students in historical inquiries (Husbands 2011; Voet and De Wever in press).

Teachers’ self-efficacy is a second type of beliefs that influences their decision to use IBL. In essence, self-efficacy represents a judgement of one’s capability in light of a task analysis (Pajares 1996; Tschannen-Moran and Woofolk Hoy 2001). As such, some have argued that self-efficacy can hardly be operationalized as a generalized personality trait, but instead has to be defined against the backdrop of a specific task (Pajares 1996). Research suggests that, for history teachers to feel able to organize IBL activities in class, they must first of all consider themselves competent to conduct their own historical inquiries (Martin and Monte-Sano 2008). In other words, teachers’ self-efficacy for conducting historical inquiries appears crucial to their use of IBL in class. This is further supported by previous research on teachers’ behavior, which reveals a strong relation between teachers’ self-efficacy beliefs and actual classroom behavior, and suggests that self-efficacy positively influences teachers’ persistence and resilience in the face of adversarial conditions (Tschannen-Moran and Woofolk Hoy 2001).

In short, the present study thus aims to find out how immersion in IBL may impact teachers’ beliefs about knowledge goals in history, as well as self-efficacy with regard to conducting historical inquiries.

Improving the inquiry experience through technology

The effectiveness of IBL largely depends on the support that is provided (Alfieri et al. 2011; Furtak et al. 2012; Lazonder and Harmsen 2016). In the context of professional development initiatives, where teachers may not have yet mastered inquiry (Capps et al. 2012), it is therefore important to provide sufficient guidance during immersion in IBL, in order to ensure a positive inquiry experience on behalf of the participants. In light of this, Lazonder and Harmsen (2016) point out that support can vary from less specific forms of guidance, such as simple process constraints that are used to structure the inquiry, to more specific guidance, such as scaffolding that takes over more demanding parts of the inquiry. In practice, however, it is often challenging to implement these forms of guidance into IBL activities (Kim et al. 2007).

A possible solution to this practical conundrum may lie in the use of educational technology, especially since several researchers have argued that one of technology’s main assets within the context of history education is its ability to support IBL activities (Copeland 1985; Voet and De Wever 2017b). In particular, technology offers the possibility to create multimedia that can take the form of investigation tools, record-keeping tools, or knowledge sources (Edelson et al. 1999). In this way, technological tools may help to design authentic inquiries, create powerful visualizations, support collaboration among learners, and foster autonomous and metacognitive learning practices (Donnelly et al. 2014). As a result, there has been considerable interest in the use of technology for supporting IBL (see Linn et al. 2013; van Joolingen and Zacharia 2009), with some arguing that technology may also enhance interest and motivation for IBL (Blumenfeld et al. 1991), or in other words, positively influence teachers’ beliefs with regard to IBL.

Furthermore, an added benefit of technology-enhanced learning environments in history is that they may include sources other than the documentary evidence common to pen-and-paper inquiries (De La Paz and Felton 2010; Reisman 2012). Technology-enhanced learning environments may present students with sources that contain more varied information about the past, such as sound recordings and film fragments (van Drie and van Boxtel 2008), or may even make use of digital source archives created by libraries, universities, or government agencies (Swan and Hicks 2007). In conclusion, this makes it clear that the use of a technology-enhanced learning environment offers several benefits for professional development initiatives that aim to immerse participants in IBL.

Research questions

The present study is situated in the context of teacher education, and investigates how immersion in IBL, through a technology-enhanced learning environment, may impact student teachers’ educational beliefs. More specifically, the focus lies on educational beliefs that are relevant to IBL’s implementation in classroom. As such, the research questions concentrate on the impact of immersion in IBL on:

  • student teachers’ beliefs related to substantive and procedural knowledge goals in history, and

  • student teachers’ self-efficacy beliefs with regard to conducting historical inquiries.

Design and methods

This section provides more information about the context of the study and its participants, as well as the technology-enhanced learning environment used to immerse student teachers in IBL. With regard to the latter, the focus particularly lies on clarifying the design principles that formed the basis for the learning environment, as well as the ways in which technology was used to support and enrich the inquiry experience. Afterwards, this section also gives an overview of the instruments and methods of analysis that were used to gather and interpret the data.

Context and participants

The present study took place in the context of teacher education in Flanders (Belgium), within the integrated teacher training program, which prepares students to teach in the first four grades of secondary education (average student age: 12–16 years old). This teacher training program can be followed at university college, with the sole entry-level requirement being that students have finished secondary education. At the start of this program, students select two school subjects in which they will be trained. The program lasts 3 years, during which students are taught the content as it is instructed in secondary education, but also follow courses on teaching methodology. After successful completion of the program, students are awarded the degree of bachelor in education (De Wever et al. 2011).

In total, 302 student teachers from 12 university colleges participated in the present study. All students had selected history as one of their subjects, and were in the first year of their training program. Student teachers’ mean age was 20 years (SD = 2 years). Of all students, 185 were male and 117 female. Although little is known about the knowledge that students have about IBL in history when they enter the teacher training program, previous research has shown that IBL does not appear to be common practice in Flemish history classrooms (Van Nieuwenhuyse et al. 2015; Voet and De Wever in press). Thus, it is likely that most students have relatively limited knowledge of IBL in history when they enter the training program.

Design principles of the IBL-activity

The design of the IBL activity used during the professional development initiative was based on three core design principles.

The first design principle centered around authenticity, or creating a learning activity that resembles the work historians do. This implies a discovery-oriented approach to IBL, which calls for personal questioning, exploration, and discovery, rather than an information-oriented approach, which is limited to seeking already-existing answers (Spronken-Smith et al. 2011). Within the present study, this discovery-oriented approach resulted in an emphasis on (1) an ill-structured problem leaving room for different plausible conclusions, and (2) knowledge transformation, requiring students to form their own interpretations of evidence (see also, Voet and De Wever 2017a). This was achieved through the use of an evaluative problem statement, which required students to draw and support their own conclusions about the past (see “Designing the IBL-activity” section). In comparison with other question types, an evaluative question is therefore more likely to stimulate historical reasoning (van Drie et al. 2006).

The second design principle emphasized collaboration during inquiry, as it has been argued that historical reasoning is primarily a social activity, in which agents shape each other’s thoughts through spoken or written communication (van Drie et al. 2006). The social interaction involved in collaboration also stimulates students to elaborate their knowledge, by explaining their understanding to one another, which in turn results in more coherent arguments (Teasley 1995).

Finally, the third design principle consisted of scaffolding the inquiry. As the combination of ill-structured problems and knowledge transformation generally results in challenging tasks, learners require sufficient support in order to ensure a positive experience with IBL (Lazonder and Harmsen 2016). In keeping with research that has attempted to reduce the complexity of inquiry by breaking it down into several stages or phases (e.g., Bell et al. 2010; Pedaste et al. 2015), the present study used a macro-script that sequenced key activities (i.e., formulating historical questions, evaluating sources, forming arguments) within a workflow (for more information about scripts, see Dillenbourg and Hong 2008)

Designing the IBL-activity

The IBL activity used during the professional development initiative focused on the topic of the English Peasants’ Revolt in 1381 (for more information, see Dobson 1970; Dyer 1994). This topic was selected because it is not part of the curriculum within Flemish history textbooks, and student teachers were therefore unlikely to have much prior knowledge of this historical event. Furthermore, the name of the revolt has been heavily debated within academic history, as the lower classes were not the only ones to rise during the revolt. As such, this topic offered sufficient room for student teachers to form their own conclusions about the following problem statement: “Do you think that ‘Peasants’ Revolt’ is a fitting name for the uprisings that took place in England in 1381?”

In keeping with the authenticity design principle, a variety of information sources, which historians could also encounter in their search for information, were selected as evidence for the IBL activity. This selection included fragments from: the Wikipedia article on the Peasants’ Revolt, a TV documentary titled ‘The great Rising of 1381’, a medieval chronicle by Benedictine monk Thomas Walsingham, and two historical monographs by Dobson (1970) and Dyer (1994). These sources were furthermore selected because they offered different, and sometimes even contradictory, points of view about the name of the revolt. Finally, because these five sources contained sufficient information to solve the inquiry, student teachers were not allowed to consult additional sources. This also allowed to control the information that student teachers used to solve the inquiry.

The macro-script for the IBL activity (see “Design principles of the IBL-activity” section) was created using the Web-based Inquiry Science Environment (WISE). WISE is an online platform for designing, developing, and implementing IBL into the classroom, and has been well received by both research and practitioner communities (for more information on WISE, see Linn and Eylon 2011; Slotta and Linn 2009). A screenshot of this learning environment is presented in Fig. 1. On the left-side of the computer screen, there is a navigation panel that guides students through key inquiry activities, but also allows them to revisit these activities. There are some constraints, however, as student teachers cannot visit an activity before completing the previous ones (i.e., constraints are indicated by a grayed out button in the navigation panel). On the right-hand side of the screen, students can go through the content that corresponds to each key activity, and, if required, enter and store their notes. In total, the IBL activity consisted of nine key activities: (1) studying information on the historical context (including concepts like class system, feudality), (2) studying an explanation of how historians conduct an inquiry (i.e. centering around the interpretation of information and use of arguments), (3) translating the problem statement into historical questions, (4 – 8) analyzing and evaluating a particular source, and (9) writing down a conclusion.

Fig. 1
figure 1

Overview of the WISE learning environment

Running the intervention

A pilot study was organized to test the IBL activity that was designed for the present study. Its main purposes were to check whether student teachers understood all of the materials, and how much time was required to complete the activity. This pilot study was carried out with 36 student teachers, who did not participate in the main study, but whose characteristics were similar to the main study’s participants. Student teachers’ reactions during and after the activity made it clear that the difficulty level of the materials was appropriate, and that 4 hours were sufficient for carrying out the IBL activity.

Each university college that participated in the present study was therefore asked to allocate four consecutive hours for its implementation. At the start of the IBL activity, all student teachers received an informed consent that provided a summary of the study, explaining that its main purpose was to investigate the impact of a technology-enhanced inquiry environment on their learning process. The informed consent also assured student teachers that data would not be passed on to third parties, and that results would be anonymized in case of publication. Finally, students were given the option to refuse to have their data used for the study. During the IBL activity, student teachers worked in randomly selected dyads, completing each of the key activities that were part of the macro-script described in the previous section. The IBL activity’s effects on student teachers’ beliefs were captured using an individual pre- and posttest, which took place right before and after student teachers’ work in the learning environment.

Finally, it is also important to note that the present study was part of a larger research project on the use of a technology-enhanced inquiry environment in history teacher education. One of the goals of this research project was to examine how different forms of support might influence student teachers’ reasoning during the assignment. In the present study, student teacher dyads were randomly divided over four conditions. Depending on the condition to which they were assigned, some dyads received additional support for using sources, forming arguments, or both, whereas others did not. However, analyses indicate that this extra layer of support that was added to the macro-script (see “Design principles of the IBL activity” section) had no significant impact on the evolution of student teachers’ educational beliefs (see Appendix 1). These conditions will therefore not be further discussed, and a parsimonious model without conditions will be used for the analyses.

Instruments

Three scales were used to capture educational beliefs at pre- and posttest. As a review of the literature yielded no scales that could be used to capture beliefs about knowledge goals in history, two 5-item scales were constructed for respectively substantive and procedural knowledge goals. Items for these scales were constructed based on the review study by VanSledright and Limón (2006). To measure teachers’ self-efficacy for conducting historical inquiries, the Perceived Competence Scale (PCS) was adapted to the topic of IBL in history. Previous uses of this 4-item scale have generally yielded a good Cronbach’s Alpha measure, with values higher than 0.80 (e.g., Williams and Deci 1996; Williams et al. 1998). An overview of the scales and their items can be found in Appendix 2. To further explore possible changes in student teachers’ educational beliefs, the posttest also asked for student teachers’ reactions related to two questions: “Did the task change your view of historical research?” and “Did the task change your view of history education?”. In both cases, student teachers were prompted to: “Please explain why, and, in case of a positive answer, give a clear description of these changes”.

Analyses

The first part of the analysis focused on an inspection of the scales’ factorial validity and internal consistency. In order to determine factorial validity of the data, the pretest dataset (N = 302) was randomly split in two subsets (N = 151), of which the first was used to conduct an exploratory factor analysis (EFA), and the second to conduct a confirmatory factor analysis (CFA). The EFA was carried out with SPSS 24, after the number of factors to retain had been determined through a scree plot and Horn’s parallel analysis, of which the latter is one of the most strongly recommended techniques in this regard (Courtney 2013). Horn’s parallel analysis was conducted using the ‘Paramap’ package in R3.3.2, while the scree plot was retrieved from the SPSS output. Afterwards, the ‘Lavaan’ package in R3.3.2 was used to conduct a CFA, of which the fit indices were evaluated using the commonly used cutoff scores proposed by Hu and Bentler (1999). After factorial validity had proven to be satisfactory, the complete pretest dataset (N = 302) was used to check internal consistency of the scales. Separate Cronbach’s Alpha’s were calculated for each scale, at both the pre- and posttest.

The scales were then used in the second part of the analysis to determine the impact of immersion in IBL, through a technology-enhanced inquiry environment, on student teachers’ educational beliefs. The hierarchic nature of the data, with students being nested in different dyads and then within different university colleges, was taken into account through the use of multilevel modeling. MLwiN 2.32 was used to estimate a model of the difference score for each scale, and to calculate estimates for student teachers’ scores at pre-and posttest.

In the third and final part of the analysis, the first author went through student teachers’ written comments and identified comments that (1) explicitly mentioned an increased understanding of history’s nature, (2) described positive changes in beliefs with regard to IBL, or (3) contained negative reactions to IBL in history. A second researcher then independently coded the data, in order to calculate inter-rater reliability. Cohen’s Kappa was calculated using the ‘irr’ package in R3.3.2, and indicates good interrater reliability for all three codes: increased understanding of history’s nature (K = 0.81), positive changes in beliefs with regard to IBL (K = 0.75), and negative reactions to IBL in history (K = 0.86).

Results

In this section, the results concerning the factorial validity and internal consistency of the scales used to measure student teachers’ educational beliefs are first of all examined. Afterwards, the scales are used to conduct quantitative analyses that allow to determine the impact of immersion in IBL, through a technology-enhanced inquiry environment, on student teachers’ educational beliefs. The results are then further investigated through analyses of student teachers’ written comments about their experience with the inquiry learning environment.

Examining the scales

The exploratory factor analysis (EFA) of the scales started with determining the number of factors to retain (for more information on the selection of statistical techniques, see “Analyses” section). A scree plot (see Fig. 2) of the data points toward a 3-factor solution, as there is a clear inflection in the plot right after the third factor.

Fig. 2
figure 2

A scree plot of the data points toward a 3-factor solution

A parallel analysis was then run to compare the data’s real eigenvalues to a set of randomly generated correlation matrices (N = 100, percentile of eigenvalues = 95). Table 1 presents an overview of the results, and shows that only the first three factors’ eigenvalues are larger than the corresponding random eigenvalues, thus confirming that a three-factor solution is the best fit for the data.

Table 1 Parallel analysis of the eigenvalues

As such, a three-factor solution was extracted during the EFA. Table 2 presents an overview of the factor loadings. A quick overview of this table shows that all items loaded as intended, with low cross-loadings on other factors. Thus, factorial validity appears to be good for all three scales, which were constructed to respectively measure procedural knowledge goals (PKG), substantive knowledge goals (SKG), and self-efficacy for inquiry (SEI).

Table 2 Loadings of the three-factor solution

A CFA was then conducted to further examine this three-factor structure of the data. A plot of the CFA is presented in Fig. 3. An evaluation of the results based on the cutoff values (CFI and TLI ≥ 0.95, RMSEA ≤ 0.06, SRMR ≤ 0.08; for more information, see “Analyses” section) indicates a very good fit (CFI = 1, TLI = 1.03, RMSEA = 0 [0; 0.04], SRMR = 0.06). In other words, the results of the CFA further confirm the three-factor solution, as well as factorial validity.

Fig. 3
figure 3

Plot of the confirmatory factor analysis

Finally, Table 3 presents an overview of the scales’ internal consistency, and indicates that, for each scale, Cronbach’s Alpha at both pre- and posttest ranges from acceptable to good. Also presented in this table is an overview of each scale’s mean. It appears that, at the time of the pretest, student teachers already attributed relatively high values to both substantive and procedural knowledge goals, although substantive knowledge goals were clearly rated higher. Likewise, student teachers’ pretest scores for self-efficacy were also relatively high. At the posttest, there was an increase in each of the three scale’s average score, which is further investigated in the following section.

Table 3 Overview of the scales, each ranging from 1 (very unimportant/untrue) to 6 (very important/true)

Impact of immersion in IBL on student teacher beliefs

A multivariate multilevel analysis was run to examine each of the three scale’s difference scores from pre- to posttest. The results are reported in Table 4. First of all, the significant intercepts for procedural knowledge goals (X2 = − 56.65, df = 1, p < 0.001) and self-efficacy for inquiry (X2 = 23.45, df = 1, p < 0.001) indicate that the increase in scores is significantly different from 0, thus pointing toward a significant change in these beliefs from pre- to posttest. In contrast, student teacher beliefs about substantive knowledge goals did not change significantly from pre- to posttest (X2 = 1.03, df = 1, p = 0.31). Last, the model indicates that, for all scales together, there is significant variance on the student level (X2 = 246.19, df = 1, p < 0.001), but not on the dyad level (X2 = 1.46, df = 1, p = 0.23), or school level (X2 = 0.82, df = 1, p = 0.37).

Table 4 Multivariate multilevel model of difference scores (pre-post)

The multilevel estimates of the pre-and posttest scores, retrieved from multilevel growth curve models based on the scales’ scores (see Appendix 3), are presented in Fig. 4. Similar to before, the data indicate a significant increase for both procedural knowledge goals (X2 = 54.31, df = 1, p < 0.001) and self-efficacy for inquiry (X2 = 30.88, df = 1, p < 0.001), while the change is not significant for substantive knowledge goals (X2 = 1.18, df = 1, p = 0.28).

Fig. 4
figure 4

Evolution in student teacher beliefs (***p < 0.001, SKG substantive knowledge goals, PKG procedural knowledge goals, SEI self-efficacy for inquiry)

Looking further into the impact of immersion in IBL

Although the analyses of the pre-and posttest scores paint a generally positive picture of the impact of immersion in IBL through the technology-enhanced inquiry environment, a qualitative analysis of the comments that student teachers gave during the posttest provides a more nuanced picture.

First of all, a large group of 65 students noted explicitly that immersion in inquiry-based learning had positively changed their opinion about IBL, with answers such as: “[This task has shown me that] students should form their own opinion or even take a stance [about topics in history]. It is a good idea to teach them how to conduct their own inquiries.”, or “Inquiry is very useful to show students that different sources may have other things to say about a certain topic”, or even “Now I realize why our professor always emphasizes the use of sources”. In addition, several students also appeared to have developed a belief that IBL is more likely to lead to learning, compared to more traditional approaches. For example, one student teacher noted: “The potential of inquiry and ICT became a lot clearer to me. Students learn not only textbook content, but also certain skills. It is also a fun approach.” Similarly, another one wrote that: “When we used to be in history class, we did not pay much attention. This, however, requires a lot of your time and effort.”

However, it appeared that immersion in IBL through the technology-enhanced inquiry environment did not have the same impact on all student teachers’ beliefs, as a group of 25 student teachers’ (about 8% of the sample) answers to the posttest indicated that they were not in favor of organizing IBL in the classroom. It appears that, to these student teachers, IBL was incompatible with their own, often content-oriented, conception of history education. For instance, one student teacher stated that: “I think that students should mainly understand the historical roots of today’s society. Letting them conduct inquiries such as the one we did today, is not necessary.” Likewise, another student teacher argued that students mainly need to know, rather than experience, that history is interpretative: “School history should only concern itself with the major topics (in terms of time, space, and causality). Inquiries are not required, as long as the teacher explains that the content may not be entirely true”. In relation to this, some student teachers also seemed to assume that students would simply not be interested in IBL. To illustrate this, one student teacher wrote that: “You cannot have students conduct inquiries, because, in reality, only a few students are interested by history. I do believe that you can learn them that not everything should be trusted, but I would not go any further than that.” Likewise, another reported that: “Personally, I think that a teacher who tells captivating stories is really important to history.”

Table 5 looks further into how the quantitative data of this subgroup of 25 student-teachers compared to the rest of the sample. It appears that their scores for beliefs about procedural knowledge goals were both at pre- and posttest lower than the rest of the sample. However, this difference was not significant at the pretest (X2 = 3.75, df = 1, p = 0.05), but only at the posttest (X2 = 7.98, df = 1, p = 0.005).

Table 5 Multivariate multilevel model for the subgroup expressing negative reactions to IBL

Finally, the qualitative data also provide more information as to why immersion in IBL may have had a positive impact on student-teachers’ self-efficacy for inquiry. In total, 109 students explicitly stated that the experience had improved their understanding of how historical inquiry worked. In several cases, student teachers stressed that they now had a better understanding of history’s interpretative nature. As one student teacher stated: “It has taught me that a historian’s opinion is actually important. I used to think that sources should be approached objectively to gather reliable facts. But if you do not form your own opinion, you cannot think critically about or evaluate certain sources.” Another response, stressing a different aspect of the interpretative work involved in history, stated that: “I used to believe that interpretation was not important in an analysis of sources. Now I realize that sources almost never provide a direct answer to a question, and that a source’s content can be interpreted in different ways.”

In relation to this, student teachers now also seemed to have a better idea of the complexity of IBL in history. This includes responses like: “Now I realize that historical inquiry is more than just searching for sources and copying their contents. It is up to you to determine what you believe, by comparing as much information as possible.” or “I noticed that there are often different points of view or versions. This has shown me that you have to question everything, and that the world is full of stories that may not entirely correspond to what really happened.”

Discussion and conclusion

The present study examined how immersion in IBL in history education influences student teachers’ beliefs about knowledge goals in history, as well as their self-efficacy for inquiry. During their work on the IBL-activity, student teachers collaborated in dyads to conduct their own inquiries within a technology-enhanced learning environment.

Even though some scholars have argued that immersion in IBL may fail to have an effect without a meta-commentary explaining its underlying ideas (Swennen et al. 2008), the findings of the present study show that a relatively short professional development initiative that immerses student teachers in IBL actually leads to a significant positive effect on their educational beliefs. After working in a technology-enhanced learning environment that immersed them in historical inquiry, student teachers attributed a significantly higher value to procedural knowledge goals (i.e. emphasizing the development of historical reasoning skills), and felt more capable to conduct historical inquiries. In line with what could be logically expected, immersion in IBL did not impact student teachers’ beliefs regarding substantive knowledge goals (i.e. focused on acquiring the content of history). The significant effects found by the present study are particularly relevant to professional development initiatives, as beliefs about procedural knowledge goals and self-efficacy related to organizing IBL are both predictors of teachers’ implementation of IBL (Voet and De Wever in press).

However, the positive effect of immersion in IBL should still be interpreted with some caution, as the results also point out that this approach may not be equally effective for every student teacher. In particular, it appears that immersion in IBL did not have much impact on the educational beliefs of a subgroup of about 8% of the student teachers, who often started the intervention with content-oriented conceptions of school history, which were largely incompatible with IBL. As such, a more reflective approach appears to be required in order to alter the deeply rooted educational beliefs of this subgroup of teachers. According to previous research, this could be achieved through conceptual change strategies, which (1) help to make often implicit beliefs explicit, (2) reveal the inadequacy or disadvantages of those beliefs, and (3) help to integrate alternative and logically sound perspectives (Kagan 1992; Korthagen 2013). In other words, a meta-commentary that makes explicit the ideas that underlie IBL does appear to be required for changing this particular subgroup of student teachers’ beliefs.

Finally, the results suggest that immersion in IBL does not only positively influence student teachers’ beliefs, but may also contribute to a better understanding of how disciplinary knowledge is constructed. This finding thus provides further evidence to the common assumption that engagement in IBL in history is a vital means of learning about the nature of the discipline itself (Levy et al. 2013). Several student teachers stressed that the work in the technology-enhanced inquiry environment had improved their understanding of history’s interpretative nature. At first sight, this implies an evolution toward a more nuanced vision of history, which recognizes that history is inevitably constructed by historians. However, it is not yet clear whether this change could also result in what Maggioni et al. (2009) described as subjectivism, or a belief that all of history is merely an opinion. Additional research is therefore necessary to get a better picture of the impact of immersion in IBL on student teachers’ understanding of the nature of history.

In relation to this, another important limitation of the present study is the relative short duration of the intervention. The question remains whether additional immersion in IBL would have further impacted student teachers’ educational beliefs, and whether it would have been able to alter the educational beliefs of the subgroup of student teachers reporting a negative view of IBL in history. Another question concerns the stability of the changes found in student teachers’ educational beliefs, seeing that previous research has shown that the reality of the classroom often has a negative impact on student teachers’ drive for trying out innovative approaches, such as IBL (Fehn and Koeppen 1998; Voet and De Wever 2017a). A longitudinal study design therefore seems recommended for future research, as such a design would make it possible to answer both of these questions.

A third limitation is that, while the present study shows that immersion in IBL positively affects student teachers’ beliefs, it does not pinpoint the exact contribution of each of the intervention’s design principles to this effect. In order to shed more light on this issue, future research could compare different configurations of these design principles through (quasi-) experimental designs.

A final limitation is that the present study only focusses on student teachers’ educational beliefs, even though research has shown that these are not always in line with their practice (Fang 1996; Mansour 2013). It would thus be interesting for future research to also examine student teachers’ practice, for example during teaching internships.

Despite these limitations, the present study offers an important contribution to research on immersion in IBL in teacher education, as it shows that this approach may have a larger impact on student teachers’ beliefs than is sometimes assumed. At the same time, however, the study also shows that this impact may differ depending on student teachers’ initial conception of education, and the extent to which it allows room for IBL. The implications of these findings are discussed in the next section.

Implications

The findings of the present study hold several implications for professional development with regard to IBL, both in terms of practice and future research.

With regard to practice, the results first of all indicate that immersion in IBL is, in general, an effective approach for positively influencing teachers’ beliefs with regard to IBL. In relation to this, the overview of the principles underlying this approach, as well as their translation into a technology-enhanced inquiry environment, can inform professional development initiatives on how to design activities for immersing student teachers in IBL. Furthermore, as the present study also indicates that immersion in IBL may not be effective for student teachers who hold conceptions that are largely incompatible with IBL, it seems advised to first engage student teachers’ in a reflection on their educational beliefs. As mentioned above, an approach that has generally been recommended by research on teacher education draws on conceptual change strategies, which aim to alter beliefs by making them explicit, pointing out their flaws, and offering logically sound alternatives.

With regard to future research, the limitations mentioned in the previous section point toward a need for more longitudinal research on the effects of immersion in IBL on student teachers’ educational beliefs, which also takes student teachers’ actual classroom practice into account. This would allow to further examine most of the questions that the present study is unable to answer.