Keywords

1 Introduction

Research indicates that teaching and learning analytics (TLA) provide help for teachers in their pedagogical decision making (Sergis and Sampson 2017). However, different issues need to be tackled before data and analytics use can become an everyday practice at schools. In this article we differentiate between data-driven and data-informed decision making: the first describes a process which starts from data and uses mostly automatically collected and analyzed (big) data, while the latter (data-informed decision making process) starts with an inquiry question that would denote the data needed to answer the question(s), and need to be purposely collected or extracted. Both processes have their positive and negative sides (large data sets vs scarce data from physical classrooms; data corresponding to the individual needs of the instructor; etc.).

Mandinach (2012) brings out the two key components of data-driven decision making: technological tools to support the inquiry process, and human capacity (teacher data literacy). When educational data-mining and data collection from online learning platforms has gathered momentum, the developments on the human side have not been that fast. Challenges connected with teachers’ data use are several (Marsh and Farrell, 2014). Kaufman et al. (2014) and Mandinach and Jimerson (2016) list the following: continuous learning (data-use related knowledge), data-skills (including technology enhanced data-collection), data-use becoming an integrated component of an educator’s work, sustainability of impact (enhanced through continuous support, collaboration and data-teaming) amongst others. Mandinach (2012) also points out the possibly low quality of educational data that teachers can manually collect. So, it is essential to provide teachers with more technological support for systematic teacher-led data collection (from face-to-face, hybrid and online classrooms), as well as guidance on the interpretation of such data in teacher inquiries, which the present study focuses on.

It is clear that data-informed decision making does not only depend on teachers’ readiness to collect and use data to transform instruction but requires certain skills and competencies. Jimerson and Waymann (2015) outline six: (1) asking the right questions; (2) integrating data use with curriculum, instruction, and assessment; (3) analyzing and interpreting data; (4) linking data to classroom practice; (5) computer skills; and (6) collaborating around data. Brown et al. (2017) in their conceptual model combine data use and teacher inquiry and clearly indicate that teachers need scaffolding both in the research aspect of their inquiries as well as in working with data. Also, teachers’ awareness of different technological tools which could be used for purposeful data collection seems to be limited. Ebbeler et al. (2016) also emphasize that teachers need additional support to be able to collect and use data from their classrooms.

The aim of the current study is to explore how 12 Estonian teachers perceive their own technology use and teacher inquiry (TI) practices, and how much this corresponds to reality. Also, how data-analysis skills satisfy their actual needs and what challenges they encounter within a technology-enhanced TI process. The study explores how teachers fall into different groups based on their skills and concerns connected with technology-enhanced TI.

2 Related Work

Teacher Inquiry (TI) has been identified as a powerful tool for teacher professional development and continuous improvement of teaching and learning (Mandinach and Schildkamp 2021). However, large-scale adoption of systematic evidence-informed TI has not become a reality despite multiple efforts to offer TI models (Hansen and Wasson 2016; Sergis and Sampson 2017) that would assist practitioners in the process. Specific barriers, especially related to teachers’ data literacy competences, have been found to defer teachers from engaging with inquiry to improve their teaching practice. To alleviate these barriers and support teacher inquiry, versatile dashboards, systems and devices have been developed, however, in addition to having sophisticated technologies and appropriate data, teachers must have some level of data literacy to use data effectively and responsibly (Mandinach and Schildkamp 2021).

Mandinach and Gummer (2016a) define teachers’ data literacy as ‘the ability to transform information into actionable instructional knowledge and practices by collecting, analyzing, and interpreting all types of data (assessment, school climate, behavioral, snapshot, longitudinal, moment-to-moment, etc.) to help determine instructional steps (p. 14).’ They propose a conceptual framework for data literacy for teachers, consisting of five main steps: i) framing the question (articulating a problem and understanding the context), ii) using data (different types and from different sources, also understanding data accuracy and using technologies to support data use), iii) transforming data into information (generating hypothetical connections to instruction, understanding how to interpret data), iv) transforming information into a decision (diagnosing what students need, making instructional adjustments), and v) evaluating outcomes (re-examining the original question, considering the need for iterative decision cycles, monitoring student changes in performance, etc.). (Mandinach and Gummer 2016b).

As to decision-making, Light et al. (2005) present a framework linking data, information and knowledge, to demonstrate how raw data are made meaningful by relating the data to the context at hand. Wise and Jung (2019) propose a Situated Model of Instructional Decision-Making, which also divides the decision-making process into two: the sense-making step includes reading the data to get oriented, finding relative reference points, and explaining patterns; the pedagogical response involves either taking action, waiting to see, or reflecting on pedagogy, and checking impact. However, not much research can be found on how teachers actually follow these steps in their decision-making processes, and how they apply the knowledge of the classroom settings to the collected data. One good example of instructors’ (at university level) use of analytics can be found in Li et al. (2022), where teachers’ interpretations of analytics have been investigated.

Although educator experience and professional judgment are an important factor in teachers’ decision making, for evidence-based decision-making these must be used in conjunction with data. The main task is fitting the pieces of the puzzle together to get a more holistic understanding of the context and inform practice going forward. Also, research finds that to facilitate and optimize students’ learning processes and to consider learners’ individual needs, effective data use requires the use of multiple sources of qualitative as well as quantitative data, and going beyond performance data (Lai and Schildkamp 2013; Mandinach and Gummer 2016a). As a next step, students should also be more often involved in the process of data use to enhance ownership, student learning, and student achievement (Mandinach and Schildkamp 2021).

The Analytics Model for Teacher Inquiry (AMTI) (Saar et al., in print) has been an attempt to synthesize TI and data use so as to provide practitioners with concrete examples and explanations on what to pay attention to in each step of the TI process. The data sense-making and interpretation steps in the model guide teachers to look for patterns in the data and then apply pedagogical knowledge to the extracted information to gain knowledge about the inquiry topic to inform decision making. The model also emphasizes the link between data types and the inquiry question, and picking suitable technological options for the data-collection process.

It could be argued that teachers’ modest technological competences might be another reason for low adoption of data-informed inquiries. For example, Estonia is considered an e-country and Estonian schools have good access to the internet and are fairly well equipped with computers, not to mention students’ own devices. Therefore, it could be assumed that teachers at Estonian schools take advantage of technology in the teaching/learning process. What is the situation in reality and whether teachers also research the impact of technology use in their classrooms has remained unclear so far.

Also, most research into teachers’ use of data focuses on automatically generated data from Learning Management Systems or different dashboards (Dawson et al. 2019). More specifically, the majority of studies use learners’ online learning data to help teachers provide feedback or guidance. However, such data might not often correspond to teachers’ actual needs. Obviously, to inform technology development, more research is needed to explore what data teachers would actually collect from their classrooms, how they would like the data to be analyzed and presented to them, and what insights they expect from these analytics. Teacher-lead data-collection from their own classrooms, however, is often limited to one data source at a time (e.g. assessment data or a student survey) and is very time-consuming and often difficult to analyze (due to lack of suitable data-analytics tools for multi-modal data).

In the light of the previously outlined problems, the current study explores teachers’ sense-making and interpretation of teacher-collected data from their own teacher inquiry interventions, and how teachers’ inquiry skills correspond to their perceived technological and technological pedagogical knowledge. To understand teachers’ thought processes in TI, the steps in the AMTI are followed to document the inquiry processes of 12 teachers (from setting the goals to decision-making). Their use of technology in the process is also explored and their concern rates for TI calculated. The research questions defined for the study are:

  1. 1)

    To what extent do teachers’ perceptions of their pedagogical, technological and TI skills correspond to their use of technology in the process of TI?

  2. 2)

    How do teachers make sense and interpret the data collected during TI?

  3. 3)

    How do teachers’ implementation of TI relate to their concern levels about TI?

The contribution of our work lies in i) findings about technology-enhanced teacher inquiry, based on 12 iterative teacher action research cases; ii) establishing ‘level groups’ of teachers involved in TI, with the aim of demonstrating their different needs for assistance in TI.

3 Methodology

The paper describes a multiple case study design (Yin 1981), which was carried out within the Erasmus + Illumine project. Estonian school teachers, voluntarily participating in the project, conducted teacher inquiry action research (12 case studies for our research) for a whole school year. The Illumine project aimed at teacher professional development - introducing evidence-based practices into teaching and promoting TI, as well as researching teachers’ skills in using technology for teaching and TI. Therefore, it provided monthly 2-h online workshops for teachers, to introduce Science of Learning (SoL) strategies (Beardsley 2020) and technologies for applying these strategies. The aim was not just to try out evidence-based teaching strategies but also to conduct TI and assess the impact of these strategies on students’ learning. Teachers were also supported with all the steps in their TI, were researched about their technological and TI knowledge and skills, and were involved in co-designing materials for other teachers who would be interested in SoL and TI.

All case studies followed a similar TI routine, however, the teachers could pick from different research-based teaching strategies an intervention they wanted to adapt or apply in their own teaching. The teachers worked in teams of 2–4 people to plan their interventions and implemented these either alone (at different schools) or in pairs (when they worked at the same school). Teachers were asked to document their TI in research lesson diaries, based on the AMTI (Saar et al., in print), which outlines eight TI steps (motivation for and purpose of the inquiry, inquiry question(s), data needs, data collection tools, sense making and interpreting of the data, and decision making based on the collected and analyzed data) and provides explanations and examples for teachers.

After individually analyzing their collected data, the teachers shared their experience in groups and got peer feedback. Most of the time, however, the feedback sessions consisted of 2–3 teachers sharing their practice while others had not had time to analyze their data and could not present. One of the reasons for poor data analysis was also linked to the pandemic time restrictions and absenteeism at schools, which greatly inhibited the possibilities for classroom data collection. However, listening to others’ presentations inspired the teachers to continue with their own analyses and complete the intervention for the next workshop. Between the workshops, several individual online support sessions were carried out with some participants who required assistance or encouragement.

Sample:

Twelve Estonian school teachers (Table 1) who participated in the Illumine project agreed to participate in the study and signed a written consent. It was also agreed that they would use the data collected from their own classrooms only for their own learning - to make sense of the new strategies applied in their teaching. Any data shared in the workshops and in this study was to be anonymized.

Table 1. The sample of the Illumine participants involved in the study.

Data Collection:

First, to find out about teachers’ perceptions of their technological and TI knowledge, an online survey was carried out based on the technological pedagogical content knowledge (TPACK) questionnaire (Schmidt et al. 2009). In addition to the 7 × 4 Likert scale (1–5) questions from TPACK, four additional questions about teachers’ knowledge of the SoL strategies and four questions about their understanding of TI were included in the survey. However, as the present study focuses not on teachers’ content knowledge but application of TI with the help of technology, our data analysis is centered around the pedagogical, technological and TI aspects of the survey only.

The survey was carried out prior to workshops, where teachers covered different SoL strategies and familiarized themselves with the steps in TI and technology use for teaching and data collection. The pedagogical knowledge (PK), technological knowledge (TK) and Technological Pedagogical Knowledge (TPK) mean scores (5-point Likert scale) were calculated based on the corresponding TPACK questions (Schmidt et al. 2009). Additional four questions about TI were added: with the help of technology, I consistently collect and analyze data about my students; Last month, using technology, I collected and analyzed data about my own teaching; I have sufficient pedagogical knowledge to make sense and interpret these data; I adapt my teaching decisions based on the evidence obtained.

To investigate teachers’ actual use of technology in TI and to identify any challenges in the TI process, participants’ research lesson diaries were collected. All in all, 18 research diaries of the first and second interventions were submitted by the 12 teachers. To ensure clear understanding of all the entries in the diaries, interviews were conducted with the participating teachers after the first TI intervention, where the teachers were asked to explain their TI steps using their research lesson diaries. Field notes from the interviews were added to the diaries by the researcher.

In the end, teachers’ concern about TI was calculated using the Stages of Concern from the Concerns-Based Adoption Model (CBAM) (www.air.org), initially developed by Hall 1991, to calculate teachers’ concerns, as change facilitators, about innovation in teaching. The twelve teachers filled out the 35 Likert-scale items after their second intervention (7 months after they had started TI in the Illumine project).

Data Analysis:

For data analysis the present study employed mixed methods: descriptive statistics of the survey were compared to the results of qualitative analysis of research lesson diaries and the field notes from the interviews. Application of technology for teaching and data collection, as well as data types collected, were counted separately. Finally, the results from teachers’ research lesson diaries were compared with their concern stages from the CBAM. The four phases of the data analysis were:

Phase 1: Content analysis (Schreier 2012) was applied to the 18 research lesson diaries (with field notes) based on the eight TI steps in the AMTI (Saar et al., in print). The initial coding frame was tested on the results of the first intervention and re-coded within a month for reliability. No changes were made to the coding frame when coding the second intervention results. Also, a comparison between the relevant data of the first and second iterations was conducted to detect any possible changes in the teachers’ TI practice.

Phase 2: As the focus of this study was on the data interpretation steps in TI, the relevant sections of the research lesson diaries of all 18 interventions were analyzed separately using inductive thematic analysis (Braun and Clarke 2006), to identify salient emergent themes (within these TI steps) with robust support. These were compared to the constructs from the AMTI to confirm or adapt the proposed data-analysis steps in it and to identify any new necessary steps that might emerge.

Phase 3: The descriptive statistics about each participant’s perceived pedagogical, technological and TI knowledge were then compared with their actual use of technology and TI, apparent from the research lesson diaries (Table 2). Both success and failure in the use of technology and TI was detected, which led us to the understanding that teachers need different assistance in TI, based on their concerns.

Phase 4: The concern profile scores for each teacher were calculated using the methodology suggested in Hall (1991). The intensity of teachers’ concern is indicated by the percentile score (the higher the score the more intense the concern). Then, the ‘peaks and valleys’ in the percentiles are identified to interpret the scores based on the Stages of Concern (CBAM): Unconcerned (0) indicates that a teacher’s relation to the innovation (in this study TI) is not an area of intense concern as their attention is focused elsewhere at the moment. Information (1) shows interest in learning more about the innovation. Personal (2) stage is linked to the teacher’s ability and role in facilitating TI (doubts, lack of confidence). Management (3) involves time, resources and energy necessary to facilitate TI. Consequence (4) pays attention to the impact of TI on students. Collaboration (5) indicates teachers’ willingness to involve others in TI and coordinate TI facilitation. Refocus (6) expresses teachers’ readiness and determination to promote and develop TI.

These data have been presented in two ways: data about each individual teacher (Fig. 1) and data about each concern stage (Fig. 2). As the scores are not absolute but rather relative to the other stage scores in each profile, they do not indicate the difference between the concern levels of different individuals but rather the concern for each particular teacher. Therefore, the shape of the profile is more meaningful than how high or low the score falls on the graph.

4 Results and Discussion

Table 2 provides the main findings from the TPACK survey and research lesson diaries of the interventions. The average scores for PK, TK, TPK and TI reveal that five participating teachers assessed their technological competences even higher than their pedagogical or teacher inquiry competences, although the difference is not big. However, the mean score for the perceived teacher inquiry competences of three teachers was three or below it. Does this mean that Estonian teachers are rather good at using technology but not so confident in teacher inquiry?

Table 2. An overview of the results from the survey and research lesson diaries.

Interestingly, two of the teachers who perceived their TK to be really high (4.75 −5 points out of 5) did NOT use technology in their first iteration at all. One used technology for teaching but not for data collection, and only one used technology both for teaching (quizzes) and data collection (quizzes and a survey). However, all other teachers applied technology for both teaching and data collection, even if their TK score was between 2.7 and 3.2. Also, all teachers regardless of their TK score used technology for data collection in their second iteration. This implies that even teachers who do not feel confident with technology can still find suitable technology for their needs and can benefit from technology-enhanced TI. It must be admitted that some teachers struggled with using new (for them) digital technologies and required assistance, but no one gave up. The gap between perceived TK and actual technology use is in line with the results of Schmid et al. (2020) study, where some (STEM pre-service) teachers who reported higher TK also had higher technology integration in lesson plans, but for some teachers (language and social studies) TPACK profiles were unrelated to technology use in lesson plans. The teachers participating in our study explained that pen and paper was often just more convenient to use (e.g. when access to digital devices needed prior arrangements).

The things were more difficult with teacher inquiry. One teacher skipped the first iteration in the project and learned from others’ experience, although she participated in the planning phase. For others, the first apparent challenges emerged with wording the inquiry questions as, during the first intervention, only five teachers managed to develop inquiry questions for their intervention. Six failed in their initial attempt but were able to adjust the wording when guided to do so and did not have problems with the inquiry questions for the second iteration. The main issues connected with the inquiry questions (apparent from the content analysis) were: i) in the wording of the question (e.g.: ‘How many words can be remembered?’ when actually researching how free recall can help retain vocabulary in long-term memory) and ii) matching the inquiry question with the data to be collected, e.g. when a teacher asked about student engagement but did not collect any data about it (just assessment). Similar difficulties have been pointed out also in Luckin et al. (2016), who noticed teachers experiencing difficulties with formulating narrow enough questions for their inquiry.

Classroom data collection itself did not pose many problems, except for some obstacles with technology use, which were overcome. However, the data analysis step seemed to be of a challenge as the teachers often did not find time/energy for it. So, six relevant data-analyses of the first iteration were initially submitted (only one of them was really thorough, which inspired others to try as well), but five teachers did not manage to do it within several months. However, most teachers got inspired and more confident about data analysis by the second intervention, when seven teachers analyzed their data (although it took 5–6 h for some teachers, which was considered too long) and five teachers finished the analysis of their first intervention.

So, in the end, the results demonstrate clear improvement in wording the inquiry questions, as in their second interventions all nine teachers (who attempted) were able to formulate a clear and measurable inquiry question. Although assessment data still prevailed, more surveys were carried out to get feedback from students about their perception of the class and material covered. An apparent change could also be detected in the use of technology for data-collection (no pen and paper during the second intervention) – this could be due to the inquiry topic of the teachers (four of them concentrated on reframing their students’ academic stress mindset and, therefore, surveyed the students). It may also result from learning and collaboration – as the teachers shared their initial practices and planned the interventions in groups, following the steps in the AMTI, where technology use for data-collection has been suggested, their intervention plans for the second iteration were more detailed.

The content analysis of the research lesson diaries was aligned along the eight steps in the AMTI: first teachers’ motivation and purpose, then the inquiry question, data needs and technology for data collection, and finally making sense of and interpreting the collected data for evidence-based decision making. The diaries and field notes reveal the following opinions and skills:

For motivation three categories were identified: the main driving forces for teachers to try out novel things and carry out teacher inquiry seem to be curiosity (‘trying out new things and analyzing the outcomes’), willingness to improve and become more efficient (‘better results with minimal effort’), and desire to provide students with better experience. As to their purpose, teachers mainly wanted to find out if students’ motivation and results improve (‘better retention’) and if ‘my gut-feeling and the theory about the teaching strategies match’. Initially there were some issues with inquiry questions, which were discussed with the teachers during individual interviews to help the teachers improve the questions. This yielded good results as the inquiry questions for the teachers’ second interventions were all clear and measurable. Deriving from the strategies used in the Illumine project, teachers’ inquiry questions could be divided into questions about retention of the material covered, learning from mistakes, overcoming fear (of testing/ public speaking), engagement and motivation.

In our study, teachers initially collected mostly assessment data, which is consistent with research findings about teachers’ data use (Mandinach and Schildkamp 2021). However, the interviews also provided possibilities for explaining which data sources could better suit the teacher’s data needs to answer their inquiry questions. Unfortunately, assessment seems to be such an important aspect of school that all teachers in the second intervention still used assessment data. The only change that could be detected was that, in addition to assessment data, more teachers (5 out of 9 compared to 4 out of 11 in the 1st intervention) used several data sources (e.g. also survey data and observation). It also appeared that the teachers, though confident in their technological knowledge, did not always use technology when they might have. Some explained this with the distance learning imposed by the Coronavirus (too much screen time) and others with the ‘convenience’ of using pen and paper in the classroom (though manually collected data were rather difficult to analyze). However, as pointed out by Schmid et al. (2020) the mere use of technology itself does not indicate quality, rather it should be aligned with the purpose of the lesson. So, teachers’ use of technology depends on many other aspects than only their TK and TPK.

Although initially four themes (patterns, explanations, conclusions and suggestions) emerged from the thematic analysis of the sense-making and interpretation step descriptions in the research lesson diaries, in the end they were still regrouped into the following two themes: patterns and reference points in data (results rising/falling, vocabulary longer retained), and possible explanations based on pedagogical knowledge and experience (‘because self-tests worked as distributed practice’, ‘when ‘common denominators’ were used these helped to retain information’, ‘recall activities when assigned as home-work were often skipped’). This is consistent with the Data-literacy for Teachers framework (Mandinach and Gummer 2016a) and the AMTI (Saar et al., in print), which emphasize the multi-step approach to data analysis (from ‘data’ to ‘information’ to ‘knowledge’). In earlier research (Saar et al. 2022) teachers have been identified ‘jumping from data straight to conclusions’, which might lead to misinterpretation of the data. So, using the AMTI as a guide might have helped teachers to avoid this trap in this study.

So, the sense-making step in the TI revealed that teachers are good at noticing student progress and recurring mistakes. All teachers noticed some improvement in their students’ results or perceived higher engagement/motivation, although the teachers doubted whether these outcomes resulted from the new strategies used or other factors interfering with the intervention (e.g. Covid-time absenteeism hindered comparison of student results, excitement about participation in an experiment made students more willing to succeed, etc.).

When interpreting the collected data, the teachers could link their pedagogical knowledge to the collected data and explain the results. However, analysis of data from multiple sources (e.g. peer-assessment of public speaking and student survey about their stress level) was considered too time-consuming (taking 5–6 h) and teachers expressed the need for correspondingly assisting technology. The teachers also admitted that there were too many variables in the classroom and the results (e.g. student progress) might not always be due to the teaching strategy applied, but rather deriving from other activities (e.g. ‘after getting bad self-test results a student might have changed to a different learning strategy then the one researched’). Teachers also repeatedly admitted that data collection became difficult as students were absent from classes at different times and this also hindered interpretation. It seemed to the teachers that their students became more involved, however, it might have been due to the novelty of the new strategy (in the teacher’s opinion) and more time would be needed to investigate this.

Decision making – the collected data helped the teachers reflect on student progress and adapt their own teaching methods and learning tasks. Teachers also reported being more aware of their own teaching and feeling satisfied with good results. Sometimes, however, the decisions that teachers made were not relatable to their inquiry question (e.g.: when researching how free recall helps longer retain new vocabulary, the teacher decided to increase the number of words that should be memorized).

In general, the results from both interventions reveal that teachers are good at finding patterns from the data – this step did not pose any problems in the research lesson diaries of the 18 interventions. All 18 included assessment data, which teachers are accustomed to use. In addition, changes in the perceived stress level (n = 4), vocabulary retention (n = 9), recurring mistakes (n = 3) and student engagement (n = 3) were explored, but not always detected. As to reference points, all teachers used some assessment criteria and comparisons between students.

Usually, teachers could come up with possible reasons for the results (deriving from their pedagogical knowledge). For example, they concluded that frequent testing (not assessed) improved results probably because it involved distributed free recall. Also, they were quite surprised that students’ stress mindset could be reframed just by watching a video about ‘stress enhancing performance’. However, the teachers noticed that this was true mostly about younger students, and concluded that older students were already familiar with the concept of ‘enhancing stress’.

The post-workshop discussions also revealed that teachers gained more confidence to carry on with evidence-based practice and try out different research-based teaching strategies. The only concern was that the analysis of data from multiple sources (e.g., assessment and survey data) took too much time (5–6 h) and that is why some of the teachers did not find it possible to implement a second intervention. However, they were willing to continue with evidence-based practice if provided with technology that could ease the data-analysis step.

The main takeaways about teacher inquiry from the participating teachers were that:

  • teacher inquiry should be expanded over a longer period (at least several months);

  • in classes with fewer students it is complicated to carry out an inquiry involving several sessions (students are absent at different times);

  • it takes time to adjust to any new strategy (so the data about the first time a new strategy is used might not provide a truthful picture);

  • during an intervention, both the teacher and the students should be using familiar technology to avoid technical setbacks (that might hinder the effect of the researched teaching strategy);

  • teachers require better overview of technology that could assist in data collection and analysis;

  • teachers need tips on how to make data-analysis less time-consuming and how to present the analysis;

  • TI also directly impacts students - they got excited to be able to participate in a teacher inquiry and showed higher motivation to progress;

Finally, to better understand teachers’ concerns with their teacher inquiry process, we also administered the CBAM questionnaire (CBAM). The results (Fig. 1) show that refocus and personal concerns are relatively high among all teachers: 7 teachers have the highest concern (relative to their other concerns) about the refocus stage of TI (an expected result as they all voluntarily participated in the Illumine project, which focused on SoL and TI). The personal stage received the highest or second highest concern among 11 teachers, indicating concern about their own skills and role in TI, i.e. high willingness to improve and develop personal TI skills. The concern of six teachers (T1, T4, T6, T9, T10, T11) lies also with the impact of the TI on their students.

Four teachers (T 3, T10, T11, T12) have their highest scores in the unconcerned stage, which is indicative of their priorities being elsewhere at the moment. However, the concerns for information, management and collaboration did not get any highest score, and the concern for consequences received the highest score by one teacher (T11). As to the lowest concerns, six teachers demonstrate low concern for (time) management (which indicates that TI is less time-consuming than they initially predicted) and five are not concerned about collaboration and involving others (this being their lowest concern).

Fig. 1.
figure 1

The concern levels of teachers applying TI (by concerns).

To identify groups of teachers with different concerns for TI, we should look at the highs of each teacher in Fig. 2. From this it becomes apparent that T2 and T7 could be described as having reached stage 5 or 6, as their main concerns are linked to collaboration and refocusing. Teachers 1, 6 and 9 would fall into stages 4 or 5 (their concern being mostly linked to consequences of TI on their students, as well as collaboration). Teachers 4, 10 and 11 represent stage 4 (consequence) and teachers 3 and 5 stage 2 (personal), while T8 is somewhere between stages 1–2 (needing more information and concerned about their own effectiveness in TI).

Fig. 2.
figure 2

The concern levels of teachers applying TI (by teachers).

When we compare the CBAM results with the teachers’ success in applying meaning to their collected data during their TI interventions, the following trends become apparent (Table 3) (TI1 &TI2 stand for the scores of TI interpretation process from Table 2):

Table 3. The concern levels of teachers compared with their TI process outcomes (see Table 2).

Teachers 1, 2 and 9 had highest concern linked to stages 4–6 and they also managed well with their TI and technology use both for data-collection as well as for data-analysis. Therefore, we would group these three teachers as experts, who have acquired the necessary skills for TI and could focus on instructing and helping others. In our understanding, they can carry out TI on their own and also have demonstrated interest in continuing with TI topics.

Although the concern levels of two teachers (T6 and T7) would place them in the experts’ group, their own TI still needs considerable attention (one could not work with data thoroughly and the other could only partially interpret the data). Therefore, we would group them together with T4, T10 and T11, as independent users, who would still be needing some assistance in TI (even if it were just finding time for proper data analysis). These five teachers have a good command of TI inquiry skills, however, either their disposition towards TI (the benefits still do not outweigh the efforts), confidence in their own skills (digital and technological skills), or other responsibilities (not enough time for data-analysis, which takes a long time as seen in the project) significantly hinder their TI adoption. Handy tools for data-collection and analysis would, hopefully, help them manage TI far more efficiently.

The third group (explorers) consists of teachers (T3, T5, T8, T12) who are focused more on the development of their own TI skills and would probably need different assistance with TI than the teachers in the previous groups. One assumption is that if the data-literacy competencies of these teachers improved or they could use supportive technologies, the teachers might be far better off with TI. However, this is a question for further research, which would need thorough investigation.

5 Conclusions, Limitations, Further Research

The study tried to investigate the match between teachers’ perceived technological and teacher inquiry knowledge, and explored teachers’ actual application of TI and technology within it. 12 Estonian teachers who participated in the ‘Illumine’ project carried out 18 teacher inquiries, testing novel (for them) teaching strategies. Despite some initial setbacks with inquiry questions, technology use, and time-consuming data analysis, all 12 claimed to have improved their TI skills, and used technology for both instruction and data-collection (though not always for analysis).

In general, the teachers understood the steps in their TI well, although documenting their thoughts was said to be challenging. Their own comments about their TI were enthusiastic - they were proud of their achievement (being able to carry out and document a teacher inquiry) and found some of the outcomes rather surprising. For example, not many believed that it would be able to reframe students’ mindset about their academic stress just by watching one video. Teachers agreed that this strategy does not work with all students but even helping some students was seen as a good achievement.

All in all, the main problem areas in teacher inquiry seem to be connected with wording the research question and matching it with the data to be collected; also, more information and skills are needed to use technology, which would assist teachers in data collection and analysis. Another difficulty can be seen in data interpretation, i.e. applying pedagogical knowledge to the collected data, especially if the data come from different sources and about different aspects of learning (e.g. engagement and results). This indicates the need for devices that could help with the analysis of multi-modal data. Based on their TI process and CBAM results, we divided the participating teachers into three broad groups based on their TI concerns: experts, who can work on their own; independent users, who still need assistance, especially in data-analysis; and explorers, who make their first steps and need more guidance than the previous groups.

In conclusion, it can be said that even when teachers are highly motivated in TI, they do not have enough skills and knowledge (or time) and, therefore, require assistance. Data-teams and other collaboration methods among teachers were seen as really helpful in this matter: ‘it was good to discuss the results in a group – otherwise you just sit alone on top of the data’. Additional observations by the researcher are:

  • for some teachers it takes longer to get used to the TI process: for example, some need a ‘preparation time’ - to observe others before engaging in an inquiry themselves, and for others it takes months to analyze their data;

  • main progress could be detected in wording the inquiry question and matching the data types with it;

  • teachers are good at using technology for instruction but are not used to applying it for data collection and analysis;

  • data-analysis is too time-consuming at the moment, especially when the data come from different sources and need to be compared;

  • the items about technological knowledge in the TPACK survey lack questions about the technological knowledge for data-collection and analysis;

The current study also has some limitations. First, all the teachers involved in the study had volunteered to participate in a TI project, therefore, they were all similarly interested in developing their TI skills and a study with a different set of teachers would probably yield different results. Also, as the inter-rater reliability could not be calculated, the reliability of codes was checked within a month by the same researcher. The internal validity of the study is supported by using a model (AMTI) for documenting the TI and assisting teachers with carrying out TI, however, there was no time to confirm the external validity, and this needs to be done in the future.

Lessons learned from the study that answer our RQs:

  1. 1.

    Teachers’ perceptions of their pedagogical, technological and TI skills do not always correspond to their use of technology in the process of TI. However, this is more related to teachers’ decision NOT to overuse technology than their competences.

  2. 2.

    Teachers use technology for instruction much more than for data-collection (or analysis) probably because they are not yet used to the idea and often the use of digital tools requires some pre-arrangement (as not all classrooms are equipped with digital tools).

  3. 3.

    Teachers’ skills in TI improved during the seven months of participating in the TI project. The main improvement was in wording the inquiry questions, but also in working with data. The main obstacle is still lack of time for a thorough analysis.

  4. 4.

    Although it took some effort and explanations for teachers to understand how to apply pedagogical knowledge in data analysis, the teachers were usually good at identifying pedagogical reasons that could explain the data, and used two steps in data-analysis (probably because it was prescribed by the AMTI).

  5. 5.

    Based on their TI success and concern stages, we could divide the participants into three groups who would need different kinds of assistance with TI: the experts (confident on their own), independent users (need technology and tools) and explorers (who look for more information about TI and are concerned about their personal skills).

In the future, more research into the differences of teacher concerns in TI would be necessary, as the current study revealed possible level groups but the sample was really small and consisted of volunteers interested in the topic. Also, more technological support is required, especially when teachers use data from different sources. So, multi-modal TLA should also focus on teachers purposely collecting their own data and not relying on ‘big data’. Teachers also require technologies for data-collection and documenting TI, to make the process less time-consuming. One solution could be advertising the existing versatile technologies (dashboards, apps, LMS, etc.) more among teachers and making them accessible for teachers’ use. Another research line could be linked to exploring how students have been and could be engaged in TI processes.