Abstract
Laboratories are considered to play a unique role in circuits teaching. Laboratories can be traditional, with physical components and desks, or virtual with graphical simulators. Applying these facilities in teaching, students can make experiments or measurements by exploring electric circuits’ features. However, an intriguing research question is whether physical components or graphical simulators are more appropriate to build knowledge, enhance skills and improve attitudes. Thus, the aim of this article is: 1) to perform a review in order to explore the characteristics of the studies that compare the tangible and graphical user interfaces and 2) to apply a meta-analysis for the effects of the interfaces under study. The meta-analysis included 88 studies with pre/post-tests designs with 2798 participants, which were emerged from: a) 4 databases, b) forward snowballing method. The review showed that the majority of researchers have focused on the knowledge gaining, while a few researchers have examined skills and attitudes. The meta-analysis showed that the combination of user interfaces (tangible/graphical) appears to be the most beneficial for students in the domain of electric circuits teaching.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
Today, where the use of technology and its tools has penetrated formal and informal education, two ways of teaching laboratory courses can be identified: a) the traditional one with tangible user interface (TUI), which “in general may be considered as physical objects whose manipulation may trigger various digital effects, providing ways for innovative play and learning” (Sapounidis et al., 2016, p. 273), and b) the virtual one with graphical user interface (GUI), which through technology provides simulations (Maatuk et al., 2022; Sapounidis et al., 2019; Thees et al., 2022; Xie et al., 2020). Moreover, the use of simulations has also been established in courses where participation in labs with TUI is considered essential, such as science teaching and electric circuits (Pan et al., 2022; Salta et al., 2022). In real experimentation, it is assumed that the circuit might be connected to a computer or contain components like microcontrollers. Therefore, if the user presses a button, or even changes the position, values, and orientation of real circuits’ components, this can directly affect the output of the circuit, so in this article the term tangible user interface (TUI) is adopted throughout.
Real laboratories in many cases, might contain tangible components along with measuring instruments, experimental setups, as well as expensive specialized equipment (Altmeyer et al., 2020; Finkelstein et al., 2005; Wörner et al., 2022). Their main advantage is direct physical contact with the activities to be studied (Akçayir et al., 2016). According to some researchers, physical contact might activate multiple senses and therefore can have a beneficial effect on a student’s cognitive domain (e.g. Sapounidis et al., 2015). However, there are three main disadvantages of TUIs: a) it usually takes more time to prepare the activity/experiment, b) often, special equipment is needed and therefore is difficult to replace due to high cost in case of a damage, c) the results of real experimentation in some scientific fields are difficult to be observed (e.g. Evangelou & Kotsis, 2019; Olympiou & Zacharia, 2018; Zacharia & Olympiou, 2011). For example, the students will not be able to see the particles directly in a subatomic particle interaction experiment.
In contrast, GUI offers a safer and more immediate visualization of the phenomena, providing an infinite number of modifications and repetitions of the experiment at no cost, while the time for the implementation of the experiment is reduced in comparison to a real lab (Olympiou & Zacharia, 2012; Potkonjak et al., 2016; Puntambekar et al., 2021; Villena-Taranilla et al., 2022; Zacharia & Constantinou, 2008). Therefore, it is considered that virtual laboratories and simulations might overcome the disadvantages of real laboratories and tangible experimentation (Falloon, 2019; Tselegkaridis & Sapounidis, 2021). Yet, virtual labs may not offer a complete picture of the subject to a student. For instance, the consequences of wrong settings in the GUI are not easily perceived in contrast to the TUI, where one wrong setting can lead to the destruction of the equipment.
Undoubtedly, teaching through virtual labs and GUIs has been a common practice for years and was not created for the needs of the pandemic (Baran et al., 2020; Chernikova et al., 2020; Foronda et al., 2020; Reeves & Crippen, 2021; Tselegkaridis & Sapounidis, 2022a). More specifically, during the pandemic period the use of this technology has been intensified (Xie et al., 2020). However, while societies are opting to return to new normality and students return to physical labs using tangible components, we do not know which of the conditions -GUIs or TUIs- or a combination of these two interfaces can be the most beneficial for students’ learning (Kapp et al., 2020; Renken & Nunez, 2013; Sapounidis & Demetriadis, 2013; Sullivan et al., 2017). In detail, the relative studies appear to be quite limited presenting contradictory results. Therefore, the present article focuses on electric circuits and presents a meta-analysis, enlightening the characteristics of the studies along with the impact of the interfaces on students’ knowledge, attitude, and skills (DerSimonian & Laird, 2015; Munn et al., 2018). As far as we know, this is the first meta-analysis that has been performed in this field. Thus, this article broadens the agenda in teaching electric circuits while at the same time is strengthening our understanding of the interfaces’ impact.
The rest of the article is organized as follows: in Sect. 2 the theoretical background of previous research is given, while Sect. 3 focuses on the methodology of the review. In Sect. 4 the results are developed, in Sect. 5 there is a discussion of the findings, and finally, in Sect. 6 the conclusions are descripted.
2 Background
Researchers have looked into the factors that can contribute to students’ performance in education (Lazonder & Harmsen, 2016). Consequently, some of them explored the level of guidance along with teachers’ understanding of students’ difficulties (Kapici et al., 2022). The results depict that if teachers understand their students’ difficulties while learning, then they can support them more efficiently (Engelhardt & Beichner, 2004; Gaigher, 2014; Hmelo-Silver et al., 2007; Moodley & Gaigher, 2019). According to some other researchers (e.g. Alfieri et al., 2011; Bretz, 2019; Minner et al., 2010) teaching science through laboratories might have a positive effect on students as long as they do not participate passively but in the context of inquiry-based learning. As well, this way of learning seems to offer a positive impact on students’ attitudes toward science (Chen et al., 2014; Hofstein & Lunetta, 2004).
Usually, researchers examine how laboratory activities affect learning objectives (Sapounidis et al., 2023; Wörner et al., 2022). Learning objectives can be classified into the following domains: attitudes, knowledge and skills (Baartman & De Bruijn, 2011). In this direction, Unlu and Dokme (2011b) investigated the attitudes of 66 middle school students about electric circuit, through a 3-week intervention. The sample was divided into three groups, TUI, GUI, and mixed. The findings showed that students’ attitudes had a statistically higher score when they took part in activities either only with GUI, or with a mixed interface. Faour and Ayoubi (2018) research, conducted in a middle school with a sample size of 50 students, compared TUI and GUI in terms of participants’ attitudes. The intervention lasted 10 weeks and the findings showed that there were no statistically significant differences in the results from the two different groups. Also, the research of Kapici et al. (2020), which was carried out in a middle school with a sample size of 143 students, investigated students’ attitudes. The sample was divided into three groups, TUI, GUI and Mixed and the intervention lasted 6 weeks. The results showed that there were no statistically significant differences between the three groups.
The research implemented by Farrokhnia and Esmailpour (2010), was carried out with the participation of 100 university students, investigating the skill of constructing a real circuit. So, the researchers measured the time it took the students to build the circuit with real components. The results showed that students who participated with GUI acquired the same skills as students who participated with TUI, since they did not need more time to construct the real circuit. Moreover, Kapici et al. (2022) conducted a research with 116 middle school students, and compared participants’ inquiry skills and level of guidance in TUI—GUI. The students took part in an intervention that lasted 4 weeks. Four groups were created, one group worked with TUI and low level of guidance, one group worked with TUI and high level of guidance, one group worked with GUI and low level of guidance, and one group worked with GUI and high level of guidance. Students participated in a pre/posttest that included 28 questions referring to observation, classification, designing experiments and forming hypotheses. The results showed that there were no statistically significant differences between the inquiry skills in the two interfaces and the level of guidance.
The research implemented by Tsihouridis et al. (2013), was carried out with the participation of 73 high school students, in order to investigate students’ knowledge. The intervention lasted 11 h and two groups were created, TUI and GUI. The results showed that there was no statistically significant difference between the two groups in their knowledge. Furthermore, the research by Taramopoulos et al. (2012), which was carried out with the participation of 32 middle school students, depicted similar results. The intervention lasted 17 h and there were no statistically significant differences between the students of the two groups who worked with TUI and GUI respectively. Nevertheless, the research of Zacharia and Michael (2016) conducted in a primary school with a sample size of 55 students, compared TUI, GUI and Mixed in terms of participants’ knowledge. The intervention lasted 3 weeks and the results showed that the mixed interface was more conducive to knowledge of the electric circuits concepts than the use of TUI or GUI alone.
What is more, the research by Tsihouridis et al. (2015) investigated the effect of sequence in the mixed interface on students’ conceptual understanding. The intervention lasted 7 h and 66 students of middle school took part. According to its findings, the sequence that started from TUI and turned to GUI has had even a slightly better performance, compared to the one that started from GUI and turned to TUI. That reveals that there may be some indications that the sequence between the interfaces might plays an important role.
Additionally, Falloon (2020) examined the case where learning is transferred from simulation to activities with real components. 40 five-year-old children took part in simple circuit activities. The findings showed that the students transferred successfully their knowledge from one interface to another.
Last but not least, Zacharia and de Jong (2014) showed that when the taught subject is simple circuits there is no interface that is more favourable to the students. However, when the taught subject is more complex circuits, it seems that the GUI prevails over the TUI. This may be due to the fact that in the GUI it is easy to make modifications/changes, while in the TUI this process is more difficult.
In conclusion, in each learning domain, results appear conflicting, without indicating any interface as more efficient. Hence, there is no indication of whether any domain is enhanced more by a particular interface. Thus, the aim of this article is to investigate the following Research Questions (RQ):
-
RQ1 what are the characteristics of the compared (TUI—GUI) studies in the teaching of electric circuits
-
RQ2 what are the results of these studies through a meta-analysis
These two RQs will enrich the research in this field and provide useful information and directions to educators and researchers.
3 Methods
The review was conducted according to the Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) Statement (Page et al., 2021).
3.1 Eligibility criteria
To enhance the reliability of the study, we followed the same strategy with other meta-analyses where only peer-reviewed journal articles were included (e.g. Sapounidis et al., 2023; Tingir et al., 2017). Moreover, articles that contain quantitative comparisons between TUI and GUI teaching electric circuits, in English language were searched. There were no restrictions placed on the year of publication. Articles related to distance education or remote laboratories were not included as we needed to achieve a clearer picture between the TUI and GUI comparison. Moreover, exclusion criteria (EC) were used: a) EC1 Off-topic articles, b) EC2 Experimental study that did not involve electric circuits, c) EC3 Study with a non-experimental design, and d) EC4 Study that was not accessible for retrieval.
3.2 Information sources and search strategy
Two strategies were used to search for articles: a) database search with a key-phrase, and b) forward snowballing method, which is the searching within the citing papers (Kondaveeti et al., 2021; Mourão et al., 2020).
Four well-known databases were used: Web of Science, Scopus, ERIC, and IEEEXplore. The search was conducted in 3–22 December 2022, and had the Boolean logic “(real OR hands-on OR physical) AND (virtual) AND (experiment* OR environment*) AND (circuit)”, adapted to each database.
3.3 Selection, data collection, and risk of bias
Two reviewers worked independently, examining the records and applying the inclusion/exclusion criteria. They arrived at the same results for most articles. In cases where there was uncertainty about a particular article, the reviewers engaged in discussions, presented their arguments, and together made the final decision.
It was crucial for each included article to have quantitative data (pre/post-test), so that sufficient data could be collected to perform the meta-analysis. An article usually contained multiple data, as multiple pre/post-tests were often used.
As shown in Table 1, out of a total of 3247 results from the 4 databases, 14 met the criteria. But 4 were duplicates, so 10 articles emerged.
Two of the articles that emerged from the database search, were selected for the number of references they had, 522 in total. In detail, articles were sought from those who had cited Kapici et al. (2019) and Zacharia (2007) in their research. As shown in Table 2, from these 522 references, 19 new articles that met the criteria emerged. However, 13 were duplicates, so 6 articles included, bringing the total to 16 from which we extracted data.
The 16 included articles described a total of 88 pre/post-tests, and these were used to perform the meta-analysis. Finally, Fig. 1 shows the PRISMA flowchart.
4 Results
To be able to answer RQ1, we recorded data such as the sample size, the duration of the intervention, the school level, and the learning objective.
4.1 The features of the included studies
The findings (Table 3) show that approximately 44% of the articles were published in the period 2017–2022. The oldest article that emerged from the search is from 2005. Half of the articles were conducted in Asia, 31% in Europe, 12.5% in North America and 6.5% in Africa.
As shown in Fig. 2, 19% of the studies took place in a primary school, 38% in a middle school, 13% in a high school and 30% in a university.
12.5% of the investigations lasted up to 2 weeks, 50% up to 2 months and 37.5% up to a six-month period. About 38% of the studies had a sample size of up to 50 people, 31% up to 100 people, and 31% over 100 people. Only one study had a sample size of over 225 people. Figure 3 shows the sample size histogram.
Half of the articles (50%) compare TUI and GUI, while the other half (50%) compare TUI, GUI, and mixed interfaces.
The findings (Table 4) show that 75% of the articles set as learning objective the knowledge gaining, 15% the attitudes, and 10% the skills. Multiple choice questions were used as an evaluation tool in 38.5% of cases, open tests were used in 34.6%. Likert scale was used in 11.5% of cases, while true/false and matching questions were used in 7.7%.
4.2 Meta-analysis
4.2.1 Overall effect
From the 16 included articles, a total of 88 studies that implemented pre/post-tests designs were included in the meta-analysis, comprising a total number of 2798 students. The random-effects model was applied according to which the effect size is considered a random variable (Borenstein et al., 2010; Rice et al., 2018), where it is expected that there is no one real effect, but a distribution of real effects. The Comprehensive Meta-Analysis (CMA) software was used to perform the calculations for the meta-analysis (Borenstein et al., 2022). Under the random-effects model, the summary estimate of the effect size (Hedge’s g), of the use of laboratory environment on academic outcomes for electric circuits was + 1.669 with a 95% confidence interval of + 1.43 to + 1.907, with Z-value 13,713, p < 0.001. Eleven of the studies reported a negative effect size. In contrast, if using the fixed-effect model, the summary estimation of r = 0.860 with a 95% confidence interval has a lower limit of + 0.808 and an upper limit of + 0.913. Figure 4 shows the forest plot of random overall effect resulting from the meta-analysis.
4.2.2 Heterogeneity
Heterogeneity criterion was tested with Cochran’s Q statistic. The null hypothesis states that there is a common true effect size for all studies. Cochran’s Q statistic was Q = 1733.915, df = 87, p < 0.01, indicating inconsistent true results for several studies. The degree of heterogeneity is measured by the I2 statistic, which was 94.982% indicating the percentage of the total variance that is due to the heterogeneity (Borenstein, 2020; Borenstein et al., 2017; IntHout et al., 2016). The value of τ2 = 1.188 and τ = 1.090.
4.2.3 Publication bias
Publication bias was detected via the asymmetric funnel plots of standard error as shown in Fig. 5 that include only data from the empirical studies, while Fig. 6 includes data from both the present and imputed studies. Figure 7 shows the funnel plot of precision vs standard differences in means.
Egger’s test confirmed the graphic inspection (t[87] = 15.647, p < 0.01). Rosenthal’s fail-safe N = 1.889, suggests that about 1.889 studies should be added to the meta-analysis before the cumulative size effect would become statistically insignificant. The complete meta-analysis under the fixed-effect model showed a positive association of 0.860 between the use of laboratory environment and student academic outcomes. Duval and Tweedie’s Trim and Fill method suggested that if we removed the asymmetric studies, this association would be reduced to 0.567.
4.2.4 Analysis of subgroups
The degree of heterogeneity (I2 statistic) has shown that there are sources of variance in observed effect sizes different from the sampling error. This leads us to assume that some dimensions of subgroups might have different effect sizes and act as moderators. Thus, it is hypothesized that the use of laboratory activities for electric circuits have different effect sizes on the dependent variables:
-
Learning Objective: 1) attitudes, 2) knowledge, 3) skills
-
School level: 1) primary, 2) middle or high school, 3) university
-
User Interface: 1) TUI, 2) GUI, 3) Mixed
4.2.5 Analysis for learning objective
As shown in Table 5, fixed effect analysis showed that the I2 statistic for attitudes, knowledge, and skills is 85.859%, 94.826%, and 44.984% respectively. Consequently, the random-effect model seems appropriate which provided the values of 0.316 for attitudes, 2.318 for knowledge, and 0.848 for skills. The effects are statistically significant at p < 0.001.
The comparison between effects of attitudes and knowledge gives Q = 61.114, p < 0.05. This shows that the effect sizes between attitudes and knowledge are statistically significant. Also, a comparison between the effects of knowledge and skills gives Q = 10.893, p < 0.05, indicating that the difference of effect size between knowledge and skills are statistically significant. A comparison between the effects of attitudes and skills gives Q = 7.848, p < 0.05, that is the difference of effect size between skills and attitudes are statistically significant. Therefore, knowledge acquisition is the most beneficial from the use of laboratory for electric circuits.
4.2.6 Analysis for school level
As shown in Table 6, fixed effect analysis showed that the I2 statistic for primary school, middle or high school, and university is 89.073%, 87.840%, and 89.271% respectively. Consequently, the random-effect model seems appropriate which provided the values of 0.415 for primary, 1.046 for middle or high school, and 4.377 for university. The effects are statistically significant at p < 0.01.
The comparison between effects of primary and middle or high school gives Q = 12.416, p < 0.05. This shows that the difference of effect size between primary and middle or high school are statistically significant. Also, a comparison between the effects of primary school and university gives Q = 131.247, p < 0.05, indicating that the difference of effect size between primary school and university are also statistically significant, as well. A comparison between the effects of middle or high school and university gives Q = 104.817, p < 0.05, that is the difference of effect size between middle or high school and university are also statistically significant. Therefore, students from university gained more benefits from the use of laboratory for electric circuits.
4.2.7 Analysis of user interface
As shown in Table 7, fixed effect analysis showed that the I2 statistic for TUI, GUI, and Mixed is 94.918%, 93.618%, and 94.263% respectively. Consequently, the random-effect model seems appropriate which provided the values of 1.144 for TUI, 1.171 for GUI, and 2.585 for Mixed. The effects are statistically significant at p < 0.001.
The comparison between effects of TUI and GUI gives Q = 0.010, p > 0.05. This shows that the difference of effect size between TUI and GUI are not statistically significant. A comparison between the effects of GUI and Mixed gives Q = 24.408, p < 0.05, indicating that the difference of effect size between GUI and Mixed are statistically significant. Similarly, a comparison between the effects of TUI and Mixed gives Q = 23.603, p < 0.05, that is the difference of effect size between TUI and Mixed are also statistically significant. Therefore, the important finding is that the mixed user interface is the most beneficial from the use of laboratory for electric circuits.
5 Discussion
Regarding the features of the emerged articles, almost half of them were published in the period 2017–2022. This possibly indicates a growth in the field of designing educational interventions in electric circuits with different interfaces. If we consider that during the Coronavirus Disease 2019 (COVID-19) pandemic period, many educational activities were carried out virtually, then in the near future comparative (TUI-GUI) studies would be carried out and published.
About thirty percent of the research studies were conducted at a university. Our results also showed that several studies on teaching electric circuits were conducted in Turkey, while few were conducted in the United States. Nevertheless, in a similar field related to science, technology, engineering, and mathematics (STEM) education, the United States takes a leading role (Tselegkaridis & Sapounidis, 2022b). This disparity might be attributed to the differences between the fields and could possibly reflect the efforts made by Turkish researchers to develop this specific area. However, the included articles may not provide the overall picture of the field, or some studies may have been omitted. This may be due to the way the search was conducted, that is, from the specific words/keywords we used.
According to our findings, there were no participants younger than 10 years old. This aligns with the findings of Wörner et al. (2022), who demonstrated that no experiments are conducted in science education with children younger than third grade. A possible explanation for this may be that the subject of electric circuits is not extensively taught in early childhood. Based on Brenneman et al. (2019) early childhood teachers rarely receive adequate preparation to implement STEM activities. Additionally, according to Lu et al. (2022) and Ültay and Aktaş (2020) STEM education and research mostly focus on secondary and high school education.
Moreover, an important issue for any educational research is the duration of the intervention. About thirty-eight percent of the studies lasted a semester, while about ten percent lasted up to 2 weeks. Another equally important issue for safe inference is the sample size. About thirty-eight percent of the studies used up to 50 students, a small number considering that the intervention includes at least 2 groups of students for TUI and GUI. Consequently, statistical analysis of these sample sizes imposes some limitations.
About thirty percent of the studies used open-ended questions. In general, this is a factor that might contribute to the reliability and quality of the findings, as long as a rubric is used to grade the test. Nevertheless, the included articles did not mention the use of a rubric. Also, in forty percent of the studies, multiple-choice questions were used as an assessment tool.
According to the findings of Table 3, half of the articles compared the TUI with the GUI. In the domain of knowledge, some studies conclude that the GUI has a better learning effect than the TUI (e.g.Faour & Ayoubi, 2018; Kollöffel & de Jong, 2013; Tekbıyık & Ercan, 2015). However, other studies found no difference between TUI and GUI (e.g.Amida et al., 2020; Kapici et al., 2022). We notice that no study has reached a general conclusion that TUI is better than GUI. This finding may be attributed to the types of the exercises conducted during the interventions (Mathur & VanderWeele, 2020; Nakagawa et al., 2022). The other half of the articles compared mixed interfaces. In the domain of knowledge, the findings showed that mixed interfaces probably have a better learning result than TUI or GUI alone (e.g.Kapici et al., 2019; Manunure et al., 2020; Unlu & Dokme, 2011a; Zacharia, 2007).
From the above it can be concluded that the GUI leads to the same or better results than the TUI. Nevertheless, this feature should be investigated in several circuits with different activities. Moreover, students who engage in activities in a tangible and graphical way seem to benefit more in the domain of knowledge (Alkhaldi et al., 2016; de Jong et al., 2013; Wang & Tseng, 2018).
Regarding skills, the interface seems to play no role as no differences were observed between TUI and GUI (e.g. Kapici et al., 2022). Also, the findings showed that students’ attitudes are not affected by the laboratory interface, since either in a TUI, GUI, or mixed interface, the results showed no differences (e.g. Faour & Ayoubi, 2018). However, as the number of the emerged articles was small, further development of such research is needed in order to enrich the field and draw more secure conclusions. As well, it should be noted that although all the studies had electric circuits as their subject, they did not have the same activities or common pre/post-tests designs.
The meta-analysis of the 88 studies with pre/post-tests designs, despite the fact of lacking homogeneity and detection of publication bias, has provided strong indications for the significant effects of laboratory activities on learning outcomes. Specifically, under the random-effects model, the summary estimate of the effect size was + 1.669 with a 95% confidence interval of + 1.43 to + 1.907, with Z-value 13,713, p < 0.001. This finding is consistent with other meta-analyses on the use of technology in education (e.g. Sapounidis et al., 2023; Schmid et al., 2014; Tingir et al., 2017). Knowledge appears to benefit significantly regarding skills and attitudes, while older students seem to benefit the most. As far as the effect of interfaces is concerned, it was shown that a mixed model achieves better results compared to activities involved merely TUI, or GUI. The meta-analysis supports the same findings as in the preceding review.
6 Conclusions
This article aims to shed light on aspects of the utilization of user interfaces in teaching electric circuits, through a meta-analysis. To achieve this, we searched and selected 16 articles that described experimental interventions and provide quantitative comparisons between TUI and GUI. Our findings show that out of the 16 emerged articles, nearly half of them have been published in the last 6 years. Moreover, 6 of the emerged articles had a sample size of up to 50 students, while in another 6 articles, the intervention lasted up to one semester. In addition, the majority of researchers compared interfaces and focused on knowledge building, while few researchers have studied students’ skills and attitudes. After all, strengthening students’ attitudes towards science issues may not have a direct impact on students’ achievement, but it may affect positively their later engagement in those sciences.
In addition, a meta-analysis was conducted using 88 pre/post-tests. Despite the limitations, the findings demonstrated that the use of laboratory activities had a positive impact on students’ learning outcomes, regardless of the interface. Specifically, the meta-analysis revealed the following results: a) the mixed use of interfaces yielded the best outcomes, b) older students achieved better results, and c) knowledge showed the most significant improvement compared to attitudes and skills. However, the literature needs to be enriched with more studies so that safer conclusions can be drawn.
Data availability
Data will be made available upon reasonable request.
References
Akçayir, M., Akçayir, G., Pektaş, M., & Ocak, A. (2016). Augmented reality in science laboratories: The effects of augmented reality on university students’ laboratory skills and attitudes toward science laboratories. Computers in Human Behavior, 57, 334–342. https://doi.org/10.1016/j.chb.2015.12.054
Alfieri, L., Brooks, P. J., Aldrich, N. J., & Tenenbaum, H. R. (2011). Does discovery-based instruction enhance learning? Journal of Educational Psychology, 103(1), 1–18. https://doi.org/10.1037/a0021017
Alkhaldi, T., Pranata, I., & Athauda, R. I. (2016). A review of contemporary virtual and remote laboratory implementations: Observations and findings. Journal of Computers in Education, 3(3), 329–351. https://doi.org/10.1007/s40692-016-0068-z
Altmeyer, K., Kapp, S., Thees, M., Malone, S., Kuhn, J., & Brünken, R. (2020). The use of augmented reality to foster conceptual knowledge acquisition in STEM laboratory courses—Theoretical background and empirical results. British Journal of Educational Technology, 51(3), 611–628. https://doi.org/10.1111/bjet.12900
Amida, A., Chang, I., & Yearwood, D. (2020). Designing a practical lab-based assessment: A case study. Journal of Engineering, Design and Technology, 18(3), 567–581. https://doi.org/10.1108/JEDT-08-2019-0194
Baartman, L. K. J., & De Bruijn, E. (2011). Integrating knowledge, skills and attitudes: Conceptualising learning processes towards vocational competence. Educational Research Review, 6(2), 125–134. https://doi.org/10.1016/j.edurev.2011.03.001
Baran, B., Yecan, E., Kaptan, B., & Paşayiğit, O. (2020). Using augmented reality to teach fifth grade students about electrical circuits. Education and Information Technologies, 25(2), 1371–1385. https://doi.org/10.1007/s10639-019-10001-9
Başer, M., & Durmus, S. (2010). The Effectiveness of computer supported versus real laboratory inquiry learning environments on the understanding of direct. Eurasia Journal of Mathematics, Science & Technology Education, 6(1), 47–61.
Borenstein, M. (2020). Research Note: In a meta-analysis, the I2 index does not tell us how much the effect size varies across studies. Journal of Physiotherapy, 66(2), 135–139. https://doi.org/10.1016/j.jphys.2020.02.011
Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2010). A basic introduction to fixed-effect and random-effects models for meta-analysis. Research Synthesis Methods, 1(2), 97–111. https://doi.org/10.1002/jrsm.12
Borenstein, M., Higgins, J. P. T., Hedges, L. V., & Rothstein, H. R. (2017). Basics of meta-analysis: I2 is not an absolute measure of heterogeneity. Research Synthesis Methods, 8(1), 5–18. https://doi.org/10.1002/jrsm.1230
Borenstein, M., Hedges, L. E., Higgins, J. P. T., & Rothstein, H. R. (2022). Comprehensive Meta-Analysis. Biostat Inc. www.Meta-Analysis.com
Brenneman, K., Lange, A., & Nayfeld, I. (2019). Integrating STEM into preschool education; Designing a professional development model in diverse settings. Early Childhood Education Journal, 47(1), 15–28. https://doi.org/10.1007/s10643-018-0912-z
Bretz, S. L. (2019). Evidence for the Importance of Laboratory Courses. Journal of Chemical Education, 96(2), 193–195. https://doi.org/10.1021/acs.jchemed.8b00874
Chen, S., Chang, W. H., Lai, C. H., & Tsai, C. Y. (2014). A Comparison of students’ approaches to inquiry, conceptual learning, and attitudes in simulation-based and microcomputer-based laboratories. Science Education, 98(5), 905–935. https://doi.org/10.1002/sce.21126
Chernikova, O., Heitzmann, N., Stadler, M., Holzberger, D., Seidel, T., & Fischer, F. (2020). Simulation-based learning in higher education: A meta-analysis. Review of Educational Research, 90(4), 499–541. https://doi.org/10.3102/0034654320933544
de Jong, T., Linn, M. C., & Zacharia, Z. C. (2013). Physical and virtual laboratories in science and engineering education. Science, 340(6130), 305–308. https://doi.org/10.1126/science.1230579
DerSimonian, R., & Laird, N. (2015). Meta-analysis in clinical trials revisited. Contemporary Clinical Trials, 45, 139–145. https://doi.org/10.1016/j.cct.2015.09.002
Engelhardt, P. V., & Beichner, R. J. (2004). Students’ understanding of direct current resistive electrical circuits. American Journal of Physics, 72(1), 98–115. https://doi.org/10.1119/1.1614813
Evangelou, F., & Kotsis, K. (2019). Real vs virtual physics experiments: comparison of learning outcomes among fifth grade primary school students. A case on the concept of frictional force. International Journal of Science Education, 41(3), 330–348. https://doi.org/10.1080/09500693.2018.1549760
Falloon, G. (2019). Using simulations to teach young students science concepts: An Experiential Learning theoretical analysis. Computers and Education, 135(March), 138–159. https://doi.org/10.1016/j.compedu.2019.03.001
Falloon, G. (2020). From simulations to real: Investigating young students’ learning and transfer from simulations to real tasks. British Journal of Educational Technology, 51(3), 778–797. https://doi.org/10.1111/bjet.12885
Faour, M. A., & Ayoubi, Z. (2018). The effect of using virtual laboratory on Grade 10 students’ conceptual understanding and their attitudes towards physics. Journal of Education in Science, Environment and Health, 4(1), 54–68.
Farrokhnia, M. R., & Esmailpour, A. (2010). A study on the impact of real, virtual and comprehensive experimenting on students’ conceptual understanding of DC electric circuits and their skills in undergraduate electricity laboratory. Procedia - Social and Behavioral Sciences, 2(2), 5474–5482. https://doi.org/10.1016/j.sbspro.2010.03.893
Finkelstein, N. D., Adams, W. K., Keller, C. J., Kohl, P. B., Perkins, K. K., Podolefsky, N. S., Reid, S., & Lemaster, R. (2005). When learning about the real world is better done virtually: A study of substituting computer simulations for laboratory equipment. Physical Review Special Topics - Physics Education Research, 1(1), 1–8. https://doi.org/10.1103/PhysRevSTPER.1.010103
Foronda, C. L., Fernandez-Burgos, M., Nadeau, C., Kelley, C. N., & Henry, M. N. (2020). Virtual simulation in nursing education: A systematic review spanning 1996 to 2018. Simulation in Healthcare, 15(1), 46–54. https://doi.org/10.1097/SIH.0000000000000411
Gaigher, E. (2014). Questions about answers: Probing teachers’ awareness and planned remediation of learners’ misconceptions about electric circuits. African Journal of Research in Mathematics, Science and Technology Education, 18(2), 176–187. https://doi.org/10.1080/10288457.2014.925268
Hmelo-Silver, C. E., Duncan, R. G., & Chinn, C. A. (2007). Scaffolding and achievement in problem-based and inquiry learning: A response to Kirschner, Sweller, and Clark (2006). Educational Psychologist, 42(2), 99–107. https://doi.org/10.1080/00461520701263368
Hofstein, A., & Lunetta, V. N. (2004). The laboratory in science education: foundations for the twenty-first century. Science Education, 88(1), 28–54. https://doi.org/10.1002/sce.10106
IntHout, J., Ioannidis, J. P. A., Rovers, M. M., & Goeman, J. J. (2016). Plea for routinely presenting prediction intervals in meta-analysis. BMJ Open, 6(7), e010247. https://doi.org/10.1136/bmjopen-2015-010247
Jaakkola, T., & Nurmi, S. (2008). Fostering elementary school students’ understanding of simple electricity by combining simulation and laboratory activities. Journal of Computer Assisted Learning, 24(4), 271–283. https://doi.org/10.1111/j.1365-2729.2007.00259.x
Jaakkola, T., Nurmi, S., & Veermans, K. (2011). A comparison of students’ conceptual understanding of electric circuits in simulation only and simulation-laboratory contexts. Journal of Research in Science Teaching, 48(1), 71–93. https://doi.org/10.1002/tea.20386
Kapici, H. O., Akcay, H., & Cakir, H. (2022). Investigating the effects of different levels of guidance in inquiry-based hands-on and virtual science laboratories. International Journal of Science Education, 44(2), 324–345. https://doi.org/10.1080/09500693.2022.2028926
Kapici, H. O., Akcay, H., & de Jong, T. (2019). Using Hands-On and Virtual Laboratories Alone or Together-Which Works Better for Acquiring Knowledge and Skills? Journal of Science Education and Technology, 28(3), 231–250. https://doi.org/10.1007/s10956-018-9762-0
Kapici, H. O., Akcay, H., & de Jong, T. (2020). How do different laboratory environments influence students’ attitudes toward science courses and laboratories? Journal of Research on Technology in Education, 52(4), 534–549. https://doi.org/10.1080/15391523.2020.1750075
Kapp, S., Thees, M., Beil, F., Weatherby, T., Burde, J. P., Wilhelm, T., & Kuhn, J. (2020). The effects of augmented reality: A comparative study in an undergraduate physics laboratory course. CSEDU 2020 - Proceedings of the 12th International Conference on Computer Supported Education, 2(January), 197–206. https://doi.org/10.5220/0009793001970206
Kollöffel, B., & de Jong, T. A. J. M. (2013). Conceptual understanding of electrical circuits in secondary vocational engineering education: Combining traditional instruction with inquiry learning in a virtual lab. Journal of Engineering Education, 102(3), 375–393. https://doi.org/10.1002/jee.20022
Kondaveeti, H. K., Kumaravelu, N. K., Vanambathina, S. D., Mathe, S. E., & Vappangi, S. (2021). A systematic literature review on prototyping with Arduino: Applications, challenges, advantages, and limitations. In Computer Science Review (Vol. 40). Elsevier Ireland Ltd. https://doi.org/10.1016/j.cosrev.2021.100364
Lazonder, A. W., & Harmsen, R. (2016). Meta-Analysis of inquiry-based learning: Effects of guidance. Review of Educational Research, 86(3), 681–718. https://doi.org/10.3102/0034654315627366
Lu, S. Y., Lo, C. C., & Syu, J. Y. (2022). Project-based learning oriented STEAM: The case of micro–bit paper-cutting lamp. International Journal of Technology and Design Education, 32(5), 2553–2575. https://doi.org/10.1007/s10798-021-09714-1
Maatuk, A. M., Elberkawi, E. K., Aljawarneh, S., Rashaideh, H., & Alharbi, H. (2022). The COVID-19 pandemic and E-learning: Challenges and opportunities from the perspective of students and instructors. Journal of Computing in Higher Education, 34(1), 21–38. https://doi.org/10.1007/s12528-021-09274-2
Manunure, K., Delserieys, A., & Castéra, J. (2020). The effects of combining simulations and laboratory experiments on Zimbabwean students’ conceptual understanding of electric circuits. Research in Science and Technological Education, 38(3), 289–307. https://doi.org/10.1080/02635143.2019.1629407
Mathur, M. B., & VanderWeele, T. J. (2020). Sensitivity analysis for publication bias in meta-analyses. Journal of the Royal Statistical Society Series C: Applied Statistics, 69(5), 1091–1119. https://doi.org/10.1111/rssc.12440
Minner, D. D., Levy, A. J., & Century, J. (2010). Inquiry-based science instruction-what is it and does it matter? Results from a research synthesis years 1984 to 2002. Journal of Research in Science Teaching, 47(4), 474–496. https://doi.org/10.1002/tea.20347
Moodley, K., & Gaigher, E. (2019). Teaching electric circuits: Teachers’ perceptions and learners’ misconceptions. Research in Science Education, 49(1), 73–89. https://doi.org/10.1007/s11165-017-9615-5
Mourão, E., Pimentel, J. F., Murta, L., Kalinowski, M., Mendes, E., & Wohlin, C. (2020). On the performance of hybrid search strategies for systematic literature reviews in software engineering.Information and Software Technology, 123. https://doi.org/10.1016/j.infsof.2020.106294
Munn, Z., Peters, M. D. J., Stern, C., Tufanaru, C., McArthur, A., & Aromataris, E. (2018). Systematic review or scoping review? Guidance for authors when choosing between a systematic or scoping review approach. BMC Medical Research Methodology, 18(1), 143. https://doi.org/10.1186/s12874-018-0611-x
Nakagawa, S., Lagisz, M., Jennions, M. D., Koricheva, J., Noble, D. W. A., Parker, T. H., Sánchez-Tójar, A., Yang, Y., & O’Dea, R. E. (2022). Methods for testing publication bias in ecological and evolutionary meta-analyses. In Methods in Ecology and Evolution (Vol. 13, Issue 1, pp. 4–21). British Ecological Society. https://doi.org/10.1111/2041-210X.13724
Olympiou, G., & Zacharia, Z. C. (2012). Blending physical and virtual manipulatives: An effort to improve students’ conceptual understanding through science laboratory experimentation. Science Education, 96(1), 21–47. https://doi.org/10.1002/sce.20463
Olympiou, G., & Zacharia, Z. C. (2018). Examining Students’ Actions While Experimenting with a Blended Combination of Physical Manipulatives and Virtual Manipulatives in Physics. In Research on e-Learning and ICT in Education (pp. 257–278). https://doi.org/10.1007/978-3-319-95059-4_16
Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., … Moher, D. (2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. The BMJ, 372. https://doi.org/10.1136/bmj.n71
Pan, Z., Wang, Z., Yuan, Q., Meng, Q., Liu, J., Shou, K., & Sun, X. (2022). A spatial augmented reality based circuit experiment and comparative study with the conventional one. Computer Animation and Virtual Worlds, 33(3–4), 1–13. https://doi.org/10.1002/cav.2069
Phanphech, P., Tanitteerapan, T., & Murphy, E. (2019). Explaining and enacting for conceptual understanding in secondary school physics. Issues in Educational Research, 29(1), 180–204.
Potkonjak, V., Gardner, M., Callaghan, V., Mattila, P., Guetl, C., Petrović, V. M., & Jovanović, K. (2016). Virtual laboratories for education in science, technology, and engineering: A review. Computers and Education, 95, 309–327. https://doi.org/10.1016/j.compedu.2016.02.002
Puntambekar, S., Gnesdilow, D., Dornfeld Tissenbaum, C., Narayanan, N. H., & Rebello, N. S. (2021). Supporting middle school students’ science talk: A comparison of physical and virtual labs. Journal of Research in Science Teaching, 58(3), 392–419. https://doi.org/10.1002/tea.21664
Reeves, S. M., & Crippen, K. J. (2021). Virtual laboratories in undergraduate science and engineering courses: A systematic review, 2009–2019. Journal of Science Education and Technology, 30(1), 16–30. https://doi.org/10.1007/s10956-020-09866-0
Renken, M. D., & Nunez, N. (2013). Computer simulations and clear observations do not guarantee conceptual understanding. Learning and Instruction, 23(1), 10–23. https://doi.org/10.1016/j.learninstruc.2012.08.006
Rice, K., Higgins, J. P. T., & Lumley, T. (2018). A re-evaluation of fixed effect(s) meta-analysis. Journal of the Royal Statistical Society: Series A (statistics in Society), 181(1), 205–227. https://doi.org/10.1111/rssa.12275
Salta, K., Paschalidou, K., Tsetseri, M., & Koulougliotis, D. (2022). Shift From a traditional to a distance learning environment during the COVID-19 pandemic: University students’ engagement and interactions. Science and Education, 31(1), 93–122. https://doi.org/10.1007/s11191-021-00234-x
Sapounidis, T., & Demetriadis, S. (2013). Tangible versus graphical user interfaces for robot programming: Exploring cross-age children’s preferences. Personal and Ubiquitous Computing, 17(8), 1775–1786. https://doi.org/10.1007/s00779-013-0641-7
Sapounidis, T., Demetriadis, S., & Stamelos, I. (2015). Evaluating children performance with graphical and tangible robot programming tools. Personal and Ubiquitous Computing, 19(1), 225–237. https://doi.org/10.1007/s00779-014-0774-3
Sapounidis, T., Stamelos, I., & Demetriadis, S. (2016). Tangible user interfaces for programming and education: A new field for innovation and entrepreneurship. In Advances in Digital Education and Lifelong Learning (Vol. 2, pp. 271–295). Emerald Group Publishing Ltd. https://doi.org/10.1108/S2051-229520160000002016
Sapounidis, T., Stamovlasis, D., & Demetriadis, S. (2019). Latent class modeling of children’s preference profiles on tangible and graphical robot programming. IEEE Transactions on Education, 62(2), 127–133. https://doi.org/10.1109/TE.2018.2876363
Sapounidis, T., Tselegkaridis, S., & Stamovlasis, D. (2023). Educational robotics and STEM in primary education: a review and a meta-analysis. Journal of Research on Technology in Education, 1–15. https://doi.org/10.1080/15391523.2022.2160394
Schmid, R. F., Bernard, R. M., Borokhovski, E., Tamim, R. M., Abrami, P. C., Surkes, M. A., Wade, C. A., & Woods, J. (2014). The effects of technology use in postsecondary education: A meta-analysis of classroom applications. Computers and Education, 72, 271–291. https://doi.org/10.1016/j.compedu.2013.11.002
Sullivan, S., Gnesdilow, D., Puntambekar, S., & Kim, J. S. (2017). Middle school students’ learning of mechanics concepts through engagement in different sequences of physical and virtual experiments. International Journal of Science Education, 39(12), 1573–1600. https://doi.org/10.1080/09500693.2017.1341668
Taramopoulos, A., Psillos, D., & Hatzikraniotis, E. (2012). Teaching Electric Circuits by Guided Inquiry in Virtual and Real Laboratory Environments. In Research on e-Learning and ICT in Education. https://doi.org/10.1007/978-1-4614-1083-6
Tekbıyık, A., & Ercan, O. (2015). Effects of the physical laboratory versus the virtual laboratory in teaching simple electric circuits on conceptual achievement and attitudes towards the subject. International Journal of Progressive Education, 11(3), 77–89.
Thees, M., Altmeyer, K., Kapp, S., Rexigel, E., Beil, F., Klein, P., Malone, S., Brünken, R., & Kuhn, J. (2022). Augmented Reality for Presenting Real-Time Data During Students’ Laboratory Work: Comparing a Head-Mounted Display With a Separate Display. Frontiers in Psychology, 13(March), 1–16. https://doi.org/10.3389/fpsyg.2022.804742
Tingir, S., Cavlazoglu, B., Caliskan, O., Koklu, O., & Intepe-Tingir, S. (2017). Effects of mobile devices on K–12 students’ achievement: A meta-analysis. Journal of Computer Assisted Learning, 33(4), 355–369. https://doi.org/10.1111/jcal.12184
Tselegkaridis, S., & Sapounidis, T. (2021). Simulators in educational robotics: A review. Education Sciences, 11(1), 1–12. https://doi.org/10.3390/educsci11010011
Tselegkaridis, S., & Sapounidis, T. (2022a). A Systematic Literature Review on STEM Research in Early Childhood. In STEM, Robotics, Mobile Apps in Early Childhood and Primary Education (pp. 117–134). Springer, Singapore. https://doi.org/10.1007/978-981-19-0568-1_7
Tselegkaridis, S., & Sapounidis, T. (2022b). Exploring the features of educational robotics and stem research in primary education: A systematic literature review. Education Sciences, 12(5), 305. https://doi.org/10.3390/educsci12050305
Tsihouridis, C., Vavougios, D., & Ioannidis, G. S. (2013). The effectiveness of virtual laboratories as a contemporary teaching tool in the teaching of electric circuits in Upper High School as compared to that of real labs.International Conference on Interactive Collaborative Learning (ICL), September, 816–820. 978–1–4799–0153–1/13
Tsihouridis, C., Vavougios, D., Ioannidis, G. S., Alexias, A., Argyropoulos, C., & Poulios, S. (2015). The effect of teaching electric circuits switching from real to virtual lab or vice versa - A case study with junior high-school learners.Proceedings of 2015 International Conference on Interactive Collaborative Learning, ICL 2015, September, 643–649. https://doi.org/10.1109/ICL.2015.7318102
Ültay, N., & Aktaş, B. (2020). An example implementation of STEM in preschool education: Carrying eggs without breaking. Science Activities, 57(1), 16–24. https://doi.org/10.1080/00368121.2020.1782312
Unlu, Z. K., & Dokme, I. (2011a). The effect of combining analogy-based simulation and laboratory activities on Turkish elementary school students’ understanding of simple electric circuits. Turkish Online Journal of Educational Technology, 10(4), 320–329.
Unlu, Z. K., & Dokme, I. (2011b). The effect of three different teaching tools in science education on the students’ attitudes towards computer. Procedia - Social and Behavioral Sciences, 15, 2652–2657. https://doi.org/10.1016/j.sbspro.2011.04.164
Villena-Taranilla, R., Tirado-Olivares, S., Cózar-Gutiérrez, R., & González-Calero, J. A. (2022). Effects of virtual reality on learning outcomes in K-6 education: A meta-analysis. Educational Research Review, 35(June 2021). https://doi.org/10.1016/j.edurev.2022.100434
Wang, T. L., & Tseng, Y. K. (2018). The Comparative effectiveness of physical, virtual, and virtual-physical manipulatives on third-grade students’ science achievement and conceptual understanding of evaporation and condensation. International Journal of Science and Mathematics Education, 16(2), 203–219. https://doi.org/10.1007/s10763-016-9774-2
Wörner, S., Kuhn, J., & Scheiter, K. (2022). The best of two worlds: A systematic review on combining real and virtual experiments in science education. Review of Educational Research, XX(X), 1–42. https://doi.org/10.3102/00346543221079417
Xie, X., Siau, K., & Nah, F. F. H. (2020). COVID-19 pandemic–online education in the new normal and the next normal. Journal of Information Technology Case and Application Research, 22(3), 175–187. https://doi.org/10.1080/15228053.2020.1824884
Zacharia, Z. (2007). Comparing and combining real and virtual experimentation: An effort to enhance students’ conceptual understanding of electric circuits. Journal of Computer Assisted Learning, 23(2), 120–132. https://doi.org/10.1111/j.1365-2729.2006.00215.x
Zacharia, Z., & Constantinou, C. P. (2008). Comparing the influence of physical and virtual manipulatives in the context of the Physics by Inquiry curriculum: The case of undergraduate students’ conceptual understanding of heat and temperature. American Journal of Physics, 76(4), 425–430. https://doi.org/10.1119/1.2885059
Zacharia, Z., & de Jong, T. (2014). The effects on students’ conceptual understanding of electric circuits of introducing virtual manipulatives within a physical manipulatives-oriented curriculum. Cognition and Instruction, 32(2), 101–158. https://doi.org/10.1080/07370008.2014.887083
Zacharia, Z., & Michael, M. (2016). Using physical and virtual manipulatives to improve primary school students’ understanding of concepts of electric circuits. In New Developments in Science and Technology Education (pp. 125–140). https://doi.org/10.1007/978-3-319-22933-1_12
Zacharia, Z., & Olympiou, G. (2011). Physical versus virtual manipulative experimentation in physics learning. Learning and Instruction, 21(3), 317–331. https://doi.org/10.1016/j.learninstruc.2010.03.001
Author information
Authors and Affiliations
Contributions
All authors contributed to the study conception and design. All authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Conflict of interest
The authors have no conflicts of interest to declare that are relevant to the content of this article.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Future work and recommendations
Initially, it is very important in future work to frame the level of guidance in a very clear and specific way. If this is done in a systematic way, it will strengthen the findings and expand the research in this field. In addition, research should go beyond simple electric circuits and extend to the whole range of electric circuits. Furthermore, researchers should focus on studies of students’ skills and attitudes, as our grasp of this aspect seems to be very limited. Finally, it is important for the researchers to pay attention to the sample size and the duration of the intervention, so that there will be depth and quality in their findings.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Tselegkaridis, S., Sapounidis, T. & Stamovlasis, D. Teaching electric circuits using tangible and graphical user interfaces: A meta-analysis. Educ Inf Technol 29, 8647–8671 (2024). https://doi.org/10.1007/s10639-023-12164-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10639-023-12164-y