Abstract
Affect and metacognition play a central role in learning. We examine the relationships between students’ affective state dynamics, metacognitive judgments, and performance during learning with MetaTutorIVH, an advanced learning technology for human biology education. Student emotions were tracked using facial expression recognition embedded within MetaTutorIVH and transitions between emotions theorized to be important to learning (e.g., confusion, frustration, and joy) are analyzed with respect to likelihood of occurrence. Transitions from confusion to frustration were observed at a significantly high likelihood, although no differences in performance were observed in the presence of these affective states and transitions. Results suggest that the occurrence of emotions have a significant impact on students’ retrospective confidence judgments, which they made after submitting their answers to multiple-choice questions. Specifically, the presence of confusion and joy during learning had a positive impact on student confidence in their performance while the presence of frustration and transition from confusion to frustration had a negative impact on confidence, even after accounting for individual differences in multiple-choice confidence.
Access provided by CONRICYT-eBooks. Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Research has shown that affect and metacognition play a significant role in learning. When students accomplish learning goals, they are likely to experience joy [1], while negative emotions during learning, such as frustration and confusion, can lead to disengagement with the learning material and prevent effective learning [2, 3]. To enable advanced learning technologies (ALTs) to effectively interact with students, it is important to allow ALTs to take actions to address students’ affective states, and understand the relationship between these affective states and students’ metacognitive monitoring processes [4, 5]. Developing an understanding of the relationship between students’ affect and their cognitive and metacognitive self-regulated learning (SRL) processes can contribute to the design of practical, scalable ALTs [6, 8]. This work moves toward affect-aware ALTs by using automated affect detection through facial expression recognition of emotion. Automatic affect detection has been an area of active research and builds on theoretical frameworks such as the Facial Action Coding Scheme [7] and machine learning-based affect detectors [e.g. 3, 5, 9]. While automatic affect detection has been used to accurately predict learning outcomes using lower-level action units [e.g., 10], previous research using automatically detected affect has not considered the metacognitive processes that are also influenced by student affect and integral to the self-regulated learning processes that can influence students’ performance during learning with ALTs.
2 MetaTutorIVH Study
A total of 66 students enrolled in a mid-Atlantic North American University participated in this study. Data from 12 students were removed due to calibration issues, resulting in 54 students (72% female). Students’ ages ranged from 18 to 29 (M = 20.5, SD = 2.34). An 18-item, multiple-choice question pre-test assessing prior knowledge of the biology concepts covered during learning with MetaTutorIVH indicated students had low to moderate (questions correct ranging from 6 to 14) prior knowledge (M = 11.0 [61.1%], SD = 1.46 [8.1%]).
2.1 MetaTutorIVH
MetaTutorIVH is an ALT with which students learn human biology concepts through text and diagrams while making metacognitive judgments, answering multiple-choice questions, and observing a virtual human. Students interacted with MetaTutorIVH over the course of 18 counter-balanced, randomized, self-paced trials that consisted of a complex biology question, metacognitive judgment prompts, a virtual human, and science content presented in text and diagrams. For each trial, a student was first presented with a science question and then performed an ease of learning metacognitive judgment. Then students were presented with the content page (Fig. 1) containing the science text and diagram, as well as the virtual human. Students decided when to progress from the content page to the 4-foil multiple-choice question, which was then followed by a retrospective confidence judgment (RCJ), where students evaluated their multiple-choice answer confidence. Each trial concluded with the students providing a justification and RCJ for their justification.
Facial expression features were extracted automatically from a facial expression recognition system, FACET [13]. FACET extracts facial measurements from video streams that correspond to the Facial Action Coding System [11]. A discretization process filtered out subject and measurement variance to provide conservative estimates of emotion events that are stable across students. Once the evidence scores were converted to discrete events, sequences of emotion were created for each student. Students engaged in complex learning processes on the content page (Fig. 1), thus only emotion events that occur during the content page were considered. We examined page-level sequences each of which are a sequence of emotions produced by a student on a single content page, which have short sequence lengths (M = 5.97, SD = 6.47) due to the brief time students spent per content page (M = 100.5 s, SD = 47.3).
3 Results
The observed rates of occurrence for each emotion during student interaction with the content page indicate that joy (M = 0.67, SD = 0.61) and frustration (M = 0.66, SD = 0.60) were the most frequently occurring emotions, while contempt (M = 0.36, SD = 0.42) was the least frequently occurring emotion.
We calculate the likelihood metric for transitions, calculated similarly to Cohen’s Kappa (see [6] for additional details), of key learner-centric transitions averaged over page-level sequences. The likelihood of transitions between confusion and frustration were both significantly above 0 (Confusion to Frustration Average Likelihood = 0.40, SD = 0.30; Frustration to Confusion Average Likelihood = 0.19, SD = 0.20), which is not surprising considering the strong correlation between these emotions (r(54) = 0.70, p < 0.001). However, the transitions from confusion to frustration have a significantly higher likelihood measured across the 54 students than transitions from frustration to confusion (t(53) = 5.89, p < 0.001), indicating that while correlated, confusion was seen more often to precede frustration than frustration preceding confusion during learning.
Proportional to the frequencies observed on the student level, joy was observed in 51.4% of trials, frustration in 45.3%, confusion in 35.4% and a transition from confusion to frustration in 21.2% of trials. A mixed effects logistic regression model performed in R with the lme4 package [12] predicting multiple choice correctness using the presence of confusion, frustration, joy, and transition from confusion to frustration as fixed effects and random intercepts for students found no significant predictors among the fixed effects from a likelihood test against a null model using only the random intercepts for students (χ2(4) = 3.73, p = 0.44). This mixed effect model indicates there was no effect of the presence of these emotions and transition from confusion to frustration on multiple-choice performance after accounting for individual differences.
The relationship between presence of learner-centric emotions and RCJs was examined through a linear mixed effect model using fixed effects for the presence of confusion, frustration, joy, and transition between confusion to frustration and random intercepts for each test subject. A significant impact of the fixed effects of emotions was found through a nested F-test using the full model as the linear mixed effects model and reduced model being a random effects model with intercepts for students (F(4, 915) = 2.45, p = 0.045, R2 = 0.36). The fixed effects are reported in Table 1 and indicate that the presence of confusion and joy have a positive impact on student RCJs, specifically multiple-choice confidence, while the presence of frustration and a transition from confusion to frustration have a negative impact.
Additionally, to assess the relationship between multiple choice performance (a binary measure of correctness) with multiple choice confidence (i.e., an RCJ) a Welch’s two sample t-test accounting for unequal variance among groups indicated the confidence levels of students was significantly greater (t(842) = 9.1, p < 0.001, Cohen’s d = 0.60) in trials where the multiple-choice question was answered correctly (M = 84.0, SD = 15.5) than when students answered incorrectly (M = 74.5, SD = 16.2).
4 Conclusion
Using an automatic affect detection system embedded in MetaTutorIVH, we conducted an analysis of learner-centered emotions and their influence on students’ learning and RCJs. Joy and frustration were found to be the most frequently occurring emotions when examining the absolute frequency of discrete emotions. Analysis of the affective dynamics revealed transitions between confusion and frustration to be significantly more likely than chance, with transitions specifically from confusion to frustration being especially prominent. The presence of learner-centered emotions (joy, confusion, frustration), and transitions from confusion to frustration during complex learning did not reveal any effect of learner-centered emotions on learning. Additional analyses revealed positive effects of confusion and joy on RCJs and negative effects of frustration and transitions from confusion to frustration. These results can inform the design of ALTs that assist learners in both cognitive and metacognitive processes through monitoring and intervening based on their affective expressions.
References
D’Mello, S., Graesser, A.: Dynamics of affective states during complex learning. Learn. Instruct. 22(2), 145–157 (2012)
Baker, R., D’Mello, S., Rodrigo, M.M.T., Graesser, A.: Better to be frustrated than bored: the incidence, persistence, and impact of learners’ cognitive affective states during interactions with three different computer-based learning environments. Int. J. Hum. Comput. Stud. 68(4), 223–241 (2010)
Lallé, S., Conati, C., Carenini, G.: Predicting confusion in information visualization from eye tracking and interaction data. In: International Joint Conference on Artificial Intelligence, pp. 2529–2535 (2016)
Boulay, B., Centred, H.: Towards systems that care: a conceptual framework based on motivation, metacognition and sffect. Int. J. Artif. Intell. Educ. 20(3), 197–229 (2010)
D’Mello, S., Graesser, A.: Feeling, thinking, and computing with affect aware learning. In: Calvo, R., D’Mello, S., Gratch, J., Kappas, A. (eds.) The Oxford Handbook on Affective Computing, pp. 419–434. Oxford Library of Psychology (2014)
Azevedo, R., Mudrick, N., Taub, M., Wortha, F.: Coupling between metacognition and emotions during STEM learning with advanced learning technologies: a critical analysis, implications for future research, and design of learning systems. In: Michalsky, T., Schechter, C. (eds.) Self-regulated Learning: Conceptualization, Contribution, and Empirically Based Models for Teaching and Learning. Teachers College Press, New York (2017)
Ekman, P., Friesen, W.V.: Measuring facial movement. Environ. Psychol. Nonverbal Behav. 1(1), 56–75 (1976)
Calvo, R., Mello, S.: Affect detection: an interdisciplinary review of model, methods, and their applications. IEEE Trans. Affect. Comput. 1(1), 18–37 (2010)
Botelho, A.F., Baker, R.S., Heffernan, N.T.: Improving sensor-free affect detection using deep learning. In: André, E., Baker, R., Hu, X., Rodrigo, M.M.T., du Boulay, B. (eds.) AIED 2017. LNCS (LNAI), vol. 10331, pp. 40–51. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-61425-0_4
Sawyer, R., Smith, A., Rowe, J., Azevedo, R., Lester, J.: Enhancing student models in game-based learning with facial expression recognition. In: Proceedings of the Twenty-Fifth Conference on User Modeling, Adaptation, and Personalization, pp. 192–201. ACM, Bratislava, Slovakia (2017)
Ekman, P., Friesen, W.: Facial action coding system (1977)
Bates, D., Mächler, M., Bolker, B.M., Walker, S.C.: Fitting linear mixed-effects models using lme4. J. Stat. Softw. 67, 1–48 (2015)
iMotions Biometric Research Platform 6.0, iMotions A/S, Copenhagen, Denmark (2016)
Acknowledgements
This research was supported by funding from the National Science Foundation (DRL #1431552). The authors would also like to thank members of the SMART Lab and IntelliMedia Group for their contributions to this project.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Sawyer, R., Mudrick, N.V., Azevedo, R., Lester, J. (2018). Impact of Learner-Centered Affective Dynamics on Metacognitive Judgements and Performance in Advanced Learning Technologies. In: Penstein Rosé, C., et al. Artificial Intelligence in Education. AIED 2018. Lecture Notes in Computer Science(), vol 10948. Springer, Cham. https://doi.org/10.1007/978-3-319-93846-2_58
Download citation
DOI: https://doi.org/10.1007/978-3-319-93846-2_58
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-93845-5
Online ISBN: 978-3-319-93846-2
eBook Packages: Computer ScienceComputer Science (R0)