Abstract
This study analyzes students’ behavior in our remote laboratory environment and aims at identifying behavioural patterns during a practical session that lead to better learning outcomes, in order to predict learners’ performance and to automatically guide students who might need more support. Based on data collected from an experimentation conducted in an authentic learning context, we discover recurrent sequential patterns of actions that lead us to the definition of learning strategies as indicators of higher level of abstraction. Results show that some of the strategies are correlated to the learners’ performance at the final assessment test. For instance, construction of a complex action step by step, or reflexion before submitting an action, are two strategies applied more often by learners of a higher level of performance than by others. These findings led us to instrument for both students and instructors new guiding and tutoring tools in our remote lab environment.
Access provided by CONRICYT-eBooks. Download conference paper PDF
Similar content being viewed by others
1 Introduction
Research on predictors of success in learning has been a hot topic for decades [1]. Predictors are information about learners and include work style preference, self-efficacy [2], or background and expectations [3]. However, the development of Educational Data Mining and Learning Analytics provides new capabilities to explore learners’ behavior and to study its influence on their performance.
In the field of virtual and remote laboratories, interactions between students and apparatus are behavioral data that can be used as inputs for the above techniques. Based on data resulting from an experimentation conducted in a real setting, we adopt sequential pattern mining approaches to investigate learners’ behaviors during practical sessions and to discover sequences of actions, or learning strategies, that are representative of their level of performance.
2 Experimental Settings
The experimentation involved 85 first year students enrolled in an introductory course on Shell commands, and was conducted for 3 weeks. Learners had a 24-7 access to our remote lab dedicated to computer education to complete their tasks. This web-based environment offers on-demand virtual resources and features advanced learning capabilities [4]. Basically, each learner is provided with her own set of resources and can interact with it through a web-based Terminal. The learning features include real-time communication, collaborative work, awareness tools, as well as tools for deep analysis of working sessions.
These features are made possible thanks to a learning analytics framework able to collect most of users’ interactions with the platform. Our analysis focus on the Shell commands submitted by learners, where a total number of 9183 xAPI statements was collected. Our objective was to explore these interactions to investigate their possible correlation with the assessment score (AS) which denotes the score learners got at the final assessment test. Also, as the distribution of AS made appear three distinct categories of AS (i.e., low - L, medium - M, and high - H), we also studied correlations with these categories (AScat).
3 Pattern Mining Analysis
3.1 Nature of Actions
To go further our learning context, we applied a pattern mining analysis not on commands themselves, but on their nature, their relationships, and the result of their execution. Our analysis identified 8 exclusive natures of actions: Sub_S, Sub_F, ReSub_S, ReSub_F, VarSub_S, VarSub_F, Help and NewHelp. The natures Sub_* refer to an action whose type is different than the previous one, and which is evaluated as right (Sub_S) or wrong (Sub_F). The natures ReSub_* consider an action that is identical to the previous one (i.e., same type and parameters), while the natures VarSub_* represent a command of the same type than the previous one, but with different parameters. Finally, Help depicts an action of help seeking about the type of the previous action, while NewHelp indicates a help access without relation with the previous action.
3.2 Patterns of Actions
To discover statistically significant sequences of actions, we analyzed two- and three-length sequences and applied to each sequence an analysis of variance (i.e., one-way ANOVA) for AScat, and a Pearson correlation test for AS. We identified 13 significant patterns, most of them being used more often by high- and medium-level students than by others, and presenting a significant weak (i.e., \(0.1< |r| < 0.3\)) or medium (i.e., \(0.3< |r| < 0.5\)) correlation with AS. Also, the patterns present common semantics depicting students’ behaviors: they can be viewed as learning strategies followed by learners to solve a problem.
3.3 Learning Strategies
paginationStarting from the 13 significant patterns, we specified 7 learning strategies: confirmation, progression, success-then-reflexion, reflexion-then-success, fail-then-reflexion, trial-and-error, and withdrawal. Confirmation is the successful resubmission of the same action, while progression depicts a sequence of successful actions of the same type, but whose parameters are different. Success-then-reflexion expresses a successful action, followed by access to a related help. Conversely, reflexion-then-success appears when students first access help of a certain type of action, and then submit the matching action successfully. Fail-then-reflexion shows a help access related to an action that failed. Finally, withdrawal matches with an action of a different type than the previous one whose submission failed. Analysis results of each strategy are exposed in Table 1.
The strategies Progression, success-then-reflexion, reflexion-then-success and fail-then-reflexion present significant results. The first three ones allow to cluster students in a category of performance, and seem to be traits of behavior of high- and medium-level students. Under the progression strategy, high-level students seem to decompose their problem in steps of increasing complexity. The three others strategies are related to reflexion through the use of help.
Also, the 4 above strategies are all positively correlated to AS: the results do not reveal particular behaviors of low-level learners. Another interesting result is the withdrawal strategy, which is applied homogeneously by all students and thus irrelevant to predict performance or to take a decision.
4 Results Exploitation
The results of our analysis allow for on-the-fly detection of learners’ behaviors and open the door for new opportunities. Indeed, the continuous improvement of TEL-based systems, according to experimental findings resulting from their usage, is a critical part of the re-engineering process [5]. Applied to learning analytics, this enhancement cycle makes it possible to discover new design patterns and to generate new data for research about and improvement of TEL [6].
Thus, with respect to the above methodology, we integrated into our remote lab two new features built on two distinct design patterns. The first feature relies on an intelligent tutoring system (ITS) able to guide learners during their practical sessions according to the learning strategies they follow. For instance, when a learner fails several times to execute a command, the ITS suggests her to read the matching manual so that she becomes engaged in the reflexion-then-success strategy leading to better performance. The second design pattern is an awareness system intended for teachers and highlighting students that seem to present weaknesses. For instance, if several learners follow the withdrawal strategy on the same command, the system notifies the teachers so they can make a collective intervention. These new features are already implanted into our system and will be evaluated in the near future to assess their impact on both learners’ and teachers’ behaviors.
5 Conclusion
The study of this paper aimed at revealing relationships between learners’ behaviors during practical learning situations, and their academic performance. We adopted a sequential pattern mining approach to reveal correlations between learning strategies we identified and high-level students. The data we analyzed only relate to interactions between learners and the resources required to achieve the practical work; some works are in progress to extend our analysis model to other tracking data collected by the system in order to analyze in depth their causal nature, but also to compute a predictive model reducing failing rate. These data include social traces resulting from cooperative and collaborative tasks that will allow to study new research questions about learners’ behavior in practical work situation, in a socio-constructivism context.
References
Blikstein, P.: Using learning analytics to assess students’ behavior in open-ended programming tasks. In: Proceedings of the 1st International Conference on Learning Analytics and Knowledge, pp. 110–116. ACM (2011)
Wilson, B.C., Shrock, S.: Contributing to success in an introductory computer science course - a study of twelve factors. In: Proceedings of the 32nd SIGCSE Technical Symposium on Computer Science Education, pp. 184–188. ACM (2001)
Rountree, N., Rountree, J., Robins, A., Hannah, R.: Interacting factors that predict success and failure in a CS1 course. ACM SIGCSE Bull. 36(4), 101–104 (2004)
Broisin, J., Venant, R., Vidal, P.: Lab4CE: a remote laboratory for computer education. Int. J. Artif. Intell. Educ. 27(1), 154–180 (2017)
Corbière, A., Choquet, C.: Re-engineering method for multimedia system in education. In: Proceedings of the Sixth IEEE International Symposium on Multimedia Software Engineering, pp. 80–87. IEEE (2004)
Inventado, P.S., Scupelli, P.: Data-driven design pattern production: a case study on the ASSISTments online learning system. In: Proceedings of the 20th European Conference on Pattern Languages of Programs. ACM (2015)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Venant, R., Sharma, K., Dillenbourg, P., Vidal, P., Broisin, J. (2017). A Study of Learners’ Behaviors in Hands-On Learning Situations and Their Correlation with Academic Performance. In: André, E., Baker, R., Hu, X., Rodrigo, M., du Boulay, B. (eds) Artificial Intelligence in Education. AIED 2017. Lecture Notes in Computer Science(), vol 10331. Springer, Cham. https://doi.org/10.1007/978-3-319-61425-0_66
Download citation
DOI: https://doi.org/10.1007/978-3-319-61425-0_66
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-61424-3
Online ISBN: 978-3-319-61425-0
eBook Packages: Computer ScienceComputer Science (R0)