Keywords

1 Introduction

Do my students understand? This question lingers in every instructor’s mind after each lesson. With the adoption of online teaching during COVID safe management measures, it is no longer feasible to observe individual student’s expression in a class to gauge their understanding. One of the options available is to collect reflections from students after each lesson to extract relevant feedback, so that doubts or misconceptions can be addressed in a timely manner. In general, reflection is important for interpreting and internalizing academic activity [3]. More specifically, Karm [3] has shown that, in order to support the development of university instructors and academics in teaching professions, it is crucial to engage students in active reflection.

The current use of reflection is mainly in learning, even though the learning can be from students or instructors and instructor trainees. Interestingly, one article mentioned how reflection can be used as a feedback tool to improve the curriculum [5], having the potential to evolve as a tool for instructors and not merely as a learning tool for students.

Reflection journals or learning logs are commonly used as platforms for students to express what they have experienced, what they might have learned and their doubts or questions. Even though many insights, such as misconceptions or doubts can be extracted, it remains a challenge to effectively analyse these free form reflection text.

In this paper we propose an automated reflection framework that enables reflection to be used as an agile course evaluation tool for instructors, rather than only as a learning tool for students. In the next section, we explain how reflection can be used as a course evaluation tool. Section 3 presents the proposed automated reflection framework while Sect. 4 covers our recommendations and conclusion.

2 Reflection as an Agile Course Evaluation Tool

Sharp & Lang [6] has proposed a conceptual framework for integrating agile principles in teaching and learning. According to the authors, “given that instructors face large amounts of uncertainty regarding the needs and capabilities of the students prior to or at the beginning of a course, it appears that agile principles may be useful in the course development process”. This is in line with observations from Gibson et al. [2], which reported that more frequent early reflections throughout a course may help teachers and students improve through receiving more immediate and regular feedback. This approach also allows the raising of concerns throughout the learning journey, rather than at the end of a long period of teaching. In other words, it can be more useful to collect reflection at the end of each lesson, instead of at the end of a course, to allow agility in evaluating course delivery.

Instead of the end of the course assessment alignment suggested by Ozdemir et al. [5], Chen et al. [1] analyzed students’ journals on a weekly basis and discovered additional topics that are of interest to the students, but not explicitly covered by the weekly syllabus. It has the potential to guide instructors to develop future teaching content and activities that are tailored toward students’ needs. Lo et al. [4], on the other hand, used an automated method to effectively extract questions or doubts. This gives instructors an option to adjust teaching materials dynamically based on students’ reflections after each lesson. With the doubt identified, a list of topics that require further explanation or clarification can be extracted. Additional materials can be designed to cater to the students, addressing the misconceptions before the start of the next lesson.

As discussed above, when reflection is used as a tool for course evaluation and not contributing directly to assessment, it is usually not evaluated stringently and currently there is no framework for analysing the data. Although there are various rubrics or frameworks proposed to evaluate the depth of reflection for learning, to the best of our knowledge, no comprehensive framework has been proposed to use reflection as a course evaluation tool.

3 Proposed Automated Reflection Framework

The course feedback collected at the end of the term usually only benefits the next cohort of students and has no direct impact on the current cohort. In this paper, we would like to propose adoption of both end of term course feedback and individual lesson reflection as a tool for agile course evaluation. The core source of input is the student self-reflection from each lesson, which collectively complements the end of term course feedback (as shown in Fig. 1).

The analytics output component, in the proposed automated reflection framework (Fig. 1), consists of two core analyses: objective-based and doubt/misconception-based. The objective-based analysis is for the instructors to compare the themes stated in the course/lesson objectives to the content mentioned in the student’s lesson reflections and course feedback. This provides insights on whether the objectives are well covered or any new theme can be uncovered that needs further clarification. The doubt/misconception-based analysis is useful to identify topics that require further attention. By aligning with lesson objectives, personalised learning advice can be offered to the individual students on top of addressing the common misconceptions identified. Finally, the result from the Reflection Analytics Model (Fig. 2) are presented on an analytics dashboard that enables users to perform the objective-based analysis and the doubt/misconception-based analysis.

Fig. 1.
figure 1

Proposed automated reflection framework

Along with our automated reflection framework, we have created a Reflection Analytics Model (Fig. 2) that summarises the various automated analysis approaches that have been used in recent research. The model consists of five main sections, namely: Information Retrieval, Pre-Processing, Feature Extraction, Reflection Classification Model and Analytics Dashboard.

Fig. 2.
figure 2

Reflection analytics model that represents the automated analysis process.

4 Recommendations and Conclusion

Through our proposed automated reflection framework, we recommend collecting reflections/feedback both at the end of individual lessons and end of term. We propose to use the reflection after each lesson as an enabling tool for all types of courses and provide some guiding questions to aid in extracting relevant information.

The two generic questions are:

  • Question 1: What have you learnt from the session?

  • Question 2: What needs further explanation? What could have done better?

The above qualitative and open-ended questions can work hand-in-hand with quantitative feedback that is collected using Likert-type items or scales. Based on the finding from Lo et al. [4], although quantitative feedback can be used to analyse the numerical feedback to gain an understanding of students’ overall self-assessment, written student reflection is able to identify themes and concepts that did not emerge from quantitative analysis.

The focus of the end of term data collection is on delivery, assessment design, and the depth or coverage of teaching materials. Using the proposed framework, it helps instructors to incrementally improve the course through comparing actual delivery of course materials from student’s reflection and feedback against lesson or course objectives set at the beginning of the course.

In this paper, we have proposed an Automated Reflection Framework that enables an end-to-end analysis of reflection to provide agility in course evaluation. This automatic extraction of reflection allows a systematic analysis of information as well as providing agility in course adjustment (through reflection analysis from individual lessons) and improving the learning experience for the current cohort of students.