Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Introduction

Preparing lifelong learners, who develop self-regulation skills and continuously grow as professionals, is a key challenge in higher education. A way to develop self-regulation skills is to include work-based activities to allow students to engage in the professional practice for a substantial period of their studies. However, it is challenging to gain a holistic view of placement experience and to link placement to continuous personal developmental. This requires noticing temporal patterns related to placement engagement, which can trigger reflection and promote self-regulation [1]. A common way to support the discovery of temporal patterns is by using visual dashboards. The effectiveness of visualisations depends on the capability to discover relevant patterns and link them to self-regulated learning. Unlike (usually static) visualisations, intelligent data analysis allows automatic discovery of temporal patterns, so that interactive nudges can be provided to trigger reflection. This calls for the identification of aspects and patterns of the learners’ data which are beneficial for self-regulation. Our work investigates this in a case study of medical education. At our institution students are fairly independent in their choice of WBA topic and timing, which necessitates developing SRL skills. An interactive visualisation backed by analytics could help students with sense making and action planning for their learning.

We investigate two temporal analytics methods: burst detection and process mining, and address the following research questions: (i) What patterns can be derived from WBA data using burst detection and process mining?, and (ii) Which patterns identified using temporal analytics are judged as useful by educators?

2 Methods

The goal of analysing the temporal aspect of workplace-based assessment data is to identify patterns and processes which can support students’ self-regulated learning. We conduct both cohort-level and individual-level analysis.

Data. We use WBA data for a cohort of 1st year medical students, consisting of 2360 unique assessments undertaken by 228 students between January and June 2017. During work placements students are assessed on a list of mandatory and optional clinical skills that they need to acquire throughout their degree. As students could freely choose the number of assessments they wish to undertake, there are considerable differences in assessment counts between students.

Burst Detection. Students can freely choose when they will undertake a WBA during their placements. Although they are encouraged to do the assessments regularly, it is often the case that students decide to do a number of assessments in a very short period of time (usually on the same day), resulting in a ‘burst’, i.e. a spike in assessment activity. Such burstiness might be a possible parameter for identifying at-risk students. We implemented the burst detection algorithm by [2] and applied it to the whole cohort and to each student separately (cf. Fig. 1). At the cohort level we identified 10 bursts. We noted that the most noticeable bursts corresponded to the end of placements.

Fig. 1.
figure 1

Burst detection examples. Solid line = # assessments, dotted line = cutoff. A burst occurs at any time point where the solid line is above the dotted line.

Process Mining. Process mining transforms temporal data into an event log which generalises unique individual paths through a task into common pathways. It originated in the business domain and is used extensively in healthcare (e.g. [3]). It has been applied to a limited extent in education, particularly in the field of education data mining (e.g. [4]). Until now its applicability has not been investigated for WBA data. We used the bupaR process mining package in R for the processingFootnote 1. The WBA event log yielded almost no common processes (225 unique paths for 228 students). A coarser granularity (e.g. skill category, assessor role) might result in more common pathways. We were still able to use the event log to obtain some pre-defined metrics, including: a summary of the trace lengths (i.e. the number of assessment per student; cf. Fig. 2a), and the percentage of students that have completed a given clinical skill (cf. Fig. 2b).

Fig. 2.
figure 2

Example visualisations of queries against the WBA event log.

3 Evaluation and Conclusions

Our initial evaluation involved semi-structured interviews with two educators (one clinical education expert responsible for developing and running the clinical skills education programme, and one technology-enhanced learning expert responsible for the e-portfolio and TEL outreach). We asked: (i) Is a particular analytics method producing any useful insights?, and (ii) Is is useful for students, or educators? The materials used in interviews are made availableFootnote 2.

Feedback on Burst Detection. It is important to know when students are completing assessments, and whether they are consistent. Burst detection would be useful from an administrative perspective, especially if mapped against the beginning and end of placement. It could also be used for quality assurance of individual placements (e.g. whether students are given a range of opportunities for assessment). The method would be less useful for students. Furthermore, the results should consider that some clinical skills are commonly assessed together, so several assessments in a day might not reflect a true burst.

Feedback on Process Mining. In general, process mining would be useful from an administrative perspective, such as assessing placement quality. The skills type analysis was judged as particularly useful, as it shows that students are not engaging with optional skills. As the students move through the degree, they are expected to recognise WBA as a learning opportunity and engage with the optional skills more. Additional information about the expected entrustibility level, skill category, and comparison to the cohort, would be useful.

Issues Surrounding Temporal Analytics. Both educators pointed out that temporal analytics are useful, but they do not provide enough context of the assessment. One educator said that it is important to look into the textual feedback from the assessor and the student’ response to it, for a more holistic picture of the students’ learning process. Generally, the temporal analytics considered in this paper were judged to be useful for placement quality assessment. The analytics could be used to visualise to students, however it raises the question whether students would be able to interpret and act on the information shown to them. Data interpretation would need to be integrated within the curriculum, so that students would be able to use to support their self-regulation.

Conclusions. We applied burst detection and process mining to workplace-based assessment data, and found notable patterns: (i) at the cohort-level the most noticable bursts corresponded to the end of placements, (ii) the number of completed assessments per student varied considerably, and (iii) students rarely chose to complete assessment for optional clinical skills. The analytics were evaluated by two educators as particularly useful for assessing the quality of clinical placements. Two issues were identified: (i) lack of context provided by the count data, and (ii) potential difficulty in interpreting this kind of data visualisations by students. In future work we want to address these issues by incorporating text analytics, and by adding data interpretation to the curriculum.