Keywords

1 Introduction and Context

Over a period of 6 years (2014–2019), three separate but related projects were undertaken to ascertain the perspectives of both staff and students regarding the collection and use of learning related data, also referred to as learning analytics (West et al., 2016, 2018; West, Luzeckyj, Searle, Toohey, & Price, 2018). Each of these projects was driven by the fact that few previous studies had explored the views of stakeholders regarding how and why they would, or would not, use learning analytics resources. The projects sought to determine the views of two key stakeholder groups, those being staff and students. A key area we considered was linked to our concerns about the appropriate use of data, its security and gaining informed consent to use it. This chapter draws on work undertaken as part of these previous projects and draws the findings together to provide a comparison of the two views. This is an important and unique perspective as we were unable to find current literature addressing these points where both staff and student perspectives were sought and compared.

As part of these previous studies, several literature reviews were undertaken, highlighting the lack of stakeholder input (West et al., 2019; West, Luzeckyj, Toohey, & Searle, 2017). In reviewing the literature for this chapter, we discovered that our previous work is complemented by few recent studies undertaken at scale with university staff or students.

To date, literature has moved beyond discussions around reducing attrition and development of small-scale localised activities to considering how to scale LA and develop greater institutional capacity (Colvin et al., 2016; Dawson et al., 2018; SHEILA Project, 2018; West et al., 2016); and regional or national capacity (Knox, 2017; Sclater, Peasgood, & Mullan, 2016; SHEILA Project, 2018; Siemens, Dawson, & Lynch, 2013). Dyckhoff (2011) conducted a meta-analysis of case-studies presented at an e-learning conference in Germany to identify teacher perceptions of using LA to evaluate technology-enhanced learning and teaching effectiveness. Two recent studies reflect on the importance of determining stakeholder views to ascertain institutional readiness in relation to LA adoption (Joksimović, Kovanović, & Dawson, 2019; West, 2019) while others argue for the need to consider teaching contexts and approaches as part of LA adoption (Arthars et al., 2019; Herodotou, Rienties, Verdin, & Boroowa, 2019; Lodge, Cooney Horvath, & Linda, 2019; West, 2019). One study explores the perspectives of university leaders and “presents and unpacks a leadership model for LA implementation to provide a more nuanced understanding of the factors impacting on organisational uptake” (Dawson et al., 2018, p. 237).

The small number of papers examining staff and student perspectives do not necessarily gather their insights into LA but discuss how it may be used in academic contexts. For example, Bakharia et al. (2016) explored “the pedagogical concerns and needs faced by teachers in their local contexts and how learning analytics may usefully provide actionable evidence that allows them to respond to those concerns or needs” (p. 330). Others considered how students responded to dashboards (Lim, Dawson, Joksimovic, & Gašević, 2019) or involved students in the design of dashboards (de Quincey, Briggs, Kyriacou, & Waller, 2019). Only a few studies have actually asked students about the use of dashboards and their views on them (Brooker, Corrin, Fisher, & Mirriahi, 2017; Schumacher & Ifenthaler, 2018; Roberts, Chang & Gibson, 2017). According to an exploration of trends and issues in student-facing learning analytics reporting systems conducted by Bodily and Verbert (2017), dashboards are a common feature in learning analytics literature as they inform users regarding what has occurred as well as the context. Dashboards are therefore a key method for using data and translating it into usable forms for both academic staff and students.

We only found one study which explored students’ actual perceptions in relation to the collection and use of their data. This 2018 study surveyed entry-level students studying through the open university in the UK with the intention of identifying “a better understanding of students’ awareness of the collection, analysis and use of their digital data in relation both to how deeply they use online services and media and to their own practices of privacy self-management” (Slade, Prinsloo, & Khalil, 2019, p. 2). This study considered how individuals may think about the exchange of aspects of their privacy (as data is collected) for personal benefits. Slade et al. (2019) determined that students are willing to entrust their data to others if they receive personalised benefits; they wish to control how data is collected and used. The contexts in which data is collected and used make a difference to students, who can be naïve and/or inexperienced in collecting or interpreting data, so they need to trust the service provider collecting their data.

2 Approach

Uniquely, this chapter draws on three studies undertaken with university staff and students to reflect and compare the perceptions of both stakeholder groups. The first project, funded by the Australian Government, involved surveying staff across 25 Australian institutions about their perceptions of LA (West et al., 2016). The second project was funded and endorsed by the Australian Innovative Research Universities (IRU) which is a network comprising ‘seven comprehensive universities committed to inclusive excellence in teaching, learning and research in Australia’ (Innovative Research Universities, 2019). This project involved conducting focus group workshops with staff from three of the IRU institutions (West, Luzeckyj, et al., 2018) and aimed to gain further insight into teaching staff perspectives on the use of learning analytics to enhance improvements in teaching practice and was based on the interrogation of the survey responses from the previous study.

The IRU also funded and endorsed the third project, which considered learner-facing analytics and analysed student perspectives (West et al., 2019). It broadly aimed to gain insight into how students understood LA, their concerns in relation to LA; the LA tools they believed would support them to succeed in their studies and how these might best be implemented (what sort of policies, information and training students thought might be useful).

Survey data from the projects was processed using SPSS version 25 and Microsoft Excel. Further details regarding the quantitative analysis are provided in the context within the sections below. The focus groups were audio-recorded and then transcribed with participants de-identified. A broad mix of views and insights into academic challenges and teaching approaches was garnered through this approach. The focus groups enabled researchers to further explore areas where survey respondents had indicated concerns or responses either broadly agreed or disagreed. They also provided the opportunity to determine potential explanations for responses and delve more deeply into areas of interest or complexity identified in the survey results.

All focus group transcripts were read and coded by two of the researchers who analysed them using thematic analysis (TA). TA is a qualitative research method where data is explored to allow themes to emerge (Fereday & Muir-Cochrane, 2006). Braun and Clarke (2006) define themes as important elements in the data and suggest themes demonstrate where “some level of patterned response or meaning within the data set” occurs (p. 82).

These staff and student projects differed in a number of ways. In the student project, several activities, (focus groups and a survey followed by a second round of focus group exploration) were brought together as one piece of student-related research. The survey and interview questions across the two rounds of focus groups and the survey included different questions.

This chapter focuses on reporting and comparing results from these three Australian studies. It discusses the approach and results from the staff studies before discussing the exploration into student perceptions. A comparison of findings from cohorts is then undertaken before we identify recommendations and draw conclusions. The comparison of these two important stakeholders provides unique insights and allows roadblocks to be identified so they may be addressed. It also informs institutional practice in LA development so it may move beyond smaller local projects.

3 Staff Perspectives on LA

The first two projects explored academics’ attitudes to and experiences of LA, in particular, their involvement with LA. The first project involved an online survey of Australian academics. The second had several phases, which included a series of focus groups and interviews.

The survey conducted during September and November 2014 involved a design specific to this study and, as discussed in West et al. (2016), set out to explore a broad set of research questions:

  • In which LA-related activities have teaching staff been involved?

  • In which retention applications of learning analytics are participants most interested?

  • How are institutions supporting learning analytics use amongst teaching staff?

The survey employed a purposive, snowball sampling strategy to recruit self-selecting individuals. The invitation to participate in the survey was sent to staff in at least 25 institutions with 401 individuals viewing the first question. Of those, 48 (12%) either answered no or only answered the demographic questions and were excluded. Of the remaining 353 participants, 276 indicated they were directly involved in teaching and were included in the study. These respondents were from 21 distinct institutions. Sixty-seven percentage of respondents reported a primary work role of “teaching students”, with the balance in other teaching related roles such as “learning support”, “academic development” and “student support”. Seventy-one percentage of respondents were at lecturer or senior lecturer level, and 70% had been employed at their current institution for 5 or more years.

The survey included the following definition of LA: “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs”, and participants demonstrated a high level of interest in LA with 60% of the respondents having been involved in LA in some way. Thirty-seven percentage of respondents reported they had been reading about LA as part of their professional development or using it to help with analysis and decision-making (Table 10.1).

Table 10.1 Frequency distribution of involvement in selected learning analytics activities (n = 276)

Staff interest in LA is also demonstrated by how often the respondents discussed LA with colleagues in a range of roles as shown in Fig. 10.1 where it can be seen that discussions are held more often with teaching staff and program or course coordinators.

Fig. 10.1
A vertical bar graph compares the performance of various colleague groups for daily, weekly, fortnightly, monthly, less than monthly, and never categories. It observes the highest value for the never category in all groups except teaching staff.

Frequency of learning analytics discussion with select groups of colleagues

The exploration of participants’ interest in the use of a range of LA applications (Fig. 10.2) suggested their focus mainly related to identification of “at-risk” students and how that could trigger or inform their response to those students. Other applications that showed a high level of interest included the use of LA applications by teachers who wished to evaluate and improve their own teaching practices, and how students could use LA to monitor their own progress and identify actions they could take. In both scenarios the responses may either reflect a limited understanding of the value of LA or the respondents’ specific interest in how to use it.

Fig. 10.2
A horizontal bar graph compares the data for various L A applications in a lot of interest, a little interest, and no interest categories. Highest values for identification of at-risk students in the lot of interest category.

Participant levels of interest in selected potential applications of learning analytics

However, the perceptions of the teaching staff with regard to the institutional capacity to support their use and interest in LA was rated as poor or very poor (Fig. 10.3); in particular respondents were concerned about the universities’ provision of information on its use and its potential impacts.

Fig. 10.3
A horizontal bar graph compares the data for various L A applications in not sure, poor, fair, and good categories. It observes the highest value for the poor or very poor category for how learning analytics will affect.

Rating of institution at meeting participant needs and expectations in selected areas

Following reflection on the original survey data, a second study was undertaken. This involved a series of focus groups, held at several of the IRU institutions. These discussions further explored areas of interest raised in the surveys.

Each session lasted 90 minutes and was facilitated by the project team at the institution. They were all structured in two parts with a predetermined set of questions and activities. A similar process was followed across each site to ensure a level of consistency was achieved. In part one participants were asked to:

  1. 1.

    Individually record (on post-it notes) the LA (data related) questions they would like answered or have insight into in relation to teaching/learning in their classes

  2. 2.

    Discuss their questions and ideas and consider the types of data that might be required to answer those questions

As shown in Table 10.2, the most common questions included on the “post-it” note responses related to data allowing teachers to see or to track student activity. Questions related to ethical or operational issues were the least common.

Table 10.2 Categorised comments collected on post-it notes

The types of questions participants indicated they would like answered relating to students’ activities included students’ interactions with the LMS, for example, how often and for what duration they logged in; the time spent on individual tasks; and the use of particular learning resources.

The recorded discussions mirrored these themes but included additional detail; for example, participants saw wishing to have a better idea of what students were actually doing and how they were moving through their units as being of particular importance. Participants also discussed their interest in identifying how much time students were spending on tasks and questioned LA’s capacity to realistically measure and reflect student engagement at any more than a very superficial level. Staff comments also indicated they were interested in finding more sophisticated data which provided insights into how students use the resources made available to them and how they move through a topic and develop skills, for example:

I’d really like to be able to get my hands on what they’re doing, particularly for the second lot of exercises where they’re starting to do the skill development, and just to measure engagement to start with would be really good. (Participant in teaching staff focus group at Institution 3)

It is interesting to note that at each of the institutions, questions concerning student preparedness for higher education study (e.g. background, how prepared they were for the unit, what kind of experience they had in relation to the discipline) were raised, particularly with regard to levels of English and mathematics proficiency. Very few questions were raised with regard to the teachers’ own learning and teaching proficiency, learning and teaching practice, or the curriculum. This may be due to teachers’ concerns regarding students’ lack of preparedness for university, or a failure to appreciate LA can provide insights into how they may change their teaching and curriculum practices.

Part two of the focus group research required the same participants to respond to seven pre-determined LA reports/visualisations (hereafter referred to as reports) by:

  1. 1.

    Grading the potential usefulness of report(s) on a scale of 1 to 5

  2. 2.

    Describing the perceived value of report(s) and any enhancements the participant would like to see included in them by writing comments on the reports

  3. 3.

    Discussing each report’s potential usefulness to the participant’s own teaching contexts

The reports selected for this exercise were in use or about to be introduced by at least one of the universities involved in the project. When presented to participants, each report included the title and a short explanation of its function and purpose to support focus group participants’ understanding. Table 10.3 provides each report’s title and a brief description of it.

Table 10.3 Report descriptions

Participants’ perceptions of the usefulness of the reports varied according to several factors, including the pedagogical approach, their role in relation to the purpose of the report and broader institutional contexts. The relationship between the underlying data in the report and the pedagogical approach the teacher uses also influence the perceptions of the usefulness of the reports. Participants suggested LA reports needed to fit their pedagogical approach and be easy to use as well as time saving.

Staff used the opportunity of attending the focus group to highlight other concerns they had about LA. Time pressure, in terms of the time needed to learn about the reports and their uses, and time required to then engage with the data were raised in a number of discussions, as this was perceived as adding to staff workload. These concerns indicate two important considerations related to the development of reports and LA more generally. Reports and other LA outputs need to be as simple as possible, and they need to be easily accessible with a very clear purpose. It was also clear that staff wanted to ensure LA reports present a clear value proposition for the teacher in terms of either saving them time or assisting with something this is a key part of their role.

4 Students’ Perspectives on LA

Given the success of the research with academic staff, a similar approach was taken with students from six of the seven IRU institutions. Through this project we explored students’ attitudes to and experiences of LA through a survey and focus groups conducted at each member university. The aims of this work were to explore students’ understandings of and opinions regarding:

  • Their understanding of data the university collects about them

  • Their level of comfort concerning the use of data to help support their learning

  • How useful they believe a range of LA-driven ‘interventions’ will be to their learning experience

  • Levels of concern regarding the data collected about them

  • When they would like to be reminded about university data policies and practice

This research was conducted during 2018 with the initial survey distributed via email in Semester 1 to all undergraduate and postgraduate on-shore coursework students in the six IRU universities who chose to participate (approximately 158,000 students). A total of 2017 valid responses were obtained from the survey (1% of total university population) that makes this data set one of the largest of its kind exploring student perceptions. The participant population was considered representative of the general university cohort in that survey respondents were more likely to be domestic (83%), undergraduate (76%) students studying full-time (79%). It is important to note however that a greater proportion of the respondents were female (70% vs 60% of total university population (Table 10.4).

Table 10.4 Demographic distribution of respondents

To explore student awareness of the range of data that the university collects about them and their learning experiences, the survey provided a list of 23 different options (see Table 10.5). Students were generally aware and accepting of the data that their universities were collecting about their learning experiences. It was practically taken as a given that data was being collected relating to their engagement with the learning management system (95% of respondents), and submissions within the system including assignments (99%), quizzes (98%), grades (95%) and participation on discussion boards (91%). Awareness of their university’s capacity to collect more detailed data about behaviour within the LMS was less widespread; 85% aware that the university could track their participation in online lectures, tutorial or web conferencing, 78% aware that access to Lecture Capture recordings was collected and 75% aware that their access to video and audio learning materials was recorded. Outside of the learning environment, students indicated a reduced awareness of data that was collected by support services such as academic skills services (75%), employment services 63% and library support workshops and training (74%).

Table 10.5 Student awareness of data collected by the university and comfort level associated with each data item

Students’ general awareness of university data collection related to core learning and student support; however, it also appeared to apply to monitoring of their wider engagement with a university’s wireless network (82% of students assuming that the university was monitoring usage). Significantly fewer students thought that location data from mobile phones (37%), social media (49%) or university mobile app usage (65%) was collected.

The survey also permitted the research team to explore the level of comfort that students felt when considering each of the 23 potential data sources on a 5-point Likert scale, with Very Comfortable scoring 5 and Very Uncomfortable scoring 1. A summary variable, Comfort Level, was calculated as the mean score of the responses (Table 10.5). In a pattern similar to that of awareness, students were most comfortable with the collection of data when it was directly related to the engagement in key learning systems with the highest comfort associated with access to the LMS (4.10/5) and the use of video resources and audio learning materials the lowest at 3.84 and learning outputs (grades, assignments and quizzes, range 3.85–4.25/5). They reported a lower level of comfort when data collection involved their submission of personal information (demographic and previous academic information, range 3.68–3.47/5). However, the items where students reported the lowest levels of comfort were those not directly related to learning and included location data from their mobile phone (2.27/5), data about university social media groups (2.82/5) and university mobile app usage (3.06/5). These data points may help analysts determine whether technical connectivity/access to networks, etc. is causing an issue, identify areas for campus improvement and illustrate cultural or behavioural issues around propensity to share and post in electronic environments. Some staff believe this data can provide insights into student behaviour (attendance, focus in class, etc.) that may help understand them better.

Conducting focus groups allowed further contextualization of the two main areas identified as points of concern for students in the survey. These related to the degree of comfort students felt about universities using various types of data to help support their learning and how the data was used by institutions. Students were particularly concerned that demographic information may be utilised to categorise or profile them. The following statements reflect these sentiments: “you’re putting them in a category they might not want to be in” (FG 4); “that you’re specifically identified as an accounting student from the [campus X] with an international background. So, you specially represent a certain group” (FG 2).

Students reflected a desire to understand the reasons why the university would be interested in these kinds of information and were seeking confirmation of the relevance of the data. Specifically, they questioned the collection of their location data, the use of social media, wireless network devices and mobile apps. All of these concepts aligned strongly with items that scored low on comfort. Students frequently considered the collection of this information “creepy” (FG 4) and in particular associated this with “being watched” (FG 2).

To further explore the student perception of the usefulness of data, the survey asked students to provide their perspectives of a number of practices (see Table 10.6). Students were highly supportive of data collection that might potentially lead to the provision of additional materials or services to support their learning. They were far more comfortable about being contacted about their learning than other issues (such as health or wellbeing) but preferred to be contacted by an academic staff member that they knew. One participant from FG 2 explained why they thought the academic was the best point of contact:

… you have the trust with the teacher, you go first, or your teacher first comes to you like what’s going on. And if you do have mental health issues or you’re actually struggling with understanding the subject, the teacher can guide you and same with here, with the lecturer. Comes to that bond or that trust between you and your lecturers.

Students were less positive receiving information that compared their performance and engagement with those of other students in the class (range 61–72%), though they were more positive about data leading to a potential prediction of their grades (86%), and indications of areas in which they could change their behaviour to improve their grades (82%) or pass the subject (84%).

Table 10.6 Degree of usefulness of specific practices (higher percentages illustrate more positive responses)

Reflecting holistically, students were concerned about the security of their data and the relevance of such data to their study or experience. A clear majority (90%) indicated concerns about third parties receiving their data. This finding is not surprising in the context of broader public concerns with regard to data security. It is also important to note that this question will have included substantial variation in the scope of both potential information shared and end users, including that which is required for the operation of third-party teaching arrangements and reporting to government versus external organisations for which it is illegal for the university to share student data. Interestingly, fewer than 50% of the respondents expressed concern about options that involved their data being used by the university to tailor student support or to improve learning and teaching or services (Table 10.7).

Table 10.7 How concerned respondents feel about how their data is managed and used (the higher percentage indicates greater concern)

This increased desire for transparency, and consent to data collection was further explored in the survey (see Table 10.8). Students clearly wish to be given the opportunity to provide consent to access their data more often than on enrolment, with more than 60% of respondents indicating that they would prefer to be notified either annually or at the commencement of each semester.

Table 10.8 Preferred timing of notification of university data policies and procedures

The issue of compulsory or non-compulsory provision of dashboards was also explored. Students were asked to indicate the options they would prefer if dashboards were available. As shown in Fig. 10.4, only 23% of respondents agreed with the idea of a compulsory dashboard to display their information, while the majority 73% were not in favour of this. The response to options with the ability to either opt-out of the dashboard (63%) or turn it on and off (79%) were viewed favourably indicating that participants clearly wanted a choice.

Fig. 10.4
A vertical bar chart compares the data for the percentage of respondents in the yes or no category for three different dashboards. It observes the highest value for the compulsory dashboard in the no category and turns on or off the dashboard in the yes category.

Responses regarding dashboard availability

Students are generally in support of initiatives that have the potential to support and provide feedback on their performance, particularly if there is perceived to be a short-term or quick fix correction that might help them achieve their goals.

Where concern exists, it is manifest in what could be termed university (administrative) over-reach, where support either monitors or acts on data coming from sources that students consider to be their own and distinct from dedicated academic platforms.

5 Comparing Responses from Staff and Students – the ‘Standout’ Messages

The two projects bring insights from different points of view: that of the learner and that of the university teacher around some key areas: 1. awareness and knowledge of learning analytics; 2. concerns and 3. how data might be used to support learning. This provides a unique opportunity to explore differences and commonalities from two critical stakeholder perspectives which translate into practical actions for more effective use of LA by Higher education institutions (HEIs). These include a need to ensure appropriate governance of data collection through the development of strategies and frameworks; improved services provided by IT and other departments and teams who collect and manage data and the creation of timely, coherent training, delivery and communications strategies.

5.1 Awareness of Learning Analytics and Data Collection

The teacher survey included several questions which broadly translate to the concept of awareness and knowledge of learning analytics. Specifically, teachers were asked about the frequency of their involvement in discussions related to learning analytics and with whom and their involvement in learning analytics–related activity. The scale used included daily, weekly, fortnightly, monthly, less than monthly and never. In this construct awareness and knowledge would presumably be higher if engagement in either discussion or related activity was taking place on a regular basis compared to never. As such, this can be seen to operate on a continuum from broad awareness through to a high level of awareness.

Figure 10.1 (above) provides a summary of the key areas where teaching academics have engaged with others in LA discussion with at least some frequency (i.e. more than never) and on a more regular basis (i.e. at least monthly). It shows teaching staff are talking about LA with a range of stakeholders and indicates that at least 68% of the academics surveyed had at least some awareness of LA. More regular engagement would suggest a higher level of awareness as is the case for 33% of academics who were engaged in discussions with their teaching colleagues at least monthly. It is also more likely the case that as the stakeholder group broadens, the level of awareness is likely higher. For example, those engaged in LA communities of practice (46%) are likely to have a high level of awareness and knowledge.

Teaching academics were also asked about their participation in LA activity and to identify the type of activity undertaken. Table 10.9 provides a summary of engagement where the frequency was indicated as more than never. Again, it could be assumed that those who are using LA for analysis and decision making have a higher level of awareness and knowledge of LA.

Table 10.9 Summary of frequency of activity undertaken to LA activities

Focus groups with staff across universities indicated that they did not really understand the term “LA” but had heard it used reasonably often within their institution and the sector. This is likely reflected in the relatively high level (68%) of discussion taking place. However, taking such awareness to the level of application was far less frequent (41%) and could be seen as a lower level of knowledge about application. This suggests the need for a concerted effort to develop awareness of LA with all academic staff across institutions to ensure LA is effectively leveraged to support practice improvement.

While it is acknowledged that the two studies were some years apart, the student survey included questions regarding the kinds of data students thought the university was collecting about them (as presented in Table 10.5, above). Findings demonstrate students have an appreciation that data related to core learning and support activities is collected and that various systems are used to provide it. When this was explored further via focus groups, students indicated that they did not really understand the term “LA”, which coincides with what staff had said.

Clearly there was broad awareness amongst students about the type of data that was being collected about them particularly related to assessment and general activity in the LMS. However, awareness of data collection dropped as data became less obviously connected to the LMS and the online learning environment.

It is evident through both studies that there is general awareness of data collection but what data underpins learning analytics is less clear to either group. Conversely learners and university teachers in both studies offered a different focus on the use of learning analytics. Higher education institutions (HEI) therefore need to more clearly articulate what data is collected, the purposes for which it is collected and used as well as providing clear definitions of terms such as learning analytics, so both learners and teachers are more aware of what terminology means and how data is used.

5.2 How LA Might Be Used to Support Learning

As interest in LA has developed so too has academic interest in its application to support learning and teaching. One area staff were interested in was determining how much time students were spending on tasks; however, they questioned LA’s capacity to realistically measure and reflect student engagement at anything more than a very superficial level. Some also indicated concern regarding the inclusion or evaluation of activities which do not take place in the LMS. As seen in Table 10.6 (above) students rated seeing how much they accessed the LMS reasonably highly (81% positive responses) but not as highly as many other of the aspects we questioned. The concerns raised by staff are echoed in recent studies where researchers attempted to address the issue of counting clicks to measure engagement by exploring other means to assess or validate student online engagement. Fincham et al. (2019, p. 501) used “robust empirical validation” to test a theoretical model of various forms of engagement (academic, behavioural, cognitive and affective) to determine the potential for predicting learning outcomes. In a separate study Jovanović, Gašević, Pardo, Dawson, and Whitelock-Wainwright (2019) involved students in self-reporting activities related to cognitive load and self-efficacy. They integrated trace data with academic performance and found there were associations between the two. However, these studies are not at scale, they appear complex and time consuming and they are not carried out across institutional contexts.

From our survey of university teaching staff, and as seen in Fig. 10.2 (above) academics indicated that the main areas of interest for using LA (where more than 50% of staff indicated interest) were most specifically related to:

  • Identifying at-risk students with a view to staff responding to address the risk

  • Teaching staff evaluating and improving their own teaching practice

  • Students monitoring their own progress and identifying actions that they can take

  • Development of the broad knowledge based about how effective learning can occur

  • Informing potential initiatives to promote student retention (e.g. mentoring, student support)

  • Informing design and layout of online learning sites and environment.

These areas of activity suggest staff wish to see LA being applied to the improvement of teaching practice and what they perceive they have control over rather than the broader institutional concerns. It is important however to note that the findings from this research identified a link between the institutional context and leadership and the development and advancement in thinking about the use of LA.

Discussion in the student focus groups also reflected the importance of context and leadership. Students indicated that they were unsure of how data was being used or if it could be used to support their learning. However, with more discussion about the context and potential applications, they started to identify how it could be useful to them. The literature related to leadership in LA includes a paper by Dawson et al. (2018) who suggest that as a result of different approaches to research and implementation in LA, coupled with the complexity of the field of education, there is a need for leadership in LA to be both transformational and shared across the institution, though there was no discussion of including students in the mix of leaders.

In the survey, students were given a range of LA applications and asked to indicate which ones they thought would be useful to their learning. In considering this data, it was apparent that some of the items aligned with what academics had indicated more broadly. Table 10.10 aligns student responses (from Table 10.6) with the LA applications identified by staff (as indicated in Fig. 10.2).

Table 10.10 Comparison of staff areas of interest and areas of usefulness identified by students

Student responses reflected a very pragmatic approach to LA as they were highly interested in things that could be done to support their learning and being prompted to take some action. This included prompts about additional learning materials and the provision of additional services. As indicated in Table 10.9, the top 7 (excluding progress) related to student self-monitoring and taking action. These all rated over 90% by students but staff only identified them in 64% of cases (see Fig. 10.2 above). The survey also highlighted that while some applications were seen as useful, students held a level of concern around their use.

5.3 Concerns

Both groups were asked about their concern around the use of data for learning analytics although the questions were presented in slightly different ways. Students were asked about their level of concern with data being used in various ways while for academic staff the question was framed around ethical concerns.

Academics were asked to indicate their level of concern on a scale of high/some/low or no concern related to a range of data issues and applications. Table 10.11 highlights those issues where academics had higher levels (some or high) level of concern and which are of relevance to students:

Table 10.11 Areas of concern (higher percentages indicate higher levels of concern)

Other items where academics indicated levels of concern are not relevant to this comparison but included items related to workload changes, engagement in training and professional development and accreditation related issues.

Students were also presented with a range of data issues and applications and were asked to indicate their level of concern on a scale from not at all concerned to very concerned. Table 10.7 summarises areas related to data collection and use where students indicated any level of concern beyond “not at all concerned”.

It is clear from this summary that the highest level of concern academics had was around how data was being used and associated transparency. While students were not asked explicitly about this in the survey, it was part of the focus group discussions to unpack the reasons for concern around particular elements. The overall theme and very strong message that came from students was the need for the institution to be clear and transparent around what data is being collected, why it was being collected and how it was going to be used.

Despite evidence that users do not engage with Terms & Conditions of online services, HEIs should strive to be transparent. Students should know what data are collected, by whom, for what purposes, who will have access to this data downstream and how data might be combined with other datasets (and for what purposes). As such this can be seen as the primary focus for both groups and these findings coincide with those identified by Slade et al. (2019, p. 243) who suggest “a unique opportunity to create a trusted relationship between institutions and students” exists through the use of LA.

The issue of profiling was specifically raised during focus groups with students expressing strong concerns that data would be used in this way. The following statements from student focus groups reflects this sentiment:

“You’re putting them in a category they might not want to be in.” (Student, FG2)

“I don’t know, maybe there’s just a bit of stigma attached to the word profile. Don’t like the idea of being profiled.” (Student, FG3)

Similarly, this rated highly as a concern for staff with 81% indicating some level of concern. There is also strong alignment between academic staff and students around issues related to data security and any sharing of that data with third parties.

In looking back on the key areas of awareness, usefulness of LA data and concern, there appears to be a mismatch at least to some extent between what academics are interested in doing and students level of concern around certain applications. For example, academics are interested in utilising data to explore ways to improve their teaching and curriculum while over half of students are concerned about their data being used for research purposes. Depending on how academics proceed with investigating the improvement to teaching and curriculum this could be seen as educational research or at a minimum taking a research approach.

5.4 Practical Actions for More Effective Use of LA

As both students and staff were involved in these projects, many of the results have the potential to be developed into policy, strategies and actions for HEIs. Given that both staff and students indicated they did not really understand the term “LA” and it was not consistently applied, HEIs could consider developing teams across central academic development, data collection and information technology areas responsible for developing resources and training. This could help all parties build confidence and could further encourage work with staff and students so they can collaboratively gain data and digital literacies, thus improving their understanding and agency when using LA. This collegial group could also take responsibility for managing and maintaining institutional policies and governance practices and ensuring these address the various areas of staff and student concern outlined in Tables 10.10 and 10.11. This would ensure that data use is transparent, that consent to collect and use data is appropriately sought and that students are neither profiled nor given a sense that they are stalked, but instead supported and helped to improve their learning.

The explicit tools these groups develop should, as discussed, occur in collaboration with students and, as indicated in Tables 10.2 and 10.10, be based on ways of determining and tracking how students are using resources and how these support their progression. Tools, including third-party add-ons to the LMS, that help determine which students require greater support or those who need to focus on specific areas in their learning would also be helpful. The tools described in Table 10.3 provide a useful place to start, but as indicated by staff, these need to be easy to learn, simply to use and time saving.

Given our findings indicate differences in staff perceptions regarding the usefulness of different reports and students’ attitudes about feeling they are being watched, it is essential that HEIs carefully consider students’ sense of privacy and ownership of data. Determining methods for monitoring students’ activities which occur offline, through changed assessment approaches, improved scaffolding of learning activities and opportunities for students to really identify and manage their own learning pathways in their own time and in their own ways must also be thought through. Making these changes may also require identification of different, more appropriate pedagogical approaches and concurrent academic staff support/training.

6 Conclusion

Our findings indicate both matches and mismatches in what students and staff understand and consider important in relation to LA. Academics tend to see the application of LA through their own interests and needs which include identifying students at risk, evaluating and improving teaching practice, supporting students to monitor their own learning and identifying actions the students might take to improve results and specific LA-related research interest.

Academic staff may place a lower priority on what students see as particularly useful to them (areas where LA can help them improve their learning) or it may be that academic staff are unsure how to address students’ needs in relation to LA, given both groups seem to lack an understanding of what the term means. While there is some crossover here, the findings from our research reinforce the need to gather and include student input and balance it with staff interests. To do so would require addressing, the (as discussed earlier) limited research which considers what students say they want and need. Not taking students’ perspectives into account is dangerous, as it raises the risk of our LA development missing the mark of what is useful to student success from the student’s own standpoint.

The use of LA in educational contexts is challenging and requires considered leadership approaches; attention to addressing informed consent/privacy; ethical frameworks and power. Staff raised a range of concerns in relation to the use of data and transparency while students suggest they have concerns about the data that is routinely collected about them (such as demographic data used for government reporting purposes), yet there is evidence that users do not engage with Terms & Conditions of online services. However, in higher education institutions the imperative is to strive to be transparent, requiring perhaps a cultural shift to ensure permissions are acquired with informed consent and data appropriately collected and used. In addition, broader consideration needs to be given to pedagogical approaches which utilise LA and ensure students are central in their learning and embraced as co-creators of knowledge (rather than just recipients of it). Achieving these outcomes will work toward the goal of progressing LA to broader institutional and more widespread use – a goal that will only be achievable if all parties (staff, students and those in leadership positions) focus on similar outcomes which are appropriately funded. The research projects discussed in this chapter provide a beginning by bringing together the perspectives of two of these important groups with a number of recommendations on how the findings might translate into action. Collaboration of staff across various areas within institutions and with students are key to the successful implementation of LA.