Keywords

1 Introduction

With an increased focus towards digital learning in higher education comes the proliferation of data generated by teachers and students in their day-to-day educational practice. More attention is being paid to the different ways in which this data can be used to support the goals of the student, the teacher and the institution. Learning analytics seeks to make this a reality through information and analysis in the context of key teaching and learning processes that result in decisions and actions to improve student outcomes. The educational drivers for learning analytics involve enriched student learning experiences. The economic drivers for learning analytics involve introducing efficiency and cost-effectiveness into education. These different framings create tensions when attempting to realize the potential of education to “help people to live well in a world worth living in” (Kemmis, 2014, p.21). In this chapter we describe the current practices at an Australian university six years into the implementation of learning analytics that covers both top-down and bottom-up aspects to try and maximize uptake by different stakeholders. Heath and Leinonen (2016) have already offered empirical insight into the implementation of learning analytics at this site. This chapter extends on this to explore the adoption of learning analytics that has unfolded since and so offer a useful source of ideas about ways of implementing institutional approaches to learning analytics along with a discussion of implications for future practice.

In their review of the various models informing adoption of large-scale learning analytics initiatives, Colvin, Dawson, Wade, and Gaśević (2017) argue there are three broad focus areas: input models, process models and output models. Input models focus on first establishing the necessary elements to facilitate implementation of learning analytics programs. Process models focus on the “how” of learning analytics implementation, describing sequential steps to put in place. Output models focus on the outcomes associated with the implementation of learning analytics. Input and process models can be linear and non-linear, recognizing the complex and interrelated nature of learning analytics adoption. Output models tend to be linear with universal outcomes assumed based on different levels of implementation maturity. Common elements across these different models identified by Colvin et al. include: strategy, leadership, staff and institutional capacity, technological readiness and organizational culture. These models are often conceptual in nature and created out of data collected mostly from learning analytics specialists. To address this Colvin et al. (2015) set out to understand learning analytics implementation as enacted practice across the Australian higher education sector. The results of this study showed nuanced relationships between the various conceptions of learning analytics and its implementation. While some of the important features identified were consistent with the conceptual models, other important elements identified by the study were institutional context and the conception of learning analytics at the particular site. The study by Klein, Lester, Rangwala, and Johri (2019) reached a similar conclusion regarding institutional context, with the localized structures and resources seen to impact the adoption of learning analytics tools at a large public university in the United States. In terms of the way in which learning analytics is conceived at the local site, Colvin et al. (2015) found two broad clusters of learning analytics implementation. Either it was purely a tool focused on retaining students or it was partly this as well as targeted towards drawing out and informing teaching practice and student learning. Such nuance reflects the interplay of the different economic and educational drivers for learning analytics. It seems that in order to get large scale learning analytics initiatives off the ground student retention needs to be a key factor, no matter what.

2 The Story So Far

This chapter focuses on the use of learning analytics at the University of Wollongong (UOW). UOW is a regional university in eastern Australia with approximately 33,000 undergraduate and postgraduate students enrolled and approximately 2800 staff employed across five faculties and centralised services. UOW has a network of four regional campuses and three metropolitan campuses in the greater Sydney area. Beyond Australia UOW has campuses in the United Arab Emirates, Hong Kong and a presence in China, Malaysia and Singapore. A strategy for the roll out of learning analytics capabilities at the Australian campuses has been in place at UOW since 2013. To guard against the perception of learning analytics as a force outside an individual’s control and as the sole factor influencing educational practice, a multi-faceted approach has been taken to the use of learning analytics at UOW from an early stage. It is the academic endeavor, rather than technology and data that has driven learning analytics at UOW (Heath & Leinonen, 2016). Work has been undertaken under the executive sponsorship of the Deputy Vice-Chancellor (Academic), with a governance structure established to provide guidance so that ethics and privacy are treated, not just as problems to be overcome, but rather as opportunities to refine and improve learning analytics tools & processes (Drachsler & Greller, 2016; Sclater, 2016; Slade & Prinsloo, 2013).

Few frameworks for large-scale adoption of learning analytics seem to draw on student perspectives (Colvin et al., 2017). At UOW a student survey was conducted to find out about the types of functionality desired from learning analytics, perspectives on privacy matters and preferences for interventions arising from learning analytics (Heath & Fulcher, 2017). The results informed both the strategy and accompanying learning analytics data use policy at UOW. Feedback was provided on the draft policy during consultation rounds with staff and students. The policy was also influenced by literature emerging at the time on these matters (JISC, 2015; Macfadyen, Dawson, Pardo, & Gaševic, 2014; Prinsloo & Slade, 2013). The policy is primarily concerned with assisting UOW staff carry out learning analytics activities appropriately, effectively and responsibly. Guidelines for taking action from learning analytics insights were also developed in conjunction with the policy to provide a framework for integrating the process of interpreting and acting on learning analytics insights into the flow of existing learning, teaching and student support. UOW took the position that students could not opt out of learning analytics. This was debated at length by the governance committees, but was ultimately based on the university’s duty of care to do what it can to maximize the likelihood of student success. In the interests of transparency, different communication channels have been used to ensure students awareness of learning analytics use. The learning analytics data use policy also states that students have the right to see their data used in learning analytics activities and to correct any inaccuracies about themselves.

The learning analytics team have been positioned as part of the centralized teaching and learning unit at UOW, which collaborates with staff and students to support sustainable improvements in teaching practice and student learning. This type of organizational structure for learning analytics meant budget and technology decisions were made at an institutional level. This created a top-down environment in which implementation of learning analytics tools were aligned with structures, resources and leadership across the institution. The focus here has been on early alert of students predicted to fail or withdraw from enrolled units. The learning analytics team distribute a series of reports at key points in the academic semester to coordinators of large first-year undergraduate units to help draw attention to patterns that may not be readily apparent. Unit coordinators are academic staff with specialist knowledge in the content area of the unit. They are responsible for the design of the unit, the sequencing and content of classes, the design of assessment tasks and the management of assessment marking and feedback. They have responsibilities for the quality of the teaching and learning in that unit and are required to monitor student performance and subject feedback to guide ongoing enhancement and development of the subject. Coordinators of other units have been able to opt in so that they can receive reports during semester for their units. Figure 6.1 below outlines the uptake of learning analytics at UOW by unit coordinators since initial trials in 2014. By 2019, most undergraduate students (78%) at UOW were enrolled in at least one unit receiving learning analytics support. At the same time, 34% of postgraduate coursework students were in at least one unit with learning analytics support. This reflects the focus of learning analytics at UOW on helping undergraduates’ transition into university study.

Fig. 6.1
A line graph depicts the higher percentage of students receiving any L A support for undergraduate courses compared to postgraduate courses from 2014 through 2019.

Uptake of unit-level learning analytics at UOW

2.1 Centralized Support

It is our experience that the academic issues learning analytics identifies for UOW students and the interventions that stem from a student being identified on a learning analytics report are often general to the student and are not specific to the unit in which they are identified. In accordance with this, separate student support staff to the unit coordinators perform proactive outreach early in the academic semester in response to the insights generated by the learning analytics team. One of the ethical issues related to analytics about students is the danger of reducing each person to an individual metric (Ifenthaler & Schumacher, 2016; Roberts, Howell, Seaman, & Gibson, 2016). The data only ever tells you limited information about each student. What is important is that the purpose of the action taken with students who may be encountering difficulties with getting their studies underway is to better understand their particular situation and provide tailored support. Utilising separate student support staff in the faculty has also helped to address a common complaint from academic staff: the amount of time involved with making sense of the results and knowing what action to take with students (Howell, Roberts, Seaman, & Gibson, 2018; Klein et al., 2019). Given the contextual knowledge academic staff often have about the academic support needs for individual students they have still been encouraged to take action where appropriate. This has been complemented by the outreach performed by the central student support unit who have expertise in using positive reinforcement to help influence student behaviour and ensure a baseline of action has been taken with students who may be struggling.

2.2 System Generated Reports

The indicators used to identify students who may be at risk of dropping out are based on data warehouse infrastructure that brings together key components of the teaching and learning ecosystem at UOW (Heath & Leinonen, 2016). This includes data from the Learning Management System (LMS), library usage (aggregated), peer assisted supplemental instruction sessions and the student information system. While external software vendors have started offering similar functionality, the internal data warehouse has been used at UOW to drive this work as it brought together data missing from any single software component of the UOW learning platform. Having dedicated resources familiar with the local conditions responsible for the ongoing maintenance of the technical infrastructure used for learning analytics at UOW has ensured responsiveness to changes in the teaching and learning environment. It has also meant increased transparency in the techniques used to identify students who could benefit from additional assistance, which is another ethical concern of learning analytics (Lawson, Beer, Rossi, Moore, & Fleming, 2016; Lester, Klein, Rangwala, & Johri, 2017; Roberts et al., 2016; Slade & Prinsloo, 2013). Following guidance from the learning analytics ethical use of data advisory group, a deliberate effort was made to use indicators based on student behaviours rather than inherent student characteristics such as race, gender and socioeconomic background. Apart from yielding more accurate predictions about student outcomes, it has also helped reduce bias in the analytics techniques. Just because a student has a background not typically associated with higher education participation does not automatically put them at risk of early withdrawal (Gašević, Dawson, Rogers, & Gasevic, 2016; Lawson et al., 2016). Figure 6.2 below is a sample of a report provided to unit coordinators at the end of week 3 (out of a 13-week academic semester). This point in the academic semester was chosen because a few weeks had gone by to generate digital traces about student behaviour and it was still early enough for students to withdraw from units without incurring financial consequences. Students have been ranked on the report by the number of risk criteria they meet, the greater the number of risks the higher the student appeared on the report. Trials are underway at UOW to extend the information provided in this report with predictive modelling techniques, with a key challenge being to strike a balance between accuracy and transparency so that it is clear why a student has been identified (Hill, Fulcher, Sie, & de Laat, 2018).

Fig. 6.2
A screenshot of a spreadsheet depicts data for columns titled student names, student hashtag, tutorial group, campus, contact, student email, disability, weighted average marks, and 6 criteria.

Sample learning analytics report sent to unit coordinators early in semester

The reports, such as the sample provided in Fig. 6.2, were designed with scalability in mind. Regardless of the unit taught, the coordinator received a similar looking report at the end of week 3 that displayed results specific to their student cohort. The reports have been formatted with a cover page describing their purpose and suggestions on how it can be used by the unit coordinator. An appendix has also been included with definitions for the unit coordinator of each of the risk criteria included in the report. These elements have served as nudges to assist unit coordinators to make sense of the report results in combination with their contextual awareness of the unit being taught to then decide what action to take (Dietz-Uhler & Hurn, 2013; Gašević et al., 2016; Gašević, Dawson, & Siemens, 2015). By having “analytics products” such as the standard reports produced by the learning analytics team, a consistent experience has occurred for the recipients because the reports look and feel similar. While this has yielded capabilities that have been extended across UOW, it has limited the depth of potential insights generated from the data available on the student learning experience in a particular context. This exposes opportunities for further exploration to address bespoke information or information displayed in ways that are novel or address issues unique to particular cohorts of students. This is where the concept of “research sprints” have been implemented by the learning analytics team at UOW to help answer grassroots questions individuals have about their teaching and learning context.

3 Research Sprints

A common criticism of learning analytics is the mismatch between the analytics available and the front-line teaching and learning context (Ali, Asadi, Gašević, Jovanović, & Hatala, 2013; Gašević et al., 2016; Klein et al., 2019; Macfadyen & Dawson, 2012). This is consistent with observations made during the implementation of systematised learning analytics reports at UOW. In response to this, the learning analytics unit at UOW has developed an adapted version of the Research Sprint cycle (Rose, 2016). The Research Sprint process has aimed to uncover deeper insights in the available data to help answer particular pedagogical questions. It comprises six steps as outlined in Fig. 6.3:

  1. 1.

    Intentional and selective recruiting: this means making potential stakeholders (unit coordinators, instructional designers etc.) aware of the availability of a short burst of data science expertise (via the centralized learning analytics unit) and then selecting from a set of potential opportunities which align with and can be accommodated by the unit’s staffing and resourcing;

  2. 2.

    Identifying a set of questions and assumptions: this step requires some close consultation with the stakeholder to refine and understand the problem they seek to solve, the question they seek to answer and the types of data they consider useful in providing insights;

  3. 3.

    Data preparation and analysis: this step requires good institutional and technical knowledge on the part of the data analyst so that they can advise on the type and nature of the available data and its propensity to generate a useful response to the query posed;

  4. 4.

    Discussion of results: the stakeholder and data analyst must undertake this step together. The data analyst can advise on the cleanliness and currency of the data, and on the descriptive and inferential statistics used or the models generated to help explain the results. Together with the stakeholder who draws on their more in-depth and personal knowledge of the context from which the raw data was taken there can be further refinements of the analysis and some ‘meaning making’ can take place;

  5. 5.

    Drawing conclusions: This step is an extension of the previous step, but it has a focus on working out what can and cannot be said about what the data shows and the level of confidence (and therefore trust and credibility) of those findings; and,

  6. 6.

    Communication of results so that learnings can be applied to practice: this step is highly reliant on the joint skills and effective collaboration of the data analyst and stakeholder. Together they jointly decide which forms of communication and methods of presentation (tables, graphs, dynamic or interactive displays, narrative forms) best communicate the findings of the sprint, and to whom should these findings be communicated.

Fig. 6.3
A basic process diagram depicts an iterative approach between questions and assumptions and a discussion of results in the complete research sprint cycle.

Research sprint cycle. (Adapted from Rose (2016))

The Research Sprint process requires an iterative approach so that the revelation of information and patterns guides the next step in the data analysis, as indicated by the backward arrows from Step 4 to Step 2 in Fig. 6.3. The term ‘sprint’ was used because the agreement between the learning analytics team and the commissioning unit has been a 2-week period. This has proven sufficient to find insights and short enough to adapt to new ideas as data exploration has unfolded. Even when there have been no insights, there have still been finished questions. This process has provided structure as well as flexibility to explore data and pose challenging questions using a grassroots approach so that analytics are immediately relevant for each particular context. Three examples of completed Research Sprints follow to show the range of uses of Research Sprints; explaining the ideas underpinning these steps and how they supported the sprint methodology. The examples also serve to highlight how the conduct of the sprints provided something not available through the routine system-generated reports described above, highly relevant to the requesting stakeholder, but not necessarily of immediate significance beyond the requesting unit. The first and second examples indicate the sorts of questions posed by academic units and the third, more detailed example, describes the response to a question from a central non-teaching unit. The examples are:

  1. 1.

    The First Year Chemistry Curriculum.

  2. 2.

    The French Language Curriculum.

  3. 3.

    The Analysis of Coursework Student Course Progress.

3.1 The First Year Chemistry Curriculum

The First Year Chemistry teaching team were interested in finding out the influence of Peer Assisted Study Sessions (PASS) on academic outcomes for students enrolled in first-year chemistry units (CHEM101 and CHEM 102) over the years 2015–2017. Peer Assisted Study Sessions (PASS) are an academic support program of supplemental instruction using successful later year students to facilitate peer-learning sessions in addition to the scheduled formal university classes (Dawson, van der Meer, Skalicky, & Cowley, 2014). The PASS program is often attached to what might be termed ‘high risk’ units. At UOW, the primary identification of ‘high risk’ is applied to those courses which have historically high rates of student failure or early withdrawal.

The program integrates academic skills with course content in a series of peer-facilitated sessions that are voluntarily attended by students enrolled in these courses (Dawson et al., 2014, p. 610).

Each weekly PASS class is attended by a group of students enrolled in the target unit and is facilitated by a PASS Leader. PASS leaders are generally academically successful students with good interpersonal skills who recently successfully completed the unit. The PASS leader is responsible for facilitating,

discussion around course content and related study skills, and for preparing learning activities such as worksheets, group work, problem-solving exercises, or mock exams for their students’ (Dawson et al., 2014, p. 610).

The involvement of the First Year Chemistry team in a range of initiatives to re-develop the curriculum over the past several years, along with their sophisticated understanding of higher education pedagogy and the role of PASS, made this team ideal for their intentional and selective recruitment (Step 1 of the Research Sprint Process) to a Research Sprint. While the system-generated learning analytics products described in the first part of this chapter draw entirely on information applicable to all units (student attributes and student academic outcomes), this analysis differs from those because it used data only applicable to certain units. This involved data integration from a separate system that collected information on student participation in PASS. Therefore, an important part of Step 3 of this Research Sprint was the data preparation and analysis to ensure that the type of information that could inform the analysis was available and stored in a format that made it suitable for analysis.

Because Faculties resource aspects of the PASS programs for units within their disciplines, it is in their interests to evaluate the impact of such supports for students. In this analysis, only students who participated in both of the first-year chemistry units under consideration (CHEM101 and CHEM102) were included. These units were offered in consecutive academic semesters and it was student performance in the second of the two units that was the outcome considered (so at the end of each student’s first academic year in Chemistry). Interestingly, a range of variables other than PASS attendance (such as the composite mark for CHEM101, online learning site access in the first semester, student age, markers of past and current academic aptitude) accounted for 74% of the variance in student marks for their second academic semester chemistry unit (CHEM102) final composite mark. There was a small, but significant effect associated with PASS attendance in second semester. This effect amounted to an increase of about 0.1 in the final composite mark for each week of PASS attended, that is, 10 weeks of PASS attendance was associated with an increase by one (1) in the final composite mark (a mark out of 100) for CHEM102. The analysts recognise that it cannot be said that these correlations were indicative of causation, especially because it is well recognised that PASS attendance may be a marker of other forms of engagement that are the true ‘causative’ factors. The conduct of this Research Sprint set the scene for the Research Sprint on the French Language Curriculum, which had undergone re-design with the aim of managing the risks that might be realised with, amongst other things, the reduction of access to PASS for French language students.

3.2 The French Language Curriculum

In 2018, the learning analytics team carried out a Research Sprint in collaboration with the French Language teaching team. There were cost pressures surrounding this context, including fewer teaching hours and the prospect of less Faculty financial support for supplemental instruction (PASS). This informed changes to the curriculum for the French major within the Bachelor of Arts, which involved students undertaking regular oral assessment hurdle tasks (through the LMS, Moodle). A hurdle assessment task is one which must be completed to a specified standard before the student can progress to the next learning activity or next assessment task. They function as a type of formative feedback and are helpful in enabling students to experience success through the iterative and sequential development of areas of skill. The course teaching team were interested in finding out more about online student activity. The curiosity of this teaching team and their openness to ‘finding out’ (even if that meant that their hopes were not realised), made this team another good candidate for Step 1 (the intentional and selective recruitment) of the Research Sprint process. The teaching team were particularly interested in finding out whether students interacted with each other as intended by the educational design of the units. A further question related to whether there was any relationship between online peer interactions and student academic performance. The teaching team’s capacity to explain both the rationale for their inquires as well as the complex educational design of the core French language units greatly assisted in the identification of a set of questions and assumptions (Step 2 of the Research Sprint Process).

The principal source of formative feedback (using an online tool, Poodle) for these assessment tasks was peer feedback. The system was arranged so that first year students (those enrolled in 100-level units) received their feedback from second year and third year students (those enrolled in 200 and 300-level units), and those in second year (200-level students) received their feedback from students in their final year (300-level students). The analysis provided by this Research Sprint offered a range of useful information about student online interaction generally. Of particular interest was the connection between the number of online post counts (keeping in mind that students were posting online as a form of learning activity in units they were enrolled in and for students at 200- and 300-level, they were providing feedback to students in earlier year cohort’s classes). Results of the Research Sprint showed those students who attained high marks in their French unit tended to have a broad range of audio post counts (from small to large counts). Those with low marks (including those with a fail grade) tended, almost exclusively, to have a pattern of low interaction. More generally, almost all students with a high number of online activity hours had good final marks, whereas all students who failed (achieved less than 50/100 as their final composite result) had a low number of activity hours. The French Language teaching team took this information to be indicative of the success of the peer-facilitated formative feedback approach and saw benefits of this approach both for those students giving peer feedback, as well as those receiving peer feedback.

In terms of the Research Sprint process, to undertake the appropriate data analysis outlined in Step 3 required access to an additional source of data beyond the data warehouse accessed for much of the routine learning analytics work. The source of this additional data included aspects of the LMS not already integrated into the data warehouse where students had uploaded their digital recordings of their own oral assessment tasks and (for 200-level and 300-level students) their peer feedback on those recordings using Poodle. Responding to this Research Sprint necessitated a new approach in order to gain access to the relevant data sources and turn that information into a form useful for analysis. The iterative nature of the Research Sprint process also came into play here with several cycles between question, data analysis and discussion of results. This Research Sprint is an example of analysis undertaken, based, at least partially, on the same data sets used to create the system-generated reports mentioned above; but designed to address questions that could not be answered by the standard system-generated reports.

3.3 The Analysis of Student Course Progress

The Course Progress Policy at UOW aims to support students to achieve success in their studies and to complete their qualification within a reasonable timeframe and without incurring unnecessary tuition fee debt. The Policy sets out the requirements for achieving satisfactory course progress (achieving passing grades in over 50% of the credit points in which the student is enrolled in each academic semester) and the processes for informing students of, and referring them to, intervention strategies to assist in the achievement of satisfactory course progress. While these are specific requirements of the Australian National Code of Practice for Providers of Education and Training to Overseas Students 2018, they are good practices to apply to the institution’s support of all students.

The 2014 review of the Course Progress Policy resulted in some significant changes to the policy which, it was hoped, would have a positive impact on a student’s ability to progress through a course of study. In early 2018 a Research Sprint conducted by the learning analytics team at the request of the Chair of the Coursework Exclusion Appeals Committee analysed the progression of the cohort of students who first enrolled in an award course at UOW in 2014. This was the first cohort to which the revised policy provisions applied. The aim of this Research Sprint was to gain insight into the effect of the Course Progress policy on student progress (and whether it was effective in achieving its objectives). The Research Sprint also served to inform the 2018 review of the UOW Course Progress Policy as well as the ongoing development of strategies to support students affected by the Course Progress Rules.

In terms of Step 3 of the Research Sprint process (data preparation and analysis) this Research Sprint was interesting because the same infrastructure used for the other two Research Sprints was used for a different purpose. Having the data warehouse infrastructure in place for a number of years meant historical data was available for a deeper analysis of student trajectories. Here, each student’s pathway through their course was considered up to the point at which they either: (1) changed to a course at a different level (postgrad/undergrad), (2) completed their course or (3) their most recent enrolment status update. The Research Sprint found that outcomes worsen at each successive stage of the course status pathway, with completion rates roughly halved for students who did not achieve passing grades in over 50% of enrolled credit points compared to those who did. Completion rates halved again for students who did not achieve passing grades in over 50% of enrolled credit points in consecutive semesters.

The outcome of the First Year Chemistry Research Sprint demonstrated small effects of ‘interventions’ such as PASS. The French Language Curriculum Research Sprint demonstrated that the online peer interactions of students were taking place in the way anticipated and that there was a relationship between poor academic outcomes and low levels of online interaction. Each of these findings, while complex to analyse, were easy to interpret. For the Course Progress Research Sprint, Step 4 (discussion of the results) and Step 5 (drawing conclusions) required intense and close interaction between the data analyst and the key stakeholder. Underlying this were the procedures used to implement the Course Progress Policy (especially in relation to the changes in student course status from ‘active’, to ‘referral’, to ‘restricted’ and ‘excluded’). The meaning of the data was not immediately obvious. Ultimately, the Chair of the UOW Coursework Exclusion Appeal Committee was pleasantly surprised to find just how effectively the Course Progress Policy was working to assist students to return to a course status of ‘active’ and to eventually successfully complete their studies. Although failure to meet course progress requirements was indicative of a higher risk of student non-completion, the vast majority of students whose course status changed to ‘referral’ because of a lack of course progress in one academic semester, were eventually returned to the course status of ‘active’ and graduation. The evidence of this Research Sprint was used in tandem with the work of the Course Exclusion Appeal Committee to confirm effective implementation of Course Progress Policy. By supporting all students to successfully complete their studies in a timely way the University can help reduce the financial impact of higher education study for individual students and enhance their learning experience.

Step 6 (communication of results so that learnings can be applied to practice) of the Research Sprint process was important for this sprint because the range of stakeholders to the policy and to the process of managing students’ course progress was very diverse and they had a broad range of knowledge backgrounds and purposes. The provision of the data analysis in the form of a range of tables, and graphs enabled users of this information to quickly and easily interpret the meaning of the analysis, the conclusions reached and the implications for action. Since this report was produced aspects of it have been provided to the committee responsible for the review of the Course Progress Policy, the central unit responsible for implementing the procedures arising from the policy and the team leader of the central student support advisers.

4 Conclusion

Top-down aspects of learning analytics at UOW have generated scalable and sustainable practices. The governance structure has had oversight across the university, a data use policy has been put in place to help protect staff and students and the technical infrastructure has utilised a data warehouse that has catered for key aspects of the UOW learning platform. Frontline Research Sprints have helped address questions within particular teaching and learning contexts that learning analytics can help answer but are not addressed by the standard reports generated for unit coordinators in the top-down approach. The future direction for learning analytics at UOW will likely involve bringing these two different aspects closer together. Questions covered in Research Sprints will be used to develop and test new prototypes without the risk and resourcing implications of a full implementation. The intention here is to extend the learning analytics capabilities provided in the top-down approach with functionality shown to be useful for a number of academic staff. Key to this will be a collaborative design process with academic staff to better understand their needs related to learning analytics so that problems can be re-framed, many ideas created and a hands-on approach adopted in prototyping and testing new learning analytics capabilities (Plattner, Meinel, & Leifer, 2010; Retna, 2016). Such an approach is consistent with findings from recent studies that reinforce the need for a greater emphasis on the human utilisation of learning analytics over the technical design aspects (Howell et al., 2018; Klein et al., 2019; Leitner, Ebner, & Ebner, 2019). The classroom, in its broadest sense, is where the majority of student retention opportunities lie and learning analytics is but one tool used in a variety of teaching and learning practices. This also rings true when casting the net wider than student retention and considering how learning analytics is best integrated into classroom practice to support innovations of any kind. Teaching staff are the gateway for this, so it is important we investigate and better understand their needs and practices associated with learning analytics. With a future mandate and resourcing to do so it would be possible to more systematically gather evidence through stakeholder evaluation. As they stand, the examples described in this chapter offer ideas about ways of implementing institutional learning analytics to complement existing “top down” approaches that offer ways to integrate stakeholders into the development process.

4.1 Future Directions

The growing use of data in other university aspects also poses implications for the future direction of learning analytics at UOW. Up until relatively recently the learning analytics work undertaken at UOW operated with a dual governance structure. As mentioned earlier, one governance committee focused on decision making and management of learning analytics and a separate group focused on ethical implications arising from secondary use of student data. Other initiatives at UOW are emerging that represent a broader focus akin to ‘academic analytics’ (Siemens & Long, 2011). The potential benefits for decisions informed by analysis of student data traverse different levels of the university: student, teacher, faculty, institution etc. (Ifenthaler, 2017). The number of insights to be derived from the available data will likely always outweigh what can be reasonably resourced when there is constant pressure to “do more with less”. The rapid pace of technological change requires each of these initiatives based on student data to be treated as a living ecosystem in order to effectively address ethical considerations and ensure responsible use of data that protects all stakeholders: students, teachers, researchers, support staff and administrators. In recognition of this, the governance of all analytics initiatives based on student data at UOW is undergoing the changes necessary to guide future work.

Of relevance for a way forward at UOW is the approach taken at Open University UK, whereby cross-functional teams are established for each faculty comprising technical, pedagogical and stakeholder management expertise (Rienties, Cross, Marsh, & Ullmann, 2017). Findings from recent studies point to shortages in finding people with the diversity of practical data science skills as well as knowledge of learning and teaching (Gašević et al., 2016; Ifenthaler, 2017; Rienties, Herodotou, Olney, Schencks, & Boroowa, 2018). It is unlikely that all of these capabilities will reside in any one individual. In recognition of this, decision making needs to be approached in a collaborative way with a variety of expertise to develop evidence-based solutions implemented in ways that meet student needs and facilitate their success (Klein et al., 2019). This is consonant with broader trends in data science whereby a range of diverse talents are required to ask smart questions and communicate insights and what they mean for practice (Berinato, 2019).

The experience at UOW is consistent with the finding of Colvin et al. (2015) which suggests that the situated practice of learning analytics implementations generates future capacity. It is worth considering one perspective on how that works by reflecting on Boud and Brew’s (2013) work on academic development where they suggest that,

a conscious focus on academic practice qua practice can fundamentally shift one’s perspectives on professional learning. It moves from the consideration of learning as something that individuals do, to seeing learning as a social process occurring within the context of practice. Viewing learning as a constructed and emergent phenomenon arising in and from academic work positions academic development as a process of working with opportunities for learning created by work itself. Some aspects of this work foster, and others inhibit, learning, and an important task for the academic developer is to work with academics to engage with helpful and unhelpful facets of work in relation to their learning (pp. 209 – 210).

The work reported on in this chapter casts the work, particularly, but not only the Research Sprints, undertaken by the learning analytics team in close collaboration with teaching and other staff, as an approach which ‘fosters’ learning by staff by engaging one another in a social process within the context of their practice. In other words, each of these Research Sprint projects was itself a form of peer learning taking place in the situated practice of the stakeholders themselves, and therefore enhancing the development of future capacity of both the learning analytics team and the stakeholders with whom they collaborate. This builds trust in the practice of learning analytics by making it relevant to academic practice (or university governance practice) itself. It is not clear whether the use of Research Sprints will be sustainable and scalable, or whether the findings from such sprints are applicable beyond the small specialist work group involved in each sprint. What is clear is that this type of endeavor connects with the ‘lived experience’ of teaching and other staff and works to build trust in the practice of learning analytics. Building trust in the use of ‘big data’ will in turn result in more consistent uptake of learning analytics tools.

At UOW, an initial focus on the near real time provision of reports through the lens of retaining students resulted in system-generated reports scaled across the institution. Other system-generated reports aimed at identifying overall patterns of student engagement with learning opportunities in each unit have also supplemented this. This gave stakeholders a certain understanding of the conceptualisation of learning analytics at UOW, which in turn revealed constraints in the depth of insights provided to teachers in the context of their practice. Research Sprints were formulated in response to this observation as a way to uncover important questions about student learning in particular settings; conduct customised analyses for these questions; and co-construct new knowledge claims that informed practice. This reinforces the importance of putting in place iterative processes that continually refine the development and implementation of learning analytics. Future work aims to bring the top-down and bottom-up elements closer together so that students and teachers have more nuanced, contextualised and thus more trusted tools to enhance educational practice.