Keywords

1.1 Introduction to Learning Analytics

A key factor that will significantly improve the quality of education is Data (Banihashem & Aliabadi, 2017). Long and Siemens (2014) predicted that in the future, Big Data and Analytics would take a significant factor in higher education as it deals with the things that cannot be seen or touched. Big data deals with large datasets that cannot be analyzed and processed by humans. It has to be computed through a machine to uncover the different trends, associations, patterns specific to human behavior, and interactions (Chen, Mao, & Liu, 2014). For educational institutions, data is critical. Data helps educators to provide better outcomes to their students. Data is analyzed to understand problem instances that provide answers to questions like why a student fails to clear the exam or why an individual chose to quit a course, or why a particular concept was not understood (Reid-Martinez, 2015). Cooper (2012) stated that “analytics is the process of developing actionable insights through problem definition and applying statistical models and analysis against existing and/or simulated future data.” Hence, analytics is interpreted as the procedure that uncovers data and processing them to reach implementable insight. From 2008, implementing analytics in education was considered to provide an optimal learning curve for the student. Since 2010, the concept of Learning Analytics has emerged as an independent area (Reid-Martinez, 2015).

“Learning Analytics” incorporates a wide range of techniques used to gather, store, and report data used for administrative, programmatic, and pedagogical purposes. This data ranges from measuring retention, understanding student progress, tracking course tool usage, and providing granular and personalized, student-specific data.

Equally large are the sources for this data, ranging from student information systems, management systems, or task-specific learning tools. This data reporting is descriptive and predictive, using various tools, from business tools to custom algorithms.

Learning Analytics’ growth reflects the national surge in evidence-based learning and digital learning content delivery affordability. It remains to be seen if and how the implementation of Learning Analytics would affect pedagogical interest, technical challenges, and cultural barriers.

History

Learning Analytics results from understanding two crucial business needs, how the internal organization behaves and the end consumer’s behavior. Technological advances enabled businesses to use diverse systems such as PeopleSoft to derive information regarding internal and external behavior through data collection (Shum & Ferguson, 2012). Costello and Mitchell used the phrase Learning Analytics in the year 2000 to analyze international market opportunities for online learning products (Mitchell, 2000). In 2005 the same was used by education technology company Blackboard to describe the compilations provided by its learning management system (LMS) designed for academicians, instructors, and decision-makers (Baepler & Murdoch, 2010).

Knowledge Discovery in Databases (KDD) is an underpinning practice in Data Mining, which centers on gathering and breaking down a considerable amount of data that incorporates several aspects of computing involving logic programming and constructing the decision tree (Romero & Ventura, 2007). According to Baker and Yasef, Educational Data Mining (EDM) can be defined as “an emerging discipline, concerned with developing methods for exploring the unique types of data from educational settings and using those methods to understand better students, and the settings in which they learn” (Baker & Yacef, 2009). Baker and Yasef came up with the EDM concept in 1995.

The Learning Analytics field has been provided a significant boost in implementation across various education fields with affordable LMS systems such as Blackboard, Moodle, and Desire2Learn, which provided an excellent visual representation of large quantities of student information (Shum & Ferguson, 2012).

The Motivation for Learning Analytics

The data in educational settings is undoubtedly not new. For decades, institutional-level researchers, managers, and other interested stakeholders have used educational data to analyze and evaluate schools and programs, track graduation and retention levels, and make enrollment and resource predictions.

Data from intelligent agents, learning objects, games, and simulations at the learner level have allowed scientists to inform their pedagogical and cognitive research.

Although educational data is not new, the term “Learning Analytics” has emerged as an essential focal point for higher education in recent years and is often driven by language such as “in the service of improving learning and education” (Macfadyen & Dawson, 2012) or “evaluating the effectiveness of online instruction in delivering user-centric quality undergraduate education” (UCOP, 2010).

Administrators are increasingly using business intelligence software to predict students’ recruitment and retention trends. The faculty uses the data from LMS to gain new insights into their classes and students. Vendors enter this market in the hopes of providing institutionally unavailable services. Politicians, corporate executives, and accreditors search for proof of learning from students to ensure quality and validate the public investment.

The current thriving focus on Learning Analytics can be attributed to the availability, scale, and granularity of educational data from multiple new sources being processed via business intelligence tools and predictive statistical methods aimed at various new goals and insights. Contemporary Learning Analytics includes new technologies, applications, reporting methods, descriptive and predictive methods, data mining, and academic and nonacademic accountability pressures.

Vast amounts of information are accumulated and prepared to build up a comprehension of exercises done by people across various businesses, industries, government, and other areas to optimize its process and results.

The expansion of organizations to other sectors depends mainly on analyzing customer’s patterns on products. This is achieved using business intelligence and data mining software. In higher education, massive datasets provide information regarding the learners, study environment & patterns of studying. Universities are in a nascent stage, trying to understand how to utilize the available data to enhance their students’ educational experience.

Defining Learning Analytics

Learning Analytics blends various scholarly disciplines, namely educational data mining and predictive modeling. In 2007 (Campbell, DeBlois, & Oblinger, 2007), analytics could provide opportunities for ‘academic analytics’ to discover an answer for developing difficulties in the US advanced education, such as the helpless rate of retention. In 2010, Learning Analytics started to associate itself as a discipline (Ferguson, 2012). On the contrary academic analytics was regarded as a tool for institutional businesses focusing on recruitment and less learning. In 2010, The Society for Learning Analytics Research (SoLAR) was incepted. They gave frequently cited interpretation of Learning Analytics (Siemens et al., 2011):

Learning analytics is the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.

There are overlaps between the rapidly evolving three fields like educational data mining, Learning Analytics, and academic analytics. EDM is mainly tailored to technological problems by generating value from Big Data linked to learning. Learning Analytics is about improving the learning elements, whereas academic analytics focuses more on data use marketing and management (Ferguson, 2012; Long & Siemens, 2011). The Learning Analytics field is influenced by many disciplines, including training, psychology, linguistics, philosophy, sociology, learning physics, statistics, IT, computer science, and Artificial Intelligence. The most dominant of the two disciplines (to a large extent) of key field researchers are Information technology and education (Dawson, Gašević, Siemens, & Joksimovic, 2014).

Components of Learning Analytics

The three elements (Fig. 1.1) working together for Learning Analytics are Data, Analysis, and Action (CommLab India, n.d.). ‘Data’ reflects all of the knowledge gathered from the learners and the learning experiences in which they engage. ‘Analysis’ includes gathering all the data collected and assessing the impact of the training on the organization’s productivity and progress. Furthermore, ‘Action’ is the decisions you take and the changes you make based on the data analysis.

Fig. 1.1
figure 1

Learning analytics components. (CommLab India, n.d.)

Data

Data is a crucial asset in analytics. Data is the raw material that becomes analytical insights. Learning Analytics data is collecting student knowledge, learning environment, learning experiences, and learning outcomes. Typically, this data is collected while the learning process is going on. Data comes from various sources such as Student Information Systems (SIS), providing demographic and academic data, Learning Management Systems (LMS), providing activity reports for students, performance information for students, and other systems providing different information types.

Analysis

The analysis uses algorithms on the gathered data and analyzes it for actionable results. Data analysis incorporates machine learning techniques through mathematical and statistical algorithms. In general, the more sophisticated data analysis algorithms you use yield better insights. Alternatively, complex algorithms put forth higher demands on the volume of data to be processed, type of data, timeframe for analysis, etc. The resulting analysis can be categorized into Descriptive Learning Analytics, which understands the past and provides reactive solutions to influence future learning, and Predictive Learning Analytics, which understands the present and provides proactive solutions to improve the existing learning process. Selecting the right data and algorithms provides the best solution to implement the Learning Analytics process.

Action

Taking action is the ultimate aim of any process of Learning Analytics. Failure to act is a complete failure. It does not matter how good or bad your predictions are if you do not want to or cannot act on them. Your Learning Analytics is incomplete without alterations based on the reportages provided. The results of follow-up actions will determine whether our analytical efforts are successful or failing. Actions are about leadership and culture. As the institution’s leadership team must fully embrace the behaviors caused by analytics, school, company, etc., and organizations need to build an internal data-driven culture. The typical behavior pattern for your organization should be intervention following a data analysis process. The right internal processes need to be in place to enable interventions to occur at the end of the day.

What do we do if our empirical analytics reveal at-risk students, missed learning processes, or low-quality training content? Do we want to let that go without any action? We will have to face other ethical considerations if that is the case.

What can I use learning analytics for?

Learning Analytics can be used by instructors in various ways (Academic Technology, 2020).

  1. 1.

    Access Learning Behavior: Learning Analytics collects user-generated data sourced through learning methods and offers engagement trends in learning. Once we analyze the data, we can see the trends that reveal the students’ behavioral learning styles. With this analysis, we can measure the student’s engagement behavior rather than focusing on performance, giving the instructor insight into how students understand their course material.

  2. 2.

    Evaluate Social Learning: Learning Analytics can be used to analyze the behaviors of a learner on any interactive social network, such as online conversations in Canvas, in order to determine the benefits of social learning. It tests and records interactions between student-to-student and student-to-instructor to understand whether students benefit from social learning in their course.

  3. 3.

    Improve Learning Materials & Tools: Learning analytics can track the use of learning materials and means by a student for identifying potential problems or deficiencies and offer an objective assessment of learning materials and tools. It helps teachers to focus on changing strategies deliberately. The teachers can see ways to enhance the learning process or their course structure through aggregated student data.

  4. 4.

    Individualized Learning: Learning Analytics allows instructors to be adaptive and customize individualized learning course content for each student. Based on the individual user profile, data is collected and analyzed to produce a more significant personalized learning experience. This approach allows for continuous feedback from individual students to enhance their learning.

  5. 5.

    Predict Student Performance: Based on existing data on learning engagement and performance, Learning Analytics employs statistical and automated learning models to predict future learning performance. This will classify potential at-risk students for personalized support. The emphasis is on using data to allow the teacher to intervene quickly and help the student correct the course before it is too late.

  6. 6.

    Visualize Learning Activities: It tracks all the learning tasks that users conduct in a digital environment to provide visual reporting on the learning process. The reports will assist students and teachers in encouraging learning, improving practices, and enhancing learning performance. The goal is to increase learning habits and understanding of behaviors and self-reflection among students.

Is academic success predictable through analytics?

One of the core concepts of Predictive Learning Analytics (Sclater, Peasgood, & Mullan, 2016) is the notion that measuring student participation through VLE, submission of assessments, and additional data can be used as a proxy for learning and, therefore, probable academic success. Many studies confirm that students who are more involved are likely to perform better. For example, a model developed at St. Louis University in Missouri showed that students’ exposure to learning material and gradebook contributed to their final grade. However, this was considered an unsurprising finding that offered no useful insights to assist students (Buerck & Mudigonda, 2014). In some studies, the highest rates of participation are not associated with the best outcomes. Even the worst students who work very hard to enhance their performance tend to be the most dedicated.

Nonetheless, a case study in SoLAR’s report has clear details and actual results: 730 students in different courses were classified as at risk at the University of South Australia. Of the 549 contacted, the average Grade Point Average (GPA) was 4.29, 66 percent. The average GPA of 52% of at-risk students not contacted is 3.14. This seems to be a crucial finding, suggesting that intervention approaches for the failing students may be highly valuable to institutions: if you are regarded as at risk but are left alone, the outcome is not only much worse but also substantially more likely to fail. However, the lack of such data in the literature and rigorous, empirical, reproducible studies still makes it difficult to justify significant claims regarding the impact of Learning Analytics (Sclater et al., 2016).

Types of Learning Analytics

We can find four types (Fig. 1.2) of Learning Analytics (Lewis, n.d.).

Fig. 1.2
figure 2

Four types of learning analytics. (Mehta, 2017)

Descriptive analytics

Descriptive analytics should address questions about what happened. Descriptive analytics collect multi-source data to provide insights into past performance. These data can be used for decision-making related to future training programs. For example, you can take steps to improve the content or use an engaging learning approach if the data suggests higher dropout levels. These findings allow you to improve training programs and even eliminate courses that waste the organization’s money and resources. However, descriptive analytics are limited to suggesting something that happened without specifying why.

Diagnostic analytics

Diagnostic analytics can be used to analyze why something happened and raise questions. You should recognize the dependent elements and locate trends to learn about a particular problem or opportunity. For example, data from the diagnostic analytics may indicate that a customer service eLearning course has been low among senior managers while new hires have found it useful. Further diagnoses found that the course content is too fundamental for senior management, suggesting that the organization requires an advanced level of customer service course. The more thorough research shows that learners’ individual needs need to be addressed and that the learning experience needs to be tailored. This will help ensure that the training curriculum is not repetitive while positively affecting all learners’ success.

Predictive Analytics

Predictive analytics states what is likely to happen. It depends on the effects of current data to forecast the future. However, it is essential to bear in mind that forecasts are just an assessment, and precision is highly dependent on data quality and stability. Therefore, the data should be carefully evaluated. Predictive analytics may help to predict the possible difficulties that learners may encounter in learning. This allows L&D managers to build early intervention and tailor support opportunities. Predictive analytics may also be used to boost training efficiency and increase the commitment ratio. In a post-course survey, for example, we can say that some students do not prefer the eLearning program from a desktop. Since most of you are time-hard and always on the go, you choose to access the training anytime, anywhere on your mobile devices. In this situation, the learner profiles and predictive analytics will allow you to zero-in and deliver microlearning solutions to suit your unique needs.

Prescriptive Analytics

Prescriptive analytics aims to find solutions to questions about what should be done. In other words, it would help to understand why it will happen, in addition to seeking answers to what will happen. Also, prescriptive analytics will help you manage training programs strategically. For example, an eLearning course program has to be developed for employees in the manufacturing industry. Two aspects have been revealed by study surveys of courses taken in the past. The courses are theoretically excellent; however, it would be advantageous if students learned how to transfer this learning or apply it to their work. Simulations can gradually be conducted to help students apply the learning in a simulated environment. In effect, this will increase the effectiveness and benefit of the training program.

Again, data powers the world today. Learning Analytics offers decision-makers a deeper understanding of how training courses are aligned with learner objectives and needs. L&D leaders and their stakeholders have a vast opportunity to make data-driven decisions and, above all, use Learning Analytics. When you have not started using Learning Analytics to boost your training programs’ efficiency and ROI, it is time for your organization to focus seriously on Learning Analytics.

1.2 Learning Analytics: A New and Rapidly Developing Field

Learning analytics can be regarded as a very complex and onerous field of study. There are different application areas of the method. The processing of large data is the most critical. It is probably an effective approach, particularly in open-source studies and institutions with many participants. To apply this new methodology effectively, such areas should be held, known to, and even trained by researchers. Figure 1.3 below shows nine of these fields. As shown in Fig. 1.3, several skills are required for Learning Analytics, from data analysis to advanced web software languages and methodology. Effective assessment can help clarify and improve the learning process (Astin et al., 1996).

Fig. 1.3
figure 3

Required fields for learning analytics. (Firat & Yuzer, 2016)

A sufficient number of researchers would, therefore, not be appropriate for Learning Analytics for conventional science. Instead, a larger team of experts can be formed in their fields. In other words, Learning Analytics may be viewed as an interdisciplinary research field focused on education because of its existence.

Elias (2011) notes that Learning Analytics is closely linked to Web analytics, academic analytics, educational data mining, action analytics, and business intelligence. These relationships are summarized in Fig. 1.4. Figure 1.4 indicates that business intelligence applies to the institution’s processes to make strategic decisions using data analysis or algorithms, data collection, analysis, and Website usage by visitors, learners, or customers.

Fig. 1.4
figure 4

Learning analytics related fields. (Firat & Yuzer, 2016)

Academic analytics is related to the transition to the academia of business intelligence framework concepts and instruments (Goldstein & Katz, 2005). Data mining refers to large-scale data collection, analysis, interpretation, and record of large data. Compared to educational data mining, statistical predictive models in academic analytics seek to enhance decision-making processes.

Audience and Uses

“The goal of Learning Analytics is to enable teachers and schools to tailor educational opportunities to each student’s level of need and ability” (Johnson, Smith, Willis, Levine, & Haywood, 2011). Learning Analytics will give learners, educators, and program coordinators the most significant possible applications. Learning Analytics can benefit the personalization of learning material, enhanced student motivation through immediate feedback, early identification of risky students, and data-driven curriculum and content design (Siemens et al., 2011).

Students

Institutes that use Learning Analytics provide their students with resources that allow them to receive feedback on their progress in the course. Dashboards for Learning Analytics will provide necessary information on attendance at seminars and on-line time-to-work events, forum participation rates, on-line assessments and quiz results, and marks for standardized written assignments and exams. Students can find the system’s suggestions particularly to become more effective, focused learners. For example, if a student has a weak understanding of a particular subject area, as identified by a questionnaire, the dashboard will recommend that the student perform specific exercises and readings to enhance their understanding. Alternative learning approaches-based student peers’ past experiences or learners with a similar profile can also be provided to students. If a student starts falling behind, the system may suggest strategies to catch up with the rest of the class (Siemens et al., 2011).

Educators

Teachers may use Learning Analytic tools to track and get insight into various factors known to influence learners’ ongoing participation in the course, use it to adapt their teaching, change their assignments, and suggest teaching as appropriate (Siemens et al., 2011). Training analytical algorithms align online behavior with previous cohort success predictive models. The algorithms used are a mixture of the features of courses taught by a professor, the demographics and learning methods of students, and the technology they employ. The analytics also provide insight into the online student engagement, for example, sentiments by students about a subject, the liveliness of discussion around a topic, or the involvement of various learners (Siemens et al., 2011).

Program Coordinators

Program Coordinators may use analytical data to assess the output of a group of students or all their students. They will evaluate information to assess what works in a specific classroom and whether a specific learning plan improves student learning. In general, the program’s quantitative learning data may be disaggregated by a subgroup of students, for example, by teacher or year to see how students perform without a course prerequisite or compare their success. Learning system data can help analyze how students learn from individual interventions and how the intervention can be improved. Data can be used to recommend strategies, courses, and administrative systems to enhance teaching, learning, and graduation rates (Bienkowski, Feng, & Means, 2014).

Researchers

Researchers use learner data from multiple systems to test with learning concepts and examine the utility of different styles of teaching methods and various elements of course design. Online learning systems researchers may do experiments where several students obtain different teaching or learning methods at random, and program developers may give other users alternate versions of the software: Version A or Version B. An “A/B testing” process will answer student learning questions such as whether students learn better if they are studying at once in a given type of problem (“mass practice”) or if work on this type of problem is divided over time (“spaced work”)? What about the retention of this skill by students? What sort of practice schedule is superior for supporting retention? For what form of students, and in which contexts (Bienkowski et al., 2014)?

How learning analytics works

With online education and increasingly digital content accessed by students, the data sources available to evaluate study are expanding.

Data Sources

Most of this is big data that is considered too large to handle traditional database systems (Manyika et al., 2011). The principal source being used for Learning Analytics is the VLE (Virtual Learning Environment), which allows students to view timetables, grades, and course details, access learning materials, communicate with each other through forums, and submit assignments. VLE is commonly used in various modules, courses, and organizations. When used as the primary tool, it is likely to provide a rich source of data, for example, in distance learning programs or MOOCs. Learning Analytics can also provide valuable insights into participating students in courses that do not use VLE. The second primary data source is the SIS (Student Information System), which contains data on students, including prior qualifications, socio-economic status, ethnic group, module selection, and grades. All this is useful material, which can be combined with VLE activity data to predict academic performance.

Other data is also supplemented by VLE and SIS data. There are monitoring systems in some institutions to record students’ campus visits or their presence in specific places such as libraries, lecture halls, and refectories. This can be reported from swipe cards, proximity cards, or other entry systems or students’ connections to institutional Wi-Fi services. The mix can also be applied to the library info. In some institutions, information such as student library visits, book borrowing records, and access to digital journals is provided to personal tutors on dashboards to facilitate discussions with individuals or groups.

For example, past data analytics for a module may show that frequent access to library resources is associated with success. If the current cohort’s statistics reveal that fewer students have visited the library than anticipated so far, a teacher will bring this to the group’s attention to improve the students’ behavior. They will include documentation from previous students to back up their suggestions.

Technical infrastructure

The emerging state of the related technology and the absence of consolidation in the industry are problems for organizations wanting to invest in Learning Analytics. Competition for trade supremacy between VLEs (e.g., Blackboard), SISs (e.g., Tribal), applications of market intelligence, and visualization (e.g., IBM Cognos, Tableau, Qlikview) and emergent personalized Learning Analytics packages (e.g., Civitas Learning) is taking place. The HeLF survey (Newland, Martin, & Ringan, 2015) found that institutions consider various solutions, including their internal developments. This illustrates the results of Jisc of very limited common ground in the learning analytics frameworks of the 13 leading institutions he consulted (Sclater, 2014). One of the higher education community’s key demands was for a fundamental Learning Analytics solution, which institutions could experiment with Learning Analytics. Figure 1.5 shows the resulting architecture.

Fig. 1.5
figure 5

Learning analytics – technical infrastructure. (Sclater et al., 2016)

It illustrates how data from sources such as VLE, SIS, library systems, and students’ self-declared data flow into the warehouse of Learning Analytics. The Learning Analytics processor, in which prediction analytics are conducted and action controlled by the warning and intervention system, is the architecture’s cornerstone. Visualizations of staff analytics are presented in a series of dashboards, and a student app enables students to display their data and compare it to others. The student app provides students information about how they measure their performance and accomplishment to other students, helping them prepare and set their learning expectations and be reinforced motivated by the app when they are met. In the meantime, a student consent service helps protect privacy by requiring students to provide data collection and use authorization.

This platform will be freely accessible to organizations with in the first 2 years as a mix of commercial and open-source platforms providing an open, multi-tenanted cloud solution. This enables organizations to share the same highly scalable architecture while keeping full control over their own data. You can also select individual components if you do not want the entire solution to be deployed.

1.3 Benefits and Challenges of Learning Analytics

When the vast development of comprehensive administrative systems, the collection and interpretation of academic and personal information within educational environments and educational data management is becoming complex. Several concepts directly related to collecting of this information include data mining, academic analytics, and Learning Analytics. Such terms are, however, still misunderstood and do not have universally accepted meanings and applicable meanings. The method of extracting valuable information from a wide variety of diverse educational datasets is referred to as educational data mining (EDM). Academic analytics (AA) includes discovering clear trends in education to provide insight into student problems (e.g., retention, success rates). Learning Analytics emphasizes insights and answers to real-time learning processes based on digital learning environments, administrative structures, and social media. Such dynamic information is used to interpret, model, predict and optimize learning, learning environments, and educational decision-making in real-time (Ifenthaler, 2014; Ifenthaler et al., 2014; Ifenthaler & Widanapathirana, 2014).

Benefits of Learning Analytics

The benefits of Learning Analytics can be related to four stakeholder levels (Ifenthaler, 2015) (see Fig. 1.6): mega-level (governance), macro-level (institution), and meso-level (curricular, teacher/tutor) as well as micro-level (learner, OLE). Nevertheless, the real-time access, analysis, and modeling of applicable educational information is an essential requirement for Learning Analytics benefits.

Fig. 1.6
figure 6

Learning analytics associated with stakeholder levels. (Ifenthaler, 2015)

The mega-level promotes cross-institutional analytics by integrating data from all layers of the system for Learning Analytics. Such rich datasets allow trends to be recognized and validated within and across institutions, providing useful insights into educational policymaking. The macro-level helps institution-wide analytics better understand learner cohorts to automate related processes and assign vital sources for minimizing dropouts, growing retention, and achieving performance.

The meso-level supports curricula and learning design and offers comprehensive insights into facilitators’ learning processes (i.e., teachers, tutors). This knowledge can enhance the overall quality of courses (e.g., sequencing of learning processes, alignment with higher-level results) and improve learning content (e.g., aligning them to expected outcomes and related evaluations).

The analytics of micro-levels supports learners through recommendations and support functions in the OLE. Learners benefit from these flexible and versatile scaffolds and are expected to produce better performance.

Another critical factor for enhancing the advantages of Learning Analytics is the physical environment (e.g., the learner’s current emotional state) that is not connected directly to educational knowledge. Therefore, data can be collected by reactive instructions within OLE and linked to available educational information.

Challenges

More educational data does not always boost educational results. Learning Analytics, therefore, has its apparent drawbacks, and data obtained from several educational sources may have many meanings. Serious problems and obstacles are also related to the implementation of Learning Analytics (Ifenthaler, 2015):

  1. 1.

    Not all educational data are equal and valid. Therefore, the validity of data and their analysis is critical for producing useful summative, real-time, and predictive insights. This provides a new interdisciplinary field of study in cognitive science, education technology, learning design, psychometry, data processing, web development, artificial intelligence, and statistics. The challenges are to examine complex processes under Learning Analytics frameworks and understand their immediate and long-term impact on learning and teaching.

  2. 2.

    Ethical problems related to the use of educational data for Learning Analytics. This means collecting and storing personal data and analyzing and presenting them to different stakeholders. Therefore, procedures for accessing and using educational data need to be introduced before introducing Learning Analytics frameworks. It also includes the accuracy of applied algorithms and the weighting of predictive modeling educational results. The collection and processing of anonymous personal data is just a small step towards a more robust educational data management system.

  3. 3.

    Restricted access to educational data creates drawbacks for the stakeholders concerned. Invalid predictions, for example, may lead to wrong decisions and unforeseen problems. A misalignment of prior knowledge, pathways, and learning outcomes could increase shifts, at-risk learners’ delayed identification may cause discontinuities. A definition of Learning Analytics threshold standards could prevent vast gaps between educational institutions and provide all stakeholders with equal opportunities.

  4. 4.

    Preparing stakeholders for meaningful application of Learning Analytics insights is vital. Stakeholder professional development ensures that issues are recognized and opportunities become concrete acts. Therefore, the expanded implementation of Learning Analytics needs a new generation of experts with specialized interdisciplinary skills. It would also require new logistical and testing facilities to improve Learning Analytics’ operations.

  5. 5.

    Distributed networks and unstructured knowledge cannot be specifically related to educational data gathered in an institution’s environment. The accumulation of such knowledge and the unregulated association with existing information from education raises the risk of critical prejudice, invalid analysis, predictions, and decisions. The task is to develop mechanisms to filter biased information and to alert stakeholders.

  6. 6.

    Learning Analytics has yet to determine an appropriate data collection and economic response period (seconds, minutes, hours, days, weeks). This includes minimum requirements for valid forecasts and meaningful interventions. The shortage of data is a big obstacle for potential Learning Analytics algorithms.

  7. 7.

    A qualitative study of rich somaticized data (e.g., forum content, open-end evaluation responses) allows a deeper understanding of learners’ information and needs in the context of digital data (e.g., click streams). Developing an automated, natural language processing (NLP) capability is a simple requirement. The critical challenge is to validate algorithms, link quantitative educational data, and build NLPs in real-time.

Summarizing Benefits and Challenges

A current research source suggests that the continuous development of higher education institutions is done by learning analytics. This section aims to discuss the opportunities provided by data analytics to higher education and the challenges that bring down its role, adoption, and usage in the different fields of higher education (El Alfy, Gómez, & Dani, 2017).

Table 1.1 summarizes the various challenges facing by Learning Analytics from various stakeholders, and Table 1.2 summarizes the benefits of Learning Analytics for different stakeholders.

Table 1.1 Challenges of learning analytics for stakeholders
Table 1.2 Benefits of learning analytics for stakeholders

Applications of Learning Analytics

The different applications of Learning Analytics (Guay, 2016) are discussed in this section. The most common use of data for learning is creating a dashboard that compares one learner’s learning way to larger groups, ranging from a single team to a classroom or even a whole organization. This tool is used in environments like Moodle (Enriched Rubric module for learning analytics and SmartKlass™), Brightspace, etc.

The learner’s dashboard charts its position using various aspects: progress level, histograms, and the number of visits. One of their key goals is to include students in the management and preparation of their activities. The dashboard of the instructor allows the testing of students with difficulties based on pre-defined criteria. It helps the teacher to move quickly and to provide the necessary assistance. Many methods go as far as to say that the frequency and length of visits in the first 3 weeks of a course are valid indicators of success or failure in a specific course.

Another application is to detect learning difficulties in individual students quickly. Language problems (dysorthography and dyslexia), arithmetic problems, and even problems of motor skills (dyspraxia) can skew learning performance. Rapid diagnosis and remediation may make a huge difference, particularly if all factors resulting from the assessment are taken into account in the intervention.

In some instances, teachers and supervisors may also use a more comprehensive data pool for all students within a single organization. As a result, managers can give employee-specific training, and instructors can confirm a specific student’s absentee rate during their course with other colleagues.

Class group analysis can also be done as a fundamental quality control tool. Given that this is no longer an individual coaching effort at this time, best practices should include separating personal data from learning paths in the interests of maintaining anonymity, ensuring confidentiality, and protecting the privacy of all associated parties, including families and learner associates.

Some proponents of Learning Analytics concentrate on automated learning pathways based on personal preferences and material level. Therefore, this task includes access to a detailed and consistent representation of the structure of curricula, a list of associated skills, and a wide set of digital tools with clearly specified parameters. Such information is seldom contained in electronically interoperable formats.

These studies’ outcomes should be considered instead of a solid framework for concrete details or reliable prescriptions as a statement or even a suggestion. Predictive trends are based on a simplistic learning model that does not consider individuals’ inherent complexity, personal experiences, or the broader learning context.

1.4 Ethical Concerns with Learning Analytics

Learning Analytics provides fantastic incentives for learners but can have legal implications that should not be overlooked (Yupangco, 2018). Many organizations use Learning Analytics to track and understand learners’ actions but do not recognize the ethical implications. There is no record of the amount of personal information accessible to the LMS, and online learning experts continue to address all of the data processing and management problems. As a result, companies do not need to follow established ethical standards; therefore, each company needs to think carefully about how the students’ personal information is used, who will have access to and what will be shared. The challenges that Learning Analytics poses are fresh and complex. You need to ensure that the information is treated ethically when you use your institution’s online learning and collect student data.

1.4.1 What Are the Ethical Concerns with Learning Analytics?

The practical difficulty of Learning Analytics is the matter of the student’s privacy. There are plenty of questions (Yupangco, 2018):

  • Who has access to the data of students?

  • How much do you need to remind LMS users about the processing of their data?

  • Do you need permission from learners to use their data?

  • Where are the data to be stored? How safe does it have to be?

  • Who controls the data of individuals?

  • What about data misinterpretation or other data errors?

  • Is there an ethical obligation to react to the data we have?

Let us look quickly at each of these issues.

Access to data

Who has access to the collected data? Should managers or course designers have the same access as teachers? Should instructors have access to all or only some of the data?

If the students are distributed geographically, it might help to know which city they live in-but that does not mean that teachers will know the street addresses of students. Additional sensitive information may well be off-limits: credit cards (if paid for courses), SSN, passwords, etc.

Transparency

How much do you tell students about your collection of information, and how can it be used? Most students know that organizations gather, monitor, and analyze specific quantities of their information. It is a pervasive online experience in contemporary online culture. However, your students probably do not know how much you use their data, particularly in an educational or training environment.

Consent

What type of consent should you ask for? Can you use any (or all) of their data without your consent? Do you want students to be anonymous online? There is a consensus that you ethically need some kind of student sign-off, but what the sign-off should contain is not an agreed standard.

Location and Security

Where should data be stored? Most organizations do not regulate the storage, location, or protection of the data collected. Data are frequently stored not only outside the institution but outside the country. In the country where data are located, specific laws on security and privacy may not apply, which means that your students’ data may be used or sold without their permission.

Ownership of records

Who is entitled to determine the use of the data? Can personal information or online learners’ behaviors, such as research or marketing, be used for unrelated purposes? May students test how their data are used? How long will the data be preserved until it is deleted?

Misinterpretation

Learning Analytics also relies heavily on data interpretation and the relation between dots. This means that teachers often have to rely on experience and assumptions. You can misinterpret the data or view patterns that do not exist. In cases of misinterpretation or inaccurate information, what are the consequences for responsibility and liability?

Obligation

Some learning experts believe that once we have the information, we are ethically obliged to act. However, it could be argued that not all actions require action. What is the data that requires action, and what is not the data? Are we responsible for acting on data?

Any organization that gathers student data for Learning Analytics has to deal with these issues. Other businesses’ repercussions may be dizzying, but you do not have to work your way around the problems yourself.

1.4.2 How to Protect Learners’ Privacy?

What steps must be taken to protect students? Employ these guidelines as a starting point for the code of conduct for Learning Analytics in your organization (Yupangco, 2018).

Set the scope and purpose

You need a good idea of what data is being processed, how, and for what purposes. By determining your Learning Analytics’ scope and purpose, you establish ethical limitations, which can be explained and defended if students have questions about how their information is used.

Be transparent and get consent

Provide students with documentation that clearly explains data collection and analysis processes. Explain how and why the data are to be used and how it is not to be used. Get each student’s permission before any information is collected. However, there could be legal situations in which students cannot opt-out; these cases should be obvious.

Protecting confidentiality

Restrict student data access. Not all accessors need full access; only staff and administrators grant them the permissions they need. Render learner data anonymous wherever possible. Be sure also that student information is covered for data storage and analysis when contracting with third parties.

Trigger positive interventions

You collect data to help students succeed. Set specific guidelines for how and when teachers will intervene to help hard-working students. Specify and who will conduct the sort and purpose of the interventions. Communicate the obligations of learners for self-intervention when feedback data are given.

1.4.3 Are You Ethically Prepared?

Learning Analytics will open up Pandora’s box of ethical issues for which you will have to plan. While there is still a broad debate about your ethical responsibilities in the online learning industry, you should create a fundamental code of conduct that protects your business and your learners from information misuse (Yupangco, 2018).

1.5 Use of Learning Analytics

1.5.1 Use of Learning Analytics at the University of Edinburgh

The University invests in the use of Learning Analytics for course design, enhance the student experience and attainment. The area of Learning Analytics and its related online student data analysis methods have significant potential to overcome the challenges faced by educational institutions and educational science. Through advanced data mining techniques combined through existing educational theory, practice, and research, Learning Analytics has provided new, real-time strategies for assessing fundamental problems, such as development and retention of students, the establishment of metrics for acquiring the twenty-first century’s skills, and customized and adaptive learning.

The University of Edinburgh has a wide variety of activities in Learning Analytics (The University of Edinburgh, 2020). These activities cross many different educational, organizational, practice, and research boundaries, as seen in Fig. 1.7 below. Led by the Vice Principle of Digital Education, the Center for Research in Digital Education, the School of Informatics, Student Systems, Information Services, and the Institute for Academic Development, Members, Researchers, and professionals from the university’s support and study divisions cooperate on a range of project initiatives funded by both internal and external sources. Only some projects use student data from the University of Edinburgh. Many projects are thoroughly explored and have no direct effect on students at the University of Edinburgh, although some discoveries and innovations may be used in the future by the University of Edinburgh. While the university cooperates widely with other universities, no data is shared.

Fig. 1.7
figure 7

Activities in the field of learning analytics at The University of Edinburgh (2020)

1.5.1.1 Learning Analytics Activities

  1. 1.

    Student-focused activities local at the University of Edinburgh.

    1. (a)

      The Learning Analytics Report Card (LARC): This project asks: “How can University teaching teams develop critical and participatory approaches to educational data analysis?” It aims to develop ways to include students in their selection and study as research collaborators and active participants and encourage a deeper understanding of computational analysis in education. This research was funded by a Principal’s Teaching Award grant and included special courses in the Master’s in Digital Education for graduates.

    2. (b)

      VLE Analytics: Information services have explored other Learning Analytics methods in the Learn and Moodle virtual learning environments operating with a limited number of specific training courses. Projects and tools include those that allow students to display their data and understand their activities and learning patterns. These studies have provided useful insight into student attitudes towards data and privacy and inform many other projects.

    3. (c)

      MOOC Analytics: The University of Edinburgh is one of the founders of massive online open courses. Information services analysts, the Digital Education Research Centre, the School of Computers, and the Institute for Academic Development systematically examine the digital trace, longitudinal, and performance data of students who have been registered in MOOC. The analysis includes understanding the study trends, social networking impact on student performance, and other demographic data on MOOC learners’ performance and experience. Experts from the University of Edinburgh have also partnered extensively with the Technical University of Delft, the University of Michigan, Massachusetts Institute of Technology, University of South Australia, the University of Texas at Arlington, and the University of Memphis.

  2. 2.

    Student-focused activities in collaboration with other institutions

    1. (a)

      Video analytics: Video analytics work is carried out mainly in collaboration with the University of South Australia, University of Sydney, University of New South Wales, and the University of British Columbia. Analytics is developed to investigate the effects of educational conditions and experience implementing the Online Video Annotations for Learning (OVAL), a video annotation software. Analytics is based on digital traces of OVAL interaction, used in studies with performance and engineering students and faculty members for academic development.

    2. (b)

      Flipped Classroom analytics: The work into analytics in flipped classrooms focuses primarily on developing methods for understanding the types of tactics and strategy changes that learners adopt over the academic semester based on the analysis of digital traces recorded by VLEs. Such analytics are used to enhance education and to advance the learning experience. This work is being conducted in collaboration with the University of Sydney, University South Australia, and the University of Belgrade.

    3. (c)

      Multimodal data of self-regulated learning: The purpose of this research is to establish measures of students’ cognition, metacognition, emotion, and motivation students during learning, supported by the European Association for Research on Learning and Instruction (EARLI) as the Centre for Innovative Research, to encourage the development of more effective and adaptive educational technologies. This research collaborates with Radboud University Nijmegen, University of Oulu, North Carolina State University, Technische Universität München.

    4. (d)

      Learning dashboard effectiveness: This project works to identify common problems faced by teachers and students during online learning and find out the types of Learning Analytics that are useful for teachers to effectively deal with these problems. The project creates a web-based analytics tool named ‘Loop’ that enables teachers to interpret with Learning Analytics more easily to enhance their teaching and learning practices. This work is carried out in collaboration with the University of Melbourne, the University of South Australia, and Macquarie University.

    5. (e)

      Learning beyond the LMS: When educators use new technologies to facilitate learning, assessing the content and essence of student participation in activities using technology related to the Learning Management System (LMS) of the organization poses challenges. This project extends the field of Learning Analytics by creating an open-source toolkit for a detailed analysis of learners’ participation in connected learning environments. This research is conducted in partnership with the University of Sydney, the University of Texas at Arlington, University South Australia, and the University of Technology Sydney.

  3. 3.

    Institution-focused activities local at the University of Edinburgh

    1. (a)

      Teaching and learning dashboards: In 2015, Senior Management wanted the Student Systems to

      • develop the use of student data to help improve the student experience, learning & teaching, and operational effectiveness;

      • focus on what is going to make a difference at the school level – support, help to develop insights, and share practice;

      • focus on accessibility, visualization, and data transparency to assist in simplifying and managing complexity;

      • check the use of dashboards to achieve these targets.

      Prototypes were developed using QlikView and BI tools in the second half of 2015 and delivered to various forums with senior officials from schools and colleges. The dashboards received clear and constructive reviews from academics. Funding has been raised to make such dashboards functional for 2016/17, and the dashboards are also used to give the schools more insight. Such dashboards complement the work being done to build Learning Analytics to provide direct support for individual students and better course design.

    2. (b)

      Learning Analytics Project with Civitas: A 2-year pilot was conducted with Civitas Learning (the leading U.S. company using data from completely online in-house Masters level programs and courses. A governance group was set up to guide the project. The choice of the online Master programs as the pilot field was essential as it has the advantage of being an easily identifiable and isolated pilot group that is sufficiently large to work within the pilot system. This was a data-rich atmosphere with a clear presence of students in the digital learning system. This project gained expertise in creating Learning Analytics models, strengthened the understanding of teachers and students in this field, improved awareness of the areas where the shortcomings of the data collection occur, and led to developing a policy to support Learning and Teaching Analytics.

  4. 4.

    Institution-focused activities in collaboration with other institutions

    1. (a)

      SHEILA Erasmus + Project: To help European universities maturely use and protect their online students’ digital data, the SHEILA Project builds a policy development mechanism that encourages formative assessment and personalized learning, using stakeholders’ direct participation the development process. It was done from 2016 to 2018.

    2. (b)

      Adoption of Learning Analytics: Although the interest in Learning Analytics is rapidly expanding, there are limited resources that can inform institutions on how best to start and use Learning Analytics (Ifenthaler & Gibson, 2020). This is a significant challenge as Universities try to engage in Learning Analytics and develop institutional capability. The research focused on how Learning Analytics informs teaching practice, personalized education, and applications to improve retention and identify at-risk students. The work was carried out in collaboration with Macquarie University, The University of Melbourne, University of New England, University of Technology, Sydney, and the Sunshine Coast.

Ethics and Privacy Protection

The university is committed to the fair use of data and practices in compliance with national and European legislation that protect consumer privacy. Analytics at the University are primarily used to recognize and increase the students’ achievements and learning experiences, increase teaching ability, and inform institutional data development. The University is actively involved in national and international initiatives to promote the ethical and privacy protection of Learning Analytics. All research activities in this field are conducted under the UK Research Integrity Office: Code of Practice for Research. The University’s active involvement in creating the Jisc’s Code of Practice for Learning Analytics was exceptional. The decisions on Learning Analytics at the University comply with the Jisc Code of Practice standards. Once external organizations are contracted to provide Learning Analytics services, arrangements are concluded that comply with the applicable UK and European regulations on personal data use and processing. The evaluated data were then rendered confidential with state-of-the-art protocols before being exchanged with contracting organizations.

1.5.2 Use of Learning Analytics in Other Recognized Institutions

Several institutions have used Learning Analytics to enhance the success and retention of students. Table 1.3 below shows some of them (Dietz-Uhler & Hurn, 2013). As the information in the table shows, many successful organizations have used or developed Learning Analytics tools that often gave students and teachers a “dashboard.” For example, Purdue University has developed SIGNALS to collect data and provide a dashboard for students and teachers to track student advancement. Many schools, such as UMBC, use a Learning Analytics tool built into their LMS to track students’ progress. As shown in the third column of Table 1.3, most of these institutions use data to improve their students’ performance in a course.

Table 1.3 List of institutions that implemented Learning Analytics successfully

Other educational institutions use analytics successfully to improve teaching, learning, and student performance. Campbell et al. (2007) highlight the institutions that have succeeded in predicting student performance using different data kinds. The University of Alabama, for instance, used first-year student data files to establish a retention model based on different factors, including English course grade and the time taken. For advising and retention, Sinclair Community College established the Student Success Plan (SSP). The collection and analysis of data allowed students to be monitored, and student performance improved.

1.6 Conclusion

The world today is powered by data. Learning Analytics provides decision-makers with greater insight into how training programs’ goals and learning requirements are matched with each other. In the last decade, Learning Analytics has become an important research field for technology-enhanced learning. Learning Analytics differentiate themselves by their commitment to providing value for learners in formal, informal, or mixed environments. They are used to understand and improve both the learning and its environments. There are now opportunities for Learning Analytics to exploit the power of feedback loops at the individual teachers and students’ level. Measuring and making visible student progress and evaluation gives students the ability to develop skills to track their progress and evaluate how their performance increases their achievements. Teachers gain insights into students’ success that help them adapt their lessons or implement initiatives through tutoring, tailor-made tasks, and so on. Learning Analytics allows educators to quickly see the output of modifications and strategies and provide feedback on quality improvement.

It is clear that Learning Analytics are gaining traction and are likely here to stay. There are many benefits of Learning Analytics; in particular, it will enable students to excel. We have large quantities of data available in educational institutions. The capacity to use this data to inform us what to do in the classroom, whether face-to-face or online, lies at heart about Learning Analytics. Institutions such as Purdue University, Rio Salado Community College, and the University of Michigan have paved the way and shown students and faculty the great benefits of Learning Analytics.

Learning Analytics has a significant role to play in higher education in the future. The importance of Learning Analytics can be seen in their role in driving higher education reforms and helping educators improve teaching and learning. Learning Analytics can penetrate the fog of confusion as to how resources are distributed, build competitive advantages, and, above all, increase learning quality and value. Unless your organization has not begun to use Learning Analytics to increase the efficiency and return on its training programs, it is time to think seriously about Learning Analytics. The advantages of Learning Analytics continue to help improve various higher education areas in which problems occur simultaneously.

1.7 Review Questions

Reflect on the concepts of this chapter guided by the following questions.

  1. 1.

    What do we mean by ‘learning analytics’?

  2. 2.

    For the student or workplace learner, how can data about their behavior help them improve their performance?

  3. 3.

    Who benefits from Learning Analytics?

  4. 4.

    What will be the fundamental role and impact of learning analytics in Education?

  5. 5.

    Define Learning Analytics.

  6. 6.

    What is Learning Analytics and illustrate its main components?

  7. 7.

    List the various types of Learning Analytics. Explain in brief.

  8. 8.

    How did the Learning Analytics work?

  9. 9.

    How to protect learner’s privacy?

  10. 10.

    What are the ethical concerns with Learning Analytics?