Abstract
One of the important issues in e-learning environments is visualization of the patterns in a way that learners or instructors can understand. Most of the studies about dashboards aim at supporting awareness and reflection, self-regulation, and monitoring. Within the scope of this chapter, detailed information about visualization and dashboard is given, research in the context of visualization and dashboards is examined, design principles are introduced, and challenges and future directions of visualization and dashboard existing are discussed.
Access provided by Autonomous University of Puebla. Download chapter PDF
Similar content being viewed by others
Keywords
1 Introduction
Instructional technologies provide researchers with significant opportunities in order to facilitate learners’ learning and design more effective learning environments. Nowadays, it is seen that especially digital learning environments are used intensively as instructional technology. Large amount of data about the learning experiences of learners are recorded in digital learning environments. Log data contains important information for learners, instructors, institutions, and decision-makers (Yoo et al., 2015). Thus, learning experience of learners in digital learning environments is being improved (Sin & Muthu, 2015). There are some major challanges in digital learning environments such as lack of quality log data, and analyzing, and understanding data (Kuosa et al., 2016). At this point educational data mining and learning analytics give researchers various opportunities. Educational data mining is a method used to discover meaningful structures and latent patterns in e-learning environments (Baker & Siemens, 2014). Educational data mining enables to discover meaningful patterns based on the system usage of learners, instructors, and administrators in order to increase the quality of online learning environments (Romero & Ventura, 2010). Learning analytics use dynamic data in digital learning environments and enable (a) real-time modeling about learners and learning, (b) estimation and optimization of learning processes, and (c) assessment, revealing, and analysis of data for learning environments and educational decision-making (Ifenthaler, 2015). Analyzing of log data can present some opportunities as supporting self-regulation, more effective learning experience with personalized learning, and increasing awareness of the learning process (Drachsler et al., 2014). The aim of learning analytics is to present highly adaptable and personalized learning environments (Ifenthaler & Widanapathirana, 2014).
One of the important issues in digital learning environments is visualization of the patterns in a way that learners or instructors can understand it (Duval, 2011). Patterns which are related to the learning experiences of learners in digital learning environments are presented via dashboards. A dashboard is a learning analytics application that used to support learning processes of learners by providing information about learning experiences (Yoo et al., 2015). Most of the studies about dashboards aim to (i) support awareness and reflection, (ii) self-regulation, and (iii) monitoring (Jivet et al., 2017). Expectations of learners from dashboard system which are based on learning analytics include (a) support planning and organization, (b) adaptive recommendation, (c) individual analysis of learning process, and (d) providing self-assessment opportunity (Schumacher & Ifenthaler, 2018).
On the other hand, many studies have been mentioned that there are deficiencies in the field of visualization and dashboard (Bodily et al., 2018; Sarikaya et al., 2018; Sedrakyan et al., 2019). Within the scope of this chapter, (a) detailed information about visualization and dashboard is given, (b) researches in the field of literature in the context of visualization and dashboards are examined, (c) design principles are introduced, and (d) visualization and dashboard existing challenge and its future directions are discussed.
2 Background
2.1 Visualization
Looking into the history of visualization, it is possible to see that it is expressed as “information visualization.” “Information visualization is the use of computer-supported interactive visual representations of abstract data to amplify cognition” (Card, Mackinley, & Shneiderman). The aim of information visualization helps to discover relations among data via visuals (Herman et al., 2000). In other words, information visualization aims to present unstructured information in databases/data warehouses with various visual techniques in a way that individuals can understand. Nowadays ICT can facilitate visualization but the activity that happens in the mind (Mazza, 2010). The important factors of visualization effectiveness are data, task, internal representation, and cognitive ability (working memory capacity, domain knowledge, etc.) (Zhu, 2007). As can be seen, visualization is related to many fields. Many principles should be taken into consideration when selecting and presenting visualization techniques. Visualization should be familiar and interesting to the learner in order to help learners to understand and interpret data (Kuosa et al., 2016). Within the scope of this research, visualization is discussed in the context of digital learning environments. Having a visual overview about the learner’s learning experience can be useful for instructor and learners (Duval, 2011). Visualization helps to promote not only problem-solving but also reflection (Rieber, 1995). In addition, learners’ awareness of the learning process increases through visualization (Khan & Pardo, 2016). It is easy to use visualizations in digital learning environments for students who have knowledge of visualization techniques (Sedrakyan et al., 2017). The use of these visualization techniques in digital learning environments has become widespread especially with learning analytics. Learning analytics use algorithmic analysis or information visualization in order to increase self-awareness and reflection based on tracking learning activities (Duval et al., 2012). Information of learners’ learning experiences and learning performances in digital learning environments are presented to stakeholders via dashboards.
2.2 Dashboard
Dashboard is a rich computer interface with reports, visual indicators, and warning mechanisms that gather information dynamically in the field of business (Malik, 2005). Technologically, dashboards are multilayered applications built on business intelligence and data integration infrastructure (Lempinen, 2012). In this section, it is handled as the learning analytics application for the learning experiences of learners in digital learning environments. Dashboards that used in digital learning environments are structured based on three research areas as information visualization, learning analytics, and educational data mining (Schwendimann et al., 2016). Latent learning patterns of learners in e-learning environments can be discovered with educational data mining algorithms, and these patterns are presented to learners using visualization techniques and dashboards through learning analytics. In this way, e-learning environments can be improved and made more effective. Dashboard presents visual representation of the learner’s or course’s current and historical status for flexible decision-making in learning environments (Few, 2006). Thus, dashboard research aims to identify which data is meaningful to different stakeholders in education and how the data can be presented to support meaningful processes (Schwendimann et al., 2016).
In addition, through information on the dashboard, students’ performance patterns can be explored, problems can be estimated and focused on this problems, and motivational structures can be identified (Podgorelec & Kuhar, 2011). Dashboards provide learners with real-time feedback, suggestions, and/or visualizations in order to support student reflection and knowledge awareness (Bodily et al., 2018). In the literature, dashboards were presented to the learners for comparison with peers, monitoring achievement of learning outcomes, and self-monitoring (Jivet et al., 2017). The benefits of dashboards in digital learning environments are summarized below (Yoo et al., 2015):
-
Allow teachers to know students’ learning situations in a real-time and scalable way.
-
Improve student’s self-knowledge levels.
-
Make more intelligent decisions with the help of data mining algorithms.
-
Help students improve their motivation and self-directed learning ability and achieve their learning goals.
Initial examples of research studies are Course Signals (Arnold & Pistilli, 2012) and GLASS (Gradient’s Learning Analytics System) (Leony et al., 2012). Course Signals is a system that can give appropriate feedback to students and provide information about students’ performance via signal lights. The other example is GLASS and it was developed so that learners can compare themselves with the peers with visualization of the learners’ performance. In addition to these studies, SAM (Student Activity Meter) (Govaerts et al., 2012) visualization tool for awareness and self-reflection, LOCO-Analyst (Ali et al., 2012), and eLAT (Dyckhoff et al., 2012) can be given as examples of initial studies of dashboards. It is possible to see that many studies on dashboard have been conducted after these years. The purposes of the dashboard studies in the literature are presented below (Sedrakyan et al., 2019):
-
Increase awareness about the learning process.
-
Support cognitive process.
-
Identify the student at risk.
-
Provide immediate feedback.
-
Monitor achievement status.
-
Provide procedural information.
-
Support decision-making.
-
Inform.
-
Display learner relations.
-
Compare.
-
Reflect.
Frequent visualization techniques utilized in dashboard studies are bar charts, line graphs, tables, pie charts, indicators, alert mechanisms, and network graphs (Podgorelec & Kuhar, 2011; Schwendimann et al., 2016). Frequent studies focus on learners (Arnold & Pistilli, 2012; Jin, 2017; Khan & Pardo, 2016; Papanikolaou, 2014; Park & Jo, 2015; Şahin & Yurdugül, 2019), instructors (Ali et al., 2012; Dyckhoff et al., 2012; Mottus et al., 2015; Van Leeuwen et al., 2015; Zhang et al., 2018), both learners and instructors (Sedrakyan et al., 2018; Kuosa et al., 2016; Govaerts et al., 2012; Leony et al., 2012), and other stakeholders, such as administrators (Rienties et al., 2016). In addition to these studies, it is seen that there is also research conducted in mobile dashboard design (Fulantelli et al., 2019; Kuhnel et al., 2018; Seiler et al., 2019).
For determining how and to what extent dashboard information will contribute to stakeholders, a benefits matrix may help as suggested in Table 27.1. The individual cells may be completed from individual perspective and provide a first decision point for implementing dashboard features (Ifenthaler, 2020).
3 Challenges and Future Directions
Learning analytics were presented as an emerging topic for adoption within 4–5 years in the 2011 Horizon Report prepared by the New Media Consortium (Johnson et al., 2011). Learning analytics are still part in the 2020 Horizon Report as “Analytics for Student Success” (Brown et al., 2020). Dashboards are used in order to support decision-making and increase awareness, motivation, and learning (Sarikaya et al., 2018). Graphics, tables, and visuals on the dashboards can be configured through visualization techniques. However, inappropriate designs and digital learning environments can negatively affect the learning processes of the learners. There are also findings that visualization techniques can produce undesirable effects or strengthen negative beliefs about learning and teaching (Gašević et al., 2015). In order to prevent undesirable effects, appropriate visualization techniques and dashboard designs should be developed. In this context, there are some difficulties in visualization and dashboard design. The difficulties encountered about visualization and dashboard design in the literature are presented in Table 27.2.
As can be seen in Table 27.2, some difficulties were expressed by different researchers. These difficulties and some solution proposals are as follows.
3.1 Linking Learning Theories
In the literature there is a gap between dashboard design and learning sciences (Sedrakyan et al., 2016; Sedrakyan et al., 2019). Several research related to dashboards consider a final design evaluation of the dashboard, while the design and development process is ignored (Bodily et al., 2018). In order to make dashboard designs more effective and to improve digital learning environments, dashboard designs should be associated with learning theories, instructional design, and learning design (Ifenthaler et al., 2018). Learning theories are important in order to explain learning phenomenon and also help to design principles for learning environments, content, and tasks (Ertmer & Newby, 1993). In addition, more appropriate instruction and learning design can be structured through linking between learning analytics and learning theories (Ifenthaler et al., 2018; Wong et al., 2019).
3.2 Determining Effective Metrics and Effective Visualization Techniques
Another challenge in dashboard design is which visualization techniques to be used and which learner metrics to show. It is important to identify which metric from students’ data is valuable to show (Yoo et al., 2015). A large amount of data about learners are stored unstructured in digital learning environments. However, it is necessary to determine which of these metrics are effective on the learner performance and present them to the learners. In order to determine these metrics, feature selection algorithms used in the pre-process stage of educational data mining can be utilized. Feature selection contributes to reducing the number of metrics/variables (Şahin et al., 2017). It is important to identify the metrics that are important for learners’ learning experiences or learning performances, as well as presenting them with appropriate visualization techniques and visuals. In this context, the most important question is which visualization techniques are more appropriate and can be utilized. For this purpose, it is necessary to conduct research on which of the visualization techniques are more appropriate for students and to examine this situation (Sedrakyan et al., 2018). In order to make an effective dashboard design, a theoretical link should be established with human cognition and perception, situation awareness, and visualization technologies, and it should be structured based on this theoretical framework (Yoo et al., 2015). In addition, contextually appropriate presentations, visual language, and social framing issues should be taken into consideration in dashboard design (Sarikaya et al., 2018).
3.3 Data Security and Privacy
The other issues that should be discussed in the design of the dashboard are data design, sharing, security, reliability, and confidentiality (Sarikaya et al., 2018). This issue appears to be discussed not only in dashboard designs but also in learning analytics (Mah et al., 2019) and in related fields (Bertino & Ferrari, 2018; Chen & Zhao, 2012; Song et al., 2012). Which information will be used for learners, who will be enabled to access, and obtaining the permissions of the users can be given as examples which should be discussed under this topic (Ifenthaler & Schumacher, 2019).
3.4 Adaptive Dashboard Design
Dashboards shall be configured to allow a high level of customization (Schumacher & Ifenthaler, 2018). In other words, dashboards that are appropriate for the needs and characteristics of learners should be presented to the learners. For example, comparison with the group to the learners who have high external motivation and the learners who have high internal motivation can be presented with their daily individual performance. However, in order to develop adaptive designs, learning patterns, profiles, or preferences should be discovered based on the individual characteristics of the learners. Personality types, motivation sources, learning strategies, etc. could be indicators for individual characteristics of learners. In addition, adaptive dashboard designs can be structured specific to individual or to groups.
3.5 Amplifies Cognition
Cognitive concepts such as paying attention to cognitive load, human perception, and data literacy, avoiding visual clutter, and dividing data into interpretable sections have direct effects on design (Ahn et al., 2019). Therefore, the visuals and graphics that are presented should amplify the cognition of the learners. (a) Increasing resources, (b) reducing search, (c) enhancing recognition of patterns, (d) perceptual inference, (e) perceptual monitoring, and (f) manipulable environment amplify cognition (Card, Mackinley, & Shneiderman). In this context, it is necessary to develop designs such as presenting information related to each other together, presenting both visual and textual notifications, meaningful information, presenting different visuals and information according to the learners, and presenting graphics that are easy to use by the learners. It is possible to say that the visualization principles of proximity, similarity, enclosure, closure, continuity, and connection introduced by Gestalt in the dashboard designs still remain valid (Few, 2013). Concepts such as ease of use or perceived usefulness are within acceptance structures, and information on this concepts are included under the next topic. The presentation of different visuals or information to the learners is part of the adaptive dashboard.
3.6 Acceptance Structures
The structures in the technology acceptance models should be taken into consideration in the dashboard designs (Sedrakyan et al., 2018). The findings about why stakeholders use the developed e-learning environment or why they do not use it can be revealed through their acceptance structures. By determining these reasons, it will shed light on more appropriate environment designs. The important point is not only to develop a learning environment but also to use the developed environment by its stakeholders in accordance with its purpose. Davis (1989) defined these structures “perceived usefulness” and “perceived ease of use” in the context of Technology Acceptance Model (TAM). Vankatesh et al. (2003) defined “unified theory of acceptance and use of technology (UTAUT)” as “performance expectancy,” “effort expectancy,” “social influence,” and “facilitating conditions.” The learning environments should present some features to the learners such as being easy to use, having high-level perceived usefulness, increasing learners’ performance, etc. from this perspective. These models can guide researchers as to which visualization techniques can be utilized or what information will be presented to the learners. In addition to these, present sequential pattern via dashboard, intervention design, and dashboard interaction as an interaction type can be shown as future directions of the dashboard. Monitoring learners their own sequential behavior or the behavior of successful students can guide what they should do in the next step. The ultimate purpose of the intervention is to increase student achievement or improve students’ learning experience (Pardo & Dawson, 2016). A structured intervention model that developed with learning analytics to support learning and teaching process can improve the learning performance of learners (Wu et al., 2015). At this point, intervention design and integration of this design to the systems are issues to be considered. In the digital learning environments, there are different types of interactions such as learner-learner, learner-instructor, learner-content, and learner-assessment. Besides, recently learner-dashboard interaction has also started to be considered as a type of interaction in the digital learning environments (Khan & Pardo, 2016; Rei et al., 2017). Learners’ time spent in the dashboard, interaction behavior after interaction with the dashboard, charts and graphics that were seen by the learners, etc. can provide important information about the learning and teaching process.
4 Design Principles
There is no clear evidence about which features should be included in the design of visualizations (Zhu, 2007). It will be very useful to prepare some guidelines in order to design effective and goal-oriented visualizations (Duval, 2011). Guidelines and design principles can be contributed to the field such as (a) guiding the designers and researchers, (b) determining the issues to be considered in the design, (c) contributing to the evaluation of the developed environments, (d) developing effective and appropriate designs for the stakeholders, etc. Few (2013) suggest three design principles: (a) the most important information should stand out from others in a dashboard that usually fits limited space on a single screen, (b) the information on the dashboard should support the awareness of the students and aid prompt understanding using various visualization technologies, and (c) the information should be disseminated in a meaningful way, and the information elements should support the students’ decision-making goal and ultimate goal. Some design and evaluation principles of the dashboard were defined by Yoo et al. (2015):
-
Goal orientation
-
Information usefulness
-
Visual effectiveness
-
Appropriation of visual representation
-
User-friendliness
-
Understanding
-
Reflection
-
Learning motivation
-
Behavioral change
-
Performance improvement
-
Competency development
In this context, design principles have been put forth in order to guide researchers and designers as shown in Table 27.3.
References
Ahn, J., Campos, F., Hays, M., & DiGiacomo, D. (2019). Designing in context: Reaching beyond usability in learning analytics dashboard design. Journal of Learning Analytics, 6(2), 70–85.
Ali, L., Hatala, M., Gašević, D., & Jovanović, J. (2012). A qualitative evaluation of evolution of a learning analytics tool. Computers & Education, 58(1), 470–489.
Arnold, K. E., & Pistilli, M. D. (2012, April). Course signals at Purdue: Using learning analytics to increase student success. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 267–270).
Baker, R. S., & Siemens, G. (2014). Educational data mining and learning analytics. In K. Sawyer (Ed.), Cambridge handbook of the learning sciences (2nd ed.). Cambridge University Press. 253–274 p.
Bertino, E., & Ferrari, E. (2018). Big data security and privacy. In A comprehensive guide through the Italian database research over the last 25 years (pp. 425–439). Springer.
Bodily, R., Ikahihifo, T. K., Mackley, B., & Graham, C. R. (2018). The design, development, and implementation of student-facing learning analytics dashboards. Journal of Computing in Higher Education, 30(3), 572–598.
Brown, M., McCormack, M., Reeves, J., Brooks, D. C., Grajek, S., Alexander, B., Bali, M., Bulger, S., Dark, S., Engelbert, N., Gannon, K., Gauthier, A., Gibson, D., Gibson, R., Lundin, B., Veletsianos, G., & Weber, N. (2020). 2020 EDUCAUSE horizon report, teaching and learning edition. EDUCAUSE.
Card, S. K., Mackinley, J. D., & Shneiderman, B. (1999). Readings in information visualization: Using vision to think. Morgan Kaufmann.
Chen, D., & Zhao, H. (2012). Data security and privacy protection issues in cloud computing. In 2012 international conference on computer science and electronics engineering (Vol. 1, pp. 647–651). IEEE.
Dabbebi, I., Iksal, S., Gilliot, J. M., May, M., & Garlatti, S. (2017). Towards adaptive dashboards for learning analytic: An approach for conceptual design and implementation. In 9th international conference on computer supported education (CSEDU 2017) (pp. 120–131). Porto, Portugal. https://doi.org/10.5220/0006325601200131
Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13, 319–340.
Drachsler, H., Dietze, S., Herder, E., d’Aquin, M., & Taibi, D. (2014). The learning analytics & knowledge (LAK) data challenge 2014. In Proceedings of the fourth international conference on learning analytics and knowledge (pp. 289–290).
Duval, E. (2011). Attention please! Learning analytics for visualization and recommendation. In Proceedings of the 1st international conference on learning analytics and knowledge (pp. 9–17).
Duval, E., Klerkx, J., Verbert, K., Nagel, T., Govaerts, S., Parra Chico, G. A., Alberto, G., Odriozola, S., Luis, J., & Vandeputte, B. (2012). Learning dashboards & learnscapes. In Educational interfaces, software, and technology (pp. 1–5).
Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012). Design and implementation of a learning analytics toolkit for teachers. Educational Technology & Society, 15(3), 58–76.
Ertmer, P. A., & Newby, T. J. (1993). Behaviorism, cognitivism, constructivism: Comparing critical features from an instructional design perspective. Performance Improvement Quarterly, 6(4), 50–72.
Few, S. (2006). Information dashboard design: The effective visual communication of data. O’Reilly Media, Inc.
Few, S. (2013). Information dashboard design: Displaying data for at-a-glance monitoring (Vol. 81). Analytics Press.
Fulantelli, G., Taibi, D., Ammirata, F., & La Mattina, C. (2019). A mobile learning analytics tool to foster students’ self reflection. In EdMedia+ innovate learning (pp. 721–726). Association for the Advancement of Computing in Education (AACE).
Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71.
Govaerts, S., Verbert, K., Duval, E., & Pardo, A. (2012). The student activity meter for awareness and self-reflection. In CHI’12 extended abstracts on human factors in computing systems (pp. 869–884).
Herman, I., Melançon, G., & Marshall, M. S. (2000). Graph visualization and navigation in information visualization: A survey. IEEE Transactions on Visualization and Computer Graphics, 6(1), 24–43.
Ifenthaler, D. (2015). Learning analytics. In J. M. Spector (Ed.), The SAGE encyclopedia of educational technology (Vol. 2, pp. 447–451). Sage.
Ifenthaler, D. (2020). Change management for learning analytics. In N. Pinkwart & S. Liu (Eds.), Artificial intelligence supported educational technologies (pp. 261–272). Springer.
Ifenthaler, D., & Schumacher, C. (2019). Releasing personal information within learning analytics systems. In D. G. Sampson, J. M. Spector, D. Ifenthaler, P. Isaias, & S. Sergis (Eds.), Learning technologies for transforming teaching, learning and assessment at large scale (pp. 3–18). Springer.
Ifenthaler, D., & Widanapathirana, C. (2014). Development and validation of a learning analytics framework: Two case studies using support vector machines. Technology, Knowledge and Learning, 19(1-2), 221–240.
Ifenthaler, D., Gibson, D. C., & Dobozy, E. (2018). Informing learning design through analytics: Applying network graph analysis. Australasian Journal of Educational Technology, 34(2), 117–132. https://doi.org/10.14742/ajet.3767
Jin, S. H. (2017). Using visualization to motivate student participation in collaborative online learning environments. Journal of Educational Technology & Society, 20(2), 51–62.
Jivet, I., Scheffel, M., Drachsler, H., & Specht, M. (2017). Awareness is not enough: Pitfalls of learning analytics dashboards in the educational practice. In European conference on technology enhanced learning (pp. 82–96). Springer.
Johnson, L., Smith, R., Willis, H., Levine, A., & Haywood, K. (2011). The 2011 horizon report. The New Media Consortium.
Khan, I., & Pardo, A. (2016). Data 2U: Scalable real time student feedback in active learning environments. In Proceedings of the sixth international conference on learning analytics & knowledge (pp. 249–253).
Kuhnel, M., Seiler, L., Honal, A., & Ifenthaler, D. (2018). Mobile learning analytics in higher education: Usability testing and evaluation of an app prototype. Interactive Technology and Smart Education, 15(4), 332–347.
Kuosa, K., Distante, D., Tervakari, A., Cerulo, L., Fernández, A., Koro, J., & Kailanto, M. (2016). Interactive visualization tools to improve learning and teaching in online learning environments. International Journal of Distance Education Technologies (IJDET), 14(1), 1–21.
Lempinen, H. (2012). Constructing a design framework for performance dashboards. In Scandinavian conference on information systems (pp. 109–130). Springer.
Leony, D., Pardo, A., de la Fuente Valentín, L., de Castro, D. S., & Kloos, C. D. (2012, April). GLASS: A learning analytics visualization tool. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 162–163).
Mah, D. K., Yau, J. Y. K., & Ifenthaler, D. (2019). Epilogue: Future directions on learning analytics to enhance study success. In Utilizing learning analytics to support study success (pp. 313–321). Springer.
Malik, S. (2005). Enterprise dashboards: Design and best practices for IT. Wiley.
Mazza, R. (2010). Visualization in educational environments. C. Romero, S. Ventura, M. Pechenizkiy, & RSJ de Baker (Eds.), Handbook of educational data mining, 9-26. CRC Press
Mottus, A., Graf, S., & Chen, N. S. (2015). Use of dashboards and visualization techniques to support teacher decision making. In Ubiquitous learning environments and technologies (pp. 181–199). Springer.
Papanikolaou, K. A. (2014). Constructing interpretative views of learners’ interaction behavior in an open learner model. IEEE Transactions on Learning Technologies, 8(2), 201–214.
Pardo, A., & Dawson, S. (2016). Measuring and visualizing learning in the information-rich classroom. In P. Reimann (Ed.), Learning analytics. (41–55p). Routledge.
Park, Y., & Jo, I. H. (2015). Development of the learning analytics dashboard to support students’ learning performance. Journal of Universal Computer Science, 21(1), 110.
Podgorelec, V., & Kuhar, S. (2011). Taking advantage of education data: Advanced data analysis and reporting in virtual learning environments. Elektronika ir Elektrotechnika, 114(8), 111–116.
Rei, A., Figueira, Á., & Oliveira, L. (2017, September). A system for visualization and analysis of online pedagogical interactions. In Proceedings of the 2017 international conference on E-education, E-business and E-technology (pp. 42–46).
Rieber, L. P. (1995). A historical review of visualization in human cognition. Educational Technology Research and Development, 43(1), 45–56.
Rienties, B., Boroowa, A., Cross, S., Kubiak, C., Mayles, K., & Murphy, S. (2016). Analytics4Action evaluation framework: A review of evidence-based learning analytics interventions at the Open University UK. Journal of Interactive Media in Education, 2016(1), 1–11.
Romero, C., & Ventura, S. (2010). Educational data mining: A review of the state of the art. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 40(6), 601–618.
Şahin, M., & Yurdugül, H. (2019). An intervention engine design and development based on learning analytics: The intelligent intervention system (In2S). Smart Learning Environments, 6(1), 18.
Şahin, M., Keskin, S., & Yurdugül, H. (2017). Determination of learner characteristics and ınteraction variables for e-learning design by feature selection algorithms. In 11th international computer and instructional technologies symposium (ICITS 2017). Malatya, Turkey.
Sarikaya, A., Correll, M., Bartram, L., Tory, M., & Fisher, D. (2018). What do we talk about when we talk about dashboards? IEEE Transactions on Visualization and Computer Graphics, 25(1), 682–692.
Schumacher, C., & Ifenthaler, D. (2018). Features students really expect from learning analytics. Computers in Human Behavior, 78, 397–407.
Schwendimann, B. A., Rodriguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S., Holzer, A., Gillet, D., & Dillenbourg, P. (2016). Perceiving learning at a glance: A systematic literature review of learning dashboard research. IEEE Transactions on Learning Technologies, 10(1), 30–41.
Sedrakyan, G., Järvelä, S., & Kirschner, P. (2016). Conceptual framework for feedback automation and personalization for designing learning analytics dashboards. In EARLI SIG 2016, Date: 2016/11/30-2016/12/01, Location: Oulu, Finland.
Sedrakyan, G., Leony, D., Muñoz-Merino, P. J., Kloos, C. D., & Verbert, K. (2017). Evaluating student-facing learning dashboards of affective states. In European conference on technology enhanced learning (pp. 224–237). Springer.
Sedrakyan, G., Malmberg, J., Verbert, K., Järvelä, S., & Kirschner, P. A. (2018). Linking learning behavior analytics and learning science concepts: Designing a learning analytics dashboard for feedback to support learning regulation. Computers in Human Behavior, 105512.
Sedrakyan, G., Mannens, E., & Verbert, K. (2019). Guiding the choice of learning dashboard visualizations: Linking dashboard design and data visualization concepts. Journal of Computer Languages, 50, 19–38.
Seiler, L., Kuhnel, M., Ifenthaler, D., & Honal, A. (2019). Digital applications as smart solutions for learning and teaching at higher education institutions. In Utilizing learning analytics to support study success (pp. 157–174). Springer.
Sin, K., & Muthu, L. (2015). Application of big data in education data mining and learning analytics--a literature review. ICTACT Journal on Soft Computing, 5(4), 1035–1049.
Song, D., Shi, E., Fischer, I., & Shankar, U. (2012). Cloud data protection for the masses. Computer, 45(1), 39–45.
Van Leeuwen, A., Janssen, J., Erkens, G., & Brekelmans, M. (2015). Teacher regulation of cognitive activities during student collaboration: Effects of learning analytics. Computers & Education, 90, 80–94.
Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27, 425–478.
Wong, J., Baars, M., de Koning, B. B., van der Zee, T., Davis, D., Khalil, M., … Paas, F. (2019). Educational theories and learning analytics: From data to knowledge. In Utilizing learning analytics to support study success (pp. 3–25). Springer.
Wu, F., Huang, L., & Zou, R. (2015). The design of intervention model and strategy based on the behavior data of learners: A learning analytics perspective. Hybrid Learning: Innovation in Educational Practices, 9167, 294–301p.
Yoo, Y., Lee, H., Jo, I. H., & Park, Y. (2015). Educational dashboards for smart learning: Review of case studies. In Emerging issues in smart learning (pp. 145–155). Springer.
Zhang, J.-H., Zhang, Y.-X., Zou, Q., & Huang, S. (2018). What learning analytics tells us: Group behavior analysis and individual learning diagnosis based on long-term and large-scale data. Educational Technology & Society, 21(2), 245–258.
Zhu, Y. (2007). Measuring effective data visualization. In International symposium on visual computing (pp. 652–661). Springer.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Sahin, M., Ifenthaler, D. (2021). Visualization and Dashboards: Challenges and Future Directions. In: Sahin, M., Ifenthaler, D. (eds) Visualizations and Dashboards for Learning Analytics. Advances in Analytics for Learning and Teaching. Springer, Cham. https://doi.org/10.1007/978-3-030-81222-5_27
Download citation
DOI: https://doi.org/10.1007/978-3-030-81222-5_27
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-81221-8
Online ISBN: 978-3-030-81222-5
eBook Packages: EducationEducation (R0)