Introduction

In this digital age, access to technology leaves large digital data footprints. Our world is now a data-driven one, where business and marketing industries analyze customers’ purchasing behaviors using data to predict their interest in future products (Fritz 2011). Amazon predicts the types of books we may want, and Netflix suggests movies according to our favored genres (Dietz et al. 2018). Thus, the potential of data analytics has attracted academia to tap into similar approaches that leverage data. Studies investigating learning analytics (LA) approaches in higher education have garnered positive findings in regard to capturing students’ needs, improving learning outcomes, and supporting student success initiatives (Denley 2014; Dietz-Uhler and Hurn 2013; Gasevic et al. 2016; Smith et al. 2012). Utilizing an LA approach at the course level, instructors can gain insights into students’ learning behaviors based on patterns found within data to inform decisions on needed interventions. These patterns found within data may not be apparent using traditional methods (Muljana and Placencia 2018).

In course design practice, LA approach offers benefits for making course-design decisions to meet learners’ needs and enhance learning experience (Dietz-Uhler and Hurn 2013). Research has shown that involving multiple stakeholders, including administrators, instructors, and support personnel like instructional designers (IDs), in the decision-making process is imperative in determining the effectiveness of implementation (De Freitas et al. 2015; Ifenthaler 2017; Macfadyen et al. 2014). While a plenitude of studies paints a picture of faculty members’ and administrators’ perspectives of LA (De Freitas et al. 2015; Dietz-Uhler and Hurn 2013; Herodotou et al. 2019; Schumacher and Ifenthaler 2017; van Leeuwen 2018), little is known about the perception and current state of LA-related practice among IDs. Due to the early stage of LA-related research, existing studies are mostly quantitative, calling for qualitative studies to extend the body of literature (De Freitas et al. 2015; Papamitsiou and Economides 2014). In response to this call, this qualitative study focuses on the overall perspective of IDs regarding LA approaches in course design within the higher-education setting, seeking to further investigate the current state of LA adoption among IDs and the factors influencing their adoption.

Literature review

The proliferation of learning management system in higher education

The proliferation of the Learning Management System (LMS) has contributed to the emergence of LA in higher education (Ferguson, 2012). A national survey reveals that 99% of 800 institutions use an LMS (Dahlstrom et al. 2014). Another discovery from the survey highlights faculty beliefs in the LMS advantages for enhancing teaching and learning. Specifically, 74% of faculty members believe that the LMS can help enhance teaching, and 71% believe in the value held by LMS features to optimize student learning (Dahlstrom et al. 2014). Essentially, LMS does not only deliver content, but also tracks learning activities and real-time progress (You, 2016). As a result, early detection of struggling students is now more feasible (Macfadyen and Dawson 2010) by analyzing course activity data within the LMS (Casey and Azcona 2017; Valsamidis et al. 2012). Those available data include: (a) number of times resources are accessed; (b) data and time of access; (c) number of discussion posts generated; (d) number and date/time of messages to instructor; (e) assignment timestamp; (f) types of resource accessed; and (g) points from discussion forums, assignments, and tests (Dietz-Uhler and Hurn 2013).

To further examine student engagement, qualitative information can be generated by thematizing discussion posts and questions and analyzing individual contributions to collaborative projects. Such indicators also aid instructors in assessing learning progress, identifying struggling students, determining remedies (Casey and Azcona 2017; Dietz-Uhler and Hurn 2013; Macfadyen and Dawson 2010; Muljana and Placencia 2018), as well as providing direction for course revisions and future design (Lockyer and Dawson 2011; Persico and Pozzi 2015; Muljana and Placencia 2018).

Instructional design and learning analytics

“… there is no formula for great design. [Instead,] it is a problem-solving process…” (Christensen 2008, p. 29). This quote represents the complex decision-making process of IDs during the design and development of instruction. In design practices, not a single instructional design model can address every learning issue and design constraint. Therefore, the design process is rarely smooth, systematic, or linear (Tracey and Boling 2014). IDs use their knowledge, skills, experience, and pedagogical beliefs to make judgments and consider contextual factors (Tessmer and Richey 1997) such as the nature of learners, nature of learning task, and learning environment to inform design decisions (Gagné 1985; Keller 1987; Morrison et al. 2013; Leshin et al. 1992; Smith and Ragan 1993). In higher education, IDs also make design decisions based on evidence like students’ traits, prior performance, subject difficulty, and completion rates of the course. Additionally, designing instruction should focus on the students’ learning experiences, such as what the students perform during the learning process (Nguyen et al. 2017; Rienties et al. 2017). Therefore, the course elements can be carefully designed to affect students’ learning behaviors, engagement, and learning outcomes within the learning context (Mangaroska and Giannakos 2018). This is where the utilization of data analytics plays an essential role in providing such evidence (Dietz et al. 2018).

The definition of LA that we adopted for this study is “the measurement, collection, analysis, and reporting of data about learners and their contexts, for the purpose of understanding and optimizing learning and the environment in which it occurs” (Siemens and Long 2011, p. 32). One key point perceived from the definition is that the measurements should focus on learners and their outcomes within a learning context. Another point within the definition emphasizes the improvement of learning and its environment. In other words, the use of LA approach must be aligned with the pedagogical intent and context; otherwise, the interpretation of data lacks meaning (Gašević et al. 2015).

The data acquired from an LA approach can be used as an evaluation means to inform a pedagogical decision on improving the course design (Lockyer et al. 2013). The first type of analytic approach, called checkpoint analytics, can indicate whether students have acquired the learning prerequisites by analyzing their records of accesses to the materials (Lockyer et al. 2013). The data used for checkpoint analytics are such as accesses to the course materials. The second type approach, called process analytics, additionally provides insights into the learning process and student engagement (Elias 2011; Lockyer et al. 2013). The data used for performing the process analytics are such as discussion posts that can be analyzed to discover the engagement level (Lockyer et al. 2013) and thematized to reveal the knowledge acquisition (Muljana and Placencia 2018).

Some learning data can be obtained from an LMS and analyzed from three perspectives: formative or real-time, summative, and predictive for future improvements (Ifenthaler 2017; Ifenthaler and Widanapathirana 2014). A formative LA approach yields insights regarding learners’ performance and learning path, allowing real-time adjustment. For example, Yen et al. (2015) explore various ways of adjusting instructional strategies to influence learning behaviors and assess the changes in the LMS log data related to cognitive load. The analysis of dynamic data provides insights for simultaneously guiding students to focus on the discussion topic and minimizing the split-attention effect in discussion participation. Since most IDs work closely with faculty (Ritzhaupt and Kumar 2015), this type of formative LA is performable through collaboration with the faculty during the implementation phase. As a result, the course design is more iterative and adjustable according to the real-time progress.

Through summative evaluation, instructional strategies and impact of interventions are assessed at the end of a learning event and before determining future improvements (Ifenthaler and Widanapathirana 2014). Analyzing correlations between behaviors captured through LMS data and course achievement, Firat (2016) discovers that students who spend more time accessing the materials in LMS tend to achieve better GPAs, which implies that course elements should be designed deliberately to encourage students to access and review course materials regularly. In addition to analyzing the LMS usage data, First (2016) acquires students’ insights and thematizes the insights into six considerations for course design; namely that the course should: (1) be appropriate for effective use; (2) employ attractive design; (3) foster active interaction; (4) provide learning reinforcement; (5) support the use of social media; and (6) provide flexible access. These findings imply that IDs should thoughtfully consider these six conditions while designing and developing courses. Another summative evaluation tactic can be performed by retrieving the course usage data at the end of the semester, which can highlight the least or most accessed course resources (Muljana and Placencia 2018). Faculty and IDs can work together to reflect upon the data and find out why some resources were least accessed to inform a design decision for the next iterative course design.

Additionally, analytical data can reveal exemplary learning behaviors carried out by high-performing learners. You (2016) examines the impact of early predictions based on mid-semester learning data and discovers significant behaviors predicting course achievement. Findings reveal that (1) class attendance during the synchronous class sessions; (2) assignment submission timestamps; (3) frequent access to the course materials; and (4) number of downloads for the course materials are the stronger predictors of course achievement. Abu Saa et al. (2019), who perform a systematic review on predictive analytics, find further existent studies that display an agreement. The more frequent students access the course materials, practice in solving problems, and submit assignments, the more likely they enhance their learning achievement (Abdous et al. 2012; Burgos et al. 2018; Hung et al. 2012; Lara et al. 2014; Xing et al. 2015; Zafra and Ventura 2012). This, again, informs IDs to consider designing instruction that motivates students to review materials regularly and complete assignments on time.

Based on the three perspectives above, analysis of learning data is advantageous at all times and suitable to the design process that is characteristically iterative (Muljana and Placencia 2018). Particularly in higher education, dynamic data are helpful in grounding the improvement of course and program designs (Dietz et al. 2018). Simply put, LA provides an evidence-based approach that sustains improvements in the course design process (Dietz et al. 2018; Persico and Pozzi 2015). However, the barriers to implement an LA approach exist alongside the advantages. The process of translating data into actionable interventions to help students perform better is a non-trivial one (Kitto et al. 2018; Wise and Jung 2019; Wise and Vytasek 2017), while sorting and assessing the meaningful information from the data to align with the pedagogical goals is not a simple task (Li et al. 2017). Additionally, there is a lack of necessary technology to support the LA implementation and a shortage of personnel who are well-versed in data science as well as in teaching and learning (Ifenthaler 2017). LA-related topics on ethics and student privacy are still an ongoing concern (Dietz et al. 2018; Verbert et al. 2012). In terms of the research gap, it is not clear in the current literature to what extent IDs in higher education take advantage of LA’s strengths to improve course design, what kind of barriers they encounter, and what kind of viewpoint they have regarding the LA approach.

Theoretical framework

We adopted the Technology Acceptance Model (TAM) to guide our investigation. TAM, posited initially in 1989, has been used to explore users’ acceptance of technology (Davis 1989; Venkatesh 2000). TAM consists of two primary components affecting user acceptance, namely perceived usefulness and perceived ease of use. Scholars have included additional determinants underlying these two constructs to expand the model, resulting in TAM 2 and TAM 3 (Venkatesh and Bala 2008). In the past, TAM was rarely utilized to examine technology adoption in education (Park 2009). More recently, additional studies have verified TAMs’ suitability for understanding the acceptance and adoption processes for an extensive range of technological interventions aiming to improve learning and instruction (Al-Alak and Alnawas 2011; Fathema et al. 2015; Lai et al. 2012; Luo et al. 2019; Pituch and Lee 2006; Tarhini et al. 2013a, b).

A limited number of LA-related studies have used TAM to examine individual beliefs regarding the adoption of LA. A quantitative study validated the Learning Analytics Acceptance Model—adapted from TAM—to reveal factors affecting the adoption of an LA tool (Ali et al. 2013). TAM 3 also grounded a quantitative study investigating factors influencing the adoption of data mining tools among 206 employees in a corporate setting (Huang et al. 2012). However, further exploration of various factors determining LA adoption using the TAM framework is much needed to address the current research gap (Papamitsiou and Economides 2014).

Venkatesh and Bala (2008) established an overarching framework, representing the aggregate concept accumulated from TAM-related studies over the years. Underlying all TAMs (TAM, TAM 2 and TAM 3) are four synthesized determinants that explain perceived usefulness and perceived ease of use, including: (a) individual differences which represent demographics and personality; (b) system characteristics that include features of the system; (c) social influence that covers social pressures or processes; and (d) facilitating conditions which consist of organizational support and resources. A previous qualitative study has successfully adopted these four synthesized determinants to comprehend students’ perceptions toward the use of Twitter to enhance instruction (Luo et al. 2019). For our study, instead of using a specific TAM to explain participants’ perception which is very defined and more suitable for quantitative studies (Venkatesh and Bala 2008), we used the four synthesized determinants to formulate the overarching research question: how do IDs’ in higher education perceive their experience integrating LA approach into course design from the lens of four synthesized determinants?

Methods

This study employed a phenomenological approach, allowing us to address the research question by obtaining participants’ insights resulting from their experiences (Goulding 2005). We specifically intended to gain a comprehension about “phenomena from the perspectives of people involved” (Welman and Kruger 1999, p. 189) and from the experiences of these people with the issue being explored in the study (Groenewald 2004). Put simply, we bracketed the phenomena by focusing on the ID’s experiences in course design from the lens of LA integration in higher education guided by the abovementioned framework, and to establish implications for informing scholars, administrators, faculty, and personnel who are the key players for implementing effective LA.

Participants

We employed purposive sampling by calling for research participation through a professional organization and social media groups joined by IDs. We required the participants to have course design experiences in higher education. The call resulted in 18 involved participants who worked in a higher-education setting, held course-design related responsibilities, and had relevant experiences in between three to 18 years. Table 1 depicts participants’ demographics.

Table 1 Participants’ Demographics

Data collection and analysis

Data collection took place in three ways that participants could choose from: (1) focus group; (2) individual semi-structured interviews; and (3) email interviews. Amongst 18 participants, seven joined the focus group (lasting 75 min), and three participants opted for individual interview (lasting between 45 to 60 min). Focus group and individual interview sessions were video recorded and transcribed verbatim. Probing questions were additionally asked to acquire more in-depth insights during each session (see Table 2 for the interview and probing questions)—interview and probing questions were also designed to attain in-depth insights from the participants regarding any analytics tools available at their fingertips, such as those provided by LMS. The remaining eight participants preferred email interviews. Email interviews were mostly followed up by probing questions via multiple email correspondences to obtain elaborative responses.

Table 2 Interview Question and Probing Questions

During the analysis stage, data explicitation went through three phases: (1) structural coding; (2) in vivo coding; and (3) pattern coding. In the structural coding phase, we segmented all data into four groups (i.e. individual differences, system characteristics, social influences, and facilitating conditions) and concurrently chunked each group into broad structures (Saldaña 2013). Structural coding was to first identify broad segmented topics before conducting further qualitative analysis. This was aligned with the phenomenological process in which we established an initial, overall sense, and understanding of participants’ insights (Groenewald 2003; Zinker 1978).

Through in vivo (verbatim) coding phase, units of meaning were delineating. The first author performed this technique by coding the data using participants’ verbatims and open codes. These in vivo codes were, afterward, compared with the structural codes. Similar codes from both coding processes were merged. The different ones were re-examined to determine whether they were newly emerging codes or supporting codes. Since in vivo coding allowed the researcher to “prioritize and honor the participant’s voice,” it can “limit the researcher’s perspective on the data” (Saldaña 2013, p. 91, 94), which helped bracket the researcher’s biases and perspectives. This tactic is also appropriate for phenomenology (Saldaña 2013). Moreover, since the broad segments were already generated through structural coding, this second level of coding offered a triangulation through a reanalysis and further careful review (Saldaña 2013), in addition to serving the purpose as a double-coding tactic (Baxter and Jack 2008; Krefting 1991). Specifically, this double-coding tactic was performed two weeks after the first coding phase was completed, ensuring the consistency within the same coder (Elliott 2018). Next, the codes were shared and debriefed with the second author to check for mutual understanding and agreements.

In the next phase, the researchers performed pattern coding to identify responses with commonalities (Saldaña 2013). Relationships between codes were discovered by conducting an analytical process through Boolean terms such as “and,” “or,” and “not.” Codes conveying the same or similar meaning were, next, clustered into a single theme.

The overall data collection and analysis procedure was aligned with the data explicitation steps suggested by Groenewald (2004) and Hycner (1999), including: (a) bracketing and phenomenological reduction; (b) delineating units of meaning; (c) clustering of units of meaning to form themes; (d) summarizing each interview, validating it; and (e) extracting general and unique themes from all interviews and making a composite summary (see Table 3).

Table 3 Alignment with Groenewald’s (2004) and Hycner’s (1999) data explicitation

Promoting trustworthiness

We employed multiple data collection methods through focus groups, individual interviews, and email interviews to increase credibility (Lincoln and Guba 1985). A memo was composed after completing each interview session (including email interviews) to draw summarized key points from insights, further optimizing the confirmability (Phillippi and Lauderdale 2018). Member-checks were performed during sessions, on transcriptions, email responses, summarized key points from the memos, and demographics profile. We performed virtual debriefing sessions throughout the study to seek alternative approaches and check our subjectivity (Shenton 2004). Debriefing sessions were also conducted to discuss the themes, categories, and to resolve any disagreements amongst us. Table 4 shows the strategies used for maximizing trustworthiness.

Table 4 Strategies for maximizing trustworthiness

Results

Factor 1: individual differences

Prior exposure and pre-perception

Two IDs did not possess adequate knowledge about this approach, while seven practiced to some extent. The remaining IDs were either still learning or had heard about it. Those with limited knowledge were wary about the approach. As Darren admitted, “I can’t say exactly what it could do. But, I can see the power in what it could do.” In contrast, those who recently learned or employed some LA approaches tended to show a positive attitude and attested the benefits of LA for course design. These IDs particularly signified the role of LA for assessing student-to-content engagement. Joy’s statement provides a good example by saying “I have a plug-in [in] my campus account that will let me see where students use [or access] the most just by the page.”

Pre-existing perception also influenced designers’ perspective on LA. Twelve out of 18 IDs expressed positive pre-existing perceptions about the benefits. They agreed that LA was useful in understanding students, their context, and learning progress in addition to evaluating engagement and course efficacy. As Wylda noted, “… it could come in the most handy when doing an audience analysis.” Stanley added that LA could promote students’ problem-solving skill because “… it captured information on when the user could recognize that something was a problem but could not properly identify what kind of problems.”

Table 5 display a cross analysis of three participants who each reported various prior exposure, knowledge, and experience about LA. This table depicts that the more the IDs possess prior exposure, knowledge and experience, the more positive their perception is, and the more they openly express their hope to adopt the approach.

Table 5 A cross analysis from three representative participants

Pedagogical belief

More than half of the IDs referred to their philosophical belief associated with pedagogy and assessment. The two primary tenets noted by IDs were technology integration aligned with pedagogical intent and the critical role of learning improvement driven by assessment. Those who believed “pedagogy drives the technology” interestingly expressed mixed views. Some IDs agreed that learning improvement did not need LA. As Chun explained, “we could use our instructional design way to … find out ways in terms of non-technology [non-LA] way.” Cora conveyed another view demonstrating an openness to LA approach: “I find that the learning analytics data… is critical. We use ours to map to a set of competencies.” Despite having mixed opinions, IDs concluded that institutions should focus on pedagogical needs and context when informing their decisions on LA adoption. They believed in the strategies, instead of plainly placing the emphasis on the tools.

The essential role of assessment also received mixed perspectives. Trent professed, “I believe in a philosophy if you can’t measure it, you can’t manage it appropriately. So, I don’t try to use any resources or assessment where I can’t get some type of analytics.” This perspective was additionally shared by two other IDs. On the other hand, one ID noted that the LA-technique was not needed to assess learning improvement and therefore believed in non-analytics assessment.

Factor 2: system characteristics

Lack of data availability and legibility

Ten IDs highlighted the issue regarding limited data available to them, affecting their intention and actual practice of employing LA approach. Hence, they expressed the importance of data types they hoped to access. Tiana wished for aggregate data that she would “certainly review and try to make conclusions regarding what was happening and what needed improvements.” Sierra asked for specific data associated with learning progress by professing that, “the LMS I’ve worked with are all roughly the same, offering extremely broad, high-level opportunities rather than granular and meaningful.” Seven IDs also expressed their concerns regarding the data legibility. The LMS built-in analytics tools merely displayed raw data that were difficult to understand. Cora told us, “There’s the free version which provides you with access to enormous datasets on any number of topics raw data…. That information isn’t really useful … because it’s not pretty.”

Lack of user-friendly analytics tools within LMS

Two IDs highlighted the lack of usability found in the LMS built-in analytics tools. Tess noted that current analytics tools in LMS was “meager at best,” thereby discouraging the adoption. Essentially, IDs wished for a tool that allowed them to view comprehensive data but were intuitively easy to read without computing advanced statistics. This technological limitation seemed to influence their decision-making to adopt LA approach for improving course design. Giuliana noted, “I don’t feel that’s an area we have even begun to tap into as a community of instructional designers yet and I don’t think the technology is quite there yet” as she described the readiness in pursuing LA.

Although analytics software packages offering advanced visualization tools exist, they come with additional cost. Unfortunately, IDs did not normally control software purchasing decisions. As Georgette confirmed, “… the fact that they made analytics an add-on product and has made it costly impacts what we can do. … I can’t see that it would be financially feasible at this time.”

Factor 3: social influences

Internal drive within the institution

Seven IDs first encountered the notion of LA from faculty or fellow IDs when these colleagues attempted to draw historical student data to help diagnose learning performance issues. As Fay verified, “… that is the conversation that opened me to this [LA] idea.” Five IDs pointed out that institutional initiatives significantly influenced how they viewed LA and their degree of participation in LA-related events like “student success” initiatives. For example, in order to provide suitable student support services, some institutions started to look at analytics. Cora offered a possible explanation that this type of data “may be the only information for administrators or the people in the Student Success department who don’t work directly with the learning environments or those students.”

External influence from experts and professional groups

Four IDs were attracted to LA due to the external influence of well-known experts, practitioners, and scholars in the LA field. For example, Fay and Wylda respectively told us that they “watched some early work by George Siemens” and also “saw John Whitmer speak.” Many IDs also attended professional conferences or joined special interest groups. Reyna confirmed that a conference presentation triggered her curiosity when she was “attending an AECT conference a few years ago.” Special interest communities, have also hosted online venues like forums, blogs, and social media, and therefore reportedly made an influence on IDs’ understanding of LA. Georgette affirmed, “… so we’re posting all kinds of things pertaining to various types of analytics without … really realizing they are analytics.”

Factor 4: facilitating conditions

Inadequate supporting tools and infrastructure

IDs anticipated an adequate infrastructure that allowed a straightforward approach to access data through robust tools and integration of plug-ins. Georgette commented, “… the resource obviously for us would be to have Blackboard learning analytics. …, then, of course, support in the sense of being able to utilize the tool appropriately.” Trent expressed the importance of having an “access to tools/resources that are xAPI ecosystem compatible” to allow IDs “to more effectively and efficiently integrate analytics” into course design practice. However, pursuing an access to xAPI requires institutional multi-level approval because security risks as well as legal and ethical concerns may arise. As Scott noted, there is “… a rather complex red tape pathway to get any sort of plug-ins.”

A need for professional development opportunities

Twelve IDs believed that a skillset for interpreting the data was required. Therefore, they wished for professional development in this area, covering technical, analytical, and practical aspects. “Some training is needed to learn how to use the tool and some training is probably helpful to learn how to read [the data],” noted by Georgette. As three IDs further described, the program should discuss practical LA techniques to promote immediate application. Joy represented this notion, “I would like something that’s very practical and that I can utilize on my job.” Informal professional development was also mentioned. Jiang therefore said, “So I would hope or wish they could provide me with website, or resources where I can read reviews, where the instructions are simple and concise… All [of] the resources only give me part of the answer.”

Lack of institutional buy-in and support

Six IDs asked administrators to advocate LA-related practices, collaborate with faculty and personnel, provide sufficient budget for obtaining suitable tools, and recruit personnel well-versed in analytics. Georgette, Sierra, and Tiana respectively represented this hope:

  • “… and then you need the money for this.”

  • “Faculty need to be involved in shaping the goals and providing the resources.”

  • “[We] need personnel to gather the data—no, we don’t have adequate personnel to do this.”

IDs also hoped that faculty were aware of the rationale underlying IDs’ intent to integrate LA into course design—it was truly for enhancing learning outcomes. As Chun affirmed, “… we should be more like a firefighter. We’re helping.” Most IDs agreed that LA was an aid to improve learning environments,—hence, the term of a firefighter is used here—not for “policing” the strategies when the outcomes are found to be insignificant. This perception symbolizes an invitation by the IDs to establish a partnership with other stakeholders.

Table 6 below depicts the summarized results regarding how IDs’ in higher education perceived their experience integrating LA approach into course design from the lens of the four synthesized determinants.

Table 6 Summarized themes and examples

Discussion

The phenomenological approach allowed us to describe the phenomenon as accurately as possible according to the real experiences of the participants (Groenewald 2004). Additionally, the four synthesized aspects underlying TAMs have guided us to salient findings concerning IDs’ perspectives toward LA integration. Individual differences, system characteristics, social influence, and facilitating conditions were investigated in our research context, allowing us to uncover IDs’ perspective on the LA adoption in course design practice within the HE setting. We further expound the findings into the following discussion and implications.

Instructional designers’ varied perceptions

We discovered differing interpretations of LA amongst IDs. Those who perceived LA as a technological tool tended to speak from the pedagogical perspective. Hence, these IDs emphasized the pedagogy-before-the-technology tenet. Another interpretation detected from IDs is that LA is simply a measurement technique. Some other IDs might be open to LA because conducting evaluations is part of their tasks. These findings may suggest three important points.

First, nine IDs mentioned the importance of having pedagogy to drive the technology use. IDs appeared to comprehend the meaning of LA as in line with its definition—the utilization of LA is aimed to understand the learners and their context for improving learning and its environment (Siemens and Long 2011). This type of perception resonates with existing studies, positing that learners’ data will lose the meaning when analyzed and interpreted incoherently with pedagogical goals and learning context (Gasevic et al. 2016; Gašević et al. 2015; Reimann 2016; Wiley et al. 2020). Our findings verify the pedagogical emphasis—the focal point of LA approach relies on to the extent it can help address issues in the context of student learning. Therefore, the alignment between LA and course design can become clear, and the interpretation acquired from the LA approach can be more actionable for supporting the design (Wiley et al. 2020).

Second, four IDs believed that LA is simply a measurement technique. Particularly, two IDs reported believing that non-analytic assessment is still useful. This type of perception may suggest that, in the practical setting, some data used to inform the design decisions may not always be those acquired from an LA approach. In some cases, both LA data and non-analytic data (e.g., responses from survey) may be used to corroborate one another. This can be associated with the existing literature. In addition to analyzing the LMS usage data, Firat (2016) utilizes an open-ended question to solicit students’ insights regarding effective course elements. In their systematic review of predictive analytics, Abu Saa et al. (2019) suggest that other data collection techniques such as surveys and course evaluation surveys should still be considered in future LA-related research.

Third, IDs’ prior exposure and pre-existing perception may have shaped IDs’ current perspective and intention to employ LA-related practices. In our study, IDs with prior exposure of LA were able to articulate the crucial role of LA in terms of learning assessment and, therefore, could make a better connection between LA and instructional design. The more the IDs possess prior knowledge and experience, the more positive their perception is, and the more they openly express their hope to adopt the approach. This notion resonates with an existing study; prior exposure of analytics tools is positively correlated with the perception about the overall usefulness of analytics and behavioral intent to adopt (Ali et al. 2013). Additionally, prior exposure of a similar user interface used in the analytics tools is positively correlated with the behavioral intention to adopt the tool (Ali et al. 2013).

Learning analytics in course design practice

Our findings revealed that despite nuances seen in IDs pedagogical beliefs and perceptions, new applications to the integration of learning analytics in course design are slowly taking root in course design practice. While some IDs had yet to utilize tools within an LMS or attempt to use any proprietary LA tools in their practice, they tended to seek data points from existing available sources to provide valid recommendations for instructors seeking advice for course redesign. This can be seen amongst many student success initiatives prevalent in higher education institutions, where IDs are often involved as support personnel (Fasse et al. 2009). On a small scale, it takes place in the form of recommending instructors use just-in-time data in an LMS to alert students that are displaying detrimental learning behaviors (Arnold 2010; Dyckhoff et al. 2012; Muljana and Placencia 2018; Tabuenca et al. 2015). On a larger scale, it could mean using semester-long data to inform subject difficulty, learner characteristics, and issues with achieving positive learning outcomes before any decision of course redesign is made (Dunbar et al. 2014; Ifenthaler 2017; Ifenthaler and Widanapathirana 2014). Despite that LA as a technique to improve course design may not be verbalized overtly by IDs, data-informed decision making is clearly seen in our respondents’ design practice. Interestingly, this overall landscape of LA in course design practice reported by IDs echoes Williams et al. (2011), where they found that formal evaluation rarely occurs in ID practices, but it exists in less formal ways which also benefits the product of the evaluation process.

While many IDs see the LA approach as an opportunity, challenges still exist. We noted the challenges in Fig. 1 as we present all discussed granular factors influencing instructional designers’ perceptions, intentions, and actual use toward LA approach in course design as guided by TAM— Fig. 1 is a treemapping visualization based on the number of participants mentioning the notions related to the four factors. In some circumstances, readily available data (that could have been used to improve course design) may have been overlooked as some IDs may only fall back on traditional means of collecting student data when faced with an instructional issue. In other instances, the existing data analytics do not exist in an easy-to-read, meaningfully visualized format. Therefore, rendering it is nearly impossible for IDs to readily inform the redesign process. Translating such data to a comprehensible visualization requires additional funding, tools, or personnel with more advanced skills that may not be available in higher education institutions. With this in mind, we present implications related to the LA-related practices, hoping to inform higher education practitioners to leverage the power of LA in course design.

Fig.  1
figure 1

Factors influencing instructional designers’ adoption of an LA approach in course design practice. The treemapping visualization is based on the number of participants mentioning the notion related to the four factors

Practical implications

Merely upgrading the infrastructure or expending additional dollars may not always be the solution. In certain instances, IDs may have already identified the type of data that they need to help improve their designs more effectively and efficiently; all they need is to communicate this to the instructor. LA helps IDs to trace and analyze learning behaviors for identifying elements that require adjustment. In other instances, support can take form in providing funding for purchasing the data visualization tools and hiring skillful personnel. After all, institutions may not be prepared for employing LA practices optimally due to a lack of “necessary technology available to implement valid LA frameworks” and short supply of “specialized staff with a strong background in learning and teaching as well as data science” (Ifenthaler 2017, p. 370).

It is equally important to note that decisions regarding LA should be made with consideration of various stakeholders. As IDs, faculty, and administration amongst various stakeholders may hold disparate beliefs towards the LA approach, it is crucial to seek synergy among these different groups on campus to ease the integration and implementation (De Freitas et al. 2015; Ifenthaler 2017; Macfadyen et al. 2014; Nunn et al. 2016). For institutions perceiving the needs for adopting LA, it may be wise to conduct a needs assessment before jumping to the adoption plan. Needs assessments can help identify the current perception, knowledge, and skillset possessed by various stakeholders (Lee et al. 2007), including IDs who can take advantage of LA. Implementing LA without a thorough understanding of the potential challenges and risks may result in a lack of adequate utilization and waste of resources.

Professional development for IDs interested in utilizing the LA approach can take a variety of forms. While higher education institutions may not provide on-the-job training specifically addressing issues with the implementation of LA, creating a system that incentivizes IDs to take online courses and dedicate time to attend conferences may be a strategy that helps stimulate their interest and further advance their skills in this area. Additionally, it is crucial for IDs to continuously engage in professional learning communities, whether it is on campus or virtually formulated. We encourage seasoned practitioners and scholars who have pioneered LA-related practices to willingly share their expertise, lessons learned, and success stories in hopes of promoting an LA adoption. As our participants noted, IDs should also take advantage of numerous informal professional learning groups on social media, which can offer spontaneous, and virtually ubiquitous professional development opportunities on the subject of LA-informed instructional design. This informal form of professional development allows IDs to “learn through everyday social practices” (Gray 2004, p. 22) and possibly provides more flexible learning opportunities tailored to relevant needs, interests, and professional goals of the individuals (Luo et al. 2020; Muljana et al. 2020; Trust et al. 2017).

Limitations and recommendations for future research

We recognize the limitations of the specific context and scope of this study, suggesting some directions for future research. How the four factors impact IDs in a different context and how these four factors impact other stakeholders in higher education deserve further exploration, especially given that each group of stakeholders interacts with students at a different capacity. Detailed investigation of the determinant factors affecting each stakeholder group deserves attention in order to gain richer insights. Forthcoming studies exploring challenges (e.g., the ethical aspect of it), strategies to overcome the barriers, and lessons-learned are much needed to determine the effective and appropriate use of data. Scholars may also consider exploring the perspectives of IDs and their current state of LA-related practices in various settings such as in corporate and healthcare where utilizing data may be a norm.

As learning interventions are contextual, another research approach would be to present a design case that provides detailed documentation of an LA-specific design intervention at great length. Fellow scholars and IDs may collaboratively validate existing LA frameworks proposed by Ifenthaler and Widanapathirana (2014), Persico and Pozzi (2015), Yen, Chen, Lai, and Chuang (2015), and Davies et al. (2017). This may help clarify key aspects of the design process driven by the framework, critical decisions made during the design process, the outcomes seen in the intervention, as well as the context and conditions under which the effects of the design intervention were made possible.

Conclusion

Our findings suggest that the four synthesized determinant factors, established by TAMs, have an impact on IDs’ perception, intention to adopt, and actual adoption of LA. This study, hence, offers additional insights into previous research on LA at the intersection of instructional design in higher education through the guidance of TAMs (Ali et al. 2013). While IDs have admitted the great potential of LA in optimizing learning experience, the adoption is still rudimentary due to the alluded factors. Highlighting IDs’ hopes, we emphasize the importance of facilitating conditions to offer robust infrastructure, promote synergy among the stakeholders, and provide professional development opportunities. Our findings signify that it takes multiple entities at an institution to work together in figuring out the best appropriate use of students’ data for improving the learning environments. That said, our contribution overall expands the body of literature, highlighting the importance of bolstering the collaborative culture among stakeholders in leveraging LA effective adoption (De Freitas et al. 2015; Ifenthaler 2017; Macfadyen et al. 2014).