Introduction

The concept of blended learning (BL) is loosely defined in the literature, with many researchers using different operational definitions. However, most agree that BL is the combination of in-person instruction in the same physical location classroom (sometimes referred to as face-to-face or F2F), and online components of teaching (El-Mowafy et al., 2013). The exact balance of these components and the specific instructional approaches under the BL umbrella varies greatly. Research has established the effectiveness of the BL teaching approach in some learning environments (Bernard et al., 2014), leading to a significant rise in the use of BL-type courses in post-secondary settings. A review of the literature suggests that despite the positive outcomes associated with BL in some settings, there are no clear explanations as to why BL is effective. Also, asynchronous learning formats that can be in used in combination with synchronous learning for BL can lead to decreased satisfaction and engagement (Maki et al., 2000), social isolation (Hameed et al., 2008), and reduced completion rates. While these characteristics are not unique to asynchronous formats, they are increasingly characteristic of them. Therefore, to limit the negative outcomes while maximizing the positive outcomes, it is important to understand the mechanisms underlying the effectiveness of BL.

In recent years, many Canadian universities have applied the BL approach to first-year introductory science courses to address the shifting demographics of the Canadian student population (Dale, 2010; Association of Universities and Colleges of Canada, 2011). Fully online and blended courses serve a significant population living in remote regions of Canada. Indeed, 83% of Canadian higher-education institutions provide courses with varying degrees of blended mode, aiming to provide access for a diversity of continuing education learners with no additional cost (Contact North Canadian Digital Learning Research Association, 2018). In addition, the median age of the Canadian student population is expected to rise over the next decade due to population shifts (Dale, 2010; Association of Universities and Colleges of Canada, 2011) as well as students taking increasingly non-linear paths to obtaining post-secondary degrees and certificates (Shaienks et al., 2008). This presents a challenge, as dropout rates in higher education are at worrying levels and may be higher in older student populations (Canada aged 19–24: 21% (Shaienks et al., 2008); USA aged 19–24: 34%, aged 30 + : 53% (National Student Clearinghouse, 2010)).

One relevant area is dropout rates. In a survey of 159 studies on causes of online course dropout rates, Lee and Choi (2011) found that age was not among the 69 contributing factors to dropout rates. Instead, personal factors including work commitments, familial and social responsibilities, insufficient financial support, and poor studying environments caused students to leave higher education. However, older students more frequently experience these high-risk personal factors, particularly financial demands and familial responsibilities, which tend to be negatively associated with degree completion (Roksa & Velez, 2012; Jacobs & King, 2002). There is evidence that socioeconomic and familial pressures also influence and may delay the decision to attend post-secondary education immediately after high school (Goldrick-Rab & Han, 2011).

The rigidity of traditional schedules is also a challenge. Institutional factors identified by Cross (1981), such as class schedules and tuition costs, are another layer of barriers to traditional participation in post-secondary settings. Distance education and BL are one of the approaches used by post-secondary institutions to provide increased flexibility for students facing these increasingly common difficulties.

Moreover, BL has been shown to provide meaningful improvements in student outcomes when compared with physically co-located classrooms (Means et al., 2009; Bernard et al., 2014). The reasons for this are not well-understood. Means et al. (2013) suggest that the effectiveness of BL is attributable to teaching approaches that succeed in keeping students engaged and on task. Bernard et al. (2014) conclude that the BL approach is particularly effective in Science, Technology, Engineering and Mathematics (STEM) courses as compared with non-STEM courses. These findings suggest that BL is not effective in all circumstances, but may be dependent on aspects of both the classroom environment and student attributes.

However, there are many barriers and challenges to implementing BL. The evidence for the efficacy of BL has not led to specific recommendations for teaching practice in BL courses; a majority of faculty experience barriers to teaching online related to lack of training and support (Canadian Digital Learning Research Association, 2018). Moreover, online teaching approaches are not without drawbacks, since students have reported increased feelings of social isolation (Hameed et al., 2008), frustration (Hara & Kling, 2001), reduced interest, and lower satisfaction with online courses compared with in-person courses (Maki et al., 2000). Finally, the research literature is sparse on pedagogy for BL.

It is important to understand the pedagogical variables that lead to effective BL courses in order to enhance student outcomes, while also identifying the negatives so these can be minimized. Thus, the purpose of this study was to understand how students experienced blended university courses, and to find out specific factors related to student satisfaction in BL. We queried students’ background, engagement, and student course satisfaction, and added questions about their preference for in-person or blended course delivery. Based on the findings, we provide recommendations for blended learning instructors in higher education.

Theoretical Frameworks: Sociocultural Theory and Community of Inquiry

The overarching theoretical framework for this work is sociocultural theory (Vygotsky, 1978; see also Cole & Wertsch, 1996). It is important to distinguish sociocultural theory from social constructivism, since the two are often confused. Social constructivism is primarily focused on the role of collaboration between individuals in the creation of knowledge (Palincsar, 1998). In contrast, sociocultural theory places equal importance on both the social environment (people) and the cultural tools and artefacts with which people create knowledge (Vygotsky, 1981; see also Minick, 1987). Thus, when describing and accounting for learning within technology-enhanced BL environments, we must consider the tools with which students and instructors generate knowledge. Another relevant theoretical framework to consider in its complementary relationship to sociocultural theory is the community of inquiry framework (Garrison et al., 2010; see Lipman, 1991 for the origins of the term ‘community of inquiry’). This framework is particularly important to consider when understanding the role of the instructor within the technology-enhanced BL environment. Both theoretical frameworks are discussed next.

A reading of Cole and Wertsch (1996) suggests two key implications of sociocultural theory for making sense of learning within BL environments. First, cultural tools and artefacts such as computers, the internet, and software systems do not simply facilitate mental processing by making cognitive tasks easier or faster (e.g., calculations), but may also influence the quality of mental processing. For example, online instructional materials may facilitate access to learning, but could also broaden the scope of ideas considered and narrow the contact of instructor-student relationships. Second, human learning, defined here as the “long-term change in mental representations or associations as a result of experience” (Ormrod, 2016, p. 4), occurs within historical, cultural, and institutional contexts, and any attempt to account for changes in student learning needs to include these components of the academic environment. In describing students’ reactions to BL courses, a description of the course, history and social climate, instruction, material, and tools must be explicitly considered when making sense of learning outcomes. Indeed, it may be impossible to describe learning outcomes separately from the specific context in which it has occurred. Moreover, there is interdependence between what students can and want to do and learn, and limits on what is feasible for them given the context of a course of study.

The inter-dependence of student and learning context is exemplified by the community of inquiry (CoI) framework (Garrison et al., 2010). Developed specifically to theorize about the processes of collaborative computer conferencing in higher education, the CoI framework outlines three forms of ‘presence’ that influence a student’s educational experience. Although a full presentation of the CoI is beyond the scope of this paper, three aspects are mentioned here. First, the cognitive presence of students can be enhanced within a learning environment by selecting content that is meaningful and having discussions that challenge students to think deeply. Second, the teaching presence is enhanced by having the instructor select meaningful content but also, importantly, create a safe and interesting climate that permits students to engage in discourse and feel good about contributing to the learning. Third, the social presence of students and instructors is enhanced with strong climate and discourse on the part of all involved in the experience. These three presences are illustrated by Garrison et al. (2010) in the form of an Euler diagram where each presence overlaps just enough with the other two to create a fulsome educational experience. In its formulation, the CoI has many features that can be interpreted as a specific instance of sociocultural theory. In particular, the teaching presence in the CoI underscores the environment created by the instructor and highlights the personal attributes the instructor might have or need to develop in bringing about certain environments for students.

The role of the environment is a fundamental part of sociocultural theory. The environment includes the specific presence of instructors and the personal attributes they espouse which will undoubtedly interact with students’ self-efficacy or confidence for learning (Bandura, 1986). Student engagement, which can be considered to be similar to Garrison et al.’s (2010) cognitive presence, is increasingly recognized as a vital process to measure when understanding learning outcomes (Bernard et al., 2009; Kuh et al., 2008; Andrews, 2018). In a survey of 18 colleges and universities in the USA, Kuh and colleagues (2008) demonstrated that the National Survey of Student Engagement (NSSE) was a strong predictor of first year GPAs. Other work by Andrews (2018) identifies student engagement as a small but important predictor of graduation. A growing body of evidence indicates improved student learning outcomes in undergraduate science teaching when student engagement and active learning are incorporated in courses (National Research Council 2012; Handelsman et al., 2004; Wieman, 2007; Anderson et al., 2011; Freeman et al., 2007, 2014).

Many investigators are also examining student satisfaction as an educational outcome in BL environments, with some finding a positive correlative or predictive relationship between student satisfaction and student learning performance (Nanclares & Rodríguez, 2016; Lo, 2010; Paechter et al., 2010; Wu et al., 2010). Additional evidence indicates that student satisfaction is correlated not only with final marks but also with student motivation (López-Pérez et al., 2011). However, counter to this, Maki et al. (2000) found a dissociation between satisfaction and student learning performance. Thus, student satisfaction and engagement within a course may be predictors of desired learning outcomes but the association is uncertain and requires more study. In this study, satisfaction was evaluated at the end of the course, and included students’ degree of preference for blended learning, whether they believed the course components enhanced each other, and whether they would take another blended course if given the opportunity (Vargas & Nocente, 2016).

At this juncture, it is necessary to note a significant limitation in any discussion of student satisfaction as an educational outcome. The construct of student satisfaction in higher education is rooted in viewing the student as a consumer of education in much the same way as a business might view its clients as consumers of products (Oliver, 1997). The actual quality of a product is not assessed but rather the perception of quality. In the absence of real alternatives, a customer may indicate satisfaction with a product only because alternatives do not exist. When the construct of student satisfaction is measured in response to a learning experience within a formal course of study, it has to be recognized that what may be being measured in students may reflect more of what students have been trained to expect from their courses and less of what students should be expecting from courses (Duarte et al., 2012). In other words, because the prevalent mode of delivering large-scale instruction has traditionally focused on content and the acquisition of information and not on the process of questioning and understanding, then it may be argued that students over time have been trained to expect a lesser form of instruction as satisfactory. Consequently, when students evaluate their satisfaction with a course, in essence they may be indicating satisfaction with a poor substitute for a more intensive, challenging but less traditional form of learning. However, this less traditional form of learning is rarely offered. Although student satisfaction is a focus in this paper, this is done because the construct of student satisfaction has become a benchmark in student outcomes and not necessarily because it provides the most relevant information about the reactions to learning experiences that are indeed in their best interest.

Wu et al. (2010) proposed a model of learning satisfaction in BL courses that incorporates student expectations, perceptions of self-efficacy, and components of the learning environment. Their model draws from expectancy-value theories (see Wigfield & Eccles, 2000) and especially Bandura’s notions of self-efficacy and motivation for learning (Bandura, 1997). When applied to data surrounding student learning outcomes, the model proposed by Wu et al. (2010) accounts for individual competencies, performance expectations, and learning climate, and suggests that these are the most significant predictors of student satisfaction in BL courses. Wu et al. (2010) suggests that the most significant predictors of student satisfaction in BL courses are performance expectations and learning climate. Learning climate is a contextual variable that goes beyond the self and includes other individuals and their interactions. Although Wu et al.’s (2010) results are relevant to understanding the dynamic interplay between students, peers, instructors, and the synthesis of these variables (Palincsar, 1998), the failure to consider the tools and artefacts associated with this learning is problematic since tools and artefacts can change what and how topics are learned.

The learning climate is often described as resulting from several human factors, including the emotional engagement of the students with each other, the students with the instructor, and the instructor with the students (Skinner et al., 1998; Reyes et al., 2012). A meta-analysis of Distance Education (DE) courses by Bernard et al. (2009) found that the type of interaction students experience in their courses had strong influences on student engagement, with peer interactions having the strongest effect, followed by student-content interactions, and last, student-instructor interactions. Thus, it appears students’ interactions with peers and subsequent emotional engagement contributes most significantly to creating a positive learning environment. However, it should be recognized that the types of experiences students can have with each other and the instructor depend on the tools and artefacts included in the course of study. For example, the presence of chat groups provides an avenue for interaction; in the absence of this technology, interaction might decrease.

Objective

To explore the relationship between students’ background and engagement, this study distinguished three different constructs of engagement: behavioural, emotional, and cognitive (see Fredricks et al., 2004, 2005). Behavioural engagement (BE) describes student participation in class activities, and includes all behaviours considered critical for achieving positive academic outcomes. Typical actions might be attending class, completing an assignment or readings, or paying attention in class. Emotional engagement (EE) describes the affective attachment students have for instructors, classmates, or the institution, which influences their willingness to complete course activities. This includes students’ perceptions, positive or negative, of the amount and quality of these interactions. Examples of this are a student having an opinion about other students, about their instructor/lecturer, or being interested in particular course material, regardless of the cause of these emotions. Cognitive engagement (CE) refers to investment and willingness to exert the attention and effort necessary to understand complex ideas and master course-relevant skills, particularly difficult ones. A cognitively engaged student might work toward good study habits, strategize about how to learn effectively, or critically evaluate the material they are studying.

We investigated four undergraduate science courses that employed the BL approach. Using secondary data analysis, our aim was to identify and describe important and desirable aspects and practices of higher education pedagogy in a BL environment for promoting student satisfaction. In investigating these variations on the BL approach, we are cognizant of the potential of behavioural, cognitive, and emotional engagement to interact and change with different tools and artefacts in different courses.

Methods

Participants

Six hundred ninety-two participants took part in the quantitative portion of the study, and 48 participants took part in the qualitative portion of the study, which involved individual in-person interviews. Participants were students who had attended BL-based first-year courses offered in the Faculty of Science in a large, research-intensive university between fall 2014 and winter 2017. Instructors of these courses had recently worked with a teaching support unit to convert physically in-person courses to blended courses via the university’s online learning management system (Moodle). These courses were redesigned “with the aim of improving student engagement and satisfaction” (Vargas & Nocente, 2016), as well as to offer greater time and classroom flexibility for changing student demographics. This redesign included online video lectures, practice materials, and collaborative learning experience with peers (see Table 1). Specifically, this design provided lectures and formative assessments via the learning management system, so that students could re-listen, pause, and learn more from online videos, as well as having access to practice quizzes online. The redesigned courses were from first-year Chemistry, Computing Science, Human Geography and Planning, and Mathematics. Table 1 summarizes the courses and details of the blended course implementation.

Table 1 Summary of the features of the four courses analysed in the study

All courses were hosted on the University’s Moodle–based learning management system (LMS), where students could access all online content from their courses through the common portal. The course sessions allowed students to interact with each other through forums, online messaging, and breakout groups; students could also email instructors directly through the LMS.

Although full demographics were not collected, historically students who enrol in these first-year courses are between ages 17 and 24. Gender information was collected for two courses: Computing Science and Human Geography and Planning. In these two courses 230 of 393 students identified their gender; the gender breakdown was 22.9% female, 34% male, and 1.5% transgender or gender fluid, with 41.6% of students not responding. This study was approved by the Institutional Research Ethics Board and complied with policies associated with external granting agencies.

Materials: Converted Courses

Chemistry

Introductory University Chemistry I is a multi-sectioned first-year level course that serves as a core requirement in many university programs. Course sections included a lecture, laboratory, and seminar. This course was the only course among the four that was not fully blended (only four topics of twenty-five were transformed); in-class time was reduced while the blended units were taught online (see Table 1).

Computing Science

Introduction to the Foundations of Computing is a first-year level programming course using the Python language with an emphasis on computational problem solving. Students were paired with a classmate and tasked with programming playable video games. The course was fully blended (see Table 1). Students had online assignments that permitted multiple attempts, and received individual mentor feedback.

Human Geography

Cultures, Landscape & Geographic Space: An Introduction to Human Geography and Planning is a first-year level introduction to geographical techniques, the spatial organization of human landscapes, and the significance of the distribution of human activity. In this course, the traditional three hours of physically present lecture were replaced by one hour of lecture time, one hour of online learning, and a one-hour seminar. Exams could be attempted multiple times. The course was fully blended (see Table 1).

Mathematics

Calculus for the Physical Sciences I is an introduction to calculus for the Physical Sciences. This first-year course provided opportunities for individual and collaborative problem solving in class and during exams through a two-stage summative assessment process. Students wrote an exam individually, and then worked in small groups on a selection of repeated questions. Students would be given the better of the two grades for the repeated questions. Assignments could also be attempted multiple times. The course was fully blended (see Table 1).

Procedure

This study involved secondary analysis of data collected by personnel at the teaching support unit on campus. A member of the unit went to the different lectures to invite students to complete the survey, to explain the purpose of the study, and to guarantee anonymity of results. The response rate for this was over 20% of the total number of students. A link to the letter of information and survey was provided in their learning management system (see Appendix 1); at the end of the survey, students were given the option to sign up for an interview, which comprised the qualitative portion of the study. The original study was designed to approximate an explanatory-sequential design (Creswell & Creswell, 2017). The first phase involved quantitative data collection, and the second phase involved qualitative data collection to better understand the survey results.

Quantitative Survey

Participants responded to a 10-min post-course online survey with questions targeting student engagement and satisfaction. A total of 41 survey items were divided into five sections as shown in Table 2. The first set of items requested student background information and were adapted from Owston et al. (2013). The second set of items were also adopted from Owston et al. (2013) and included statements designed to measure satisfaction with specific course delivery (e.g., The online and F2F course components of this course enhanced each other) and overall course satisfaction (e.g., Overall, I am satisfied with this course). Participants were asked to respond on a 5 point Likert scale, where 1 = strongly disagree and 5 = strongly agree. The third, fourth, and fifth set of items were adapted from Fredericks et al. (2005) and Owston et al. (2013) to probe students’ emotional, cognitive, and behavioural engagement respectively. Participants again were asked to respond on a 5-point Likert scale, where 1 = strongly disagree and 5 = strongly agree. At the end of the survey, students were given the option to volunteer for a 40-min semi-structured interview. In our data analysis, student quantitative responses were compared amongst the four different BL course offerings.

Table 2 Survey questions

Before each semi-structured interview, participants were given a letter of information. Interviews probed prior experience with online or blended learning courses, expectations of the course, general experience (course workload, likes, and dislikes), the perceived value of course resources and activities, difficulties the student faced, student motivation, feelings about other students/teaching assistants/instructor, and suggested improvements to the course. For a detailed list of the 20 questions included in the interview, see Appendix 2. Interviews were conducted with 9 students from the Chemistry course, 22 students from the Computing Science course, 15 students from the Human Geography course, and 2 students from the Math course.

Preparation for Analyses

There were five predictor variables in the quantitative study: background skills, behavioural, cognitive and emotional engagement, and type of blended course. The main outcome variable was student satisfaction with the course. A brief description of each of the predictor variables follows:

Self-reported background skills included GPA, high school average, workload (course load, laboratory load, and employment), motivation, time management skills, and peer-support. The behavioural engagement subscale comprised five items (e.g., B32: I was able to consistently pay attention in this course and B33: I followed the course schedule and completed non-graded activities). The cognitive engagement subscale comprised six items (e.g., B26: When I read or viewed course materials, I asked myself questions to make sure I understood and B27: If I was not understanding what I was learning in this course, I would go back and review the course materials). The emotional engagement subscale comprised nine items. Two items in the subscale were reverse scored (e.g., B15: I felt isolated during this course and B16: I felt anxious in this course). Type of blended course (e.g., chemistry, computing science, human geography, and math) also served as a predictor. Last, student satisfaction included measures of students’ satisfaction with the course and their preferences for BL, as measured by whether they expressed they would take another BL course in the future.

The qualitative analyses included considering participants’ responses to the 20 interview questions mentioned above and in Appendix 2. All of the interviews were audio recorded and transcribed by personnel at the teaching support unit. Using a grounded theory approach (Strauss & Corbin, 1994), interviews were analysed with the NVivo12 software package (QSR, June 26, 2018). Based on research objectives, interview transcripts were analysed according to participants’ background skill, student engagement, and course components. Within each of these three categories, participants’ affective reaction was considered; that is, did students speak positively or negatively about their skills, engagement, and course components.

Survey data were analysed statistically and interview data were evaluated for themes. All statistical analysis was carried out in R Studio version 1.1.419 (RStudio Inc., Boston, MA, USA; R version 3.5.1 “Feather Spray”). The internal consistency of the behavioural, cognitive, and emotional subscales was evaluated using the Cronbach function of the psy package. Correlation analysis was carried out using the corr function of the Hmisc package, and family-wise error was corrected using a Bonferroni correction. Multiple linear regression analyses were carried out to determine significant predictor variables for student satisfaction with the BL course, and only factors significantly correlated with student satisfaction were included in this step. Relative importance of each predictor was assessed using the bootstrapping function in the relaimpo package: FIRST, LAST, LMG, and PRATT with 1000 iterations. The categorical variable of student preference for BL course offerings was assessed using chi-squared tests for goodness of fit.

For the thematic analysis of interviews, two raters were trained to employ an independent but iterative process for coding each interview transcript to reduce rater bias in analysis. Raters reviewed and analysed the transcripts for affective responses to each of the three categories (i.e., background skill, student engagement, course components) and any other emerging theme to evaluate student learning experience. Coding schemes for categories were operationally defined (available upon request from the first author). Rater consistency in coding, based on 10 of 48 interviews, was 95.4% based on simple agreement without correcting for chance. Using a measure that corrects for chance agreement, Cohen’s kappa, inter-rater agreement was excellent at 81% (Landis & Koch, 1977).

Results

One main question guided the study: What is the relationship between students’ background and engagement for predicting student satisfaction, including preference for blended course delivery? In the following paragraphs, findings from the quantitative and qualitative analyses are presented.

Quantitative Analysis

Student preference for course delivery

As shown in Table 3, in response to the survey question “If the same course was being offered in different formats, which course format would you prefer?”, students indicated a statistically significant preference for the BL course delivery compared with other delivery options across subject matter, χ2 > 46.5 (2, 37–232), p < 0.001. However, if course delivery options were limited to physically in-person (described in the survey as F2F) or online, then as shown in Table 4, students indicated a statistically significant preference for in-person over online delivery across subject matter, χ2 > 8.34 (2, 37–232), p < 0.001.

Table 3 Student responses to item B7
Table 4 Student responses to item B8

Predictors of student satisfaction

The emotional, behavioural and cognitive engagement subscales were assessed for their association with student satisfaction. All nine items of the emotional engagement subscale were retained with a Cronbach’s α = 0.84. The six items of the cognitive engagement subscale had a Cronbach’s α = 0.67. However, item B28 of this subscale was excluded because its deletion improved internal consistency to a Cronbach’s α = 0.74. The five items of the behavioural engagement subscale presented with low overall internal consistency (Cronbach’s α = 0.26) so they were not used. Instead, only two indicator items from the subscale were used as these were the most relevant to the construct of interest: B32 “I was able to consistently pay attention in this course” and B33 “I followed the course schedule and completed non-graded activities.” The other three behavioural engagement items were problematic because the wording of the items included additional tasks that potentially confounded the behaviour of interest (e.g., B34 “I followed the course schedule and completed assigned activities [e.g. view videos, quizzes etc.]”).

Pearson correlation coefficients were calculated to examine the relationship between background, behavioural engagement (items B32 and B33), cognitive engagement, and emotional engagement and self-reported student satisfaction. Self-reported student satisfaction was measured with item B11 “Overall, I am satisfied with this course.” Given the multiple correlations calculated, family-wise error was corrected using the Bonferroni method, α ≥ 0.0042. As shown in Table 5, student satisfaction was significantly correlated with emotional engagement, cognitive engagement, and B32, one of the items measuring behavioural engagement. Additionally, there was a significant but moderate positive correlation between emotional and cognitive engagement in all samples (r [.49–.54], p < .001).

Table 5 Factors correlated with student satisfaction

Multiple regression analysis was conducted by course to test if student engagement and background skills served as statistically significant predictors of participants’ ratings of satisfaction.Footnote 1 As shown in Table 5, for Chemistry, there were three significant predictors of student satisfaction—emotional engagement, behavioural engagement (B33) and workload (A7), adjusted R2 = 0.49, F(4,227) = 27.88, p ≤ 0.0001. For Computing, there were two significant predictors—emotional engagement and cognitive engagement, adjusted R2 = 0.639, F(3,145) = 88.29, p ≤ 0.0001. For Human Geography, there were two significant predictors of student satisfaction—emotional engagement and behavioural engagement (B32), adjusted R2 = 0.543, F(3,214) = 87.05, p ≤ 0.0001. For Math, there was one significant predictor of student satisfaction—emotional engagement, adjusted R2 = 0.518, F(3,33) = 13.89, p ≤ 0.0001.

Qualitative Analysis

As mentioned previously, interview transcripts were analysed according to the following content categories: background skill, student engagement, and course components. Thematic evidence was found for each of these categories as described in the following paragraphs. As well, Table 6 includes sample quotes from students in relation to course components and engagement, especially cognitive and emotional engagement.

Table 6 Sample quotes from interviews

Course components

Students primarily talked about types of instructional material, the online instructional videos and lecture components. The most frequently mentioned and coded component was the online delivery (50.1%). Mention of online delivery pertained to instructional videos (35%), followed by Lecture (34.6%), Laboratory (11.9%), and Seminar (11.6%). Most students enjoyed having access to the online videos as a method of content delivery (30 of 43); with 22 students, indicating it provided them with the opportunity to “listen to it over, and over, and over”. However, students also indicated interest in the physically in-person lecture for the purpose of reviewing material and asking questions. As one student stated, “it doesn’t really matter how many times you watch the video, you may not be able to understand it, because sometimes you just need to hear it in different words.” Having the in-person lecture allowed students to “go to the professor and have [them] explain it.” Additionally, students indicated the value and usefulness of having a strong online instructor presence.

Student engagement

Almost half of all comments (47%) in this category pertained to emotional engagement, often involving the Instructor (17.6%) and other students (12.7%). The second most frequent type of engagement mentioned was cognitive engagement (26.5%) and behavioural engagement (26.5%). Approximately half of students’ comments involving cognitive engagement dealt with their motivation (14.2%), with a third of these identifying the instructor as playing a role in their engagement (5.5%), particularly in motivation and resiliency. Two students indicated that the instructor’s negative response to their attempts to learn from a failed exam resulted in their disengagement from the course and further efforts to reflect and correct their mistakes. Two students in the Maths course and two students in the Computing Science course indicated that observing the instructor make simple mistakes in problem solving activities helped build confidence, and reduced their anxiety in learning the skill. For example, one student spoke about a past experience with another instructor and the role they had in student engagement:

“My English professor, and I’m usually not an English person, but he’s motivated me to go look up all these things, and do all the work, which is very strange considering I didn’t do that well, and now I’m like, I actually like this. He engages, he, you know, motivates me to look up these things that he didn’t teach them in class, he just showed me the way.”

Student perception of peer interactions were largely framed in two narratives: comparative or cooperative. Students framed their negative peer interactions in comparative terms citing personality, performance, or lack of partner choice as sources of dissatisfaction. For example, “I switched into a lab where I had a friend of mine in the course, … I wanted to work with her … [but] I was paired, automatically paired up with someone else … And didn’t really get a say in that process”. There were also upward comparisons, where the partner was perceived to be more competent but this resulted in worse learning outcomes due to a lack of cooperation, i.e., “I had a partner that was really good at programming, which was nice because he knew what was going on, and he could help me out. But … I’m still learning and he was ten steps ahead of me. So, I would… go back, and I would have to figure it out on my own.” Interactions were positive when framed as cooperative, and resulting in productive discussions where students could “bounce ideas” off each other and felt “more comfortable [having discussions] around their peers”, with some students being motivated to contribute in a spirit of “not letting my team down”. Two excerpts regarding the two-stage exams in Math exemplify this:

Student A: “You could hear other people talking to their neighbor, do you know what this means, and they’d go, yeah, it’s this, and they’d explain it really quickly. So, having those people in lecture was actually really helpful.”

Student B: “You know always leave the classroom and then you gossip with your friends, oh did we get that one right, so this actually gives you the opportunity to have that valuable discussion in your class, and actually having it reflect on your marks, which I think it’s interesting, so I’m glad there’s a way to kind of make discussing your math test with your friends acceptable.”

Background skills

Over half of all student comments in this category pertained to workload (55.4%), followed by peer support (38.9%). For peer support, the majority of students indicated a reliance on other students for support when they encountered difficulties and problems in their courses. Additionally, many students found having supportive peers in class resulted in better discussions and improved understanding of course concepts, leading to feelings of belonging and an overall more positive experience.

Collective Themes of Category Inter-relationships

The coding results revealed complex inter-relationships among themes. For example, the most common theme in students’ comments was that their engagement and sustained motivation depended on the instructor’s behaviour in the classroom and interactions with the students. For example, in response to interviewer prompts, students stated the instructor’s presence and behaviours mattered to their feelings of engagement or disengagement in course activities. Students also mentioned student-instructor interactions such as accessibility to instructors via email and office hours; instructor willingness to address and change in response to student feedback; instructor or departmental presence in instructional videos; and the overall quality of interactions with instructors and teaching assistants. In Human Geography, the instructor’s practice of meeting students between quiz attempts produced strong emotional engagement among students, leading to positive attributions about the instructor. Surprisingly, in response to an interviewer probe about whether the course instructor was easily contacted by email, a student responded, “Yes, he was open, and very good at responding [to] emails, it’s like, he, I guess I’ve never emailed him, but he says that he’s very, he wants our emails.”

A second inter-relationship was the effect of design and organization of online content on students’ engagement and satisfaction with the course. Bernard and colleagues (2009) have also demonstrated the importance of design and organization on students’ course experience. These investigators found that the student-content interaction was positively associated with qualitative measures of achievement outcomes. During the interviews, students attributed their positive and negative experiences with the course components (e.g., forum, online videos, seminars, and lecture) to the instructional team. For example, students blamed the instructor as demonstrating lack of interest even when the instructor had limited control and responsibility over specific online components such as website functionality and third party quizzes.

A third inter-relationship was the disagreement among students with the perceived value of class activities. For example, two students reveal disagreement regarding the perceived value of low-risk assignments:

Student A: “There’s also a disconnect between how much work I was putting into a very small assignment. It’s like I have to watch 90 minutes for 2% of my mark? It’s like no, I’m not gonna do that, it’s not worth my time.”

Student B: “[Instructor] repeats how the course is composed of a lot of small assignments, and [this] reinforces the idea of incremental knowledge.”

These disagreements appeared to surface when students had a poor grasp of the learning objective for a class activity including: pre-assessment tasks, use of instructional videos, lecture activities, repetitive problem-solving activities, and the value of low-risk formative assessments. Pre-class videos or readings were used for communicating fundamentals to free-up class time for more complex concepts. However, many students misunderstood the objective of these online components, suggesting that online material was used to replace lecture time. These misunderstandings of the purpose and design of class activities indicate that the purpose and expectations of the BL approach needs to be clearly communicated to students.

A final inter-relationship was noted, pertaining to the use of active learning practices and the enjoyment students felt as active participants during physically co-located lectures. Students indicated an overall appreciation for the use of live response devices in class as a method of periodically engaging students. Additionally, students mentioned being strongly in favour of in-class collaborative activities such as “pair and share” and group discussion activities. For example, the following two students contrasted instructors having a passive role and an active role in the classroom:

Student C: “It’s a little hard … she likes to talk and then she doesn’t ask, do any of you guys have questions, and then she keeps talking without a break.”

Student D: “I liked that the instructor was really engaging, and we had a lot of opportunities to like talk with your neighbour for two minutes and try to come up with an [answer].”

Discussion

The purpose of this study was to better understand students’ experiences after university courses were changed from physically co-located to blended delivery. We studied students’ background and engagement and student course satisfaction, and included questions about their preference for method of course delivery. However, before discussing results, important limitations need to be recognized. First, this study involved secondary data analysis, and did not involve experimental manipulation or even control of student characteristics across the four different courses. Our analysis of qualitative interviews provide clues for students’ quantitative survey responses, but the present research should be followed up with additional classes, diverse samples and more rigorous techniques for isolating the specific variables within BL courses that lead to student satisfaction or not. We only make tentative recommendations in the paragraphs that follow because causal claims are not warranted by our study design.

Overall, our quantitative and qualitative results provide provisional support for the importance of emotional engagement above all other forms of engagement in predicting students’ self-reported satisfaction with BL courses. Although cognitive and behavioural engagements were also predictors of student satisfaction, they were only statistically significant for particular courses. Emotional engagement was a significant predictor of satisfaction in all four courses. Thus, this suggests implementation of teaching practices that promote positive emotional engagement in the BL environment; for example, students’ emotional engagement appeared to be enhanced when the instructor modelled vulnerability by showing mistakes, was open to meet, used collaborative active learning, and established clear learning objectives. In the sections that follow, these and other predictors of student satisfaction are summarized. In connecting this result to the theoretical foundations guiding the study, this result suggests that capturing students’ cognitive presence is not enough but, rather, the need for emotional presence is just as consequential.

Predictors of Satisfaction

The present study found that almost none of the examined background factors consistently predicted student satisfaction. Of the seven items, only A7 “How many courses with a laboratory component are you taking this semester?” was predictive of student satisfaction and then only in Chemistry. This is surprising since other studies have found workload and high school grades to predict student outcomes (Pike et al., 2008; Roksa & Velez, 2012; Jacobs & King, 2002; Goldrick-Rab & Han, 2011; Andrews, 2018). It is possible that the large degree of homogeneity of student responses reduced the predictive value of items A1–A7. For example, 620 students of 692 reported taking 4 or 5 courses a semester (the amount required for full-time student status). Likewise, 415 of 692 reported no work hours outside of school; those students who reported working outside school, 50% indicated working 10 h or less a week and only 66 reported working more than 20 h.

In terms of engagement, the most frequent predictor of student satisfaction in all four courses was emotional engagement (see Table 5). Measured cognitive engagement and item B32 “I was able to consistently pay attention in this course” (an indicator of behavioural engagement) were inconsistently correlated with student satisfaction for all four courses. Student satisfaction in Chemistry was predicted by behavioural and emotional engagement but not by cognitive engagement. Chemistry was only partially blended and had fewer in-class active learning and collaborative learning opportunities compared with the other three courses. However, this does not account for the failure of cognitive engagement to serve as a predictor, since Human Geography and Math were fully blended and cognitive engagement was also not a predictor of student satisfaction in these courses. It is possible that these courses presented material that was covered in high school or was not very challenging to students. Thus, cognitive engagement may be less important in these courses than in computing science, a subject matter that might not receive much attention in high school.

The regression analysis revealed that only emotional engagement was predictive of student satisfaction in all four courses (see Table 5). This result is supported by previous research linking the effectiveness of BL to its ability to keep students engaged, motivated, and on task through use of instructional elements that encourage interactions among learners (Means et al, 2013). Although we did not quantitatively measure interaction among learners, we speculate based on the qualitative findings that encouraging interaction among learners may be most useful as a way to create opportunities for building emotional engagement.

Satisfaction and Engagement Through Peer Engagement

Our qualitative thematic results suggest that incorporating collaborative active learning processes in BL courses is positively associated with student satisfaction and reducing social isolation. Aligned with expectations from CoI (Garrison et al., 2010), this result is supported by similar findings in online teaching settings (Bernard et al., 2009). One would expect that positive student satisfaction with a specific form of course delivery would be associated with a willingness or even a preference to participate again in that form of course delivery. Confirming this, we found that students in all four courses reported a strong preference for the BL format (item B7: 60.5% overall, 33.3% being chance). For Math, students showed the highest preference for the BL format (86%; see Table 3). Interestingly, the Math course was also the course with the most significant portion of class-time and grade weight devoted to collaborative work (“pair and share” activities, and collaborative two-part exams). Peer interaction and group work were prominent themes in the thematic analysis, suggesting that student-student interactions through collaborative learning are positively associated with student satisfaction. This interpretation is supported by So and Brush (2008), whose findings suggested perceived levels of collaborative learning were predictive of satisfaction. It is also consistent with the pre-existing argument by Trninic et al. (2018) and Melser (2004), that the human desire to learn can fundamentally arise from a drive toward working with others, emphasizing the importance of community creation. Furthermore, undergraduate science student retention and learning outcomes are improved when student engagement and active learning are incorporated in courses (National Research Council 2012; Handelsman et al., 2004; Wieman, 2007; Anderson et al., 2011; Freeman et al., 2007, 2014).

The strong preference for BL found in this study is of particular importance as other studies have indicated that students engaged in BL report reduced satisfaction when compared with physically in-person classrooms (Sikora & Carroll, 2002). In addition, studies focused on assessing the effect of technological training indicate only about 50% of students report a preference for BL vs physically co-located (Poon, 2012). However, stronger student-student relationships have been linked to greater peer support and retention, as well as increased satisfaction among visible minority groups (Terenzini et al., 1997). There is evidence to suggest that social isolation in DE courses leads to greater confusion, anxiety, and frustration due to a perceived lack of clear feedback from instructors (Hara & Kling, 2001). Hassana and Woodcock (2014) have also suggested that the lack of physically in-person interaction in BL reduces social cues and might lead to reduced effectiveness of feedback along with communication and validation of students’ experience (Hara & Kling, 2001). Based on the theoretical literature (e.g., CoI) and our findings, we suggest that the incorporation of collaborative active learning processes in the BL environment might alleviate issues reported in DE by fostering student-student relationships, reducing feelings of social isolation, and increasing student satisfaction. Indeed, the CoI framework explicitly outlines the criticality of developing and establishing a social presence in contributing to the educational experience.

Emotional Engagement by the Instructor

Our findings suggest that student satisfaction is less dependent on the instructor’s ability to deliver content and more dependent on students’ perceptions of an instructor’s emotional openness, vulnerability, and the creation of a supportive learning environment in BL courses. This is an interesting result in our view because it underscores that engagement in learning is more than simply being interested or understanding content. Referring back to the two implications of sociocultural theory by Cole and Wertsch (1996), we can observe an interaction between tools and context within the BL environment. Context must be explicitly inclusive of cognitive and emotional aspects. Making sense of human learning within BL environments must consider not only the physically in-person content and online access to instructional materials but also the emotional character of contact established by the instructor with students in this hybrid medium of instruction. Sociocultural theory emphasizes that human learning occurs within historical, cultural, institutional contexts. We would add to this list the technological context. Personal interactions within every context—historical, cultural, institutional and technological—may determine instructional effects on learners.

When student interactions with the instructor increasingly build trust for scaffolding and learning, students are willing to ask questions, view feedback as constructive, and attempt more difficult tasks (Carless, 2013). In other words, emotional presence helped fortify the social presence. In our thematic analysis, students readily identified with the vulnerability demonstrated by instructors who adjusted their teaching practices in response to student feedback, as well as instructors who acknowledged their own mistakes. Students perceived instructors as open when they used examples from personal experiences, repeatedly invited them to email and attend office hours, frequently reminded them of deadlines and exam dates, and reiterated learning objectives. Although this repetitious approach may seem paternalistic and as undermining the development of independent learning, several students described these experiences as emotionally engaging.

Students also expressed feeling that their learning was supported when instructors used low-risk summative and formative assessment practices in the form of live-response devices, group discussions, and multi-attempt assessments. These practices elevate personalized experiences and place emphasis on reflective learning rather than the punitive consequences of misconceptions or errors. For example, in the Human Geography course, week-long quizzes were administered in which students could meet with the instructor for feedback before attempting the quiz a second time. This is labour intensive and may not be suitable for all courses. However, similar lower-risk automated online quizzes might replicate these characteristics to some degree, although this could dampen the emotional engagement of students if they perceive the feedback to be depersonalized. These conjectures need to be empirically investigated.

Conclusions

Our findings reflect the utility of frameworks such as the CoI (Garrison et al., 2010) in making sense of students’ satisfaction in BL environments, as well as the overall relevance of sociocultural theory in helping us understand the complexity of learning. Moreover, our findings support some of the recommendations already made by previous researchers such as Bernard et al. (2009), Wu et al. (2010), and Nortvig et al. (2018) for BL. However, we emphasize contextualizing these recommendations against broader awareness that students may need to feel emotionally engaged to each other and the instructor before they can make the most of the technological tools they have been given to learn.

The first recommendation is that instructors create a strong online presence, not only because it keeps students on-task but also because it highlights the vitality of the online community. The second recommendation is to use active learning via collaborative activities such as think-pair-share and online discussions to create a dynamic learning environment that promotes connection, and positive emotional engagement (Wu et al., 2010; Bernard et al., 2009). These should explicitly deemphasize ‘right answers’ and emphasize the reflective purpose of peer feedback as a method of engaging with course material at higher metacognitive levels. The third recommendation is to explicitly relate learning goals to course content. If students do not understand or see the connection between course goals and the delivered content and learning activities, students may disengage. The qualitative evidence in this study indicated decreased student cognitive engagement as a result of unclear alignment between learning objectives and learning activities. Unclear alignment might also lead to reduced emotional engagement if students perceive this as instructor disinterest.