Keywords

12.1 Introduction

With personal web-enabled devices having become increasingly prevalent, new ideas for integrating technology into university curricula have surfaced over the last two decades. The depth of integration ranges from simply using a browser to conduct research on the Web as part of learning activity to courses that are offered fully online. The terms mixed-mode learning, blended learning and hybrid learning are often used interchangeably to describe a combination of online and face-to-face teaching (Martyn, 2003; Snart, 2010). In this study, we are concerned with mixed-mode learning in its extreme form, called HyFlex mode (Beatty, 2007), where students can choose to complete any part of a course in online and/or face-to-face mode. This format not only allows students to flexibly choose between the two modes but also to make use of both modes if they desire so for additional learning support. Such flexibility is not offered in traditional blended or flipped classroom instruction, in which the online and face-to-face parts of a course complement each other but usually not overlap (Garrison & Kanuka, 2004; Hill, 2012; McGee & Reis, 2012; Singh, 2003).

Despite the benefits of added flexibility, HyFlex comes with two unique challenges, in addition to those inherent to online and face-to-face instruction individually. Firstly, students should have the same opportunities for learning in either mode and should not be disadvantaged by choosing one mode over the other. Specifically, students should have equitable access to learning resources, tools to complete learning tasks and learning support. Secondly, the way that active learning strategies for student engagement, such as feedback, classroom response systems or collaborative activities, can be implemented in a face-to-face delivery often differs from how these strategies can be implemented in an online delivery. Therefore, student engagement measures need to be customised for each delivery mode. In the following sections, we describe how we addressed these two challenges for a practical information technology course offered in HyFlex mode at an Australian university, guided by the following key questions: What methods can be employed to ensure equity and enhance engagement in a practice-oriented HyFlex course? What are students’ perceptions on the choice and implementation of these methods?

12.2 Background

HyFlex Mode

The term HyFlex refers to courses that are designed in online mode as well as face-to-face mode (hybrid) and allow students to complete any part of the course in either or both of these modes (flexible) (Beatty, 2013). The motivation for this format is to offer the benefits of online mode, such as the convenience to study in any place at any time (Bertram, 1999) and making a course available to students with strict commitments besides study (Robinson, 2005), while still retaining the optional benefits of face-to-face instruction, such as immediate feedback (Hattie & Timperley, 2007) and synchronous communication (Vonderwell, 2003). Having received widespread recognition only recently, little research on the effectiveness of HyFlex exists to date. Initial studies by Kyei-Blankson and Godwyll (2010), Nur-Awaleh and Kyei-Blankson (2010) concentrated on student satisfaction with the format per se. Miller, Risser and Griffiths (2013) investigated student satisfaction as well as student performance in the implementation of HyFlex for the lectures of a statistics course. The implementation consisted of live streaming of the face-to-face lectures, polling through a student response system and a chat room as an additional communication channel. The authors found that student performance with respect to learning outcomes or grades was not negatively affected by the format and that the majority of students preferred face-to-face lectures supplemented by additional instructional methods. Abdelmalak and Parra (2016) designed a graduate course in HyFlex format and observed that the flexibility of attendance mode suited the adult learners’ life circumstances. The authors further found that the flexibility accommodated for learners’ different learning styles and strategies and empowered learners with more control over their learning. Taylor and Newton (2013) looked at HyFlex in an institution-wide context and provided principles to guide HyFlex curriculum development in higher education institutions. Yuskauskas, Shaffer and Grodziak (2015) report from the transition to HyFlex mode for the courses in an undergraduate program that the majority of students attended in a mix of online and face-to-face.

In general, existing research suggests that HyFlex is a promising format, as students liked the associated flexibility (Abdelmalak & Parra, 2016; Beatty, 2007, 2013; Miller et al., 2013; Romero, Chávez, & Gutiérrez, 2016; Taylor & Newton, 2013) and their choice of attendance mode had no significant impact on their grades (Lakhal, Khechine, & Pascot, 2014; Miller et al., 2013). Factors to consider for a successful implementation of HyFlex are instructor training (Beatty, 2013; Romero et al., 2016; Taylor & Newton, 2013) and suitable technology (Abdelmalak & Parra, 2016; Liu et al., 2014; Miller et al., 2013; Taylor & Newton, 2013). In addition, as the flexibility of HyFlex places the onus on the students to make the “right” attendance choice, students should be made aware and receive guidance regarding their attendance choices (Abdelmalak & Parra, 2016; Gounari & Koutropoulos, 2015; Inglis, Palipana, Trenholm, & Ward, 2011; Taylor & Newton, 2013). Students have different learning strategies that may be more or less supported by a given delivery mode (Abdelmalak & Parra, 2016; Inglis et al., 2011). Particularly in online mode, reduced interaction with instructors and lack of self-regulation ability by students may lead to lower course performance or even unenrolment (Deimann & Bastiaens, 2010; Guglielmino & Guglielmino, 2001; Shedletsky & Aitken, 2001). Therefore, the design of HyFlex courses should focus on equity between online and face-to-face mode, so as to eliminate potential disadvantages as much as possible when students choose one mode over another (Beatty, 2007), and engagement measures appropriate for each mode, by recognising the strengths and weaknesses of each mode and providing tailored support for student learning (Platt, Raile, & Yu, 2014; Song, Singleton, Hill, & Koh, 2004).

Equity

Platt et al. (2014) suggested that “a central concern” (p. 290) of the critiques about the wide adoption of online delivery is whether it offers sufficiently equivalent learning experiences compared to traditional face-to-face delivery. Their study revealed that from students’ overall perception, the online and face-to-face modes offer different learning experiences. In particular, the students perceived that online learning provides more flexibility while face-to-face learning allows a greater level of interaction and knowledge gain. HyFlex mode provides an even higher level of flexibility to learning, but it remains a challenge on how to make sure online students are not disadvantaged regarding interaction opportunities and knowledge gain.

Based on the best practice of online delivery design across Australia and the UK, Stone (2017) developed ten recommendations for improving online learning outcomes. Design ideas suggested to encourage online interactivity, including purpose-made short videos, information presented in multiple ways and discussion boards. Carr-Chellman and Duchastel (2000) also pointed out the importance of designing mini-lectures for online students, rather than long recordings of the full face-to-face lectures. Using a variety of media and appropriate technologies is emphasised in the Standards for Online Education developed by Parsell (2014). Video conferencing and discussion boards are useful tools to enhance synchronous and asynchronous interaction between students and instructors and among students in online courses (Carr-Chellman & Duchastel, 2000; Luo & Clifton, 2017). The fundamental aspects of a course should also be made equitable for all students, for example, by providing online submission points for assessments (Griffith University, 2013), including a detailed study guide on the course site (Carr-Chellman & Duchastel, 2000) and developing clear instructions on learning activities (Yuskauskas et al., 2015).

Engagement

HyFlex mode has the unique advantage of engaging students with learning in various ways, especially compared to fully online mode. A recent report by the Australian Department of Education and Training (2017) showed that the 9-year-period completion rate of HyFlex (called multi-modal therein) study, around 70%, was close to that of face-to-face (internal) study, around 76%, and much higher than that of online (external) study, around only 46%. Yet, the integration of diverse teaching methods and the flexibility factor of HyFlex study can also be diminished by negative factors such as technical complexity and problems with instructional materials (Yoo & Huang, 2013). Another issue is students’ lack of reflection while occupied with learning activities (Landis, Scott & Kahn, 2015).

Using ePortfolios for reflective learning has been widely studied (Buyarski et al., 2015; Morreale, Van Zile-Tamsen, Emerson & Herzog, 2017; Mummalaneni, 2014). ePortfolios act as collections of various kinds of artefacts and can thus be a perfect match for regular reflections to support students’ learning routines (Roberts, Maor, & Herrington, 2016). Also, ePortfolios can be used as an assessment tool for “planning, synthesising, sharing, discussing, reflecting, giving, receiving and responding to feedback” (Joyes, Gray, & Hartnell-Young, 2010, p. 2). A study by Landis et al. (2015) found that “educators who adopt ePortfolios invariably seem surprised by the importance of reflection” (p. 117), but at the same time, the reflection tasks need to be carefully designed and students need to be supported to “learn to reflect” (p. 117).

Kearney’s (2013) approach to combine authentic assessment with self- and peer-assessment demonstrated that students could “successfully reflect on the value and quality” of learning (p. 888), became more involved in course interaction and increased their ability to learn independently. Authentic assessment refers to assessing students’ technical and generic skills in settings that relate to real-world scenarios. Even without being coupled to self- or peer-assessment, authentic assessment is a valuable way to engage students and enhance their learning (Boud, 2000; Lizzio & Wilson, 2004). Similarly, peer feedback on its own is effective in raising student engagement and supporting learning (Falchikov, 2001; Hanrahan & Isaacs, 2001; Liu & Carless, 2006; Willey & Gardner, 2010).

Setting

The course considered in this study was part of an undergraduate Information Technology degree program at an Australian public university. The course was offered on two campuses and covered information security and information management topics. Across both campuses, 91 students were enrolled in the course, mainly in their second year of the degree. It was the first offering of this course. The decision to conduct the course in HyFlex mode was based on a desire to provide students with more flexibility in their learning.

12.3 Methodology

Table 12.1 provides an overview of the course dimensions. The conceptual content was presented in the form of lectures. Each week, students could attend a face-to-face lecture of 2-h duration and/or view one to two short video lectures of ca. 15 min’ duration. The practical content was presented as written instructions, which students could complete face-to-face in a computer laboratory or at home using their own computer. Support was provided by a tutor in the laboratory and via an online discussion forum. The assessment comprised four different types: ten learning journal entries to include reflections on the content learned in the lectures, ten laboratory journal entries to include reflections on the practical activities, three quizzes with multiple-choice/true–false questions and two assignments to be completed in groups of up to three students, with the first assignment being peer-reviewed by at least one other group. All study materials and assessment submission points were available online through the university’s learning management system. The HyFlex format was explained to students in the first week of the course. In the following, we present our methodology to address the challenges outlined above across the course dimensions.

Table 12.1 Overview of course dimensions

12.3.1 Equity

We begin with describing our approach to ensure equity among face-to-face and online students for completing the learning activities and having access to learning support. Table 12.2 gives an overview of the equity measures in our approach.

Table 12.2 Overview of equity measures

12.3.1.1 Concepts

A major challenge to guarantee equity was to achieve a balance between the conceptual content delivered face-to-face and online. Even though each face-to-face lecture was recorded, we developed short video lectures for the concepts in each week, since online students may not find the lecture recordings useful (Abdelmalak & Parra, 2016). We aimed to release the online video lectures around the same time as the face-to-face lecture, which was held on the same day across both campuses. In fact, we usually released the video lectures before the face-to-face lecture, so that those students who were interested in watching the video lectures in addition to the lecture could choose to watch online before attending face-to-face. Specifically, this gave them the opportunity to receive feedback face-to-face on questions they had encountered while watching online. Watching the lecture recordings, online students could view and hear what was discussed in the face-to-face lecture. Consequently, both online and face-to-face students had access to the face-to-face lecture, either by attending face-to-face or by watching the recording, as well as access to the online video lectures.

Another challenge for equity was how to address students’ questions regarding the lecture content in an equal manner for both online and face-to-face students. Any questions that students posed during the face-to-face lectures were discussed on the spot. While the discussions were included in the lecture recordings, we needed to resemble the opportunity for the online students to ask their own questions. We therefore set up a course discussion forum on the Piazza platform.Footnote 1 Students could post their questions on this forum and receive replies not only from the instructors but also from other students in the course, as shown in Fig. 12.1. Any correct student answer could be endorsed by the instructors and labelled as such. By allowing other students to reply, rather than just the instructors, the turnaround time to answer a question could be decreased. Studies have shown that the use of Piazza in a course can foster collaborative learning, increase student interactivity and establish a sense of community (Blooma, Kurian, Chua, Goh, & Lien, 2013; Grasso, 2017).

Fig. 12.1
figure 1

Sample question and answer thread on Piazza, with permission from Piazza

12.3.1.2 Practice

The first issue for equity regarding the practical content was to find suitable software for exercise tasks. As is common in modern information technology curricula, students should have the opportunity to practise the actual implementation of course concepts, that is, writing code. Yet, practicing the implementation of security concepts usually requires well equipped and dedicated security laboratories, since students need elevated privileges to run the certain code and isolated software environments to simulate security attacks. Such laboratories were not suitable to our course as we aimed to provide online students with equal access to practise the laboratory exercises. As a result, we needed to develop practical tasks on a more accessible platform that can simulate the concepts delivered in the course to engage students in learning. For the information security parts of the course, we used mainly the SEED lab,Footnote 2 which includes a virtual Linux environment (see Fig. 12.2) and learning activities related to information security. For the database management parts, we settled on phpMyAdminFootnote 3 in conjunction with Codeanywhere,Footnote 4 which allowed students to create and manage a database in the cloud. In particular, we ensured that both solutions can be obtained free-of-charge, run on Windows and Mac machines, and have at least some support documentation on the Web for further reference.

Fig. 12.2
figure 2

Instructions to start the SEED Linux Environment, with permission from Wenliang Du

A second equity issue for practical content, as with the concepts delivery, was how to provide timely support for students in both cohorts. To assist student learning with respect to the practical parts, we prepared detailed, step-by-step instructions with ample screenshots for each week. We further included links to external resources on the Web. Having detailed instructions was an important part of our scaffolded approach (Simons & Klein, 2007). Research has shown that students, who are given step-by-step instructions the first time they attempt a new task, rather than partial or no guidance at all, perform better on solving similar tasks in the future (Carroll, 1994; Sweller & Cooper, 1985). Furthermore, students who are learning about the implementation of a new concept can often get demotivated when they encounter unexpected difficulties. This was true in particular for our online students, who did not have the immediate support of a tutor in the laboratory at their disposal. To address this issue, we set up the course discussion forum (see above), where students could post their questions and receive answers from the instructors or other students.

12.3.1.3 Assessment

Equity in assessment is critical, which involves providing assessment platforms that are equally accessible to both online and face-to-face students. As a result, we decided that all assessment items could be submitted online so that online students would not be disadvantaged. The weekly learning and laboratory journals were submitted on the PebblePadFootnote 5 platform. The quizzes were completed on the university’s learning management platform (Blackboard Learn). Since arranging a proctored setup for monitoring the quiz submissions by online students required too many resources, we had limited control over the parameters under which online students would complete the quizzes. Thus, we designed the quizzes as open-book, but implemented a maximum time allowed to complete the quiz and adjusted the difficulty accordingly.

Both assignments could be completed by students individually or in a group of up to three. Since team-working was an important skill we aimed to build through the assignment projects (Sabin & Sabin, 1994; Williams, Wiebe, Yang, Ferzli, & Miller, 2002), we provided an online platform to allow both online and face-to-face students to collaborate in a group in a real-time fashion. We investigated the functionality of PebblePad to support real-time, online document collaboration, but found that only one user at a time could edit a document. We determined that this would be too restrictive and instead instructed students to use established online document collaboration software, such as Google Docs or Office365, to prepare their assignment documents. Students would then submit their final version and receive feedback on it via PebblePad or Blackboard, thus ensuring an equitable setting for face-to-face and online students.

12.3.2 Engagement

The second major focus in our course design was student engagement. We applied a range of techniques to increase engagement in the face-to-face and online delivery modes, as given in Table 12.3.

Table 12.3 Overview of engagement measures

12.3.2.1 Concepts

One approach we used during the delivery of conceptual content was reflective questions. In the lectures, we encouraged active learning by regularly stopping the presentation and asking students to reflect on the content that was just presented. For example, we would ask students to explain in their own words the difference between two concepts just introduced, to connect concepts from theory to the real world, or to relate some concepts to their own experience. This approach had two aims (Campbell & Blair, 2018): on the one hand, we wanted to stimulate students’ higher-order thinking abilities, and on the other hand, we wanted to find out whether we had explained certain concepts clearly enough.

The main challenge to simulate this active learning technique for the online students was that students would be watching the corresponding video lectures remotely and asynchronously, that is, without access to direct interaction during their viewing. We addressed this challenge by using the online tool ViziaFootnote 6 to embed our active learning questions in the video lectures. Each embedded question would pause the video lecture screen and overlay the question text (see Fig. 12.3). Students would have to choose from one of the given answers. The video lecture would then resume and explain the correct answer. All answers were automatically recorded in a spreadsheet. We used this data to analyse which questions were answered mostly incorrect by the students, indicating that those concepts were perhaps not explained clearly enough in the video lecture.

Fig. 12.3
figure 3

Reflective question embedded in video lecture

Another approach we applied for student engagement was learning journals for the weekly lectures. To reinforce the concepts presented during the face-to-face or video lectures (Green, Wyllie, & Jackson, 2014), students should complete a learning journal. In this journal, each entry would reflect on the previous lecture and was incentivised with marks towards students’ overall course grade, up to a maximum of ten entries. We specifically desired the format and content of the learning journal to be flexible. We did not want to force students to write a summary of each lecture, but rather encouraged them to make entries that would reflect their personal learning journey. We therefore chose to set up the learning journal as a blog type on the PebblePad platform, since blog entries could be of varied formats, such as a simple paragraph of text or an upload of a screen shot.

12.3.2.2 Practice

Hands-on, problem-based learning exercises are critical to student engagement, especially in the IT discipline (Ali, 2005; Kay et al., 2000). We therefore designed the practical learning activities with two goals in mind: firstly, we wanted the students to apply the theoretical concepts they learned in a practical scenario in order to reinforce their learning, and secondly, we wanted to resemble real-world tasks as much as possible in order to increase their employability skills. The practice activities mainly included programming exercises or real-world simulations (see Fig. 12.4 for an example). By using the practical activities as a bridge between theoretical concepts and real-world tasks, we hoped that students would find the activities relevant and thus have a higher motivation to complete them. At the same time, we aimed to develop our practical exercise to be as close to the students’ daily life as possible, so that they can find interesting applications of what they have learnt.

Fig. 12.4
figure 4

Firewall simulation software (Warner et al., 2010)

As mentioned above, we paid particular attention to provide the students with detailed instructions. The majority of the students would be exposed to new software for the first time and required to complete programming activities. This could have certainly been overwhelming and demotivating for some. Therefore, the step-by-step instructions guided the students through the activities and supported proper scaffolding. Moreover, the instructions for each week included several reflective questions, which required students to critically assess certain steps they completed. For example, students should explain the output of a certain function, describe possible implications of security breaches, or research about real-world examples for an activity. Students would compile their answers in a weekly laboratory journal on PebblePad, which would be marked with feedback by the instructors.

12.3.2.3 Assessment

The two major assessment items in the course were the two assignments. We aimed to make the students’ tasks in both assignments as authentic to real life as possible. For the first assignment, students were required to draft a security plan for the scenario in which a university implements a new student grading system. Their security plan should be modelled on an industry template. We further involved our university’s Cyber Security team in the development to align the assignment to current industry practice.

The second assignment was a continuation of the first and required students to implement and justify database management techniques for the new student grading system, using industry-standard programming and security tools. We related the scenario again to the university setting, rather than some other industry setting since students would be most familiar with this type of organisation, particularly if they have had no previous work experience. This also helped students to reflect and understand how the information they interact with on a day-to-day basis is managed and secured.

As a measure to further support their learning, students would peer review other students’ submissions of the first assignment. Instead of simply submitting their assignment and receiving feedback from the instructors, students conducting the peer review would, on the one hand, view other students’ submissions and see how they approached the task, and, on the other hand, receive feedback from their peers in addition to the one from the instructors. Thus, the process would help to deepen their understanding of the assignment task. We implemented the peer review process on the PebblePad platform, in a manner that allowed students to review and comment on other students’ assignments anonymously, which cannot be achieved through standard assignment handling platforms. Each student or group of students would upload one submission. Then, each submission was put together with two other submissions, on average, into a set. Each student was assigned to one set and asked to review the submissions in the set. Since the sets of submissions could contain a mix from both campuses, there was a high chance that each group would receive peer reviews not only from students at their own campus but also from students at the other campus. As previous studies have shown that the use of marking rubrics can improve the peer review process (Ashton & Davies, 2015; Gielen & De Wever, 2015), we provided students with a rubric to review their peers’ submissions. The rubric consisted of 19 items, where each item could be marked as fully, partially, or not completed. Additionally, students could give qualitative feedback. The same rubric was used later by the instructors to determine the actual final mark. The instructors on one campus reviewed sample submissions from the other campus, and vice versa, to align the marks across campuses.

12.4 Evaluation

We conducted two student surveys at different times throughout the course to receive feedback from the students on our approach and its impact on their learning journey. The first survey was designed specifically for this course and took place in Week 5, which we call the mid-term survey. It included five Likert-scale rankings on the perceived value of the different aspects of the course, one choice of preference between the face-to-face lecture and the video lectures, and two short answer questions asking which aspects of the course are most useful and most in need of improvement, respectively. It was completed by 66 of 91 enrolled students across both campuses. Finally, the university-wide student evaluations of course and teaching surveys took place at the end of the trimester, which we call the final surveys. These surveys gather quantitative and qualitative feedback from the students. We added two custom quantitative questions to receive specific feedback on some of our initiatives in this course. They were completed by 31 students across both campuses. We summarise the findings from the surveys and our direct observations below.

12.4.1 Equity

First, we looked into the effectiveness of using video lectures for content delivery, especially for the online students. From the mid-term survey, 58% of the 66 students who participated in the survey found the video lectures most helpful for their learning, whereas the other 42% preferred the face-to-face lectures. This is not surprising, given the flexibility that the video lectures offered to the learners. Indeed, 21% of the surveyed students ranked the video lectures as the most helpful aspect of the course, tied in first place with the laboratories. On the other hand, it also shows a significant portion of students are still in favour of the face-to-face classroom learning environment. Most interesting is that quite a few students appreciated a combination of face-to-face and online delivery. Since the presentation format of concepts differed in the video lectures from the face-to-face lectures, students found value in using both learning resources. Some students used the video lectures as pre-class learning and/or recapping resources. For example, one student commented he or she found “the short videos recapping the important topics of the lecture” a most helpful aspect in the course. These findings highlight the value of HyFlex mode for student learning.

The results from the mid-term survey were confirmed in the final survey, where 61% of the 31 participants strongly agreed that the video lectures had assisted their learning and 68% strongly agreed that the teaching (lecturers, tutors, online, etc.) in this course was effective in helping them to learn. It was also interesting to notice that the standard recordings of the face-to-face lectures were overall much less popular compared to the video lectures, as shown in Fig. 12.5. The figure shows the counts of views of the recordings from the face-to-face lectures of both campuses combined compared to those of the video lectures. The counts are based on cumulative views and include multiple views from unique users. This observation corroborates previous work (Devlin, 2013; Mayes, Luebeck, Ku, Akarasriworn, & Korkmaz, 2011; Parsell, 2014) that specially designed short videos are preferred by students over recorded videos of face-to-face lectures.

Fig. 12.5
figure 5

View count per lecture week, with permission from Piazza

Second, we evaluated the use of the Piazza discussion forum in providing timely feedback to students’ questions and supporting online students. Within 12 weeks, our Piazza forum received 43 posts, including both questions and notes. All questions posted on the forum received answers and a number of them were intensely discussed. The average response time was 38 min, which means students received support in a reasonable timeframe that is even comparable to what one would expect in some face-to-face laboratory sessions (where one tutor assists around 20 students). As one way to engage students in self-learning and peer support, we encouraged peer response and discussion to students’ questions, and it was encouraging to see that 31% of the questions received a student response.

Overall, 65% of the whole cohort visited the forum and viewed an average of 19 posts. Although only 22% of those who visited the forum indeed made contributions, the percentage engaged in Q&A is comparable to (if not higher than) a medium or large face-to-face class setting and similar to the participation in Piazza recorded by Grasso (2017). Figure 12.6 shows the number of unique users per day who visited the discussion forum. The first spike in users, denoted by a circle, coincides with the due date of the first assignment, and the second, denoted by a square, with the day before the due date of the second assignment. Grasso (2017) also found that student activity increased before the due dates of major assessments. The data confirms that our students found the Piazza discussion forum useful enough to discuss their activities there.

Fig. 12.6
figure 6

Piazza unique users per day, with permission from Piazza

Third, during practical laboratory sessions, we carefully monitored students’ experience in following the instructions. The majority found the instructions easy to follow and scaffolded well, and the others could quickly catch up with some assistance from the tutors. For instance, one comment from the mid-term survey reads: “I really like the comments in laboratory questions saying for ‘If you cannot remember how to do X then refer to Laboratory X or X’. This is fantastic for the pre-conditions. Keep it up!” In particular, there was only a single question on the discussion forum regarding the laboratory instructions, suggesting that the laboratory instructions generally provided enough clarity.

12.4.2 Engagement

First, with respect to evaluating the effect of reflective questions during the concept presentations on students’ active learning potential, 21% of respondents in the mid-term survey strongly agreed and another 58% agreed that they were satisfied with the presentation of the lectures. Particularly, we received unsolicited positive comments from students in the final surveys on their effectiveness. Examples include: a student “enjoyed stoppages in lectures to discuss matters in more details”, and the lecturer “encouraged questions being asked to help us understand better”. We were unable to obtain the statistics on student participation for the quizzes embedded in the video lectures, as the Vizia platform has not been maintained anymore.

Second, we evaluated the weekly learning journals as a way to promote reflective learning. Student feedback on the learning journals was overall very positive. They were ranked as the third-most useful aspect of the course in the mid-term survey, behind video lectures and laboratories, with 9% of respondents mentioning it. Furthermore, in the final surveys, 19% of respondents mentioned the learning journal in their response to the qualitative question “What did you find particularly good about this course?” 66% of all students in the course completed each weekly learning journal entry, with an average grade over all students of 12.2 out of 15 possible marks. In particular, we observed that the majority of students created detailed and individual summaries of their learning experiences, for example, an elaborate mind map as shown in Fig. 12.7. This demonstrates that the students enjoyed the format of the blog entries on PebblePad, which allowed them to submit their posts in different formats.

Fig. 12.7
figure 7

Mind Map as an example of a learning journal entry

Third, we evaluated students’ perceptions of the practical exercises, which were focussed on hands-on, real-world activities and included reflective questions for summative assessment. In the mid-term survey, 21% of the surveyed students rated the laboratories as the most helpful aspect of the course, tied in first place with the video lectures. One student commented in the final survey: “As much as people complain, I found the weekly assessment to be really helpful. It forced you to actually cover the content”.

Lastly, we evaluated the success of implementing peer review through the PebblePad platform. In total, 70% of submissions received two or more peer reviews. We can see in Table 12.4 how the peer review marks determined by students differed from the actual marks determined by the instructors. A total of 46 submissions were uploaded and could receive between 0 and 15 marks. In the usual case, each submission received more than one peer review, so the peer review marks were calculated as the average of all peer review marks given to that submission. The mean difference between peer review marks and actual marks is 2.04, with a standard deviation of 1.56 and a maximum difference of 5.75. The mean difference corresponds to 14% of the total marks possible for the assignment. When distinguishing between positive and negative differences, we found that 26 of the 46 submissions received higher peer review marks than actual marks and 19 received lower peer review marks. The mean and standard deviation are slightly higher for the former and slightly lower for the latter. Thus, students generally valued submissions within one grade step of how the instructors valued the submission, with no distinct tendency by the peer reviewers to under- or overvalue, similar to the finding by Luo, Robinson, and Park (2014). This implies that students generally had a good understanding of the requirements of the assignment.

Table 12.4 Measures of the difference between peer review marks and actual marks for Assignment 1 (Marks scale: 0–15)

12.4.3 Summary and Limitations

It is encouraging to see that our methodology was effective in helping students learn. From our evaluation, we can say that the interventions we implemented in this course fulfilled our expectations. We restricted our evaluation in this study to the learning aspects and left out most technical details of the learning platforms. Our methodology was also well-received by students. They rated their overall satisfaction with the course at an average of 4.4 on a 5-point Likert scale across both campuses.

The mid-term and final surveys also helped us identify some of the key areas where the course can be improved. For instance, in the mid-term survey students mentioned as the two most important aspects the provision of more detail in the lectures and the focus on more Linux content in the lectures and laboratories. In the final surveys, the most common theme in students’ feedback was to make the assessment specifications clearer. These were valuable comments and will help us to improve the course for the next iterations.

Since students could flexibly choose between the face-to-face and online mode throughout the entire course, there were no defined face-to-face or online cohorts. This made it difficult to perform global comparisons between the two cohorts, such as comparing late submissions or final grades. However, from our observation as lecturers, we found that on both campuses there was no relation between the students’ chosen lecture mode and their grade for the weekly learning journal, for example. For the next iteration of the course, we plan to gather more data on students’ chosen mode in order to gain further insights into possible relations.

Moreover, we analysed in this study the effectiveness of our interventions for the HyFlex cohort across both campuses. It would be interesting for future work to conduct this analysis also on the campus-level, in addition to the cross-campus analysis, and compare the results from one campus to the next. Obviously, this could then also include comparisons over time and with other HyFlex or mixed-mode courses that implemented similar interventions.

12.5 Conclusion

We shared in this study our experience with implementing a variety of equity and engagement methods to assist student learning in a HyFlex course. The main contribution of our work was the focus on equity and engagement along all major aspects of the course, with the practical orientation of the course being a major challenging factor. The evaluation of our approach showed that students were overall satisfied with our choice and implementation of equity and engagement methods. Students particularly liked the short video lectures, practical laboratory exercises and reflective learning journals. With HyFlex mode still being a relatively new delivery format, further investigation is required to conduct more fine-grained analyses of each individual method across online and face-to-face mode, as well as more holistic analysis between student learning styles, their choice of mode throughout the course and their course performance and satisfaction.