Abstract
Digital game-based learning (DGBL) is becoming a prominent innovative approach used in higher education to gamify the learning experience. Research studies using DGBL as a framework for game design has shown promising results in increasing student engagement, motivation, interaction, satisfaction, and most of all, learning outcomes among undergraduate students in blended and face-to-face learning environments. Although a plethora of digital or online games exists, game-based student response systems, also known as SRSs, were commonly reported in the research literature as an educational technology tool used to engage students into active learning roles. The positive impact of game-based SRSs in traditional classrooms were well documented. However, scant research exists on the effectiveness of game-based SRSs used in distance education courses designed for first-year learners. Therefore, further studies are needed to address this gap in the literature. This review synthesizes the research evidence on the impact of game-based SRSs in terms of (1) enhancing teaching practices, increasing student engagement and motivation, and improving learning outcomes and (2) understanding students’ perceptions regarding their learning experience using game-based SRS as a teaching and learning tool in undergraduate courses in higher education settings. Finally, future recommendations for practice are discussed.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
Digital game-based learning (DGBL) is becoming a popular innovative pedagogical approach to teaching and learning within higher education and there has been much research done to determine its effectiveness for improving teaching and learning outcomes (Chang et al., 2018; Erhel & Jamet, 2016; Hung et al., 2018). Generally speaking, DGBL provides a promising solution for solving instructional problems in undergraduate courses. Empirical studies using DGBL as a framework for testing the effectiveness of using a digital or online game as a pedagogy has shown to increase student motivation, engagement, and learning (Chang et al., 2018; Hung et al., 2018; Turner et al., 2018; Yildirim, 2017). A growing body of research on DGBL exists and there are various types of experiential educational games used in higher education classrooms such as video games (Erhel & Jamet, 2016), online/digital computer games or simulations (Chang et al., 2018; Turner et al., 2018), 3D game models (Chang et al., 2018), tutorial games, board games, and interactive games using technology (Elelmahdi et al., 2018; Hung et al., 2018; Karaaslan et al., 2018).
Furthermore, using DGBL as the framework for game design, empirical studies utilizing digital games and gamified online quizzes as formative assessments for student learning have shown positive results in increasing student engagement and motivation, promoting class participation and interaction, providing meaningful feedback, and encouraging continuous practice and exposure to the course content. As a result, students retain the information learned, thus, improving learning outcomes on summative assessments (Karaaslan et al., 2018; Sanchez et al., 2020). Despite the plethora of educational games that exists, many educators and scholars are turning to game-based student response systems (SRSs) as a pedagogical strategy for gamifying the learning experience (Chien et al., 2016; Hunsu et al., 2016; Laici & Pentucci, 2019; Liu et al., 2019; Sanchez et al., 2020; Subhash & Cudney, 2018).
2 Research Background
The purpose of this systematic review is to synthesize the research findings on the impact of game-based SRSs in undergraduate courses in higher education settings for enhancing teaching practices, increasing student engagement and motivation, as well as improving learning outcomes. The researcher’s background is in online teaching and learning, so the intent was to find primary-level studies exploring the use and benefits of game-based SRSs in distance education courses. However, as evidenced in this review, empirical studies investigating the impact of game-based SRSs in undergraduate courses were primarily in traditional and blended learning environments; therefore, further research studies are needed to determine the impact of game-based SRSs in distance education courses designed for undergraduate students. Nonetheless, the research findings from this synthesis can be helpful in identifying effective game-based SRSs used to promote student participation, engagement, motivation, and active learning in undergraduate courses. Most importantly, this synthesis should serve as the foundation for future research and teaching practice decisions and actions regarding the use and design of game-based teaching and learning pedagogies for not only traditional learners, but also e-learners enrolled in undergraduate distance education courses.
2.1 The Use of Game-Based SRSs in Undergraduate Courses
Recent empirical studies using an experimental or quasi-experimental approach to investigating the impact of game-based SRSs, also known as classroom response systems (CRSs), audience response systems (ARSs), or interactive response systems (IRSs) in undergraduate courses, have shown to be an effective educational technology tool college faculty could use during classroom instructions to engage students into active learning roles (Cheng & Wang, 2019; Datta et al., 2015; Funnell, 2017; Liu et al., 2018; Persaud & Persaud, 2019; Rahmahani et al., 2019; Sanchez et al., 2020; Tivener & Hetzler, 2015; Voith et al., 2018; Wang, 2018; Yabuno et al., 2019). A primary function of an SRS is to facilitate interaction between students and instructor using hand-held electronic devices such as a smart-phone, laptop, tablet, or notebook. With the effective use of SRSs, both instructor and students have access to the students’ responses on multiple-choice, true-or-false, closed-ended, and open-ended questions, thus allowing faculty to provide ongoing formative feedback (Laici & Pentucci, 2019). Therefore, game-based SRSs have shown positive results in improving instruction and learning outcomes.
Although there are different types of game-based SRSs available to students and college faculty, the most popular ones reported in the research literature were Socrative (Abdulla, 2018; Aslan & Seker, 2017; Badia Valiente et al., 2016; El Shaban, 2017; Guarascio et al., 2017; Munusamy et al., 2019; Pérez Garcia & Marin, 2016; Sprague, 2016), Kahoot! (Rahmahani et al., 2020; Yabuno et al., 2019), PollEverywhere (Meguid & Collins, 2017; Wong, 2016a, 2016b), Plickers (Elmahidi et al., 2018), clickers (Voith et al., 2018), TopHat (Feraco et al., 2020; LaDue & Shipley, 2018; Ma et al., 2018), Mentimeter (Joshi et al., 2021; Wood, 2020), TurningPoint (Lee et al., 2015; Stowell, 2015), KeyPad (Sawang et al., 2017), NearPod (Tornwall et al., 2020), Quizizz (Asiksoy & Sorakin, 2018) and Quizlet and Powtoon (Karaaslan et al., 2018). Despite overwhelming positive results reported in the literature, it is apparent that some SRSs are more effective than others depending on the classroom size, course content, duration of teaching sessions, student and faculty comfort with technology, quality of game design, development of quiz questions, and students’ individual learning preferences (Hunsu et al., 2016; Laici & Pentucci, 2019; Wang & Tahir, 2020). Also, not a single study was conducted in an online or distance education course designed for undergraduate learners; therefore, scarce research exists, calling out a need for future studies (Karaaslan et al., 2018).
In this systematic review, the researcher sought to better understand and identify primary-level studies using DGBL as the conceptual framework for investigating the impact of game-based SRSs on teaching effectiveness, student engagement, motivation, and learning outcomes, with the focus on undergraduate courses in higher education settings (traditional, blended, and distance education). This review not only seeks to measure the impact of game-based SRSs on student learning outcomes by using aggregate results, it also includes other studies exploring students’ perceptions, attitudes, experiences, or satisfaction with using game-based SRSs as a teaching and learning tool to provide a more in-depth analysis that could inform future practice. The questions that guided this review were the following: (1) What is the impact of game-based SRSs on teaching effectiveness, student engagement, motivation, and learning outcomes in undergraduate courses in higher education settings?; (2) How is the impact of game-based SRSs on teaching effectiveness, student engagement, motivation, and learning outcomes measured in undergraduate courses in higher education settings?; (3) How are the main effect sizes observed among studies on learning with game-based SRS influenced by methodological features of the study design?; and (4) How do undergraduate students perceive the use of game-based SRSs as a teaching and learning tool?
3 Methods
Using a Mixed Methods, Mixed Research Synthesis (MMRS), an integrated research design used to integrate different types of studies and data, the systematic review diligently followed the steps of Petticrew and Roberts (2005), Gough et al. (2017), and Thomas et al., (2017a, 2017b) prescribed methods. Data included in the review are findings extracted from various qualitative, quantitative, and mixed methods studies and mixed synthesis techniques are used to integrate the primary-level studies within the MMRS. Therefore, to carry out the MMRS, the researcher followed eight stages as described by Heyvaert et al. (2017) and each stage is broken down into a more detailed systematic process as outlined and adapted from Newman and Gough (2017), as shown in Fig. 1: (1) writing the review protocol (including review objectives, review questions, and MMRS design); (2) sampling; (3) searching for primary-level studies; (4) applying inclusion and exclusion criteria; (5) critically appraising the methodological quality of the primary-level studies; (6) extracting relevant data from the primary-level studies; (7) interpreting, synthesizing, and integrating the data; and (8) writing and disseminating the MMRS report.
3.1 Search Strategy
The search for literature was carried out March–May 2020 and December 2021 accessible via different electronic databases using the university’s library such as Academic Search Complete, the Directory of Open Access Journals, ERIC, EBSCO, JSTOR, Science Direct, ProQuest, PsychINFO, and Taylor and Francis. This search was filtered to focus on peer-reviewed journals published between 2015 and 2021 with a focus on game-based SRSs and their impact on teaching, student engagement, motivation and student learning outcomes in undergraduate courses in higher education settings, as well as studies exploring students’ perceptions, experiences, attitudes, and satisfaction with game-based SRSs as a teaching and learning tool. The search results were filtered to include publications in the fields of education, social sciences, literacy studies, nursing sciences, math and statistics, computer science and engineering, gaming, educational technology and Web 2.0 tools, and undergraduate studies in foreign countries.
3.2 Search Terms
This review focused on two main areas: (a) the impact of game-based SRSs on teaching effectiveness, as well as student engagement, motivation and learning outcomes in undergraduate courses in higher education settings and (b) students’ perceptions, experiences, attitudes or satisfaction with using a game-based SRS as a teaching and learning tool. Therefore, keywords were used to identify relevant articles found in bibliographic databases. An example of a bibliographic database search is given in Box 1. This search was used in a review that aimed to find primary-level studies that investigated the impact of game-based SRSs on teaching effectiveness, student engagement, motivation, and learning outcomes in undergraduate courses (Chien et al., 2016; Hunsu et al., 2016; Laici & Pentucci, 2019). The search is built using terms for the population of interest (undergraduate students), the intervention of interest (game-based SRSs) and outcomes of interest (teaching effectiveness, student engagement, motivation, and learning outcomes). It used keywords and Boolean operators “OR” and “AND” to combine terms.
Box 1: Search string example to identify studies that address the question, “What is the empirical research evidence on the impact of game-based SRSs on teaching effectiveness, student engagement, motivation, and learning outcomes among undergraduate students in higher education settings?” ProQuest Database ((TI = (“student response systems”) and (“undergraduate”) or TI = ((“student response systems”) and (“online education”) and (“undergraduate”) or TI = ((“student response systems”) and (“distance learning”) and (“undergraduate”) or TI = ((“student response systems”) and (“distance learning”) or (“distance education”) or (“online learning”) and (“undergraduate students”) or (“college students”) or (“university students”) |
To refine the search results, keywords “student response systems” and “undergraduate” were used. Using these broad key words and Boolean operators helped to identify publications where the impact on instruction, student engagement, motivation and learning were considered. To help focus the search only on empirical studies investigating the impact of game-based SRSs on student learning outcomes in undergraduate courses in the United States and foreign countries, keywords “undergraduate”, “undergraduate students”, “college students” or “university students” were used.
3.3 Selection Criteria
Studies identified by the search were subject to a process of checking or also known as screening to ensure they meet the selection criteria. There were two stages in this process. In the first stage, titles and abstracts were carefully reviewed and checked for relevancy and if relevant, a full copy of the article was downloaded and saved into a file folder to complete the coding process. A Microsoft Excel spreadsheet was used to keep track of article titles, abstracts, and citations. The second stage involved reading and rereading, interpreting, and extracting information needed to answer the review questions, such as the research method and design, measurements, and reported findings using the Critical Appraisal methodology by Petticrew and Roberts (2005). This information was recorded and coded on the spreadsheet using the study methods prescribed by Gough et al. (2017) and Thomas et al., (2017a, 2017b). In combination of individual effect sizes using statistical procedures, results from studies were linked to explore different themes in the data, compare methodologies and designs, and search for commonality and refutation with theory (Thomas et al., 2017a, 2017b).
Overall, empirical articles were selected if they investigated the impact of game-based SRSs on teaching effectiveness, student engagement, motivation, and learning outcomes designed for undergraduate students in higher education settings using qualitative, quantitative, and mixed research designs. The exclusion criteria identified articles presented as reviews and reports, reflection practices, faculty’s perception or experiences with the use of SRSs, research studies conducted in K-12 settings, as well as in graduate and doctoral courses, and studies that used different gaming approaches other than game-based SRSs. There was a total of 69 articles that explored the effectiveness of using a game-based SRS in an undergraduate course, however, five of these articles were either a literature review, meta-analysis, or systematic review, three were reflection practices, one focused on the development of a conceptual framework for game-based SRS, and one article explored the usefulness of game-based SRSs for team work such as cooperative and collaborative learning. Therefore, these studies were excluded from the review. From this search criteria, 59 articles met the inclusion criteria for this rigorous review as shown in Table 1.
4 Data Collection
4.1 Coding Studies
Primary-level studies selected were systematically coded and information from the studies were recorded and used to answer the review questions. This information included the characteristics of the studies, including details of the type of game-based SRS used, learning environment, research design, measurements used to detect quantitative differences in outcome between groups receiving and not receiving an intervention (game-based SRS), effect sizes, and the results. First, each study was coded according to type of study conducted, such as experimental or quasi-experimental research (ER), mixed method (MM), qualitative (QS), and studies exploring students’ perceptions (SP) were also included. Next, an assessment of the quality and relevance of the studies in addressing the review questions, as well as the results of each study were synthesized to answer the review questions using the Critical Appraisal steps outlined by Petticrew and Roberts (2005). To produce a better answer to the review questions, analytic steps recommended by Thomas et al., (2017a, 2017b) and Gough et al. (2017) were used in combination of individual effect sizes using statistical procedures such as searching for themes in the data, exploring differences in research designs and measurements, and comparing research outcomes.
4.2 Data Analysis
Empirical articles meeting the inclusion criteria were further analyzed (Table 2) in relation to some thematic elements of interest such as the type of SRS used in the classroom, learning environment, study design, statistical test, subject, nation, measurements, and if there were reported statistically significant difference in test scores or course grade (student learning outcomes). In addition to reporting these themes, if effect size was reported for each individual study, it was included in the analysis. Out of the 59 studies, only 24 used a quasi-experimental, experimental, or mixed method research design to measure the impact of a game-based SRS on student learning outcomes as shown in Table 2. However, Table 3 displays thematic data from studies exploring students’ perceptions regarding the use of a game-based SRS as a teaching and learning tool. Most of these studies used a quantitative method, however, there were a few that were qualitative studies. Therefore, students’ feedback from surveys, questionnaires, and interviews are included in the analysis to provide a better understanding of students’ preferences, attitudes, experiences, or satisfaction using game-based SRSs in undergraduate courses.
As mentioned previously, 59 of the articles that met the inclusion requirement, 24 of these were quasi-experimental or experimental studies that investigated the impact of a game-based SRS on student learning outcomes in an undergraduate course. However, 35 of these studies explored students’ perceptions with the use of game-based SRSs as a teaching and learning tool in an undergraduate course using surveys, questionnaires, and interviews as shown in Table 3. Questions from surveys, questionnaires and interviews were regarding students’ overall experience, their attitude and satisfaction with the use of a game-based SRS and how it may have impacted their learning, participation, interaction, involvement, or engagement in an undergraduate course. Interestingly, not a single study was conducted in a distance education or online course, which means little to no data exists to validate or refute the impact of game-based SRSs on undergraduate students’ learning, engagement, and motivation, a major gap in the research literature.
5 Results
Table 2 shows the number of studies conducted in higher education institutions, which are geographically dispersed among Australia, Europe, United States, China, Taiwan, Guyana, India, Ireland, Norway, and Turkey. All of these studies investigated the effectiveness of game-based SRSs on student learning outcomes designed for undergraduate students. However, only two studies reported individual effect sizes. Therefore, making it difficult to provide aggregate results based on SRS used in the classroom, which limits accurate inferences that can be made about the true impact game-based SRSs can have on student learning. Despite this limitation, more than half (58%) of the studies used a pre-post-test research design, 33% used a posttest only design, and less than 1% used structural equation or predictive modeling to test for differences in test scores or course grades. Interestingly, 75% of these studies had significant findings, thus, suggesting that game-based SRSs can be an effective educational tool college faculty can use to engage college students into active learning roles. However, there were six studies that did not produce significant findings and, in these studies, traditional hand-held clickers, iclicker, Kahoot!, Peardeck, and TurningPoint mobile technologies were used. Based on the extensive research literature on game-based SRSs, Peardeck was not a common SRS tool reported in the literature. Despite positive findings reported in traditional undergraduate courses, it is still unclear as whether game-based SRSs will have the same impact on learning outcomes for undergraduate students enrolled in distance education courses.
Further, most of the studies used inferential statistics to test the research hypotheses and most of them reported statistical findings (75%). However, studies conducted by Funnell (2017) and Pérez Garcias and Marín (2016), were poorly designed in that they did not use inferential statistics to test for statistical significance or effect size of treatment (game-based SRS) on student learning outcomes, which makes it difficult to draw accurate conclusions about the true effectiveness of game-based SRSs such as TurningPoiont, iClicker, Mentimenter, Socrative, or Kahoot! in undergraduate courses. As noted by Field (2009), using descriptive statistics only provides information about frequencies, means, and standard deviations and does not allow researchers to test hypotheses and make accurate inferences about the population and the effects of the treatments used in the study. Therefore, the research findings should be used with caution. To summarize the research findings, the following research questions were used to guide the researcher: (1) What is the impact of game-based SRSs on teaching effectiveness, student engagement, motivation, and learning outcomes in undergraduate courses in higher education settings?; (2) How is the impact of game-based SRSs on teaching effectiveness, student engagement, motivation, and learning outcomes measured in undergraduate courses in higher education settings?; (3) How are the main effect sizes observed among studies on learning with game-based SRS influenced by methodological features of the study design?; and (4) How do undergraduate students perceive the use of game-based SRSs as a teaching and learning tool?
5.1 Impact of Game-Based SRS Tools
5.1.1 Student Learning Outcomes
Previous empirical studies using game-based SRSs have shown to be effective in increasing student engagement, enhancing teaching and learning, and improving course performance in undergraduate classrooms (Buil et al., 2019; Cheng & Wang, 2019; Datta et al., 2015; Persaud & Persaud, 2019; Tivener & Hetzler, 2015; Voith et al., 2018; Walklet et al., 2016; Wong, 2016a, 2016b). While some studies show positive effects of SRSs, empirical evidence indicate that the results are mixed depending on the type of SRS used in the classroom. For example, Liu et al. (2018) conducted a quasi-experimental study using a pre-test and post-test design to study the effectiveness of using an interactive SSR called Peardeck in improving English grammar skills among 50 s-year students in the French Engineering Institute. Results from the independent t-test showed no significant difference between the pre-test and post-test scores for the experimental group, which suggests that the SRS was not effective for improving students’ learning achievement of English (Liu et al., 2018).
In another study, Funnell (2017) compared the effects of using online audience response systems (ARSs) (Mentimeter or Socrative), clickers, and a mixture of the two for teaching information literacy concepts to medical students at Queen Mary University of London. Based on the results from student evaluations, class observations, and quizzes, ARSs, when used as an active learning pedagogy, were more effective than clickers in terms of increasing student engagement, and having a positive impact on student learning (Funnell, 2017). On the contrary, Voith et al. (2018) studied the use of clickers on learning outcomes in an undergraduate social work course and found significant findings, which suggests that the clickers may be an effective tool in increasing student engagement and promoting learning. Cheng and Wang (2019) had similar results with their study using a classroom response system (a clicker system). Cheng and Wang (2019) examined the effect of using a classroom response system on the learning performance and student participation of 2500 undergraduate students in courses such as Corporate Finance, International Finance, and Introduction to Business Law at a public university in Hong Kong. Findings from regression models were positive, thus, suggesting that the CRS contributed to better academic performance and a higher level of student participation (Cheng & Wang, 2019).
In a similar study, Wong (2016a, 2016b) investigated the effectiveness of using a clicker on 170 Chinese and international undergraduate students’ learning efficacy and satisfaction. Data from final exam scores and student survey were collected to test differences in course performance and student perception of the use of the clicker. Over 70% of Chinese students agreed that the clicker application contributed significantly to their learning and most of them felt that the clicker increased their engagement and involvement in class discussions (Wong, 2016a, 2016b). An independent t test was conducted and results indicated that students using clickers scored higher compared to students who did not use the clickers. Datta et al. (2015) also compared the efficacy use of an interactive clicker to a didactic lecture (Microsoft PowerPoint presentation) among 192 undergraduate medical students in India. Data from pre-test, post-test and retentions tests scores were collected and analyzed to test differences between the two teaching methods. The interactive post-test score was better than the didactic lecture by 8–10% and the interactive retention test score was higher than conventional test score by 15–18%, thus, suggesting that the interactive clicker had a positive effect on student learning outcomes and student–teacher interactions (Datta et al., 2015).
Furthermore, Tivener and Hetzler (2015) compared differences in student knowledge acquisition and interactivity using an ARS (clicker) in a basic athletic training course among 69 undergraduate students enrolled in an introductory athletic training course. To assess students’ basic athletic training knowledge and student interaction, Tivener and Hetzler (2015) collected data from a presurvey and postsurvey and conducted an ANOVA to test for differences and found that the ARS significantly improved student learning and classroom interactivity. However, both the control and experimental groups increased at the same rate, which did not support the hypothesis that differences would exist between the groups using ARS technology (Tivener & Hetzler, 2015). Using a pretest/post-test design, Persaud and Persaud (2019), compared differences in student perceptions of their level of interactivity when an SRS was used to promote interaction in an undergraduate class of 239 students enrolled in an introductory Information Systems course at the University of Guyana. Results from an independent t-test indicated a significant difference between scores, thus, providing strong evidence that the intervention improved student interactivity in a large class (Persaud & Persaud, 2019).
Using a quasi-experimental design, Wang (2018) also examined the effectiveness of integrating Kahoot! in the classroom to support collaborative learning of 120 students in their third year of college. Results from student questionnaires, weekly learning diary, grades from the interactive response system, and delayed tests revealed that students who used Kahoot! as a group activity performed significantly better than students who used the individual account. However, users with the individual account had better learning retention and showed significant improvement on delayed tests. Nonetheless, students reported that Kahoot! contributed to increased class participation, fostered interaction, and promoted enjoyment of the class (Wang, 2018).
Yabuno et al. (2019) also compared the effectiveness of using traditional SRSs (clickers) and gamified SRSs such as Kahoot! among 255 undergraduate students enrolled in a human anatomy course. To test for differences in course performance and student perception, students completed a summative assessment and end of course survey. Results from a multiple linear regression model indicated that there was a positive correlation between performance using iClicker and exam performance, as well as positive correlation between performance using Kahoot! and exam performance (Yabuno et al., 2019). Based on the end of course survey, 80% of students reported that iClickers and Kahoot! were both fun and effective for learning. However, 95% reported that iClickers should be used every day, whereas 70% of students reported that Kahoot! should only be used once a week (Yabuno et al., 2019).
5.2 Students’ Perceptions
5.2.1 Student Engagement, Participation, and Interaction
Although not included in Tables 1 and 2, it is important to note that 35 (59%) of the 59 empirical studies found in the research literature focused on students’ perceptions and their learning experience using a gamified SRS as a teaching and learning tool in undergraduate courses. Interestingly, students’ perceptions varied depending on the course and SRS used in the classroom. For instance, Bicen and Kocakoyun (2017) conducted a study among 130 university students to determine the most preferred mobile application for gamification and found students preferred Kahoot! to ClassDojo, Classcraft, and Socrative. In another study, Karaaslan et al. (2018) compared the effectiveness of several gamified SRSs (Kahoot, Quizlet, NearPod, Powtoon, and YouTube) on intermediate-level English language learners’ vocabulary learning performance. Data consisted of a survey of students’ reflections on their vocabulary learning experiences through digital games and activities (Karaaslan et al., 2018). Results indicated that 45 out of 40 students enjoyed learning vocabulary through games, they liked the competitiveness, curiosity and team spirit created in games, and they found playing games meaningful (Karaaslan et al., 2018).
Using Poll Everywhere, Deng (2019) examined how the SRS influenced the effectiveness of information literacy instruction among millennial students enrolled in an English Composition program at Eastern Washington University. A librarian designed multiple sets of open-ended and closed Poll Everywhere questions that were integrated into the library instruction sessions and found that the SRS encouraged classroom discussion, reduced redundancy in instruction, and helped make learning more meaningful for millennial students (Deng, 2019). In another study, Walklet et al. (2016) examined the influence TurningPoint clickers and Poll Everywhere had on the learning experience of 143 undergraduate psychology students in two separate studies. To examine students’ perceptions of the use of clicker quizzes, a survey was conducted. Overall, 91% of students agreed that the SRS provided meaningful feedback, increased understanding, and promoted peer discussion. However, only 62% of students stated that the SRS increased their confidence on the assessment (Walklet et al., 2016). Students also expressed issues with the clickers in that they take too much time to set up in class and sometimes they do not work. In the second study, Poll Everywhere was used as a formative assessment and a survey was conducted to determine students’ perceived impact on student engagement and learning experience. Majority of students agreed that the use of polling activities improved the learning experience. However, Poll Everywhere was less successful in encouraging attendance and was perceived as less useful for promoting higher-level cognitive skills (Walklet et al., 2016).
In another study, Elelmahdi et al. (2018) examined the effectiveness of using a classroom response system called Plickers as a formative assessment for aiding the learning process among 166 undergraduate students enrolled at the University of Bahrain. To assess students’ perceptions of the use of Plickers, a questionnaire was used. Students’ responses to the open-ended questions indicated that Plickers had a positive effect on students learning, especially for millennials, which suggests that gamified tools such as Plickers can be an effective formative assessment (Elelmahdi et al., 2018). In a different study, Buil et al. (2019) studied the gaming effects of using clickers on student learning using the flow theory, which was first introduced by Czikszentmihalyi in 1975. The flow theory is rooted in the idea that some activities are intrinsically motivating, which enhances the learning experience. In their study, Buil et al. (2019) analyzed the influence of three flow preconditions for game-based design, balance of skill and challenge, feedback and goal clarity, on 204 undergraduate business students’ concentration, sense of control and experience on students’ perceived learning and satisfaction with the gamified activity. Based on the results of the Flow State Scale, findings indicate that balance of skill had a positive impact on concentration, sense of control, and experience. In addition, the feedback and goal clarity provided by clickers also had a positive impact on students’ concentration and sense of control, but did not have an effect on their learning experience using the SRS (Buil et al., 2019). Lastly, concentration and sense of control had a positive impact on students’ perceived learning, which suggests that their experience predicts both perceived learning and satisfaction (Buil et al., 2019). In essence, the findings of this study provide strong support for the use of technology as a tool to promote flow experiences and enhance the learning experience.
Further, to investigate the effectiveness of interactive technologies such as Socrative, Florenthal (2018) used the uses and gratifications (U&G) theory, which was originally developed in the 1940s and widely used in web-based advertising to engage users. Based on qualitative data collected from open-ended questions from a survey and a discussion group assignment from 40 college students enrolled in a marketing research course, five motivational themes emerged: (1) knowledge acquisition and learning; (2) expression of self and others; (3) interaction, engagement, and enjoyment; (4) convenience; and (5) annoyance (Florenthal, 2018). Overall, students felt that the use of Socrative motivated them to focus in class and pay attention to key concepts, encouraged students to learn and retain information, and enhanced their learning experience by making learning fun, interacting, and engaging. However, 60% of students expressed that the use of Socrative was annoying (Florenthal, 2018). More than half of the students expressed how much they were frustrated with some of the participation features. Nonetheless, this study demonstrates that the U&G theory can provide a practical framework for comparing motivational drivers found in SRS tools. However, empirical evidence is needed to determine the effectiveness of using Socrative for improving learning outcomes in online courses (Florenthal, 2018).
Carroll et al. (2018), on the other hand, used Bandura’s social learning theory (SLT) (1977) as a framework for investigating the effectiveness of GoSoapBox, an interactive online SRS, for improving learning experiences in an undergraduate sociology and public health course. To assess students’ perception of GoSoapBox, a survey was conducted and 50% of students stated that GoSoapBox positively influenced their learning, 32% stated that GoSoapBox kept them engaged, while a third of students stated that they enjoyed the social and interaction features and felt that GoSoapBox helped them learn new concepts and increased their critical thinking skills (Carroll et al., 2018). Although the findings support Bandura’s SLT, the researchers did not investigate the effect the interactive online SRS had on student learning outcomes based on a summative assessment using a quasi-experimental or experimental design. Therefore, creating a gap in the research literature.
Overall, results from previous studies demonstrate that when game-based SRSs tools are used as a formative learning assessment tool, they can be an effective way to increase student engagement, promote participation and interaction, as well as improve learning outcomes in undergraduate classrooms. Also, reports from surveys, questionnaires, and interviews with undergraduate students have been overwhelmingly positive. This indicates that most traditional students enjoy using a game-based SRS as a teaching and learning tool. However, a major gap in the research still exists, little is known about the effectiveness of game-based SRSs in distance education courses designed for undergraduate students. Therefore, future studies are needed to address this gap.
6 Limitations
This review was limited by the search location within electronic databases accessible via the library system. The search was also limited to articles published in peer-reviewed journals between 2015 and 2021 to cover the most recent research. The review also concentrated on empirical evidence in relation to identifying the effects game-based SRSs can have on undergraduate students’ engagement, motivation, and learning in traditional, blended, and distance learning courses. Also, findings from single studies might be misleading or confusing because they were poorly designed. Lastly, thematic interests were used to evaluate the quality of the research studies in terms of research design, methodology, instrument used to measure learning gains, statistical test used to test the hypotheses, and accuracy of the research findings. As a result, there were many challenges to conducting such a review such as tackling the questions of conducting a quality appraisal, issues of synthesizing the research findings when qualitative, quantitative, and mixed research methods were used, and determining which parts of the review to include in publication.
7 Discussion
This review focused on identifying primary-level studies investigating the impact of using game-based SRSs as a formative assessment for promoting student engagement, motivation, and active learning in undergraduate courses in higher education settings (traditional, blended, and distance education). A total of 69 articles were identified using the search terms during March–May 2020 and December 2021 and were published between 2015 and 2021 on game-based SRSs, and the impact on teaching and learning outcomes in undergraduate courses in the Australia, US, Europe, China, Taiwan, India, Guyana, Ireland, Norway, and Turkey. However, 10 of these studies did not meet the inclusion selection criteria. Most of the studies that were excluded were either literature reviews, reflection practices, or focused on developing a conceptual framework for game-based SRSs. In addition, not a single study investigated the use of game-based SRS tools in a distance education course. This indicates that DGBL initiatives and other game-based learning pedagogies for e-learners may be overlooked or underrepresented.
Despite limited research in distance education or online learning courses, the researcher gained valuable insights into the different types of game-based SRSs used in traditional and blended learning environments, as well as their impact on teaching, student engagement, motivation, and learning. Although there is much research reported in the literature, the research findings are limited in that only two studies reported effect sizes, which did not allow the researcher to provide aggregate results. Also, more than half of the studies (59%) explored students’ perceptions, attitudes, and experiences using game-based SRS as a teaching and learning tool in undergraduate courses in the US and in foreign countries. Therefore, further experimental or quasi-experimental studies are needed to draw accurate inferences about the cause/effect relationship between the intervention (game-based SRS) and student learning outcomes (test scores).
8 Conclusion
As evident from the research literature, the impact of game-based SRSs on student engagement, motivation, interaction, satisfaction, and learning in blended and face-to-face undergraduate classrooms were well documented. However, despite the positive findings, the level of effectiveness in terms of engagement, teaching effectiveness, academic performance, and student preference, varied depending on the type of SRS used to facilitate teaching and learning in an undergraduate course. For that reason, it is important to consider factors such as classroom size, comfort with technology, course content, time and resources required for quality game design, and students’ learning styles before selecting and implementing a game-based SRS (Laici & Petucci, 2019). In addition, scarce research exists on the effectiveness of game-based SRSs in distance education courses designed for first-year learners. Therefore, further empirical studies are needed to address this gap in the research literature.
9 Recommendations for Practice
Clearly, research findings from the empirical studies indicate that game-based SRS tools can be effective not only in enhancing teaching, but also increasing student participation, engagement, and learning. Majority (75%) of studies outlined in Table 2 had significant findings, that is, the game-based SRS studied had a positive impact on student learning outcomes and posttest scores were statistically higher after the intervention was used. When presented with such tools, students are encouraged to engage in class lectures/discussions, connect and interact with their classmates and instructor, and respond to feedback. As a result, students become active learning participants, which in turns, improves their learning and satisfaction in a course. Based on the evidence from high-quality studies, college faculty should be encouraged to use game-based SRSs in traditional, blended, and distance education courses designed for undergraduate students. Although positive findings in traditional and blended learning environments were evident, little is known about the effects of game-based SRSs in distance education courses. Therefore, higher education leadership should promote game-based learning pedagogies in distance education courses by supporting research in DGBL and game-based SRSs. By understanding how these influence engagement and learning, higher education leaders can provide better opportunities for training and development, particularly for undergraduate students.
Data availability
The author included a detailed account of the data available from previous empirical studies for the purpose of this article.
References
Abdulla, M. H. (2018). The use of an online student response system to support learning Physiology during lectures to medical students. Educational Information Technology, 23, 2939–2946. https://doi.org/10.1007/s10639-018-9752-0
Anderson, S., Goss, A., Inglis, M., Kaplan, A., Samarbakhsh, L., & Toffanin, M. (2018). Do clickers work for students with poorer grades and in harder courses? Journal of Further and Higher Education, 42(6), 797–807.
Andzik, N. R., Gist, C. M., Smith, E. E., Xu, M., & Neef, N. A. (2021). The effects of gaming on university student quiz performance. Journal of Effective Teaching in Higher Education, 2(1), 109–119.
Ashtari, S., & Taylor, J. (2021). Winning together: Using game-based response systems to boost perception of learning. International Journal of Education & Development Using Information & Communication Technology, 17(1), 123–141.
Asiksoy, G., & Sorakin, Y. (2018). The effects of clicker-aided flipped classroom model on learning achievement, physics anxiety and students’ perceptions. International Online Journal of Education and Teaching, 5(2), 334–346.
Aslan, B., & Seker, H. (2017). Interactive response systems (IRS) Socrative application sample. Journal of Education and Learning, 6(1), 167–174. https://doi.org/10.5539/jel.v6n1p167
Badia Valiente, J. D., Olmo Cazevieille, F., & Navarro Jover, J. M. (2016). On-line quizzes to evaluate comprehension and integration skills. Journal of Technology and Science Education, 6(2), 75–90.
Bicen, H., & Kocakoyun, S. (2017). Determination of University students’ most preferred mobile application for gamification. World Journal on Educational Technology: Current Issues, 9(1), 18–23.
Buil, I., Catalán, S., & Martínez, E. (2016). Do clickers enhance learning? A control-value theory approach. Computers & Education, 103, 170–182. https://doi.org/10.1016/j.compedu.2016.10.009
Buil, I., Catalán, S., & Martínez, E. (2019). The influence of flow on learning outcomes: An empirical study on the use of clickers. British Journal of Educational Technology, 50(1), 428–439. https://doi.org/10.1111/bjet.12561
Campillo-Ferrer, J., Miralles-Martínez, P., & Sánchez-Ibáñez, R. (2020). Gamification in higher education: Impact on student motivation and the acquisition of social and civic key competencies. Sustainability, 12(4822), 4822. https://doi.org/10.3390/su12124822
Carroll, J.-A., Sankupellay, M., Newcomb, M., Rodgers, J., & Cook, R. (2018). GoSoapBox in public health tertiary education: A student response system for improving learning experiences and outcomes. Australasian Journal of Educational Technology, 34(5), 58–71. https://doi.org/10.14742/ajet.3743
Caserta, S., Tomaiuolo, G., & Guido, S. (2021). Use of a smartphone-based student response system in large active-learning chemical engineering thermodynamics classrooms. Education for Chemical Engineers, 36, 46–52. https://doi.org/10.1016/j.ece.2021.02.003
Chan, S. C. H., Wan, C. L. J., & Ko, S. (2019). Interactivity, active collaborative learning, and learning performance: The moderating role of perceived fun by using personal response systems. The International Journal of Management Education, 17(1), 94–102. https://doi.org/10.1016/j.ijme.2018.12.004
Chang, C.-C., Warden, C. A., Liang, C., & Lin, G.-Y. (2018). Effects of digital game-based learning on achievement, flow and overall cognitive load. Australasian Journal of Educational Technology, 34(4), 155–167.
Cheng, L. T. W., & Wang, J. W. (2018). Enhancing learning performance through classroom response systems: The effect of knowledge in a global economic environment. Journal of Teaching in International Business, 29(1), 49–61.
Cheng, L. T. W., & Wang, J. W. (2019). Enhancing learning performance through classroom response systems: The effect of knowledge type and social presence. International Journal of Management Education, 17(1), 103–118. https://doi.org/10.1016/j.ijme.2019.01.001
Chien, Y., Chang, Y., & Chang, C. (2016). Do we click in the right way? A meta-analytic review of clicker-integrated instruction. Educational Research Review, 17, 1–18.
Datta, R., Datta, K., & Venkatesh, M. D. (2015). Evaluation of interactive teaching for undergraduate medical students using a classroom interactive response system in India. Medical Journal Armed Forces India, 71(3), 239.
Deng, L. (2019). Assess and engage: How Poll Everywhere can make learning meaningful again for millennial library users. Journal of Electronic Resources Librarianship, 31(2), 55–65.
El Shaban, A. (2017). The use of “Socrative” in ESL classrooms: Towards active learning. Teaching English with Technology, 17(4), 64–77.
Elelmahdi, I., Al-Hattami, A., & Fawzi, H. (2018). Using technology for formative assessment to improve students’ learning. TOJET: The Turkish Online Journal of Educational Technology, 17(2), 66.
Erhel, S., & Jamet, E. (2016). The effects of goal-oriented instructions in digital game-based learning. Interactive Learning Environments, 24(8), 1744–1757. https://doi.org/10.1080/10494820.2015.1041409
Feraco, T., Casali, N., Tortora, C., Dal Bon, C., Accarrino, D., & Meneghetti, C. (2020). Using mobile devices in teaching large university classes: How does it affect exam success? Frontiers in Psychology. https://doi.org/10.3389/fpsyg.2020.01363
Field, A. (2009). Discovering statistics using SPSS (3rd ed). Sage.
Florenthal, B. (2018). Students’ motivation to participate via mobile technology in the classroom: A uses and gratifications approach. Journal of Marketing Education. https://doi.org/10.1177/0273475318784105
Flosason, T., McGee, H., & Diener-Ludwig, L. (2015). Evaluating impact of small-group discussion on learning utilizing a classroom response system. Journal of Behavioral Education, 24(3), 317–337. https://doi.org/10.1007/s10864-015-9225-0
Funnell, P. (2017). Using audience response systems to enhance student engagement and learning in information literacy teaching. Journal of Information Literacy, 11(2), 28.
Gough, D., Oliver, S., & Thomas, J. (2017). Introducing systematic reviews. In D. Gough, S. Oliver, & J. Thomas (Eds.), An introduction to systematic reviews (2nd ed., pp. 1–18). Sage.
Green, A. J., Chang, W., Tanford, S., & Moll, L. (2015). Student perceptions towards using clickers and lecture software applications in hospitality lecture courses. Journal of Teaching in Travel & Tourism, 15(1), 29–47.
Guarascio, A. J., Nemecek, B. D., & Zimmerman, D. E. (2017). Experiences in teaching and learning: Evaluation of students’ perceptions of the Socrative application versus a traditional student response system and its impact on classroom engagement. Currents in Pharmacy Teaching and Learning, 9, 808–812. https://doi.org/10.1016/j.cptl.2017.05.011
Heyvaert, M., Hannes, K., & Onghena, P. (2017). Introduction to MMRS literature reviews. In Using mixed methods research synthesis for literature reviews (pp. 1–22). SAGE. https://doi.org/10.4135/978150633324
Hsiung, W. Y. (2018). The use of E-resources and innovative technology in transforming traditional teaching in chemistry and its impact on learning chemistry. International Journal of Interactive Mobile Technologies, 12(7), 86–96. https://doi.org/10.3991/ijim.v12i7.9666
Hung, H.-T., Yang, J. C., Hwang, G.-J., Chu, H.-C., & Wang, C.-C. (2018). A scoping review of research on digital game-based language learning. Computers & Education, 126, 89–104. https://doi.org/10.1016/j.compedu.2018.07.001
Hunsu, N. J., Adesope, O., & Bayly, D. J. (2016). A meta-analysis of the effects of audience response systems (clicker-based technologies) on cognition and affect. Computers & Education, 94, 102–119. https://doi.org/10.1016/j.compedu.2015.11.013
Ingalls, V. (2020). Students vote: A comparative study of student perceptions of three popular web-based student response systems. Technology, Knowledge and Learning, 25(3), 557–567.
Joshi, N., Lau, S.-K., Pang, M. F., & Lau, S. S. Y. (2021). Clickers in class: Fostering higher cognitive thinking using ConcepTests in a large undergraduate class. Asia-Pacific Education Researcher, 30(5), 375–394.
Karaaslan, H., Kilic, N., Guven-Yalcin, G., & Gullu, A. (2018). Students’ reflections on vocabulary learning through synchronous and asynchronous games and activities. Turkish Online Journal of Distance Education (TOJDE), 19(3), 53–70.
Kokina, J., & Juras, P. E. (2017). Using Socrative to enhance instruction in an accounting classroom. Journal of Emerging Technologies in Accounting, 14(1), 85–97. https://doi.org/10.2308/jeta-51700
Kramer, M. M., & Stover, S. (2015). Implementing social norm pedagogy to impact students’ personal health behavior. Journal of Educational Technology, 12(3), 1–12.
LaDue, N. D., & Shipley, T. F. (2018). Click-on-diagram questions: A new tool to study conceptions using classroom response systems. Journal of Science Education & Technology, 27(6), 492–507. https://doi.org/10.1007/s10956-018-9738-0
Laici, C., & Pentucci, M. (2019). Feedback with technologies in higher education: A systematic review. Form@re, 19(3), 6–25. https://doi.org/10.13128/form-7698
Lee, U., Sbeglia, G., Ha, M., Finch, S., & Nehm, R. (2015). Clicker score trajectories and concept inventory scores as predictors for early warning systems for large STEM classes. Journal of Science Education & Technology, 24(6), 848–860. https://doi.org/10.1007/s10956-015-9568-2
Liu, C., Sands-Meyer, S., & Audran, J. (2019). The effectiveness of the student response system (SRS) in English grammar learning in a flipped English as a foreign language (EFL) class. Interactive Learning Environments, 27(8), 1178–1191. https://doi.org/10.1080/10494820.2018.1528283
Ludvigsen, K., Krumsvik, R. J., & Breivik, J. (2020). Behind the scenes: Unpacking Ssudent discussion and critical reflection in lectures. British Journal of Educational Technology, 51(6), 2478–2494.
Ludvigsen, K., Krumsvik, R., & Furnes, B. (2015). Creating formative feedback spaces in large lectures. Computers & Education, 88, 48–63. https://doi.org/10.1016/j.compedu.2015.04.002
Ma, S., Steger, D. G., Doolittle, P. E., & Stewart, A. C. (2018). Improved academic performance and student perceptions of learning through use of a cell phone-based personal response system. Journal of Food Science Education, 17(1), 27. https://doi.org/10.1111/1541-4329.12131
Meguid, E., & Collins, M. (2017). Students’ perceptions of lecturing approaches: Traditional versus interactive teaching. Advances in Medical Education and Practice, 8, 229–241. https://doi.org/10.2147/AMEP.S131851
Mitchell, G., McVeigh, C., Carlisle, S., & Brown-Wilson, C. (2020). Evaluation of a co-produced delirium awareness programme for undergraduate nursing students in Northern Ireland: A pre-test/post-test study. BMC Nursing, 19(1), 1–9. https://doi.org/10.1186/s12912-020-00427-9
Muir, S., Tirlea, L., Elphinstone, B., & Huynh, M. (2020). Promoting classroom engagement through the use of an online student response system: A mixed methods analysis. Journal of Statistics Education, 28(1), 25–31.
Munusamy, S., Osman, A., Riaz, S., Ali, S., & Mraiche, F. (2019). The use of Socrative and Yammer online tools to promote interactive learning in pharmacy education. Currents in Pharmacy Teaching & Learning, 11(1), 76.
Newman, M., & Gough, D. (2017). Part I: Methodical considerations. In O. Zawacki-Richter, M., Kerres, S., Bedenlier, Bond, M., & K. Buntins (Eds.), Systematic reviews in educational research: Methodology, perspectives and application. (pp. 3–22) Springer.
Owen, H. E., & Licorish, S. A. (2020). Game-based student response system: The effectiveness of Kahoot! on junior and senior information science students’ learning. Journal of Information Technology Education: Research, 19, 511–553.
Perera, V., & Hervás-Gómez, C. (2021). University students’ perceptions toward the use of an online student response system in game-based learning experiences with mobile technology. European Journal of Educational Research, 10, 1009–1022. https://doi.org/10.12973/eu-jer.10.2.1009
Pérez Garcias, A., & Marín, V. (2016). Ethics issues of digital contents for pre-service primary teachers: A gamification experience for self-assessment with Socrative. The IAFOR Journal of Education, 4(2), 80–96.
Persaud, V., & Persaud, R. (2019). Increasing student interactivity using a Think-Pair-Share Model with a Web-based student response system in a large lecture class in Guyana. International Journal of Education and Development Using Information and Communication Technology, 15(2), 117–131.
Petticrew, M., & Roberts, H. (2005). Systematic reviews in the social sciences: A practical guide. Wiley.
Rahmahani, D., & Suyoto, & Pranowo. (2020). The effect of gamified student response system on students’ perception and achievement. International Journal of Engineering Pedagogy, 10(2), 45–59. https://doi.org/10.3991/ijep.v10i2.11698
Sanchez, D. R., Langer, M., & Kaur, R. (2020). Gamification in the classroom: Examining the impact of gamified quizzes on student learning. Computers & Education. https://doi.org/10.1016/j.compedu.2019.103666
Sawang, S., O’Connor, P., & Ali, M. (2017). IEngage: Using technology to enhance students’ engagement in a large classroom. Journal of Learning Design, 10(1), 11–19.
Sprague, A. (2016). Improving the ESL graduate writing classroom using Socrative: (Re)considering exit tickets. TESOL Journal, 7(4), 989–998. https://doi.org/10.1002/tesj.295
Stowell, J. R. (2015). Use of clickers vs. mobile devices for classroom polling. Computers & Education, 82, 329–334. https://doi.org/10.1016/j.compedu.2014.12.008
Subhash, S., & Cudney, E. (2018). Gamified learning in higher education: A systematic review of the literature. Computers in Human Behavior, 87, 192–206.
Thomas, C. N., Pinter, E. B., Carlisle, A., & Goran, L. (2015). Student response systems: Learning and engagement in preservice teacher education. Journal of Special Education Technology, 30(4), 223–237. https://doi.org/10.1177/0162643415623026
Thomas, J., O’Mara-Eves, A., Harden, A., & Newman, M. (2017b). Synthesis methods for combining and configuring textual or mixed methods data. In D. Gough, S. Oliver, & J. Thomas (Eds.), An introduction to systematic reviews (2nd ed., pp. 181–211). Sage.
Thomas, J., O’Mara-Eves, A., Kneale, D., & Shemilt, I. (2017a). Synthesis methods for combining and configuring quantitative data. In D. Gough, S. Oliver, & J. Thomas (Eds.), An introduction to systematic reviews (2nd ed., pp. 211–250). Sage.
Tivener, K. A., & Hetzler, T. (2015). The effects of an electronic audience response system on athletic training student knowledge and interactivity. Athletic Training Education Journal, 10(3), 212–218.
Tornwall, J., Lu, L., & Xie, K. (2020). Frequency of participation in student response system activities as a predictor of final grade: An observational study. Nurse Education Today. https://doi.org/10.1016/j.nedt.2020.104342
Turner, P. E., Johnston, E., Kebritchi, M., Evans, S., & Heflich, D. A. (2018). Influence of online computer games on the academic achievement of nontraditional undergraduate students. Cogent Education, 5(1), 66.
Van Daele, T., Frijns, C., & Lievens, J. (2017). How do students and lecturers experience the interactive use of handheld technology in large enrolment courses? British Journal of Educational Technology, 48(6), 1318–1329.
Voith, L. A., Holmes, M. R., & Duda-Banwar, J. (2018). Clicking toward better grades: The use of student response systems in social work education. Journal of Social Work Education, 54(2), 239–249.
Walklet, E., Davis, S., Farrelly, D., & Muse, K. (2016). The impact of student response systems on the learning experience of undergraduate psychology students. Psychology Teaching Review, 22(1), 35–48.
Wang, A., & Tahir, R. (2020). The effect of using Kahoot! for learning—A literature review. Computers & Education, 149, 1–22. https://doi.org/10.1016/j.compedu.2020.103818
Wang, Y. (2018). Interactive response system (IRS) for college students: Individual versus cooperative learning. Interactive Learning Environments, 26(7), 943–957.
Wong, A. (2016a). Classroom response systems and student performance improvement: Local versus international students. Journal of Teaching in International Business, 27(4), 197–208.
Wong, A. (2016b). Student perception on a student response system formed by combining mobile phone and a polling website. International Journal of Education and Development Using Information and Communication Technology, 12(1), 144–153.
Wood, A. (2020). Utilizing technology-enhanced learning in geography: Testing student response systems in large lectures. Journal of Geography in Higher Education, 44(1), 160–170.
Yabuno, K., Luong, E., & Shaffer, J. F. (2019). Comparison of traditional and gamified student response systems in an undergraduate human anatomy course. HAPS Educator, 23(1), 29–36.
Yildirim, I. (2017). The effects of gamification-based teaching practices on student achievement and students’ attitudes toward lessons. Internet and Higher Education, 33, 86–92.
Zhang, L., Cheng, J., Lei, J., & Wang, O. (2020). How the anonymous feature of audience response system influences the interactions of students by different types of questions. Journal of Educational Technology Development & Exchange, 13(1), 39–56. https://doi.org/10.18785/jetde.1301.03
Zou, D., & Lamber, J. (2017). Feedback methods for student voice in the digital age. British Journal of Educational Technology, 48(5), 1081–1091. https://doi.org/10.1111/bjet.12522
Funding
The author received no financial support for the research, authorship, and/or publication of this article.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The author declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Squire, N. Undergraduate Game-Based Student Response Systems (SRSs): A Systematic Review. Tech Know Learn 28, 1903–1936 (2023). https://doi.org/10.1007/s10758-023-09655-9
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10758-023-09655-9