Keywords

1 Introduction

Learning by teaching has been generally accepted one of the best ways for students to learn for centuries. As of this writing, Google Scholar lists over 7,030 entries for a search on this topic, which is 1,390 more than in two years ago when our first article about Flip-Flop was published (Stelovska et al. 2016). A yearly increase of almost 10% in scholarly publications clearly shows that the interest in this arena continues unabated and there is and will be increased demand for practical applications that support this idea. Many sources (Duran 2016; Hanke 2012; Goodlad and Hirst 1989; Biswas et al. 2015; https://en.wikipedia.org/wiki/Learning_by_teaching) discuss the various pros and cons of learning by teaching: students benefit not only in quantity of learning in terms of the amount of knowledge gained, but also in the degree to which they understand the material.

As the theory of learning by teaching proposed the concepts of Active Learning (https://en.wikipedia.org/wiki/Active_learning; Bishop and Verleger 2013) and Constructivist Learning (Vygotskii 1978; https://en.wikipedia.org/wiki/Constructivism_(philosophy_of_education), these approaches have also seen widespread acceptance and integration into curricula. Both of these methodologies prioritize actively involving students in their own education rather than passively absorbing lectures. These concepts in general and our approach in particular satisfy most of the levels in all the dimensions of the Bloom’s Taxonomy of Educational Objectives (https://en.wikipedia.org/wiki/Bloom%27s_taxonomy).

In the traditional model of instruction, students attend lectures in person while working on assignments at home. The Flipped Classroom (Bishop and Verleger 2013; https://en.wikipedia.org/wiki/Flipped_classroom) builds on these concepts and reverses this model: students watch lecture videos before class sessions, then work on exercises and ask questions in class where the instructor is available to assist them.

While Massive Open Online Courses (MOOC) (Baggaley 2013; Christensen et al. 2014; https://en.wikipedia.org/wiki/Massive_open_online_course; Vardi 2012; Hartnett 2013) are often regarded only as means of replacing classroom lectures by learning from screencasts and lecture recordings on a larger scale, e.g. allowing students to watch and listen to lectures from around the globe, such resources can obviously be used in flipped classrooms as well.

The flipped classroom depends on students actually learning the material presented in the lecture videos. Even if students do watch the videos, there is no guarantee that they do so attentively: they may view the lectures in an environment with distractions or listen to the lecture in the background as they work on other tasks. If students do not understand the concepts covered in the screencast at home, their ability to participate in problem solving during the classroom session is constrained and therefore they learn less from those exercises as well.

The same problems arise in MOOC context: while the students can easily re-watch a section of a lecture video, how do they know whether it is necessary—whether they understood it or not? Here, an immediate feedback via test questions synchronized at crucial points within the screencast is an obvious remedy. But even if creating such quizzes is supported in within the supporting technology, it is unlikely that most of the instructors will embrace additional quiz-authoring chores.

Therefore we propose that the student peers construct quizzes synchronized with lecture videos and take the quizzes their peers constructed while watching screencasts. Our Flip-Flop methodology allows to conduct these activities in a well-organized fashion during the entire length of the course and offers a complete set of supporting online tools. In particular, we introduce Peer Improvement, a component of Flip-Flop that allows the student who just took a quiz to suggest improvements to all the quiz tasks and their elements.

According to the most recent Fall 2017 report by Hill (2017), 87% of institutions of higher learning and 91% of student enrollments rely upon a learning management system (LMS) such as Blackboard (http://www.blackboard.com), Canvas (2018), Moodle (https://moodle.org), or D2L Brightspace (https://www.d2l.com). None of these popular school-wide ‘big four’ LMS’s offers quiz editing features for the students, records the authored quizzes, or allows taking peer quizzes. While the name of Quizlet (https://quizlet.com)—another recent commercial LMS—seems to indicate that quizzes are its core essence, it mainly supports creating study plans, scheduling study sessions, or taking short quizzes to measure progress, and does not focus on creating peer quizzes. Moreover, Quizlet does not target educational institutions, and is used almost exclusively in the language learning community. Quiz It!, another interesting framework that recently won “Best Education Hack” as well as “Best Google Cloud Platform Hack” prizes (https://devpost.com/software/quiz-it) attempts to create quizzes automatically using artificial intelligence to analyze the underlying resources. While quiz-making is offered by some based on video lectures—e.g. Kahoot (https://kahoot.com), none of these platforms feature the comprehensive set of features that our methodology proposes. In particular, the Peer Improvement component mentioned above is not available in any of the aforementioned environments.

The remaining sections of this article are organized as follows: while Sect. 9.2 introduces the general principles of Flip-Flop, Sect. 9.3 describes the technology framework that supports these principles. Section 9.4 then describes a study evaluating the principles behind the Flip-Flop methodology and showcases the quizzes that students created. Finally, Sects. 9.5 and 9.6 discuss planned improvements to the existing software implementation, offer a glimpse of upcoming data analysis efforts, and highlight the potential impact on education and knowledge acquisition per se.

While this article is mainly based on our contributions to the HCII 2018 conference (Stelovsky et al. 2018; Ogawa 2018), several sections—in particular Sect. 9.6. The Vision—cover new ground.

2 Flip-Flop Concepts

While the Flip-Flop approach was devised primarily with the Flipped Classroom methodology in mind, it can also enhance any MOOC-type of instruction as it adds a “flop” component to any lecture videos: learners construct quizzes synchronized with educational screencasts.

In order to create quality quizzes, students must understand the material in the lecture video. Even when constructing a simple multiple-choice task, the quiz author has to think of a question that is relevant for the specific screencast segment along with answer choices whose correctness is not too easy to guess or too hard to answer. Finding the correct answer is typically straightforward once the student has decided on a question. However, developing incorrect answers is non-trivial: these incorrect answers must be incorrect but not obviously so.

Even multiple-choice quizzes can offer features that while being more informative for the quiz-taking student are more challenging and therefore can bring more educational benefit to the quiz author. For instance, we propose that each answer can be accompanied by a feedback that can explain why the answer is correct or incorrect. In particular, the feedback for an incorrect answer can indicate the likely misconception that often leads to such incorrect choice. Thinking in terms of answer feedbacks and how to formulate them concisely brings additional benefits not only to the quiz author, but demonstrates the pros and cons of the author’s approach to the peers who take the quiz.

Furthermore, we propose that the author can append a short hint phrase as well as a hint link to any question. Formulating a concise hint is quite challenging as the author should only point the quiz taker in the right direction without giving away the entire correct answer. The hint link also challenges the author to find the most appropriate web page online that explains the topic well enough to answer the corresponding question.

In our version of Flip-Flop, the answer feedbacks, hints, and hint links are optional. However, the quiz templates that are automatically added to authors’ channels to assist with their initial editing chores include several dummy answers each with a default dummy feedback, a default hint as well as a dummy hint link. While these dummy elements seem to simplify the student’s work, they need to be deleted if the author decides to omit them and thus serve as a reminder that it would be more appropriate to replace them with some substantial content.

When synchronizing tasks with the video, the author can simply opt to limit the response time to the duration of the corresponding video segment or choose to pause the video and give the quiz taker a specific time limit to choose an answer. He or she can also choose whether to show the correct answer after an incorrect answer was selected.

In addition to multiple-choice questions, we support poll tasks—where there is no distinction between correct and incorrect answers—as well as “pinboard tasks” where the author displays text or an image and optionally a link to an external web page and the quiz-taker does not need to take any action except possibly to click on the link to view such a page. Since the author of the quiz can specify the maximum points per task and whether the number of possible points decreases with time, correctly answered questions increase the total score for this quiz and consequently motivates and rewards students for positive performance.

Since the students also take peer-generated quizzes in conjunction with creating quizzes, each student has multiple opportunities to engage with the material and thus assess themselves whether they understand the content.

Since the tightly structured and systematic approach of Flip-Flop does not explicitly fit other well-documented and researched educational methodologies, we propose the term Constructive Learning for approaches and technologies that require the students to construct teaching materials based on and synchronized with recordings of educational lessons.

3 Flip-Flop Technology

3.1 Taking Quizzes

To take a quiz, a student may either navigate directly to a quiz through a notification email or find the quiz in the list of quizzes that peers have created. A sample Flip-Flop quiz is shown in Fig. 9.1.

Fig. 9.1
figure 1

Taking a quiz: question and answers

If the quiz taker selects the correct answer, he or she sees feedback indicating that the choice was correct (see Fig. 9.2).

Fig. 9.2
figure 2

Taking a quiz: selecting a correct answer

If the selected answer is incorrect, the feedback will indicate this as well, hopefully identifying the misconception that the student has made and suggesting how to correct this mistake (shown in Fig. 9.3).

Fig. 9.3
figure 3

Taking a quiz: selecting an incorrect answer

The question in Fig. 9.4 also has a hint for the quiz taker to look at.

Fig. 9.4
figure 4

Taking a quiz: viewing a hint

3.2 Peer Improvement

Once the student has finished taking the quiz, he or she enters an editing suite where he or she can page through every task of the quiz just taken and suggest an improvement to every task component. For instance, clicking on the question for the task displays an editing field where the student can change the wording or type an entirely new question text. Similarly, the student can change each of the answers and modify the feedback for each of the answers, the task’s hint text, and hint link. Figure 9.5 shows the editing field where the question has been reworded. Notice that the differences between the original and the edited text of the question are depicted in red and green colors.

Fig. 9.5
figure 5

Peer improvement: suggested improvements

The editing facilities do not stop at just editing the texts. Students can suggest that another answer is correct rather than the one that the author selected simply by clicking on its red “x” sign. (Obviously, a green check mark will now indicate the answer deemed correct.)

Notice that once one or more students take the quiz and suggest improvements, the next student can select any of the previous improvements (as well as the original item) and improve it again. Alternately, the student can just click on the check mark to indicate that he or she approves of that particular item. Such an approval earns this suggestion item a ‘ + ’. As Fig. 9.6 shows, the ‘ + ’ stars appear next to the ID of the student who suggested the improvement.

Fig. 9.6
figure 6

Peer improvement: approval

We also automatically feature an evaluation page where the quiz taker clicks on the star rating for each of the essential quiz components as depicted in Fig. 9.7.

Fig. 9.7
figure 7

Peer evaluation: star ratings

Peer Improvement has numerous advantages. The learner who just took the quiz and has its tasks fresh in memory can immediately change all the items he or she found problematic or even judges as incorrect. Note also that only one feedback—the one associated with the selected answer—is shown during the quiz taking session. In contrast, all the feedbacks are shown during the Peer Improvement session. Since we encourage the authors to provide additional information within the feedbacks such as an indication of what misconception might have led to an incorrect answer, the student can even learn more in-depth aspects of the subject. Similarly, the Peer Improvement framework presents all the hints and the hint links which were not unless the student clicked on the “Hint” button during the quiz, enabling the student to follow the hint links and learn from these additional resources.

Peer Improvement is also likely to increase the motivation of learners. After all, improving and even suggesting additional improvements to already posted ones can be perceived as a challenging opportunity to showcase mastery of the subject. Collecting ‘ + ’ from the peers only adds to the motivation of the student.

Last but not least, the instructor naturally benefits from the Peer Improvement methodology as the improvements can be showcased in the order of most ‘ + ’ scores earned. This makes it easy to incorporate Flip-Flop methodology into grading, to build a prioritized database of quiz tasks that can be used in future tests, and to promote the applications of online tasks discussed in the Conclusions section below.

3.3 Creating Quizzes

In order to create a quiz with the Flip-Flop authoring tool, a student must first log into his or her channel—currently on slippah.com—and select the quiz she needs to create. The initial quiz authoring page is shown in Fig. 9.8.

Fig. 9.8
figure 8

Creating a quiz: authoring tool

The left half of the page contains the video and segments for the quiz. The gray anchors below the video indicate the start and end points of the quiz; this particular quiz covers the second quarter of the lecture. Between those anchors, green and yellow rectangles represent individual question segments: the question will be displayed on the screen during that portion of the video. The student may change the quantity, placement, and duration of these segments to ensure that the application displays quiz questions during relevant portions of the video.

The right hand side of the page shows the currently selected question (highlighted in green below the video), up to five answers with radio buttons to indicate the correct choice, as well as feedback for each of the answers that can be expanded or hidden using the text bubble to the right of each answer.

Clicking the light bulb to the right of the question reveals the hint entry with fields where the author can post the hint link and type in the label for the link as shown in Fig. 9.9. Since we observed that some students in the Algorithms CS course often wanted to formulate quiz content with special math symbols and Greek letters used in the textbook, we have now added a special symbol keyboard pane shown on the right.

Fig. 9.9
figure 9

Creating a quiz: editing a hint with hint link and label; symbols pane

3.4 Instructor Perspective

Simplifying the chores of the instructors is one of the main objectives of a learning management system. Given that the Flip-Flop method needs specific scheduling facilities that are essential for assigning students quizzes to make and take based on screencasts, a substantial portion of the underlying technology had to be devoted to providing instructors with (1) a tool for easy review of the work from students, and (2) with the scheduling tool that subdivides the students into groups and assigns the quiz authors portions of the corresponding screencasts.

A previous iteration of this application had a standalone website displaying all of the quizzes for the current course. This page (shown in Fig. 9.10) divided the students into groups for each quiz, providing links to the lecture video and notes for students to refer to.

Fig. 9.10
figure 10

Old scheduling tool

The student assigned to a section of the video is highlighted in blue. Hovering the mouse over the name of a student brought up a menu of choices that allowed the quiz authors to send links to their peers who then took the quizzes through this same interface.

The current version of the scheduling tool helps instructors to define the student groups, determine the quiz author for a particular segment of a screencast and when to generate the corresponding quiz templates, add them to the channel of each student, and remind the author once the template is up as well when his or her quiz is due (see Fig. 9.11).

Fig. 9.11
figure 11

Instructor perspective: creating quizzes

After students create quizzes, their instructor can easily look through those quizzes for grading. For example, in Fig. 9.12 we can see the contents of the one quiz that Samantha Lewis wrote.

Fig. 9.12
figure 12

Instructor perspective: viewing quizzes

3.5 Channels

In the previous version of our software technology, the students had to download a template of their quiz from the website for the course, integrate it manually within their own web pages, and after they finished authoring the quiz based on this template they had to return to the web page of the course to choose an item from a menu that sent an email to the peers who needed to take this new quiz.

In the current version, each student has his or her own channel on the Flip-Flop web site. A template is now inserted for each new quiz to be authored based on the schedule of the course. Once the student has created the quiz he or she simply clicks a button that sends an email to all the peers announcing that the quiz is ready.

4 Initial Assessment of Flip-Flop Conceptual Framework

We conducted an initial assessment of the Flip-Flop conceptual framework to determine its benefits in Computer Science education. We utilized a two-part analysis focusing on (1) task performance compared to traditional homework assignments, and (2) its impact on study methods.

4.1 Initial Assessment of Flip-Flop

To evaluate the effectiveness of Flip-Flop, we focused on its impact on task performance and study approach. The frame used for the study was a computer science service course that utilized a hybrid approach that included traditional lectures and online video lectures. Students met in the lecture hall once a week for a 75-min lecture and subsequently watched an approximately 25-min online video developed by a faculty member and completed a 10-question quiz based on the video content. This structure repeated each week with a range of topics including search sets, logic, financial functions, social computing, security, and information management. Students also met in the laboratory with teaching assistants twice a week for hands-on application of the topics covered in the lectures.

4.1.1 Data Collection Tools and Procedures

To determine the effectiveness of the Flip-Flop approach, we used an experimental design to test the effectiveness of the Flip-Flop and original approaches. Students who enrolled in at least one upper-division course were invited to participate in the study for extra credit. A total of 14 students participated in the study and were randomly assigned to the control or experimental group. The control group watched the video and completed a twenty minute 10-question quiz about the content, while the experimental group was required to watch the video and develop five multiple-choice questions with time stamps. After completing the initial activity, both groups completed a ten minute 5-question quiz based on the final examination questions as the dependent variable.

After completing the experiment, we conducted two focus group interviews with the participants to determine if the Flip-Flop approach changed the students’ study habits. We used a semi-structured interview format to allow the conversation to flow naturally and include follow-up questions based on responses. The following open-ended questions served as the interview guide:

  • How did you take notes during the video?

  • How did you study for the quiz(zes)?

4.1.2 Analysis

A univariate analysis of variance using the final examination questions scores was used to determine the difference in task performance and if the result was statistically significant. The focus group interview data were coded to identify themes among the participants’ study habits.

4.2 Improving Task Performance with Flip-Flop

A univariate analysis of variance demonstrated a statistically significant difference (p < 0.05) with F = 6.25 in final examination questions scores between the control and experimental groups (Table 9.1). The mean for the control group was 80% and the experimental group was 94%. Therefore, participants using the Flip-Flop approach to student performed on average 14% higher than those with the traditional approach.

Table 9.1 Univariate analysis of variance

4.3 Changing Study Approaches

The control group (traditional approach) included four themes for study habits. Students focused their notes on facts and definitions, copied bulleted lists from slides, scrubbed the video for slides with text (used the navigation slider to select a section of the video), and multi-tasked while they watched the video. Several participants noted that they had the video playing in one window, while they had their email open in another window. Those students stopped working on their email messages when they heard or saw an important point to write as study notes.

The experimental group (Flip-Flop approach) indicated that they also took factual notes. However, they also attempted to understand concepts and develop notes with illustrations to aid in question development. Figure 9.13 includes an example of a student illustration to demonstrate how information was transferred. Since they were required to note the time for each question, their notes were more detailed and included timestamps next to facts and concepts rather than solely including content. This helped them to revisit difficult concepts because they knew the location in the video to replay content instead of having to search. The Flip-Flop group also focused on the task because it required an active approach to note-taking as they prepared questions compared to the control group who multitasked while watching the video. Most of the experimental group had difficulty creating appropriate distractor items for questions and noted that writing distractors took more time than the questions. The themes for both groups are summarized in Table 9.2.

Fig. 9.13
figure 13

Sample illustration for information transfer

Table 9.2 Study habits for control and experimental groups

The initial benefits of the Flip-Flop model are quite promising, as they improved assessment scores and student note-taking approaches. The students’ notes also addressed a typical concern about the focus on multiple-choice questions, which is the importance of developing a conceptual understanding of content. This approach led students to study concepts rather than facts without additional instruction.

4.4 Flip-Flop in CS Courses

Until now, Flip-Flop has only been applied in computer science courses at the University of Hawaii at Manoa. It was primarily used in the inverted Algorithms course—ICS 311. We have constructed a web page the showcases the quizzes that students constructed in Fall 2016 when it was in used for the first time (http://honza.epizy.com/slippah/questions.html). Figure 9.14 shows the top of this page and demonstrates that clicking on a question reveals all of its possible answers, their feedbacks, the hint, and the hint link which when clicked opens the target resource in another tab. (The hint to the question currently displayed showcases that some students did take our suggestion seriously that a little humor does not hurt—even within an algorithms quiz.)

Fig. 9.14
figure 14

Web page with the quizzes created in an algorithms course—top portion

As the tiny size of the scroll thumb in the bottom right of Fig. 9.15 indicates, scrolling through all of the pages of 5554 student questions is quite impressive in its own right. However, the reader may appreciate not just the quantity but the quality of the students’ contributions as most of the quiz tasks are well conceived, well formulated, and fit well with the screencast topics.

Fig. 9.15
figure 15

Web page with the quizzes created in an algorithms course—bottom portion

Subsequently, Flip-Flop was also employed in the Algorithms course in Fall 2017 and we are now expanding its use into Data Structures ICS 211, a second-semester programming course with a total of 86 students. As of this writing, the ICS 211 students this semester have created a total of 464 quizzes with 1410 questions, 4166 answers, 2754 feedbacks, and 509 hints. Students were instructed to create their quizzes with a minimum of two questions with two answers each; on average each quiz had 3.04 questions and each question had 2.95 choices. Although students were provided with templates including feedbacks and hints, they did not actively use these options with 0.661 feedback items per answer and 0.361 hints per question.

Given that the ICS 211 course is a prerequisite for the ICS 311 course, one interesting option will be to reuse some of the quiz tasks created this semester for one or more of the quizzes that the ICS 311 students may take to ascertain that they have the knowledge needed to successfully continue their studies.

5 Conclusions and Future Work

5.1 Data Analysis, e.g. Can It Take Less Time?

Naturally, Flip-Flop can collect very detailed and abundant data not only while students take a quiz but also while quizzes are created. For instance, our most recent version of the framework stores the time span an author needed to come up with any of the task components: question, every answer and every feedback, every hint and hint link.

Our analysis of the collected data is still in preliminary state. However, we have developed, for instance, a visualization of the duration of the quiz tasks that allows us to compare this timing with the segmentation suggested by the template. As Fig. 9.16 shows, A. Hemmings, R. Black, and C. Carr were the only students who used exactly the same number and timing as the template proposed. (Equally wide rectangles of alternating grays.) J. Mackenzie and D. McDonald allocated twice as much time to each of the first two questions than the template suggested but while Mackenzie added more time to the last question, McDonald decided to add one more task. We can also see that K. Davies and J. Peake generated only two questions. Also, several students have already become familiar with paused type of questions that are indicated by left-pointing triangles.

Fig. 9.16
figure 16

Visualization of quiz tasks timing

While the detailed data analysis is still outstanding, we can cite several comments, critique, and improvement suggestions that students typically voice. In our experience, while the students usually see and appreciate the benefits of the Flip-Flop approach, their main objection is that they need to invest more time than in other classes. Indeed, creating a quiz is at least initially a challenging task. Since we are now collecting data about how long it takes an author to create a quiz, we will be able to determine how much time it took at the beginning and compare it with the times at the end of the semester. (Naturally, we expect that the latter times will be shorter not only because an increased familiarity with the technology itself, but mainly because of better quiz-making skills.) If the Flip-Flop approach is applied across courses during several consecutive courses, we would also expect fewer objections and more appreciation of the acquired benefits.

Our technology, however, could help shorten the time it takes the student to take a quiz. For instance, students have raised concerns about the time spent waiting between questions on their quizzes; we could provide a button that skips the remaining portion of the screencast segment once an answer was selected and start playing the video at the beginning of the next task segment. Alternately, we could add a scale that allows the quiz taker to speed up the screencast. Another feasible approach would be to let the author who creates a quiz from a segment of one screencast to take only quizzes built from other screencasts so that he or she does not need to view any portion of a particular screencast twice. While such speedups may please the students, they may be less appreciated by the instructors—after all, viewing a lecture twice certainly reinforces the understanding of the subject. Therefore we are inclined to make them optional within the scheduling tool and let the instructor decide whether they will be available to the students in his or her course.

We expect that the integration of the Agile Tooltip framework discussed below in Sect. 9.5.4 will lead to a user-centered approach that gives the students ample chance to voice such concerns in and even allow them to suggest further improvements to the entire Flip-Flop methodology and technology. We expect that such feedback will offer valuable data that can be analyzed to ascertain the pros and cons of not just the Flip-Flop methodology but the concept of Constructive Learning as a whole.

5.2 Additional Question Types

Our quizzes are currently limited to multiple-choice questions, polls, and pinboard tasks. Although this has been sufficient thus far, we envision having a wider range of questions for students to choose from.

Since the questions for Flip-Flop quizzes have been predominantly multiple-choice thus far, one next step might be to support the selection of multiple answers. For example, the author of a quiz might instruct his or her peers to select all of the options that are correct or to draw lines between the pairs of matching answers.

Multiple-choice questions limit students to a finite number of answers, and while this makes grading the quiz easier it also makes answering the question a matter of recognizing the correct choice rather than recalling or deducing the correct answer. Allowing quiz authors to write short response or essay questions would compel students to generate their own answers rather than selecting from a list of options. Furthermore, the author of the quiz would have to grade the responses to these types of questions, providing the student with the perspectives of his or her peers and requiring the author to discern whether each written answer is correct or incorrect.

Many domains are more graphically oriented and an answer or feedback purely in text would take too much space on the screen. Our application currently supports images in answers, but only as a thumbnail that is displayed next to the text. Improving support for graphical answers would allow students to click on pictures of answers rather than text, increasing the versatility of the software.

On the other hand, some fields are heavily text-driven or do not have lecture videos readily available for use. Although Flip-Flop quizzes are designed for video quizzes, this system could be adapted to display a document on the left side of the window while questions and answers appear on the right as students scroll through the text. In some cases, the software might only display the questions for the quiz when the instructor does not have any material to synchronize the quiz with.

5.3 Improvements to Peer Improvement

Our software currently allows quiz takers to suggest modifications to the text of questions, answers, feedback, and hints. However, reviewers should also have the option to propose adding or removing quiz components as well. For example, if the author did not write feedback for an answer, the student who just took the quiz should be able to provide the author with some text that the author could eventually use when updating or revising the quiz.

Furthermore, quiz takers should be able to write comments accompanying their suggested modifications. This will allow reviewers to not just propose changes but explain why they would make those changes.

5.4 Agile Tooltip

We plan to integrate another novel technology into the upcoming version of Flip-Flop that we have independently developed to allow users to seek more detailed help with UI widgets as well as provide feedback such as problems encountered and suggestions for improving their user experiences. We call this technology “Agile Tooltip”, as its roots are in one of the main concepts that the Agile Methodology in Software Engineering has pioneered: involving the customer in continuously defining and perfecting the software product rather than trying to make exact requirement specifications up front. The Agile Tooltip concept goes one step further in involving the end user in this process. It adds two buttons to every tooltip: A “help” button, typically represented by a “?” question mark icon, and a “feedback” button, typically depicted as a “thought bubble” icon. Both of these buttons either lead to a page or displays a dialogue where the user can find help information that is directly related to the widget: for instance, an explanation of how this particular widget is used within a typical workflow.

In the context of Flip-Flop (see Fig. 9.17), the Agile Tooltip will assist the students while they construct and take quizzes so they can easily seek corresponding help pages as well as provide feedback about the Flip-Flop technology itself, the quizzes they take or make, and the screencasts themselves. For example, the tooltip related to the “Link” entry in the authoring system will lead to the help page that describes how the student should search for resources on the web that are related to a question, how to select and copy the web page link from the address bar of the browser, and how to paste it into the entry field. The advantage of this approach is not just that the help is targeted to the purpose of the widget, but also that the help pages can provide additional semantic information: for instance, suggesting that the linked webpage should not completely answer the question but only help with deriving the correct answer.

Fig. 9.17
figure 17

Agile tooltip for Flip-Flop

The “feedback” button of the Flip-Flop’s tooltip will display a suggestion page or dialogue where the student will be able to choose the type of feedback he or she is providing, the entry field for the suggestion text along with a drag and drop pane where he or she can submit a screenshot, and an entry field for his or her email. We intend to support at least the following types of feedbacks:

  1. 1.

    Problems, errors, and suggestions how to improve the functionality and appearance of the Flip-Flop user experience,

  2. 2.

    Problems understanding the help pages and suggestions how to improve them,

  3. 3.

    Problems with the quiz the student is currently taking, and

  4. 4.

    Problems understanding the screencast itself and suggestions how it could be improved.

Once a student submits a feedback we will record and categorize the suggestions. The student will receive an automatic “thank you” email. Furthermore, the Agile Tooltip system will allow the instructor and/or teaching assistant to view and respond to the specific suggestion, for instance, promising to alert the student once the problem was corrected or the suggestion addressed. Such replies may be sent to all the students in the course to demonstrate that the instructors and developers do care and thus encourage them to suggest more improvements on their own.

The detailed description of the Agile Tooltip methodology, the corresponding software framework and the support app, as well as its general applications will be subject of a future research article we plan to submit to a software engineering journal.

5.5 AI Support for Flip-Flop and Vice Versa

Artificial intelligence will benefit the quiz authors as it can suggest questions based on the video content and transcript, correct answers, as well as documents to be used as hint links. An interesting aspect is that answers with a low confidence level which are normally discarded by AI frameworks can be well used in multiple-choice tasks—after all, the author must also invent the incorrect answers.

On the other hand, artificial intelligence frameworks can benefit from the experts’ knowledge while they construct Flip-Flop quizzes. For instance, IBM Watson requires that a client must first ‘ingest’ documents and then ‘train’ Watson with questions and correct answers. That is exactly what the hint links and quiz tasks provide.

Moreover, an AI framework will be able to learn from both correct and incorrect answers, using feedbacks that explain why an answer is correct or not. In particular, AI will be able to learn from peer improvements—especially if the ‘peer’ is an expert instructor.

6 The Vision

The Flip-Flop quizzes have several important aspects:

  • They are largely subject-independent,

  • They are also grade-independent,

  • They are language independent, and

  • They can be easily translated into various other languages.

Most importantly, a vast number of such quizzes can be stored online, associated and linked to other online resources. If Flip-Flop and future similar technologies were widely adopted and enhanced, a vast reservoir of testing resources will become available.

In particular, an abundance of online quizzes can potentially change how we perceive testing per se.

As most educators know, when taking a test the primary motivation for students is to get the minimum sufficient number of points to achieve a desired grade. They expect that their tests will be corrected by a knowledgeable—if not infallible—instructor and that once they took the test they will not be bothered with taking another test where this knowledge will be required—except maybe on a final test that covers an entire course. So for most students, taking a test is mainly about the question “Did I get enough points?” rather than “Did I learn and understand the topic?”. If students can, however, take a quiz before viewing a lesson screencast or reading a textbook chapter to ascertain that they know the prerequisites and then take a quiz after studying the topics to find out how much they learned or whether they should review the cast or reread the chapter without being stressed about how these quizzes impact their grades, their attitudes are likely change. Gradually they may perceive a test not as a threat but as a positive or even entertaining experience that helps them to learn a subject and even assess and improve their own learning progress, learning speed, and learning skills.

There are other advantages that are likely to result from the availability of online quizzes for an arbitrary subject at an arbitrary time. If there is no time limit on how long a learner can take to complete a quiz, then the lack of stress is likely to result in better scores and more satisfactory experience. On the other hand, if a learner can opt for time limitation, the increased challenge might also prove to benefit the learning effect. Moreover, online quizzes can become a competitive adventure where learners attempt to achieve better and better scores and compete. After all, the abundance of online trivia quizzes (currently 5,350,000 search results on Google) proves that testing can be very entertaining.

Furthermore, online quizzes can point the learner to resources that are appropriate for his or her current level of knowledge and learning style. For instance, if a learner performs miserably on a quiz, he or she can be guided to a learning resource that is less advanced or covers the topic in simpler terms. Moreover, quizzes may be constructed in different ways, such as using pictures rather than text. If a student performs better on the pictorial type of quizzes, he or she can be guided towards the visually rich presentations of a topic. Similarly, students who answer textual questions rapidly may learn better from a more abstract description of the subject.

In an ideal world, we are all learners who should be easily able to find out whether we understand a chapter in a book, an article in a research journal, or even a topic explained in a Wikipedia web page. Imagine that each article of an encyclopedia topic had an accompanying online quiz a reader could take before he/she starts reading it. Then the hint links could be used to point to other articles that explain the prerequisite topics that need to be covered in order to adequately digest the topic at hand. Additional quizzes interspersed within the article itself could let the readers test their grasp of each subtopics covered. Finally, at the end a quiz could not only reveal how well the entire article was understood, but even ask questions about additional related, possibly more advanced, topics with links that point to follow-up articles that can be now more easily understood. When quizzes can be taken online without any adverse consequences—such as a low score or time limit—then our attitude towards tests could be dramatically changed and we could start viewing tests positively as instruments that help us to objectively assess our knowledge and learning, and lead us through a network of knowledge while making it easy to judge whether our learning is effective, efficient and even entertaining.

Abundance of online quizzes does not benefit only the learners. Making quizzes is one of the most tedious and time consuming chores of an instructor. Coming up with a new set of quizzes with new tasks every semester is not easy. Selecting from numerous quizzes even if they are not constructed by experts but by students and possibly just improving on their wording is bound to substantially shorten the time necessary for the instructor to create quizzes. Furthermore, since such improvements are stored and classified as employed or improved by an expert, the quality of the quizzes and tasks will undoubtedly improve gradually.

Instructors and institutions may administer Flip-Flop quizzes on prerequisite content to verify that students still recall earlier concepts from other courses. The hints for these quizzes may include links to review sites so students may reacquaint themselves with the material if necessary. Much like a traditional quiz, Flip-Flop quizzes may also be used to check whether students have learned new material at the end of a module or chapter of a textbook.

Obviously the abundance of quizzes needed to achieve the aforementioned goals can only be achieved if the Flip-Flop methodologies become widely adopted. We therefore encourage the readers interested in trying Flip-Flop in their courses to get in touch with us to discuss how we could best accommodate their needs and integrate them within our technology.

Last but not least, an abundance of online quizzes might solve one of the increasingly pressing educational quandaries: how to address copying and prevent plagiarism in a time when most of the resources, exercises, problems and their solutions are accessible online. If our students were willing to take dozens of quizzes to prepare for a test, we should be confident that they have learned the subject well enough. Given this perspective, we can argue that the more quiz tasks and problem solutions are uploaded to the internet the less important it will be to penalize students for copying. Plagiarism may become a historically interesting misdemeanor rather than an educational felony.