Keywords

1 Introduction to the Technology

The purpose of this chapter is to help other instructors who are asked to teach an online course for the first time to learn from the experience of others. At the end of the chapter, several recommendations will be made for instructors who plan on teaching an online statistic course for the first time.

How to deliver content is often the first area of need that instructors must satisfy for an online course. The chapter will discuss SoftChalk, a lesson-building software package that can be used to build lessons with videos, short quizzes, and flash-based activities. In addition to the instructor prepared website lessons, the textbook publisher’s website called MyStatLab will also be discussed briefly. Additionally, communication between the student and the teacher as well as communication among students is critical for a good learning environment. Several forms of communication programs will be discussed including programs within Sakai (the course management system) and Elluminate by Blackboard. Elluminate is an online software package that includes video chat, text chat, and an interactive whiteboard. Sakai is an open source course management system that includes a text chat room and discussion board. Conjointly, a brief review of the hardware used in the creation of the recordings for the online course will be covered including the tablet PC and wireless microphone systems. Finally, how do you handle formalized assessment exams in an online environment? How do you test the student’s understanding of the material when you are not physically in the same location? An online test proctoring service, ProctorU, will be discussed in detail.

2 Setup

This research was conducted with a class of 67 undergraduates at a large research institute in the United States. The course was taught during the summer over a period of 6 weeks. The stipulation made by the university was that the students would not be required to come to campus for any portion of the class and therefore everything had to be done completely online. There were multiple areas of assessment in the course: exams, daily lesson quizzes, chapter quizzes, small group discussion board assignments, daily homework, and a final project. Each of these assignments utilized technology at some level.

The students were asked, but not required, to complete a 38-question presurvey and a 40-question postsurvey. Out of the 67 students enrolled in the course, 28 started the presurvey and 25 completed it. For the postsurvey, 28 started it and 22 completed the survey. All students who completed the survey were 18 years old or older.

A brief overview of the class demographics from the survey showed that 89.5 % of the students were female and 10.5 % were male and that 47.4 % were sophomores, 42.1 % were juniors, and 10.5 % were seniors in college. The students were also asked if they had taken a statistics class before and 73.7 % had never taken a statistics class while 5.3 % had taken a (non-AP) statistics class and 21.1 % had taken an AP level statistics class in high school. After earning a bachelor’s degree, the student’s plans for the future included 21 intending to pursue a post-bachelor degree in medical school, veterinary school, graduate school, or law school, while one each intended on entering the work force, military, and Peace Corps. These demographics show that most students that completed the survey were female, had not taken a previous statistics class, and were planning for a post-bachelor degree.

3 Background

For more than a decade, statistics educators have been researching how to implement statistics courses online and determining if there is a difference between online courses and traditional courses.

Utts, Sommer, Acredolo, Maher, and Matthews (2003) compared a traditional course to a hybrid course (partially online) that met only once a week. She found that “performance of students in the hybrid offering equalled that of the traditional students, but students in the hybrid were slightly less positive in their subjective evaluation of the course” (2003, p. 1).

Tudor (2006) discussed a course for public health students where she included voice over PowerPoint slides and quizzes for self-assessment. The quizzes were static quizzes on Word files with answers supplied. She did include discussion board assignments in her course, but concluded that “. . . it appears that the effectiveness of online discussion in a statistics class is still debatable. The biggest factors affecting their success may be the topic of discussion and the quality of the questions” (2006, p. 7).

Everson and Garfield (2008) discussed using discussion boards in her online courses to align her course with GAISE (Aliaga et al., 2005) guidelines. She states:

An important goal of the online course described in this chapter was to align them with the GAISE recommendations. Based on our experiences and observations, structuring our online courses in this manner has resulted not only in student learning but in student satisfaction. (2008, pp. 9–10)

The GAISE guidelines are a series of guidelines for educators teaching statistics in the United States. The GAISE guidelines recommend:

that instructors emphasize statistical literacy and develop statistical thinking, use real data, stress conceptual understanding rather than mere knowledge of procedures, foster active learning in the classroom, use technology for developing conceptual understanding and analyzing data, and use assessments to improve and evaluate student learning.

Mills and Raju (2011) summarized and compared 20 articles about online courses in statistics over the past decade. Mills asserts that:

In the middle to latter part of this decade, more importance was and has been placed on: selecting “appropriate” uses of technology for the online statistics environment, improving interaction among students and the instructor, enhancing the overall learning experience for online students, and conducting formative and summative evaluations to carefully monitor the teaching and learning process. (2011, p. 21)

Additionally, general education literature can also tell us about the important components of an online course. The text, “The Online Teaching Survival Guide: Simple and Practical Pedagogical Tips,” contains a list of ten best practices for online teaching including the following three items; “create a supportive online course community,” “use a variety of large group, small group, and individual work experiences,” and that the instructor should “prepare discussion posts that invite responses, questions, discussions and reflections” (Boettcher & Conrad, 2010). These best practices describe the necessity of building community in an online course as well as the need for a variety of assignments. The discussion board assignments, final project, and chat room office hours described in this chapter were specifically designed in order to reflect GAISE guidelines as well as the best practices in the online environment.

4 Technology

Technology plays a vital role in an online course; however, the technology should be there to assist the course not be a hindrance. The subject matter of the course should be of primary importance for the students, not the technology used to deliver the course.

4.1 Hardware

For this course, the student needed a copy of the textbook and a copy of the course notes. The shell of the notes, a workbook, is a 120-page document with the examples, exercises, terms, and important concepts for the course; however, the examples and exercises are not completed. The students complete these examples and exercises with the instructor as they watch the tutorials. This allows for the students to concentrate on statistical understanding, rather than copying.

In order to understand the course design, it is necessary to understand the capabilities of Microsoft OneNote for the tablet PC. The program allows a user to include handwritten notes in a file by using a stylus. So, instead of working out a problem on a chalkboard, the instructor could work out the problems on the screen in OneNote. In OneNote, the color and width of the pen could be easily changed during the lecture. The shell of the notes that the students had was imported into OneNote allowing the instructor to write directly on the notes and then the instructor recorded video tutorials of the discussion. Microsoft OneNote was chosen rather than Microsoft Powerpoint because it allowed for a less restricted working space. For course creation the instructor used several pieces of hardware, including a microphone, webcam, and a Fujitsu T5010 tablet PC with a stylus pen and 4GB ram.

As for sound in the video tutorials, four sets of microphones were tested: the microphone built into the tablet PC laptop, the Azden WLX-PRO VHF wireless microphone, the Samson SWAM2SES N6 Airline Micro wireless ear set, and an H530 Logitech headset microphone. Two things should be considered when evaluating a microphone for online course use: the sound quality of the recording and the ease of use. Except for the Samson microphone, the other microphones had poor sound quality in the instructor’s opinion and the breathing of the instructor was picked up on the recording. The Samson microphone did not pick up the breathing of the instructor and filtered out surrounding area noise making it the best option.

4.1.1 Content-Building Software

The content for the course was delivered in multiple ways using instructor built lessons and publisher supplied materials. For this course, SoftChalk was used to create a course website which covered 24 detailed lessons spanning 143 web pages and included complete topic explanations, 254 quiz questions, and 128 short instructor videos. The lessons also contained 22 activities including flash-based dynamic study tools written in SoftChalk, online applets, and exercise problems for the students to solve using StatCrunch . SoftChalk made it very easy to insert graphics, videos, sound clips, or web pages. It also made it possible to test the students on what they had learned and prepare flashcards or games to emphasize important topics. Grading these activities ensured that the student actually completed them and hopefully reached a higher level of engagement.

The lessons start with a list of about three to eight objectives that the student should master for that day. The lesson then steps the student through learning each of those objectives by first giving an explanation of why the objective is important and how it relates to the other material in the course. After the lesson objectives, there was a video (or videos) that explained the main concepts of those objectives. Following the video(s) there was usually either a quiz exercise, study tool such as flashcards, or a statistical applet for them to see the concept in action to help the student test their knowledge about what they had just learned.

For example, one of the first lessons of the course discusses “Measures of Center, Spread and Position.” On the first page of the lesson, a baseball data example is presented along with a brief discussion about why it would be important to quantify the measures of center, spread and position for this data set. The objectives of the day are then presented.

On the next page of the lesson, the first objective is discussed with a short tutorial which explains the definitions of the mean, median, and mode as well as finding the mean and median of two data sets. The students are then asked to complete a quiz where they have to find the mean and median of a data set in addition to matching the terms mean, median, and mode with their definitions. For the second objective, the students are asked to explore the effect of an outlier on the mean and median by playing with an applet designed by the publisher of the textbook and they are then quizzed on their findings. For the third objective, the students watch a short video that shows them how to compute the median from a stem and leaf plot from Minitab, a statistical software package. For the fourth objective, the measures of variance, standard deviation, and range are discussed in a short video and then the students are asked to compute these values for a data set as well as to predict the effect of an outlier on these measures. For the fifth objective, the empirical rule is explained in video and the students are asked to answer a question about the rule. The sixth objective includes videos that show how to compute the quartiles and how to use the quartiles to make, interpret, and read boxplots. The objective finishes with the students answering questions by comparing side-by-side boxplots. The last objective discusses output from Minitab and working with StatCrunch. Students watched a video by Webster West, the creator of StatCrunch and were asked to use StatCrunch to analyze a data set. The last page of the lesson reviews the important concepts that they have learned.

In addition to the SoftChalk lessons, the students were required to use MyStatLab for homework problems provided by the publisher for each lesson assignment. The homework problems accompanied the textbook, Statistics: The Art and Science of Learning from Data by Agresti and Franklin (2009). From an instructor’s point of view, the assignments were easy to select and assign and provided instant feedback to the students. But what was the experience from the student’s point of view? In the postsurvey, the students were asked how much time they spent working on the course (including everything related to the course: activities, quizzes, watching lectures, doing homework, studying) per day? The average amount of time spent on the course per day was 3.05 h. The standard deviation was 2.76 h. The data did have one outlier where the student said that they spent 15 h per day on the course, which seems doubtful. If this point is removed, the average is 2.47 h with a standard deviation of 0.75 h. The students were also asked how many hours they spent on the course per week. The average time spent on the course was 11.64 h and the standard deviation was 5.36 h. The minimum number of hours per week was 3 and the maximum number of hours per week was 28.

Additionally, the students were asked in the presurvey how much of the homework they planned to complete. All but one of the responses said 100 % on the presurvey (one response said 80 %). On the postsurvey, the average response for the percent of homework completed was 97.5 % and the standard deviation was 3.628 %. The minimum was 90 % and the maximum was 100 %.

4.2 Computer/Video Screen Capture for Content Creation

Two forms of lecture capture software programs were used during the semester, Camtasia Relay and Jing!. Camtasia Relay was chosen because it was supported by the university and video storage was free but unlike the full version of Camtasia, it did not allow editing of the video beyond setting start and end times. Additionally, for Mac users an extra program called Flip4Mac had to be downloaded so they could watch the videos. Otherwise, Camtasia Relay was very easy to use. At the end of the semester, the students were asked what percentage of the videos they watched. Based on the students self reported usuage, the average percentage of videos watched by the students was 91.73 %.

The other video capture program used was Jing! and it was selected for the student’s projects because it was free and easily accessible online for students to use. The main limitation of free Jing! is that it only allows for 5 min recordings. A few students initially had issues understanding how the program worked and were resistant to learning a new software program. However, after they were pointed to the help tutorials on Jing!’s website, they were quickly able to make the software work. Afterward several students noted how easy it was to use and how they planned on using it in the future.

4.3 Interactive Communication Programs

During the semester, interaction was also encouraged between the students and between the instructor and the students. Several formats of interactive computer programs were used: email, the chat program in Sakai, the discussion board in Sakai, and the Elluminate software package. The students were sent list serv emails almost every day of the course reminding them of upcoming deadlines or giving additional instructions. The students were also encouraged to email questions about grades directly to the instructor and to post all questions about the content of the course and the administration of the course on the Q/A board.

The instructor initially thought not having a whiteboard in the Sakai chat function to answer questions would be limiting but it wasn’t, instead the instructor made Camtasia Relay videos and posted it for the students. The Elluminate software allows instructors to conduct online office hours using video chat, text chat, or an interactive whiteboard. Additionally, it was possible for students to be polled about a concept or simply asked to raise their hand. Although the program had many capabilities, this also made it difficult to operate the program and teach at the same time. The initial plan for the online office hours for the course was to use the Sakai Chat room and then to transition to Elluminate. The transition, however, was not made due to complexity of the software. It felt that Elluminate had become the primary focus rather than learning the course material.

The discussion board in Sakai was used for two reasons: for a question and answer board and for a small group discussion board. Students were encouraged to post questions about the content of the course and general administration issues on the question and answer discussion board. The small group discussion board was used for discussion between randomly selected groups of about eight students. The students were asked to complete five activities during the 6-week semester. The first activity was for the students to introduce themselves to the group and then to reply to at least three other students’ introductions. The second activity was for the group to select three articles from the internet that contained information about an experiment and/or survey. The students were then asked to identify various aspects of the study such as the explanatory and response variable and to discuss what aspects of the experiment/survey were good and what could be improved. The group then ranked each of the three surveys/experiments in terms of quality and adherence to the good survey/experimental protocol that they established. The third activity was for the students to conduct a lesson style called a Four Corner Debate that has the students debate a particular concept. The idea for a Four Corner Debate came from the talk by Michelle Everson and Jackie Miller at USCOTS 2011 (for more information on a Four Corner Debate visit this website http://www.educationworld.com/a_lesson/03/lp304-04.shtml). The concept for the debate was for students to consider issues about privacy and ethics as it relates to data collection and statistical analysis. Sometimes it is helpful for students to see other sides of an issue by not getting to pick the point of the view that they are arguing. So each student was told that in a few days a statement was going to be posted to the discussion board which they would need to debate. However, they had to pick their point of view before the statement was posted. The students had to pick if they “strongly agreed,” “somewhat agreed,” “somewhat disagreed,” or “strongly disagreed” with the statement. Several days later the statement that “Data can only do good things in today’s world,” was posted. The students then had to support their point of view in respect to this statement. The fourth activity asked the students to complete a collaborative quiz on four questions with multiple parts about the sampling distribution of the sample proportion and the sample mean. The students were first asked to complete the assignment on their own and post their answers and then to work together as a group to complete a response from the whole group. This idea of the use of a collaborative online quiz came from a talk by Audbjorg Bjornsdottir and Ellen Gundlach who also presented at USCOTS 2011. Only the final quiz responses from the entire group were graded and participation in building the team’s response to the assignment was a part of the grade. The last assignment was for the students to critique other students’ semester final project.

The instructor found grading the discussion board very time consuming for 67 students, particularly the second assignment. The students also resisted the group activities because they were uncomfortable coordinating with other students not in the same town or who didn’t respond in a timely manner. Additionally, for the fourth assignment there was very little discussion over the quiz answers since they didn’t want to point out that another student was wrong. In the future, the instructor plans to have students submit a group contract laying out each student’s responsibilities to help students feel more comfortable with the assignment.

The communication software and email were all used to improve interaction in the online course and to help build a sense of community. The students were asked in the pre- and post survey how important these technologies were to them and a few of the results are below (Table 27.1).

Table 27.1 How important was the following with the instructor to you?

The students were also asked how important was interaction and how frequently they visited the Small Group Discussion Board (SGDB) and the class Q&A Discussion Board (QADB) (Tables 27.2 and 27.3).

Table 27.2 How important was interaction with other classmates on the SGDB and QADB?
Table 27.3 How frequently did you visit the SGDB and QADB?

Although this survey does not represent a random sample of students, it is interesting that the students preferred form of communication was still email.

4.4 Grading

Determining how to setup high stakes testing in an online environment can be very difficult. The instructor needs to think about what type of assessments work the best at determining how well the students have learned the material, what type of mechanisms need to be in place to ensure that the students are who they say they are, and that the security and integrity of the exam itself remains protected.

For this course, the instructor determined that the best way to conduct high stakes testing was with an online proctored multiple choice test. All students had to begin their exam within 3 h of the first exam being started. The ordering of the questions and the answers was randomized for each student. The students were also directly proctored during the exam by an online test proctoring company called ProctorU. Before the exam, the students were encouraged to perform a system check of their computer to make sure that it would fully function with ProctorU’s monitoring software. On the night of the exam, the students logged in to the ProctorU software and were greeted by a proctor in a video chat using a webcam. The students would then allow the proctor to see their computer screen so that whatever is on the computer screen is viewed by both the student and proctor as well. The proctor then asked to see the student’s id and asked a few questions to ensure their identity. The company also took a picture of the student that could be used for later reference if needed.

For an instructor, setting up an exam time with ProctorU required completing a short Excel spreadsheet that included start and stop times, exam length, the date of the exam, the password of the exam, and if any special accommodations were needed. The instructor then setup the exam within the course management system and set a password for the test. The students would only find out the password after communicating with the proctor at ProctorU.

5 Conclusions

For conclusions, the specific course assessment will be given as well as a set of recommendations for teachers teaching the course for the first time.

5.1 Specific Course Evaluation

The course will be evaluated in two ways, the overall instructor evaluation and the overall course grades given. The overall instructor rating from student course evaluations was 4.42 out of 5. Table 27.4 shows the final grade distribution for the course.

Table 27.4 Grade distribution for the course

Grades are a fairly limited source of assessment because they can be arbitrarily determined by the instructor. However, it does show that most students were successful in the course. The drop rate was 7.5 % and although this value is similar to other non-online courses taught by the instructor, it would be nice if the drop rate was smaller.

5.2 Overall Assessment of Technology and Recommendations

SoftChalk was easy to learn for someone without a strong website HTML programming background. Additionally, its ability to add quizzes and other activities helped improve the learning experience. The tablet PC was a good tool allowing for quick graphics to be drawn for illustrating statistical concepts. Camtasia Relay recordings were easy to make and allowed for quick explanations of material to be presented to students. StatCrunch allowed for the students to collect and investigate their own data as encouraged by the GAISE guidelines. Finally, the SoftChalk lessons and MyStatLab allowed for immediate feedback while students were practicing working with statistical concepts. As for recommendations, don’t assume that students will quickly pick up different software. Introductions to all forms of software used in the course should be provided to make the students more comfortable with the environment. Encourage communication through chat rooms and especially email since this is still their most comfortable form of communication. Software should be chosen to enhance a course and should take the back stage to the course material. Elluminate’s software was cumbersome to use, whereas the other programs worked seamlessly in the background and aided in the learning of the material. Finally, more interference from the instructor to stimulate discussion and team work should be made. The discussion board assignments did not generate the sense of community that was their primary goal so students didn’t work together well to make sure the work submitted was correct.