Introduction

Online learning has become another important format for course delivery in higher education (Allen and Seaman 2009; Schrum and Hong 2002; Song et al. 2004). According to a survey conducted by Allen and Seaman (2009) at over 2,500 institutions of higher education in the US, 4.6 million students took at least one online course during fall 2008. The number of students enrolled in an online course has grown rapidly over the last decade, so increasing numbers of students will take an online course during the next decade (Allen and Seaman 2009).

Despite great success with online learning, several concerns remain. One concern is that students new to online learning easily feel lost and socially isolated (McInnerney and Roberts 2004). If they do not know online learning environments are different from those of conventional learning and if they are unable to figure out how to engage actively in online learning communities, their feelings of isolation and loneliness may be exacerbated (Cho et al. 2010; McInnerney and Roberts 2004). Another concern is that students’ inability to interact with others in online learning environments may result in failure in online interactive activities, such as online collaboration and online discussion (Cho and Jonassen 2009). Interaction with others (e.g., peers and instructors) is important for success in online learning environments; therefore, students need to know how to interact skillfully with others to pursue academic online activities that require it. Different from conventional learning environments in which instructors can provide immediate support to students, online courses require students actively to seek help, participate in activities, and be part of a community (Huang 2002; Yang et al. 2006). For success in online learning environments, students therefore need to know how to interact with others (Cho and Jonassen 2009).

Yet another concern is that new online students encounter many technical issues related to taking an online class mediated by a course management system (CMS). When using a CMS, students interact with course content and communicate with others. Students’ ability to interact with a CMS is critical, especially for novice online learners. If the technology employed in an online environment is new to learners, they must gain familiarity with the technology in addition to the course content (Anderton 2006). In a qualitative study of registered nurses’ experiences with web-based learning, online students were reportedly unable to give the course content sufficient attention because they had spent so much time learning how to use the CMS (Atack 2003). At the same time, many online instructors complained that they must repeat the same answers regarding technical questions over and over again (McVay 2000). In addition, even online instructors do not know how to answer students’ technical questions in many cases. Some online instructors think many online students are unready to take an online course because they lack understanding about how online courses operate (McVay 2000). Although this was the state of online courses around 10 years ago, the trend does not appear to be diminishing (E. Vitz, personal communication, February 14, 2008).

In addition, a higher dropout rate has been observed among students in online courses than in conventional courses (Ali and Leeds 2009; Lee and Choi 2011). In a recent review of online course dropout research, Lee and Choi (2011) found skills (e.g., time management, computer confidence, and coping strategies), psychological attributes (e.g., motivation, self-efficacy, and satisfaction), and online interactions are related to dropout rates. In addition, Nash (2005) found students who dropped out or failed online courses were more likely to believe that online courses are easier than conventional courses. These studies indicate a need to educate online students to develop solid understanding of the nature of online learning.

As a solution to these problems, many educators and practitioners have suggested implementing an online student orientation (OSO) (Ali and Leeds 2009; Bozarth et al. 2004; Lee and Choi 2011; Palloff and Pratt 2003; Scagnoli 2001; Wojciechowski and Palmer 2005). Wojciechowski and Palmer (2005) found orientation attendance was an important factor predicting students’ academic achievement in online learning environments. Ali and Leeds (2009) found online students who attended an orientation performed significantly better than students who did not attend an orientation to online learning environments; therefore, an orientation program for online students is very important to help them develop positive learning experiences in online learning environments (Palloff and Pratt 2003).

However, the process of developing an OSO is rarely shared among institutions. In addition, existing OSO programs typically focus more on administrative and technological issues than on students’ learning in online environments perhaps because online orientation programs are not systematically designed from the perspective of student learning. Designing an OSO program to include additional aspects, such as student learning processes, should provide further benefit to students. Hence, the purpose of this study was to describe the process of developing an OSO program.

Project overview

The OSO described in this study was developed at a university in the American Midwest for students who were planning to take an online course or who were enrolled in online courses but needed assistance to better understand the online learning. As of spring 2009 when the current developmental study was conducted, 88 faculty members representing 29 departments taught 127 online courses. A total of 3,200 students took an online course, some of whom took more than one. So the total enrollment in online courses was 5,592, and the total number of credit hours students took through the online learning program (OLP) was 15,442, representing 12.5 % of total campus credit hours.

The OLP is positioned under the Division of Continuing Studies, which provides diverse courses such as personal enrichment, professional development, off-campus face-to-face courses, online learning, and weekend credit courses. The OLP is a significant part of the Division of Continuing Studies in that the OLP provides resources for online learning, such as CMS and funding to support a new online course development. The OLP works with the Center for the Enhancement of Learning and Teaching (CELT) for quality matter, online course development grant, and online course peer review. The director and the assistant director of the OLP contacted the CELT, asking for the development of an OSO program. The director of the CELT assigned the development task to the author, instructional consultant and designer; therefore, the author assumed the lead role for this project. The clients for this project were the director and assistant director of the OLP of the Division of Continuing Studies.

Program design

Because the purpose of this study was to describe the process of the OSO development in detail, the key components of instructional system design (ISD) models used will be described in detail. They explain the systematic design and development process in four phases: analysis, design, development, and evaluation (Dick and Carey 1996; Hirumi et al. 1994; McKenney et al. 2002). Instructional designers have not used these phases linearly to develop instructional materials; instead they have used each phase dynamically, considering authentic factors such as learning contexts, time, resources, and client needs (Richey and Klein 2005).

Analysis phase

The analysis phase included needs assessment, task analysis, and context analysis (Rossett 1987), accomplished through interviews, observations, and extant data analysis. A needs assessment was conducted to identify the needs of the OLP and set goals for the project. Interviews with the OLP clients (i.e., the director and assistant director) and analysis of an existing online orientation program revealed that they wanted to develop a new OSO that not only improves understanding of online learning, Blackboard, and technical issues but also checks students’ readiness to take an online course.

In order to define tasks that students need to learn during the OSO, a task analysis was conducted on online course syllabi and online literature. An analysis of 20 online course syllabi, provided by the assistant director of the OLP, revealed teaching and learning practices taking place in online courses offered by the OLP at the university. The syllabi covered diverse subject areas, including English, history, political science, communication studies, math, computers, engineering, and science. By reviewing each syllabus, the nature of online learning in terms of types of online learning tasks required, ways to interact with others, and possible learning resources were identified. In addition, a review of the literature concerning online learning was conducted to elaborate content in more detail as well as to make sure the OSO covers most of the important issues related to online learning. The literature review (Bozarth et al. 2004; Cho 2008; Huang 2002; Ko and Rossen 2004; Palloff and Pratt 2003; Roper 2007) was used to determine the structure and develop the content of the OSO.

In addition, the author located and subsequently analyzed eleven existing OSO programs through the Google search engine (Keywords: OSO, self-assessment, and online course) between April 22, 2008, and April 28, 2008. The existing OSO programs did not present enough information about the real nature of online learning; they simply presented technology skills and technical requirements necessary to take an online course.

Eight of the 11 online orientation programs offered 16 online readiness surveys or quizzes that students can self-check to determine whether they are ready to take an online course. The 16 measures were analyzed in terms of question format, number of items, and factors covered in the questions. Regarding question format, many questionnaires (10 of the 16) featured multiple choice questions with “yes” or “no” and “high,” “average,” or “low” as answers. Although this format simplified students’ answers, it may not have accurately revealed the level of students’ awareness of their readiness to take an online course.

The number of items in each questionnaire was diverse, ranging from 8 to 68; the average number of items in each questionnaire was 14.7. These items measured either or both technical requirements (e.g., understanding about word processing, Blackboard, email, discussion boards, graphic images, and Internet speed) and learning skills (e.g., time management, self-motivation, perceived reading and writing ability, social interaction, and self-discipline). If the items measured only one factor, it was learning skill; however, items measuring each factor were not well categorized. Thus arose the need to develop clearer factors for readiness to take an online course with definitions.

Furthermore, a context analysis was conducted to examine the generic format of online courses. The author planned to design the OSO to simulate generic online courses offered by the university where the current developmental study was conducted; therefore, identifying the nature of the generic online course was important. All the online courses offered at the University were asynchronously delivered via Blackboard. A list of the online courses offered during the spring semester of 2008 was provided by the assistant director of the OLP. Via email, the author randomly chose and contacted about half the faculty members teaching online courses (N = 43), representing 29 departments. Then, the author asked to access and to observe their online courses. Seventeen faculty members granted permission to observe their 26 online courses. The number of online courses each faculty member taught ranged from one to four. On average, each faculty member taught 1.53 online courses. Based on the observations, the online courses were categorized into two types: highly interactive and minimally interactive. Highly interactive courses featured interactive discussion, peer review, and collaboration as important instructional strategies. Minimally interactive courses included little interaction among students, and their instructors used primarily exams and individual projects for evaluation. Therefore, the author decided that the new OSO should simulate both types of courses offered by the University. In addition, no extra cost was expected to develop the OSO program because Blackboard, the existing CMS, was chosen for the main delivery platform. Analysis results suggested four learning objectives for the OSO as follows:

  • Online students develop understanding about the nature of online learning;

  • Online students use Blackboard skillfully for their own learning;

  • Online students solve technical issues they may encounter while using Blackboard; and

  • Online students develop self-awareness about learning skills required for online learning.

Design phase

During the design phase the results of the task analysis were used to design content for the OSO, which was validated by experts, including online faculty, online administrators, and instructional technology researchers. Four modules were designed, and specific descriptions on each module appear below.

Module 1, called “What is the nature of online learning?” was designed to help students understand the nature of the online learning environment. The module consists of four topics: Learning Environment, Assignments, Online Communication, and Learning Resources. Each topic includes 2–5 subtopics (see Fig. 1).

Fig. 1
figure 1

Module 1: What is the nature of online learning?

The content was validated through the expert review. Online faculty and online administrators were invited to participate in the expert review. Four faculty members at the university, recommended by the director of the OLP, were chosen based on two criteria: (a) years of online teaching experience at the university and (b) demonstrated excellence in teaching as shown in student evaluations and in online course development. An email invitation was sent to four faculty members to ask them to participate in the expert review: Two were able to participate. A focus group meeting was conducted with the two faculty members, during which they and the author discussed the validity of the content for Module 1. In addition, the OLP administrators, including the director, the assistant director, and a student support staff member, checked the content validity. Revisions were based on the results of the expert reviews.

Module 2, entitled “How to learn in Blackboard,” was designed to help students understand how to navigate and learn in their online course via Blackboard. Module 2, which lists possible uses of Blackboard functions for online learning, includes five topics: Understanding the Interface of Blackboard, Monitoring Course Activities, Communicating with Others, Interacting with Course Materials, and Evaluating Performance. Each topic has 3–6 subtopics (see Fig. 2).

Fig. 2
figure 2

Module 2: How to learn in Blackboard

The content was validated through the expert reviews by the two staff members of Information Technology Services at the university. One of the staff members worked at the help desk, mainly responding to students’ questions about Blackboard over the phone or online. The other staff member operated workshops about Blackboard for both students and faculty members. To validate the content of Module 2, another expert review was conducted, in which the OLP administrators, including the director, assistant director, and student support staff member, participated. Changes were made based on their suggestions.

Module 3, entitled “What are the technological requirements to take an online course?” was designed to introduce necessary technological requirements to take an online course mediated by Blackboard. Module 3 includes four topics: Access Your Online Course, Technology Requirements for Using Blackboard, Software Requirements for an Online Course, and Technical Problem Solving. Each topic has 1–6 subtopics (see Fig. 3).

Fig. 3
figure 3

Module 3: What are the technological requirements to take an online course?

Expert reviews were conducted to validate the content. The two staff members in the Information Technology Services performed the expert review. In addition, the OLP administrators, including the director, assistant direct, and student support staff member completed the review to validate the content of Module 3. Changes were made based on the results of the expert reviews.

Module 4, entitled “What learning skills and motivations are necessary for online learning?” was designed to provide prospective online students with an opportunity to self-determine their readiness to take an online course. With the literature review (Artino 2008; Cho and Jonassen 2009; Cho et al. 2010; Holcomb et al. 2004; Yang et al. 2006) and consideration of the modules in the orientation, six types of self-efficacy were proposed to measure students’ readiness to take an online course. Self-efficacy is a person’s belief about his or her capability to do a particular activity (Bandura 1986). Numerous studies reported that self-efficacy is a powerful predictor of online students’ academic success and behaviors (Artino 2008; Cho and Jonassen 2009). These six self-efficacies include self-efficacy for (a) completing online course, (b) interacting with classmates for academic purpose, (c) interacting with the instructor for academic purpose, (d) self-regulating online learning, (e) handling tools in a CMS such as Blackboard, and (f) socially interacting with other classmates. Although each item measures self-efficacy, the name of the module included the terms learning skills and motivations instead of self-efficacy for online learning because the clients perceived learning skills and motivations easier to understand and more appealing to online students than self-efficacy for online learning. The six types of self-efficacy with definition are as follows:

  • Self-efficacy to complete an online course: Belief in personal capabilities to complete an online course;

  • Self-efficacy to interact with classmates for academic purpose: Belief in personal capabilities to interact with classmates for academic purposes;

  • Self-efficacy to interact with the instructor for academic purpose: Belief in personal capabilities to interact with the instructor;

  • Self-efficacy to self-regulate in online learning: Belief in personal capabilities to self-regulate (e.g., planning, monitoring, evaluation, and adjustment);

  • Self-efficacy to handle tools in Blackboard: Belief in personal capabilities to use tools in Blackboard; and

  • Self-efficacy to interact socially with other classmates: Belief in personal capabilities to interact with others (e.g., classmates and instructor) for social purpose.

In this module, students take a survey to self-evaluate whether they are ready to take an online course. Students can take the survey either before or after they completed Modules 1–3. By taking the survey, students are expected to learn what types of self-efficacy (in this case learning skills and motivations) they need to develop or learn before taking an online course. For example, if students realize they need to learn more about Blackboard after taking the survey, they will pay more attention to Module 2.

Expert reviewers validated the items concerning self-efficacy. Two groups participated in the expert reviews, including the OLP administrators and five doctoral students in instructional technology and educational psychology. The OLP administrators included the director, assistant director, and student support staff member. The total number of initial items generated in the pool was 71, explaining six factors. Each one was asked to evaluate each item with a choice of keep, delete, or revise. After the evaluation of each item, each factor retained five to seven items, meeting the guideline that at least three items should be used to measure each factor (Pett et al. 2003), and the total number of items remaining was reduced to 35 (see Appendix 1).

Development phase

In consideration of aesthetics of the OSO program, the author developed webpages using the HTML codes and integrated them into Blackboard after discussions with the clients as well as the two staff members in the CELT. Dreamweaver CS 3 was used to develop the HTML pages. The prototype was presented to both the clients and CELT staff members, and the final interface was based on feedback from the clients and CELT staff members (see Fig. 4).

Fig. 4
figure 4

Overview of the OSO

The first three modules were organized with a visual chart (See Fig. 5). For example, Fig. 5 presents an example of Module 1:“What is the nature of online learning?” Clicking a topic on the visual chart leads students to a topic page, each of which includes description, examples, and guidelines as follows:

Fig. 5
figure 5

An example of interface in the OSO

  • Description: Explains the meaning of each topic or introduction of the topic;

  • Examples: Possible and specific cases or events reflecting each topic application in real online courses; and

  • Guidance: Suggestions or advice on how to handle each topic in an online course.

In order to develop Module 2, “How to learn in Blackboard,” and Module 3, “What are the technological requirements to take an online course?” the author simulated the process of responding to each topic, capturing screens on Blackboard. To increase the authenticity of the Module 2, the author used 64 screen shots from the course that he observed with the instructor’s permission; however, when taking the screen shots in online courses, images that provided identifiable course information and student identification were avoided. In addition, 28 screen shots were used for Module 3.

Module 4, “What learning skills and motivations are necessary for online learning?” was developed in a quiz format because the clients wanted students to have quiz experience before taking an online course. The quiz was one of the common evaluation formats used in online courses. Following the development guideline for the self-efficacy scale (Bandura 2006), a 100-point scale was used, where 0 denoted “cannot do at all” and 100 denoted “highly confident can do.” The quiz consisted of six types of self-efficacies with 35 questions.

Evaluation phase

Both formative and summative evaluations were conducted, the former after Module 1 was developed. The OLP director recommended four online faculty members, two of whom responded to the author’s request to participate in the formative evaluation. Participating faculty members were asked to think aloud their ideas and suggestions to improve the OSO. They also answered additional questions about navigation, content, accessibility, design and development, understanding, and satisfaction. In addition, one online student participated in the formative evaluation of Module 1. A recruiting email was sent out to students in four recommended faculty members’ online courses. Only one undergraduate online student volunteered for the face-to-face interview perhaps because the recruiting email was sent out at the end of the semester. Unfortunately, the instructors provided no incentives (e.g., extra points) for participation in the formative evaluation. During the interview, the student was asked to explore and share her opinions about Module 1. Changes in Module 1 were made based on the suggestions from all three respondents.

Unfortunately, formative evaluations for Modules 2 and 3 were not conducted because of time constraints for development and limited access to online faculty and online students. The author wanted to recruit at least four or five online faculty and students to complete formative evaluations on Modules 2 and 3, but doing so at the end of the semester proved difficult. Thus, because these two modules were structured like Module 1, the results of the formative evaluation of Module 1 were used to improve Module 2 and 3.

A summative evaluation was conducted to examine online students’ experiences with the developed OSO. A convenience sample of 63 students from one online education course and one online nursing course participated in the evaluation. Participation in the summative evaluation was completely voluntary. The purpose of the online education course was to teach pedagogical content knowledge, and it required interaction among students during the entire semester. By contrast, the online nursing course was designed to teach medical terminology, and instructional strategies heavily relied on self-study with little student interaction. Therefore, the two courses represented generic courses (highly interactive vs. minimally interactive) offered at the University. Among the 63 participants, 48 students were females. Average age, school year, and the number of online courses prior to taking the current online courses were 28.14 (SD = 9.76), 3.11 (SD = 1.49), and 2.81 (SD = 5.42), respectively. Some may be concerned that the participants were experienced students (participants had taken nearly three online courses on average); however, experienced students can in fact evaluate the OSO more accurately than novice online students.

Measurement

A two-part online survey for the online students’ evaluation of the OSO program was developed: The first part included 28 questions with a 5-point Likert scale, where “1” denoted “strongly disagree” and “5” denoted “strongly agree.” The author developed most of the items, but some came from Wang et al. (2007). The two staff members in the CELT with extensive experience in online course design and online teaching experience checked the content validity of the measurement. Students evaluated six aspects of the OSO program, including navigation, content, accessibility, design and development, understanding, and satisfaction; and Cronbach’s α for each aspect was 0.94, 0.89, 0.91, 0.89, 0.93, and 0.88, respectively. The second part of the evaluation consisted of the following two open-ended statements:

  1. 1.

    If you are either satisfied or dissatisfied with the OSO, please explain your choice.

  2. 2.

    If you could suggest one thing to improve the OSO, what would it be? Please explain.

Procedure

Research permission was acquired from the two online instructors recruited from the four online faculty members recommended by the director in the OLP. Once the two faculty members provided the author with permission, he sent an email invitation to online students to participate in the summative evaluation; and the two online faculty members also posted the invitation on a class announcement on Blackboard. At the same time, the OSO program site and online survey link were posted both in an email and an announcement. In the email and class announcement, students were asked to participate first in the OSO program and then in the summative evaluation. The study was approved by the IRB and conducted according to ethical guidelines.

Results

The results of the quantitative data analysis showed that students apparently valued the OSO program (see Table 1). They provided positive ratings for navigation (M = 4.21, SD = 0.63, content (M = 4.14, SD = 0.57), accessibility (M = 4.05, SD = 0.70), design and development (M = 4.16, SD = 0.57), understanding (M = 3.89, SD = 0.64), and satisfaction (M = 3.95, SD = 0.60). A repeated measures ANOVA was conducted with Greenhouse–Geisser correction to assess whether significant differences existed among the average ratings of the six variables of the OSO, which included navigation, content, accessibility, and design and development, understanding, and satisfaction. Results indicated that participants rated the six variables of OSO differently, F (3.52, 218.37) = 8.18, p < 0.001, R 2 = 0.12, η2 = 0.12. Pairwise comparisons with Bonferroni post hoc test indicated that students evaluated navigation, content, and design and development significantly higher than understanding and satisfaction; however, accessibility was not significantly different from any other variables.

Table 1 Summative evaluation with the OSO

Multiple regression was also conducted to examine the best linear combination of variables, including navigation, content, accessibility, design and development, and understanding, to predict student satisfaction with the OSO program. The combination of variables significantly predicted student satisfaction with the OSO, F (5, 57) = 39.63, p < 0.001, with two variables significantly contributing to the prediction. The beta weights suggested that understanding (beta weight = 0.67, p < 0.001) contributed most to predict student satisfaction with the OSO, followed by content (beta weight = 0.28, p < 0.05). The adjusted R 2 value was 0.76, indicating that 76 % of the variance in student satisfaction with the OSO program was explained by the model.

In addition, students’ written answers were qualitatively analyzed with the open coding method. An instructional technology researcher and the author discussed each statement and determined the coding schemes. The unit of analysis was meaning: If sentence(s) provided more than one meaning, they coded it with more than one code. For example, they coded “I am happy with the content and information provided and the ease of use” as P-C and P-E meaning positive-content and positive-easy to use. Because the number of comments for each question was small, they decided to discuss and determine the coding schemes together instead of blind code.

A total of 24 students responded to the first question: “If you are either satisfied or dissatisfied with the OSO, would you explain why?” All comments were positive. Student answers’ were coded with five coding schemes (see Table 2). Students seemed most satisfied with the content presented in the OSO (11 frequencies), easy way to navigate the orientation (six frequencies), and helpfulness (four frequencies). Four positive statements provided no specific reasons that students were satisfied with the OSO. Interestingly, three statements expressed students’ wishes to take an online orientation when taking a first online course, implying not only the importance of an orientation for beginning online students but also the lack of opportunity to take an OSO program.

Table 2 Comments about the OSO program

A total of 18 students provided comments in response to the second question: “If you could suggest one thing to improve the OSO, what would you suggest?” Suggestions were coded as five themes. Five comments were about design, including choice of color, link, and adding sitemap. With the CELT staff the author reconsidered choice of color on some pages and made small changes, such as fixing broken links. The author decided not to add a sitemap for two reasons: First, the navigation was designed with modules, and each module was organized hierarchically and provided multiple navigations; second, navigation was evaluated more significantly and highly than other variables (e.g., understanding and satisfaction). Another suggestion related to the operation of the orientation. Four responders stated that the OSO program should be given to first-time online students, and one student mentioned that it should be mandatory. The suggestions implied an important role of OSO program, especially for first-time online students. In addition, they suggested the need for more content; however, because the content was already validated through multiple expert reviews, the author decided not to add more content. The final suggestion was to provide the OSO via a different delivery format, such as a video orientation with pictures and animation; but with the limited resources (e.g., funding and time) available, this suggestion was not considered. Finally, the five comments about the orientation in general were positive but provided no specific suggestions (Table 3).

Table 3 Suggestions to improve the OSO program

Discussion

The primary purpose of the study was to share in detail the process of the development of the OSO program. With a framework of Instructional System Development (ISD) models, the study thoroughly describes the entire developmental process through analysis, design, development, and evaluation phases. According to Richey and Klein (2005), this developmental study can be categorized into Type I developmental research, which shows the entirety of the developmental process. The detailed documentation of the developmental process can be helpful for other institutions planning to develop an OSO program.

The author learned several lessons through this study. First, at any institution a thorough analysis is important before developing an OSO program. In this study, analysis included needs assessment, tasks analysis, and context analysis. By conducting a series of analyses, the author identified clients’ needs and existing resources that can be used for the development of the OSO program. In addition, the thorough analysis allowed the author to understand the University’s online learning contexts better. The learning contexts included types of online course (highly vs. minimally interactive), online course interfaces, online tasks and resources, and student support. Diverse techniques should also be considered for the analysis, including interviews, focus group, extant data analysis, and online course observations; furthermore, knowing how and when to apply these techniques appropriately is important.

Another lesson is that listening to stakeholders’ voices is important for project success. In this study, direct and indirect stakeholders were invited to take part in the project. Direct stakeholders included the director, assistant director, and student support staff of the OLP as well as faculty members and students. Indirect stakeholders included staff members from the CELT and faculty and student support staff members from the Information Technology Services team. Invited direct and indirect stakeholders validated the content of the OSO program from multiple perspectives, and the university community took ownership of the project. In particular, the clients were actively involved during the entire process of developing the OSO. For example, the direct stakeholders provided information about needs and existing resources (e.g., existing orientation for online students, syllabi, and course lists) and participated in the expert review to validate content. Because of direct stakeholders’ active participation in the study, important decisions (e.g., design concept) were made quickly, allowing the author to proceed more effectively.

Yet another lesson is that quality content is significant in determining project success. The author spent considerable time with content development through initial development and content validation from the stakeholders. Approximately half the project time (4 months of 10) were devoted to content development. Students well received the content, which focused on learning instead of technical aspects. Student evaluation on content significantly predicted their satisfaction with the OSO program along with their evaluation of understanding. In addition, the comments submitted during the qualitative analysis revealed that students highly valued the content of the OSO program.

Using the systematic developmental process in the current study, an OSO program featuring two aspects was developed: First, the OSO is a self-paced online orientation that students can access anytime and anywhere as long as they have Internet access; they can work on any modules they choose. Because the navigation of each module in the orientation is structured with a visual chart, they can choose topics they would like to learn. The self-pacing of the online orientation can be beneficial to many prospective and existing online students.

Second, different from many existing orientation programs that emphasize technology as a main requirement, the current OSO was developed from the standpoint of learning in online environments. For example, Module 1 explains how online learning differs from traditional learning in terms of environment, assignments, online communication, and resources. Module 2 explains how to use Blackboard for learning; therefore, topics in Module 2 relate to learning. They include understanding the interface of Blackboard, monitoring course activities, communicating with others, interacting with course materials, and evaluating performance. Because the OSO emphasized learning, students were expected to understand the nature of online learning better and use the functions in Blackboard as tools for their learning.

Limitations

The number of participants in the evaluations, including both formative and summative evaluation, was one of the limitations of this study. Unfortunately, for the formative evaluation, only two faculty members and one student participated in Module 1. In addition, the formative evaluation of Modules 2 and 3 could not be conducted. Only 63 students participated in the summative evaluation of the study because of time constraints on the project and limited access to online student and faculty for the evaluations. Considering the total number of students enrolled in the OLP (N = 3,200) when the study was conducted, 63 was a small number for the summative evaluation; however, the students were recruited from two representative types of courses (highly interactive vs. minimally interactive). If the author had more time and access to online students and faculty, he would have recruited more students representing different content areas, school year, and gender for the formative and summative evaluations.

Another limitation is that systematic evaluation by direct and indirect stakeholders is missing in the study. An interview with a list of questions or the same summative evaluation survey items could have been administered to the stakeholders, but they were actively involved in the entire developmental process. Thus, the author received verbal and informal feedback from them indicating that they were satisfied with the OSO program, rendering formal evaluation unnecessary. Formal evaluation should, however, be conducted with the direct stakeholders after any OSO program is developed.

Future research directions

Future research on OSO programs can be designed to evaluate the effectiveness of online orientation. Possible research could compare two groups of students, one of which received online orientation and the other didn’t. Then a comparison of the two groups could be conducted in terms of students’ struggles, retention rate, achievement, and satisfaction with an online course. The orientation can also be used in the professional development of faculty members who want to teach online but have never done so before.

In addition, the items developed to measure student readiness to take an online course can be more thoroughly validated through research on the development of psychometric measurement. The items were already validated through expert review, so the next steps are to administer the items to a large group of online students. Considering the number of items (N = 35), more than 175 student participants are needed for exploratory factor analysis (Hatcher 1994), which can reduce the number of items and reveal latent variables. Then, confirmatory factor analysis can be used to check whether the structures found in the exploratory factor analysis are valid with the new samples. The validated measurement can be used for further validation study. For example, using the developed measurement, online students’ achievement and learning satisfaction can be predicted. Essentially, the measurement can be used both as a self-assessment tool for online students and diagnostic tool for online teachers.