Introduction

According to the National Center for Education Statistics (NCES) college enrollments are expected to reach a new high each year from 2010 through 2018 (Planty et al. 2009). Additionally, recent research indicates that enrollment in online courses is growing at a rate approximately ten times that of traditional classroom-based instruction in higher education. It is estimated that nearly four million students were enrolled in at least one online course in the fall of 2007, representing approximately one in four of all college students in the United States (Allen and Seaman 2008). But these four million online students generally did not enroll in just a single online course, rather it is estimated that they generated approximately 10–12 million individual course enrollments in higher education (Parsad and Lewis 2008). At present, 96% of public 2 years colleges and 86% of public 4 years colleges offer such online courses (Parsad and Lewis). Clearly, with this level of participation and growth it is crucial that we continue efforts to understand the many instructional challenges and opportunities arising in online environments. This paper investigates a recent conceptual framework that attempts to support such understanding.

While several early models endeavor to describe and explain traditional distance education (e.g. Peters 1967; Moore 1973; Wedemeyer 1981; Holmberg 1985) more recent theories put greater emphasis on online pedagogical issues per se (e.g. Anderson et al. 2001; Garrison et al. 2000; Garrison and Arbaugh 2007; Hrastinski 2009; McKnight 2000; Stahl 2005). Given that growth in distance higher education today is driven largely by developments in asynchronous online learning (US Department of Education, National Center for Education Statistics 2003; Parsad and Lewis 2008; Allen and Seaman 2008), it is sensible and necessary that we now focus our attention on models that deal explicitly with pedagogical issues in these environments.

Social views of learning that are foundational to the Community of Inquiry (CoI) framework (Garrison et al. 2000) suggest that knowledge construction occurs through the development of groups of learners sharing common goals, values, and language (Lipmann 2003; Pierce 1955). In online education the effectiveness of such communities are hypothesized to be contingent on optimal levels of teaching, social, and cognitive presence (Garrison et al. 2000). The CoI model posits that, in the absence of face-to-face interaction, learners must struggle to recreate the social and epistemic processes that occur via moment by moment negotiation of meaning typical in collaborative classrooms. These dynamics are depicted by the concepts of presence described below.

Reflecting recent understanding of collaborative pedagogy, teaching presence refers to the instructional design and organization, facilitation of productive discourse, and direct instruction developed in online courses, ideally by both instructors and students (Anderson et al. 2001). Research on teaching presence has demonstrated that the construct coheres into reliable factors reflecting intended latent variables of instructional design and facilitation of discourse (Arbaugh and Hwang 2006; Shea et al. 2006). Other research reveals significant correlations between high student ratings of teaching presence and perceived learning and satisfaction with online courses (Shea et al. 2005, 2003) as well as strong correlations between effective teaching presence and high learner ratings of their own sense of community and the quality of corresponding learning experience (Shea 2006).

The notion of social presence is founded in past research (e.g., Short et al. 1976) seeking to articulate how participants in mediated communication project themselves as “real people”, especially in the lean medium of text-based, asynchronous interaction. Also reflecting the concept of teacher immediacy (Mehrabian 1966) the CoI model outlines modes of social presence including textual demonstration of affect, group cohesion, and open communication necessary to establish a sense of trust and membership in a community dedicated to joint knowledge construction. Richardson and Swan (2003) analyzed learner perceptions of social presence in online courses and found that ratings of the quality of social presence were strongly correlated with perceived learning and satisfaction with instructors. Picciano (2002) found correlations between perceived social presence, learning, and interactions in the course discussions. Shea and Bidjerano (2009), using structural equation modeling with more than 2,000 online learners, concluded that students experience of teaching presence has a direct effect on their perceptions of social presence, both of which contribute to the quality of their cognitive presence, described in greater detail below.

Participants in online environments engage in teaching and social presence combined with the third hypothetical construct, cognitive presence to develop such a community. In the CoI model, cognitive presence is seen as developing through a series of four cyclical stages beginning with a triggering event then moving (again ideally) to exploration, integration and resolution (Garrison 2003). The concept of cognitive presence is thus built upon the Dewian view of practical inquiry (Dewey 1933, 1959).

Research on cognitive presence has been mixed. It has been reported in previous research (Fahy 2005; Garrison et al. 2000, 2001; Garrison and Cleveland-Innes 2005; Kanuka and Anderson 1998; Kanuka et al. 2007; McKlin et al. 2002; Meyer 2003; Rourke and Kanuka 2009; Stein et al. 2007; Vaughan and Garrison 2005) that students in online courses tend not to reach higher stages of cognitive presence, i.e. integration, application and resolution, but instead appear to stall at lower levels reflecting introduction to, and surface exploration of, course topics and issues. These results are at odds however with other recent research reporting that, relative to classroom learners online students report higher levels of engagement (NSSE 2008). Specifically, in the most recent version of the National Study of Student Engagement (2008) among almost 380,000 randomly sampled students attending 722 US baccalaureate-granting institutions, both first-year and senior online learners were more likely than classroom-based learners to participate in course activities that challenged them intellectually and to discuss topics of importance to their major. Additionally relative to classroom-based learners, both first-year and senior online learners reported more deep approaches to learning in their coursework, including higher order thinking, reflective learning, and integrative learning (NSSE 2008, p. 16). It seems contradictory therefore that online students, who reported deeper, more intellectually challenging approaches to learning than did their classroom counterparts, are failing to reach higher levels of cognitive presence reflective of integration, application, and resolution. Add to these results of student perceptions recent meta-analytic evidence indicating superior learning outcomes in online environments (Means et al. 2009) and the contradictions become more apparent. We therefore sought to examine this issue through the CoI framework in the current study.

Several studies (e.g., Shea and Bidjerano 2009, 2008; Swan and Shih 2005; Wise et al. 2004) have examined the Community of Inquiry framework in the context of online education by the means of variable oriented approach. For example, Shea and Bidjerano (2009) found that teaching presence explains variance in student ratings of cognitive presence; however, the relationship between the two constructs is mediated by the students’ perceptions of the degree to which the online medium affords opportunities for social interaction and connectedness, i.e. social presence. Similarly, the authors established that age, registration status, perceptions of the quality of teaching presence and social presence predict multivariate measures of students’ overall assessment of the quality of their cognitive engagement (Shea and Bidjerano 2008). Although such variable oriented analyses have merits, contributing substantially to our understanding of the factors that play a role in the success of online education, they fail to capture individual differences and the extent to which the constructs interact for specific subgroups of students. The variable oriented approach ignores the possibility that different subgroups of students may exist as defined by their standing on the constructs of social presence, cognitive presence and teaching presence.

Given the basic assumption of the CoI model that computer mediation requires learners in asynchronous text-based environments to struggle to recreate the social and epistemic processes requisite for learning to occur, we were especially interested to discover differences that reflect this hypothetical struggle. The model suggests that students who have opportunities for face-to-face interaction with instructors and classmates may differ in important ways from students who do not. We therefore also sought to examine hypothesized differences between students in fully online asynchronous text-based courses and students enrolled in “hybrid” or “blended” courses in which part of the instruction and interaction occurred in classrooms and part occurred online with a coincident reduction in classroom seat time.

To extend previous research on the CoI model, in the present study, we also adopted a person oriented approach with the aim to identify distinct groups of students as defined by the patterns of their perceptions of their social presence, cognitive presence and their instructors’ teaching presence. The study therefore has several purposes. The first objective was to reexamine the robustness of the CoI model with the inclusion of new dimensions such as assessment practices, and to revalidate it with a larger and more diverse sample of students. Second, to address apparent contradictions between the recent national study of student engagement and past studies of online learner cognitive presence we sought to examine whether online students believe that they are reaching the more advanced stages of cognitive presence in their courses. Third, we sought to identify patterns of social presence and teaching presence and examine their effects on the student perceptions of cognitive presence in relation to the type of course in which they were enrolled—either fully online or “hybrid”. The following research questions guided our inquiry:

  1. 1.

    What levels of cognitive presence do students report, lower levels or higher levels?

  2. 2.

    Can the Community of Inquiry model be replicated with a larger and more diverse sample of online students? How does the addition of items about assessment affect the factor structure and reliability?

  3. 3.

    Are there distinct subgroups of students as defined by their perceptions of teaching presence and social presence? How do these differences explain variation in cognitive presence?

  4. 4.

    What is the extent to which patterns of perceived teaching and social presence predict cognitive presence after taking into account the nature of the online experiences (hybrid instruction vs. completely online instruction)?

Method

Participants

Participants in the study were students enrolled in either fully online or “hybrid” courses offered by colleges in a multi-institutional online learning network. Of the 5,024 students who attempted the questionnaire, the majority were female (n = 3,796, 76%) and full time students (n = 3,111, 63%). More than half of the participants (54%) were freshman and sophomore students; about 30% were junior and seniors, and the rest 16% were either non-matriculated students or graduate students. Less than half of the students (41%) reported a full time employment status; 36% were employed part time; and 22% were unemployed.

Instrument

The instrument used in this study is based on previous surveys developed by the one of the authors and others working on the Community of Inquiry model with several revisions. Because previous research failed to find a coherent factor structure for the teaching presence construct, the items used previously were changed to more clearly define “direct instruction”. Recent research (Shea and Bidjerano 2009, 2008) indicates that these changes do provide a better factor structure reflecting the latent variables of interest. Also, missing from previous conceptions of teaching presence was a more clearly articulated conception of the instructor’s role in the assessment of learning. Therefore, two items were added to the survey to further define this role. The survey consists of 37 items, responses to which are provided using a five-point Likert type scale anchored by “Strongly Disagree” and “Strongly Agree.” An option “I choose not to answer this question” was also included in the scale. All of the items in the survey can be found in Table 3.

Procedure

A random sample of students studying in online and hybrid courses that were offered through a unified, multi-institutional, state-wide online learning system were asked to complete the survey developed for this study. This sample has several advantages, it is broad in that it represents dozens of institutions and it is large, with more than 5,000 responses, a figure that is appropriate for factor and cluster analytic studies such as this one. The sample also represents learners studying in a program with a single learning management system, a single faculty development and training program, a single technology infrastructure provider, and a single student and faculty helpdesk. The sample thus avoids many common issues associated with multi-institutional analysis in which differences in technology, support, or training are uncontrolled variables themselves. Finally, the 30 public institutions in this higher education virtual learning environment are quite diverse, representing community colleges, 4 years liberal arts colleges, and university centers offering degrees from one large, public state university system.

The students were contacted via email and reminders were sent at 2 week intervals four times, until the end of the fall 2008 academic term. As noted above 5,024 students responded to the survey. The response rate was approximately 37%.

Results

Levels of cognitive presence

Overall, results indicated a high level of cognitive presence in these online courses. Students generally agreed with statements describing the achievement of more advanced levels of thinking and learning in the CoI model such as integration and resolution. For example students generally agreed or strongly agreed with items designed to reflect integration. Agreement or strong agreement with these items ranged from 74 to 75.2% as indicated in Table 1.

Table 1 Survey results for items reflecting integration in percentages

Additionally, as demonstrated in Table 2, less than 6% of respondents reported that they had failed to achieve the highest level of cognitive presence, i.e. resolution. Levels of agreement or strong agreement on indicators for resolution ranged from 72.1 to 77% as indicated in Table 2.

Table 2 Survey results for items reflecting resolution/application in percentages

Factor analysis

Data preparation

The original data set consisted of 5,024 cases, 718 (14%) of which had missing values on one or more variables. Therefore, for the purposes of factor analysis, listwise deletion of the cases with missing values was carried out. The remaining 4,306 cases were evaluated for both univariate and multivariate outliers, since all of the analyses that follow are sensitive to deviations from normality. Cases with standardized scores in excess of |3.29| and with Mahalanobis distances greater than 82.72 (p < .001) were excluded. Following the exclusion of cases with missing values and outliers, 3,623 cases remained. The demographic make-up of the resulting sample did not differ substantially from the sample of students who responded to the questionnaire but submitted partially completed data. Seventy-six percent were female students and 61% were full-time students.

Factor analysis was conducted on the 37 items included in the CoI instrument. The principal axis factoring with Oblimin rotations produced three distinct factors, which explained 69.19 of the variance in the correlation matrix. The factors in order of instruction were teaching presence, social presence and cognitive presence accounted for 58.17, 7.91, and 3.11% of the variance, respectively. With the exception of the item “Online discussions helped students appreciate different perspectives”, all other items behaved as anticipated and loaded on the expected factors. The correlations between the factors were as follows: .61 between teaching presence and cognitive presence; −.78 between teaching presence and cognitive presence; and −.77 between social presence and cognitive presence. The factor loadings are presented in Table 3. The reliability (Chronbach Alpha) of the factors were .97 for teaching presence and cognitive presence, and .95 for social presence.

Table 3 Results from principal axis factoring with oblimin rotations

Cluster analysis

We sought to categorize homogeneous subgroups of students with similar perceptions of teaching presence and social presence by performing two K-means cluster analyses, one for each construct. The objective of cluster analysis is to identify individuals that are similar to each other, yet different from individuals in other groups. Based on scores on an instrument, the respondents can be clustered into subgroups that have comparable response patterns and individuals with similar patterns of responses can be identified as a separate group (Aldenderfer and Blashfield 1984; Lorr 1986). The K-means cluster analyses were conducted on the standardized items to avoid pitfalls associated with large discrepancies in the means and the standard deviations among items.

Teaching presence clusters

Based on perceptions of Teaching Presence, three cluster of students were identified; these were labeled “Low Teaching Presence”, “Medium Teaching Presence”, and “High Teaching Presence”. The means of the standardized variables are presented in Table 4. The low teaching presence group (n = 520) had scores on the 15 items included in the teaching presence scale approximately one standard deviation below the mean. In contrast, high teaching presence group (n = 1,422) had standardized scores above the item means. The medium teaching presence group, consisting of 1,682 students, had average standardized scores at the means of the item distributions.

Table 4 Teaching presence cluster centers

Social presence clusters

With respect to the social presence items, three groups of students emerged: 1,561 students judging their social presence as predominantly positive; 1,314 students perceiving their social presence as neither positive nor negative; and 748 students having negative perceptions of social presence. The three groups were named “High Social Presence”, “Medium Social Presence” and “Low Social Presence. Table 5 provides the means for each of the groups on the 10 items of the Social Presence construct.

Table 5 Social presence cluster centers

Teaching presence cluster membership, social presence cluster membership, age, and type of course as predictors of cognitive presence

A three-way analysis of covariance (3 × 3 × 2) was conducted to establish the extent to which teaching presence cluster membership, social presence cluster membership and the type of course taken (hybrid vs. online) predict student ratings of cognitive presence controlling for age. The dependent variable represented the mean scores of the 12 items comprising the cognitive presence subscale of the instrument with scores ranging from 1 to 5. Homogeneity assumptions of ANCOVA were met. The results from the analysis are presented in Table 6. As seen, although age was significantly related to level of cognitive presence (with older students reporting greater cognitive presence), both perceived teaching presence [F(2, 2,988) = 194.72, p < .001, η2 = .12] and perceived social presence [F(2, 2,988) = 194.04, p < .001, η2 = .12] had significant main effects on the dependent variable. It should be noted, however, that type of course (online versus hybrid) moderated the effect of teaching presence on cognitive presence, as indicated by the significant interaction term between teaching presence cluster membership and type of course F(2, 2,988) = 4.15, p = .02, η2 = .00. The main effects are illustrated in Fig. 1, whereas the interaction between teaching presence and type of course is illustrated in Fig. 2.

Table 6 Results from (3 × 3 × 2) analysis of covariance controlling for age
Fig. 1
figure 1

Cognitive presence means for levels of teaching presence, social presence and type of course

Fig. 2
figure 2

Teaching presence and type of course interaction effect

Discussion

This study set out to verify and extend a theoretical framework designed to explain knowledge construction in college level, text-based, asynchronous learning environments. It was noted that nearly four million college students (Allen and Seaman 2008) are now enrolled in such courses and that gaining theoretical insight into the processes that result in higher order learning in such environments, an explicit goal of the CoI framework, is critical.

Several conclusions warrant additional commentary here. First, once again the constructs represented in the CoI model were clearly identified through a factor analytic approach. The inclusion of items that account for the instructor’s role in assessment result in a good factor structure reflective of the latent constructs implied by teaching, social, and cognitive presence. We feel that the inclusion of additional items that take into account the instructors role as a resource for students to gain insight into knowledge construction through the provision of formative feedback is a natural fit the model. We also believe that that the instructor should solicit feedback on the quality of the course from students and that this two-way feedback is essential to ongoing improvement in online education, much as in the classroom. That the inclusion of these items still results in a coherent factor structure suggests that the model is more comprehensive and therefore improved.

Second, it should be noted that, contrary to previous research (Garrison et al. 2000, 2001; Kanuka et al. 2007; Rourke and Kanuka 2009; Stein et al. 2007; Vaughan and Garrison 2005) indicating that online learners appear to stall at lower levels of cognitive presence, the vast majority of more than 5,000 students in online and blended courses surveyed here reported that they achieved the highest levels of cognitive presence reflected in the coding instrument designed to measure that construct. Possible explanations for these conflicting findings begin with how cognitive presence is measured. While previous researchers have used a variety of approaches to assess cognitive presence these efforts generally look at student communicative processes (e.g. threaded discussions and chats) and omit other course artifacts for evidence of higher cognitive presence. This line of research also shares a number of other limiting characteristics. Sample sizes tend to be quite small and reliability measures are low or missing. For example in Stein et al. (2007) the number of subjects was 5. Fahy’s conclusions are based on an N of 13. In McKlin et al. (2002) the researchers highest measure of reliability was on an assessment of only 26 messages, analysis of larger numbers of messages in that study resulted in unacceptable measures of reliability. Meyer (2003) does not report measures of inter rater reliability. Schrire (2004) analyzed discussion transcripts in three doctoral level forums with an N of 13 and a relatively low measure of interater reliability [Holsti’s (1969) coefficient of reliability with only fair to moderate agreement]. In addition to the concerns regarding reliability, small sample sizes, and resulting generalizability are the constraints associated with using threaded discussions as the only evidence of learning. Focusing measures of learning on communicative processes alone ignores other artifacts likely to reflect evidence of more sophisticated knowledge construction.

Also warranting additional commentary are results relative to age, pedagogical quality, and measure of significant learning. The three-way analysis of co-variance conducted here provides insight into the effects of age on ratings of cognitive presence. We know that older students, with work and family responsibilities especially appreciate the flexibility afforded by online education. One might therefore assume that age would be a reasonable proxy for engagement with online learning as measured by multivariate constructs such as cognitive presence. In fact the analysis supports this hypothesis—age does represent a significant predictor of cognitive presence with older students reporting higher levels than younger students. However, the results also indicated that while holding age constant, student ratings of teaching presence and social presence continue to significantly contribute to the prediction of variance in cognitive presence. Ratings of online pedagogical quality, as reflected in teaching and social presence indicators matter in the prediction of significant learning (i.e. cognitive presence) regardless of student age.

Student reports of higher cognitive presence revealed in the current study may reflect their recall of work on significant course tasks such as papers, case studies, term projects etc. that are more likely to reveal their growing ability to integrate and apply knowledge gained in the course. It may also be the case, as suggested in the National Survey of Student Engagement, that professors who teach online courses make more intentional use of deep approaches to learning in their lesson plans (NSSE 2008). In fact the strong positive correlation between students’ assessments of the effectiveness of their instructors’ teaching presence and their own cognitive presence levels suggests that the teaching presence items may be a significant set of indicators reflecting “deep approaches to learning” mentioned in the NSSE. It is recommended that future studies continue to inquire into instructional approaches that result in more sophisticated knowledge construction. We also believe that online courses are not the same as threaded discussions. Evidence of the advanced stages in the learning cycle (i.e. integration and resolution) is unlikely to appear in activities like threaded discussions that are often designed to initiate rather than complete the cycle and such deeper learning should be sought in other course learning activities. Indeed in the few instances when researchers look beyond threaded discussion evidence of higher order thinking has been identified, for example Kanuka et al. (2007) found that WebQuests were associated with higher order thinking documented in reflective position papers.

It is also clear from these results that membership within particular teaching presence and social presence clusters is associated strongly with the level of cognitive presence reported by the students. Supporting recent research in this area (Shea and Bidjerano 2009, 2008; Garrison and Cleveland-Innes, in press), learners reporting lower levels of teaching and social presence are also far more likely to report lower levels of cognitive presence. Extending this research however, the current results also suggest that an equilibrium model can be hypothesized. While students who report low social presence are also far more likely to report low cognitive presence, teaching presence appears to play a moderating role. When students who report low social presence report higher teaching presence, their cognitive presence scores show a significant correlating improvement. This moderating role for the influence of teaching presence holds true also for middle and higher levels of social presence and in hybrid courses as well as fully online courses.

One exception to this pattern does appear, however. Students in online courses reporting high teaching presence also report higher cognitive presence than do similar students in hybrid courses. Membership in the high teaching presence cluster appears therefore to have a greater impact on cognitive presence of online students than students who had at least some opportunity for face-to-face interaction with their instructors and classmates. It appears that, in the absence of opportunities for synchronous, moment-by-moment negotiation of meaning that occurs in the classroom, good teaching presence matters to an even greater extent.

In comparing these finding with theoretical work conducted by Anderson (2003) who posited an equivalency model with regard to the value of interaction in online courses, we find similarities and differences. Anderson (2003) asserted that,

Deep and meaningful formal learning is supported as long as one of the three forms of interaction (student–teacher; student–student; student-content) is at a high level. The other two may be offered at minimal levels, or even eliminated, without degrading the educational experience (p. 4).

Results of the present study suggest that variance in student ratings of their cognitive presence levels can be accounted for by variance in their assessments of teaching and social presence in their online courses. For example in conditions in which reports of social presence are low, higher levels of cognitive presence are evident when teaching presence is rated highly. The corollary is also true, i.e. when assessments of teaching presence are low ratings of cognitive presence improve significantly when social presence is rated highly. We would diverge with structure of Anderson’s equivalency theory only to the extent that he suggests that one form can be eliminated without degrading the educational experience. Results here suggest that the highest levels of cognitive presence are evident when students rate both teaching and social presence most highly. This holds true in both online and hybrid course settings, though as demonstrated the value of teaching presence appears to be even higher among fully online students. We therefore suggest that teachers and instructional designers need to work to develop both forms of presence. These results suggest that the elimination of either would degrade the quality of the educational experience as reflected in the multivariate measure of learning subsumed under the concept of cognitive presence.

Limitations

All of the limitations inherent in student self reports of phenomena apply to the current study. Further, representativeness of respondents is not guaranteed in large scale surveys with limited response rates, though efforts were made to mitigate this through sampling procedures. It should also be noted that systematic course redesign has been well documented to result in better learning outcomes (and lower institutional costs) in many online education settings (National Center for Public Policy and Higher Education 2005; Twigg 2003). It should also be noted that the courses included in this study were designed and taught with the systematic support of an online education program that has greater maturity in the development of fully online courses than in the emergent pedagogical approaches to hybrid course design. Differences in cognitive presence scores possibly attributable to superior online course design may therefore be of limited duration and as hybrid course designs continue to improve would likely disappear.

Finally, additional research is needed to confirm these results using different methods. Comparative content analysis of courses with high teaching, social, and cognitive presence clusters relative to courses with lower clusters would be a logical follow up to the current study. Looking for evidence of significant learning that exists outside of threaded discussion or other communicative processes is also recommended. We feel these strategies shows potential for revealing additional insights into processes and dynamics of online learning outlined in the Community of Inquiry framework.