Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Introduction: Rationale for the Research Project

There is great promise for technology use in educational settings. For example, in the most recent US Department of Education’s technology plan (2010), technology is described as offering teachers the means to enact a student-centered, technology-enabled curriculum. Unfortunately, studies have shown that most teachers are not using technology in these student-centered ways (Keengwe et al. 2008; NEA 2008), suggesting that they may be ill-prepared to use technology to affect meaningful learning in their classrooms (Spector 2010). This is causing various stakeholders to question how well teacher education programs are preparing their graduates to be effective technology-using teachers (for an overview, see Kay 2006; Tondeur et al. 2012).

Concerns about this lack of technology use have prompted leaders in higher education and at governmental levels to place a greater emphasis on providing more opportunities for pre-service teachers to use technology throughout their teacher preparation programs (Pellegrino et al. 2007; U.S. Department of Education 2002). During the past decade, the US Department of Education’s Preparing Tomorrows Teachers to Use Technology (PT3) program provided over $750 million to teacher education programs focusing on new methods for preparing future teachers to effectively integrate technology into their teaching (Pellegrino et al. U.S. Department of Education). For almost three decades now, US teacher education programs have made efforts to prepare their students to use technology as a teaching and learning tool in their K-12 classrooms (Polly et al. 2010).

Studies have examined what knowledge pre-service teachers need in order to be able to implement technology, as well as what teacher education programs should do to prepare their graduates to effectively use that knowledge to support teaching and learning (e.g., Kay 2006). Unfortunately, many of these studies only examined selected schools that were funded by large grants (Pellegrino et al. 2007) or conducted meta-analyses of already published studies (Kay 2006; Polly et al. 2010; Tondeur et al. 2012). Few studies have conducted large-scale evaluations of teachers’ uses of technology. Even fewer studies have examined the connection between that which is learned in teacher education programs and that which is required or expected in practice. The study described in this chapter was designed to fill this gap, using a mixed methods approach.

Purpose

The purpose of the study described in this chapter was to address the knowledge gap regarding how teacher education programs should best prepare pre-service teachers to integrate technology into their teaching. We began by examining the topics, related to technology integration, that currently were included in pre-service teacher education programs. We then compared those topics to the ways in which practicing teachers used technology in their classrooms.

A two-phased mixed methods research design was conducted. Phase one focused on gathering data from teacher education programs via a 14-item online questionnaire and follow-up interviews with a selected sample. Phase two focused on examining the technology integration practices of K-12 teachers, via an online questionnaire, followed by one-on-one interviews with a smaller sample. Within both phases, we began by examining a larger population through the use of online questionnaires (Schmidt 1997). Utilizing this method, we could identify common themes to investigate to a greater depth through the one-on-one interviews. Teaching artifacts were collected from participants in both of the smaller samples to create case studies of both teacher educators and practicing teachers. The results explicate the differences in the perceptions of practicing teachers and teacher educators regarding the relevance of various technology integration skills and knowledge to achieving meaningful technology integration in today’s K-12 classrooms.

The Focus of the Research

The study was guided by three research questions: (1) What technology topics are included in pre-service teacher education programs? (2) What technology topics do in-service teachers find relevant and meaningful to their teaching/learning practices? (3) What are the similarities and differences between the technology topics included in teacher education programs and those teachers find relevant to their current teaching/learning practices? Initially, the researchers intended to investigate the types of educational technology experiences that participants considered to be the most influential on their classroom practices (e.g., stand-alone technology courses, integrated field experiences), but it was difficult to assess that information. This is discussed in more detail later.

The Reason for Choosing Mixed Methods

A mixed methods research design enables researchers to combine the advantages of both quantitative and qualitative data (Teddlie and Tashakkori 2009). Using a sequential mixed design, we first used quantitative data collection methods (close-ended questions) to provide a broad, overarching view of the situation. We also collected qualitative data (from the open-ended questions) to help us elaborate on any common ideas through examination of a larger number of responses (Teddlie and Tashakkori). Through this process, we were able to identify participants from whom to gather additional information, which yielded 39 cases. A multiple case study design allowed us to view the patterns that emerged within and across settings (Yin 2003).

Identifying Participants

In large-scale survey studies, it is critical to identify the appropriate population (Barlette et al. 2001). Because this was a comparison study, there were two critical populations to identify: teacher education programs and technology-using teachers. The selection procedure and rationale for each population is discussed next.

Teacher Education Programs

In 2006, the US Department of Education conducted a similar study (Kleiner et al. 2007) with the intent of collecting information on how teacher education programs across the country incorporated technology into their courses/programs. Kleiner et al. identified 1,439 schools and requested that one contact person fill out the survey. Although the researchers received a 96 % response rate, there was no information about the individuals who completed the survey for each institution. In other words, the individuals completing the survey may not have known all the various ways technology was being used and/or to what extent it was being used across all courses in the program. This, then, potentially compromised the results.

Therefore, the selection of participants in this study was deemed to be critically important. Using a US Department of Education database called the Postsecondary Education Quick Information System (PEQIS), we identified the institutions that met our specifications. Ideally, we would have selected all institutions that prepared teachers. However, there were certain limitations to using this system. Some institutions only offered master’s degree programs (e.g., most California programs), which potentially could have included teachers who were returning for advanced degrees. However, our focus was on pre-service teacher education programs at 4-year institutions, specifically those programs that offered a teaching degree for initial licensure. Using the PEQIS, we identified all 4-year institutions in the USA that offered programs in general, elementary, and/or secondary education (n = 1,283).

Once we identified all the institutions, we focused our efforts on selecting the specific individual at each institution who would be asked to complete the questionnaire. This was an important aspect as it was a notable weakness in the previous study (Kleiner et al. 2007). Specifically, it was critical to identify a representative from each institution who had knowledge of the specific educational technology requirements at that institution. By using the institutions’ websites, a representative, with knowledge of the uses of educational technology within the teacher education courses, was identified. Identified individuals were contacted by e-mail and asked to complete a 14-item online questionnaire describing the pre-service educational technology requirements at their institutions. Of the 1,283 institutions contacted, 426 individuals completed the questionnaire (response rate of 33 %). We e-mailed reminders three times in order to obtain this response rate. Forty-four percent of institutions responding were public institutions and the median number of teacher education students graduating from responding institutions was 139. Forty-eight percent of the teacher educators responding for their institutions had over 10 years of experience at their institutions, and 62 % of respondents stated that they had primary responsibility for teaching educational technology courses.

From among the 426 responding institutions, 12 were selected for follow-up analysis. Purposeful sampling was used to maximize the variety of institutions selected. Maximized purposeful sampling was important because depending on the location, size, and institution type, educational expectations might be expected to differ. Therefore, the selection of institutions was based on the location of institution [West (n = 3), Northeast (n = 2), Southeast (n = 3), Midwest (n = 1), Southwest (n = 3)], size of the teacher education programs [Large (n = 8), Small (n = 4)], and institution type [Public (n = 6), Private (n = 6)]. The teacher educator representative, who completed the initial questionnaire, also participated in the follow-up interview and document collection.

Technology-Using Teachers

For the second population for the study, we sought to investigate how practicing teachers actually used technology in their classrooms. Our intention was to find high-quality users of technology, in hopes of highlighting the intended goals of the new US Department of Education technology plan (2010). Therefore, we sought to recruit participants from the membership of the International Society for Technology in Education (ISTE). ISTE is a professional association dedicated to supporting teachers’ uses of information technology in support of K-12 student learning. With over 18,500 individual members and 80 affiliate organizations, ISTE provided our best access to US teachers who were using technology in innovative and unique ways. Technology-using teachers were recruited through a self-nomination procedure. Requests to participate in the study were sent via e-mail to various listservs focused on educational technology (ISTE special interest groups and ISTE state affiliates). The e-mail requested that teachers complete a 23-item online questionnaire focusing on how they used technology in their classrooms. By completing the questionnaire, teachers self-nominated themselves as technology-using teachers and agreed to participate in the study.

However, not all the responses were used. Responses were selected for the study based on two criteria. First, teachers needed to report that their primary professional responsibilities were directly involved in teaching PreK-12 students; technology coordinators and administrators were not included in the study. This was important because the pre-service teachers graduating from our teacher education programs were most likely to obtain PreK-12 teaching positions. In order to identify the information that needed to be included in teacher preparation programs, we needed to study teachers who were using the technology in their classrooms and (hopefully) using it in ways that aligned with best practices as described by the US Department of Education (2010).

Second, teachers needed to report a high self-assessment of their classroom technology skills. Based on their responses to one questionnaire item in which they rated their comfort levels with technology on a 4-point scale, only those teachers responding at the upper two levels were included (e.g., (1) I’m not comfortable using technology in my classroom, (2) I’m somewhat comfortable using technology in my classroom, (3) I’m comfortable using technology in my classroom, and (4) I’m comfortable teaching others to use technology in their classrooms). If teachers reported feeling not comfortable or only somewhat comfortable, we did not include them in the sample.

A total of 457 individuals responded to the questionnaire. Of those respondents, 316 met both of the criteria. Sixty-eight percent of these respondents taught at the secondary level, and 60 % had more than 15 years of teaching experience. This was one of the trade-offs in our research design when selecting participants. Unfortunately, since more than half of our participants had more than 15 years of experience, their pre-service teachers’ education programs did not include technology experiences. Therefore, we were unable to identify the experiences they perceived as being most influential from their teacher preparation programs. Instead, with this population, we chose to focus on the topic areas they believed were most critical to include now.

From among the 316 teachers responding to the questionnaire, 27 teachers were selected for follow-up interviews and additional data collection. Purposeful sampling was used to maximize the variety of teachers selected for follow-up analysis; selection of teachers was based on subject areas and grade levels taught. Because teachers use technology differently in various subject areas and grade levels (e.g., Tondeur et al. 2007), it was important to include representation from the four core subject areas (English language arts, social studies, math, and science) and both elementary and secondary (including middle school and high school) levels in the follow-up phase. The ten elementary teachers taught in classrooms from 1st grade to 5th grade. The secondary teachers included eight middle school teachers and nine high school teachers and varied in the core subject areas taught.

Description of Data Sources

To answer our research question about which technology topics were relevant to both practicing teachers and teacher education programs, we distributed a questionnaire with both closed- and open-ended questions to both sets of participants. A slightly different questionnaire was distributed to both populations. After comparing trends among responses, we conducted follow-up case studies of individual teacher education programs and technology-using teachers.

Questionnaires

Teacher Educator Questionnaire

The teacher educator questionnaire consisted of 14 items separated into three sections. The first section contained four items focusing on demographic information such as institution name, location, and the responsibilities of the individual respondent (teacher educator) with regard to the program. Demographic information was used to inform our selection of institutions for the follow-up interviews. The second section contained seven items focusing on the technology topics included in coursework and/or experiences required in the institution’s teacher education programs. These items were based on several meta-analyses of teacher education programs (Brush et al. 2003; Ottenbreit-Leftwich et al. 2010; Polly et al. 2010). The last section referred specifically to relevant technology topics covered in the program. Respondents were provided with a list of technology topics and asked to select those that were included in all or some of the teacher education programs at their institution.

To create the list of topics for participants to respond to, we initially reviewed resources describing how teachers use technology. These resources included research articles (e.g., Brush et al. 2003; Ottenbreit-Leftwich et al. 2010) and educational policy documents (e.g., US Department of Education technology plan, ISTE National Educational Technology Standards for Teachers, and the UNESCO ICT Competency Standards for Teachers) that described how technology had been used in the past and which also advocated how technology should currently be used.

Using these resources, we created a research-based conditional matrix that identified key topics/practices, along with research studies that provided evidence of the impact of those practices at the teacher knowledge level, teacher application level, and/or student achievement level (see Fig. 1 for example excerpt).

Fig. 1
figure 1

Research-based conditional matrix of teachers’ uses of technology

A definition was provided for each main topic (see Fig. 2 for example). Content validity was established by synthesizing the literature that described teachers’ technology uses and/or technology topics covered by teacher education programs (Fink 2003). These categories were then examined, revised, and validated by a team of university faculty, K-12 teachers, and educational evaluation experts selected by the US Department of Education based on experience and expertise in the area of technology integration. Using external expert reviewers, we were able to establish the face validity of the questionnaire confirming that the measure included all the necessary questions and covered all the necessary constructs (Fink).

Fig. 2
figure 2

Definitions of topic categories

The final categories/topics included in the questionnaire were: personal productivity, information presentation, administration/classroom management, communication, access/use of electronic resources, analysis of student data, facilitation of specific teaching concepts, documentation of personal/professional growth, support for student learning styles, support of higher-order thinking skills, support for students with special needs, and classroom preparation. Examples for each category were provided so teacher educators would understand what each topic meant (see Fig. 3). Teacher educators were asked to indicate whether all programs, some programs, or none of their programs covered each topic.

Fig. 3
figure 3

Technology topics included on the questionnaire for teacher educators

We used the responses of the 426 respondents to test the reliability of the questionnaire. The Cronbach's alpha measure of internal consistency for this portion of the questionnaire was 0.86. The second item in this section was open ended and asked respondents to indicate what technology topics they perceived to be the most important topics incorporated into the curriculum of their programs.

Technology-Using Teacher Questionnaire

The teacher questionnaire consisted of 23 items separated into three sections. The first section contained seven items focusing on demographic information such as current teaching position and location, grade level/content area, and years of teaching experience. Similar to the inclusion of demographic information for teacher educators, this information was used to identify potential follow-up case participants (using grade level/content areas).

The second section contained 12 items focusing on ways in which teachers used technology to support their teaching in a typical week, and the types of technology experiences completed in their teacher education programs (e.g., educational technology courses, technology activities in methods courses). Because most teachers indicated that their pre-service technology experiences were not influential, with most having graduated over 10 years earlier, we did not include their responses to these items in this research. Instead, we focused on the technology that teachers used on a regular basis. The final section asked respondents to rate their technology expertise, to rate their pre-service teacher experiences, and to provide additional contact information if they were willing to participate in a follow-up interview.

The question focusing on technology topics utilized the same categories from the teacher educator questionnaire. The only difference was that we asked teachers to select the topic(s) that best matched the ways they used technology to support their teaching during a typical week (see Fig. 4).

Fig. 4
figure 4

Technology topic question for teachers

We used the responses of the 316 teacher respondents to test the reliability of the questionnaire. The Cronbach's alpha measure of internal consistency for this portion of the questionnaire was 0.70. The next three items in this section were open-ended and asked respondents to indicate other ways they used technology to support their teaching and what they believed were the best ways to use technology to support teaching and learning.

Follow-Up Interviews and Document Collection

Teacher Educator Follow-Up

We identified teacher educators from 12 institutions to participate in follow-up interviews, based on the recommendation of Guest et al. (2006), who indicated that data saturation typically occurs at 8–12 participants. Similar to the questionnaire instrument, items on the semi-structured interview protocol were developed by a team of experts in technology integration and approved by the team of university faculty, K-12 teachers, and educational evaluation experts selected by the US Department of Education. This helped establish the face validity of the instrument (Patton 2002). For example, the interview protocol initially included a question, “What competencies do the students leave with?” The experts recommended adding specific examples to make this question more specific and prompted us to ask teacher educators to provide information on the technology topics and/or areas that were included in their teacher education programs (question 3).

The interview protocol was pilot tested with other teacher educators (not identified for the follow-up study). This helped us identify unclear questions or areas of focus. Based on these pilot tests, slight wording modifications were made. In addition, some follow-up probing questions were added to extract specific examples, while some questions were removed due to duplicity. In one question, we asked the pilot test teacher educators to tell us, “How much interaction with technology do you think pre-service teachers get outside of the required courses?” Respondents had a difficult time answering this question, asking for clarification. Therefore, we modified this question to, “How do pre-service teachers use technology in other contexts (e.g., methods classes, field experiences, etc.…)?” The interview protocol consisted of nine broad questions, focusing on the technology topics/areas included in their teacher education programs, unique aspects of their specific programs with regard to technology integration, and challenges faced when attempting to infuse technology into their programs (see Fig. 5).

Fig. 5
figure 5

Teacher educator interview protocol

Teacher Educator Supplemental Documents

For each of the 12 institutions selected for follow-up analysis, specific documents were also collected from a variety of sources (e.g., program websites, faculty). These documents included syllabi for various technology courses, overviews/program sheets for the teacher education programs offered at the institutions, sample assignments, course materials, and student work. The supplemental documents were collected to triangulate the data sources and themes emerging from the interviews and questionnaires (Stake 1995). Stake noted that through the use of methodological triangulation, “we are likely to illuminate or nullify some extraneous influences” (p. 114).

Technology-Using Teacher Follow-Up

Twenty-seven teachers agreed to participate in follow-up interviews. We selected individuals from both elementary and secondary schools, as well as the four core subject areas at the secondary level. Similar to the teacher educator interview protocol, items on the semi-structured interview protocol for technology-using teachers were developed by a team of experts in the area of technology integration and approved by the team of university faculty, K-12 teachers, and educational evaluation experts selected by the US Department of Education. To strengthen the validity of the interview protocol, experts made recommendations to improve the focus and suggested follow-up probes for questions.

The interview protocol was also pilot tested with other teachers (not identified for the follow-up study). This helped us identify unclear questions or areas of focus. Based on these pilot tests, small revisions were made, some follow-up probing questions were added, and a few questions were removed. The final interview protocol consisted of ten broad questions, focusing on how and why these teachers use technology for teaching and learning (Fig. 6).

Fig. 6
figure 6

Technology-using teachers’ interview protocol (English language arts example)

Technology-Using Teacher Supplemental Documents

For each of the participating teachers, specific documents were collected from a variety of sources (e.g., teacher websites, e-mail correspondences). These documents included specific teacher-developed activities, sample assignments, course materials, and student work. It was important to collect these documents to (a) provide additional context for teachers’ uses of technology (data source triangulation) and (b) provide triangulation for the technology uses they described during their interviews (methodological triangulation) (Stake 1995).

Preparing to Obtain the Data

It was critical to establish an online questionnaire that was easy to use and read, as well as one that organized the data and was easily accessible to the researchers. In this study, the questionnaire data were collected through a secure, survey tool. The data were then downloaded into a spreadsheet and organized. The more critical plans for organizing the data related to the follow-ups for both populations, as there were 39 cases whose data needed to be organized (questionnaires, interviews, documents). Furthermore, we needed to be able to organize and compare the data across those cases. As Stake (2006) noted, organization in multiple case studies is essential. Patton (1980) recommends utilizing a case record which “pulls together and organizes the voluminous case data into a comprehensive primary resource package … information is edited, redundancies are sorted out, parts are fitted together, and the case record is organized for ready access either chronologically [or] topically … complete but manageable” (p. 313). In this study, the data were organized by themes within each case (see Fig. 7).

Fig. 7
figure 7

Example of case record organization

For each case, documents (websites, syllabi, files, student work, etc.) were all uploaded to a secure server. A large spreadsheet was set up—each case received its own tab. A description and a link to all documents were included in that case’s tab. This enabled us to view all the documents, interview transcripts, and questionnaire responses for one case within a single location (Merriam 1998).

Obtaining the Data

The supplementary documents (e.g., syllabi, course websites, etc.) were obtained from publically available websites and via e-mail with the individual participants. To locate publically available documents and websites, the researchers conducted in-depth searches. The documents found in these searches were used to focus the follow-up interview questions for each teacher educator and practicing teacher. Furthermore, all documents were member-checked with the interviewee to ensure the documents were theirs. Finally, each interviewee was asked to supply additional documents illustrating concepts covered during the interview.

Interpreting the Results

To answer our research questions, we began by examining the basic demographics of those responding to the questionnaire. For the teacher educators, this helped identify the context of their programs: public or private, location of the institution, and size. For practicing teachers, this helped identify who was engaged in classroom teaching, as opposed to serving as a technology coach, media specialist, or administrator. Demographic data were analyzed using frequency counts.

Next, we examined the technology topics teacher educators were covering in their programs. This closed-ended questionnaire item asked teacher educators to select those topics that were included in all, some, none, or optional in their teachers’ education programs. To compare these topics to those considered important to practicing teachers, the same list of technology topics was provided to our sample of classroom teachers, who were asked to indicate which ones were used during a typical week. Our main goal was to compare those topics being covered by a majority of the teacher education programs with how teachers were using technology. If there were no significant difference between the two groups, we could surmise that the majority of teacher education programs were preparing future teachers to use the kinds of technology current teachers actually use in their classrooms. However, if there were a significant difference, this would suggest that the majority of teacher education programs are either teaching topics that teachers do not utilize in their own classrooms or are not covering the topics teachers do use. Therefore, a chi-square test was used to compare the differences, on each topic, between teacher educators and practicing teachers.

With the chi-square test, we needed to limit the variables in the teacher educator responses. Therefore, instead of looking at which technology topics were included in all, some, none, or optional in their teachers' education programs, we decided to only look at the reports of the technology topics required by all teacher education programs. If we examined topics that were listed as being “optional” or only required in “some” teacher education programs, it was likely that these topic areas did not target most of the teacher education graduates in these programs. Significant chi-square results prompted our subsequent use of effect sizes to further differentiate between topics. The effect size “characterizes the degree to which sample results diverge from the null hypothesis” (Cohen et al. 2000, p. 610). An effect size above .3 is considered a moderate to strong effect size (Cohen et al.). Thus, based on our results, there was a strong difference between samples on the following topics: administrative purposes (.508), communication (.482), access and use electronic resources (.344), analyze student achievement data (.300), teach specific concepts (.335), support a variety of learning styles (.401), and support higher-order thinking (.334).

Once these significant differences were identified, it was important to gain a better understanding of why these occurred. Therefore, we examined participants’ responses to the open-ended questions to help illuminate the reasons behind these differences. One of the open-ended items on both questionnaires was designed to garner perceptions of the importance of specific technology uses. Teacher educators were asked to describe the most important technology topic covered in their teacher education programs, while practicing teachers were asked to describe the best ways to use technology for teaching and learning. By examining the results of this open-ended question, we hoped to identify why certain topics were identified in teacher education programs, but not used by teachers or vice versa. The results reported were based on the percentage of teacher educators (n = 366) and practicing teachers (n = 312) who responded to the open-ended questionnaire item (see Fig. 8).

Fig. 8
figure 8

Comparison of percentage of responses by teacher educators and practicing teachers regarding important technology topics/uses

To code the open-ended responses, we used a deductive code list generated from the close-ended question described above (which was previously constructed from the literature and approved by experts). Using a constant-comparative method, two codes were combined with others due to an overlap in teacher responses. The Teaching Specific Concepts code was folded into a Classroom Preparation code since both emphasized searching for resources and lesson planning to teach specific concepts. We also combined the Learning Styles code with the Special Needs code. When teachers referenced using technology to address learning styles, they typically mentioned that technology could be used to address both the special needs and learning styles of individual students.

To increase the reliability of the coding, four researchers worked together to establish definitions and representative example responses. The four researchers then used that code list with descriptions and examples to code participant responses separately; two researchers reviewed all the teacher educator responses, and two reviewed all the teacher responses. The researchers evaluated all the coded participant responses where disagreements occurred and resolved each issue separately.

To compare differences in frequencies of responses between teachers and teacher educators on the close-ended and open-ended questionnaire items described above, Pearson’s chi-square analyses were conducted on those item responses. Since multiple tests were conducted on each item, the alpha level was set at a more conservative .05/10 = .005. To further determine the magnitude of the effect for each comparison, Cramer’s V was computed and reported for each Chi-square test conducted. Seven of the ten codes were found to be significantly different.

Each code was then used to examine the significant differences among topics shown in Table 1. For example, there was a significant difference between teacher educators and teachers regarding how they reported using technology to support higher-order thinking skills. On the close-ended question, teachers reported using technology to achieve this goal much more frequently than teacher educators reported covering the topic during their classes. In the open-ended question, almost half of the teachers described that this was one of the best ways to use technology for teaching and learning. Teachers mentioned the best uses of technology to facilitate student learning included using the collaborative capabilities of technology (e.g., “The best ways to use tech to support teaching and learning are to take advantage of its collaborative abilities. Connect your students to the world around you.” [Teacher 426]), increasing student engagement (e.g., “Creating interactive lessons with visuals and high-interest activities engages the students.” [Teacher 396]), or facilitating student-centered activities (e.g., “Student-centered technology [is the best use]. The ability to have each student investigate and use technology.” [Teacher 318]). This suggests that teachers may value using technology to support higher-order thinking more than teacher educators.

Table 1 Comparison of selected technology topics covered by teacher education programs and those used on a weekly basis by teachers

In the second phase of the study (39 case studies of teacher educators and teachers), researchers used multiple case analysis procedures to analyze data obtained from interviews and artifacts. For each case, data were organized topically by the codes established in the first phase, thus developing a case record for each teacher and teacher educator participating in follow-up data collection (Yin 2003). One researcher reviewed each case record and recorded margin notes on emerging themes. The research team then collectively discussed the themes emerging both within and across the cases. Refer to Table 2 for examples of codes and corresponding themes (not all are listed).

Table 2 Emerging theme examples based on topic codes

Presenting the Results

The quantitative results were presented first to provide an overall picture of the similarities and differences between teacher education programs and technology-using teachers. Once this general picture was provided, we examined the nuances that emerged from the qualitative data. The qualitative data also provided an opportunity to explain the differences. Qualitative data were presented using verbs like “described” or “presented” to convey that we were summarizing the participants’ statements. For example, we discussed one teacher educator’s interview response as follows:

…described the importance of having pre-service teachers using Web 2.0 to collaborate: ‘You know, if every kid is making their own PowerPoint, that’s interesting – but if kids are getting together to discuss how to build one PowerPoint and it’s a group of aspect of the PowerPoint, you’ve got much richer and more meaningful use of technology there. And I think by focusing on the collaborative aspect of Web 2.0 technology, you get your foot in the door there, very naturally too’ [Teacher Educator L, lines 94–98]. Analysis of course assignments revealed that many programs incorporated other Web 2.0 tools (e.g. Google Docs, Titanpad) into the activities pre-service teachers completed. (p. 18)

When interpretations were presented, we attempted to use as much of the participant’s own language as possible. In addition, we triangulated any responses with additional data sources to increase the trustworthiness of our reported results.

Close-Ended Questionnaire Item

Based on our results, there was a strong difference between samples on the importance and/or use of technology for the following: administrative purposes (.508), communication (.482), to access and use electronic resources (.344), analyze student achievement data (.300), teach specific concepts (.335), support a variety of learning styles (.401), and to support higher-order thinking (.334) (see Table 1).

Open-Ended Questionnaire Item

Teacher educators were asked to describe the most important technology topic covered in their teacher education programs, while practicing teachers were asked to describe the best ways to use technology for teaching and learning. Based on the descriptions provided by teacher educators, the most important technology topic was introducing future teachers to how to use technology for classroom preparation and to teach specific concepts (30.6 %). In contrast, when teachers were asked to describe the best ways to use technology for teaching and learning, almost half (47.4 %) described technology uses that supported higher-order thinking. To further examine the magnitude of the effects, Cramer’s V was computed for all comparisons. Based on these data, a moderate to strong effect size was calculated for using technology to support higher-order thinking (.429). This topic showed the widest disparity between teachers and teacher educators in terms of perceptions regarding the importance of specific technology uses (see Table 3).

Table 3 Comparison of perspectives regarding the importance of specific technology uses: teacher education representative perceptions versus teacher perceptions

Multiple Case Records

Analysis of interview and artifact data revealed several emerging themes that highlighted differences between teacher education programs and K-12 teachers with regard to the use of technology to support teaching and learning. These main differences included communication, analyzing student data, documenting professional growth, and supporting higher-order thinking skills.

Communication referred to using e-mail, websites, newsletters, and/or blogs to communicate with parents and students. Almost all of the 27 teachers interviewed discussed using technology for communication purposes in their classrooms. They described using a range of technologies from more traditional newsletters and websites, to blogs and e-mail. In contrast, very few teacher educators interviewed mentioned preparing pre-service teachers to use technology for communication purposes. For the few that did, pre-service teachers created newsletters or static websites to inform “parents” of classroom events.

Analyzing student data included statements about using technology for data-driven decision making, feedback, and assessment—specifically examining student data. For this particular code, three distinct themes emerged. For teachers, they mentioned using classroom performance systems (clickers) and portfolios for assessment purposes. Teacher educators did not discuss either of these themes, but some interviewees described the importance of designing assessments that aligned with objectives.

Using technology to document or engage in professional growth included any informal (e.g., collaboration with other teachers) or formal uses (e.g., e-portfolios). Most teachers responded that technology provided them with a constant source of professional growth. With the amount of resources and information available on the Internet, teachers established their own PLNs through a variety of social media sites (e.g., Twitter, blogs, Google bookmarks). One teacher stated that the Internet was “…a floodgate. I’m just constantly bookmarking, dog earing different things here and there.” This was perhaps the strongest theme revealed during interviews with teacher educators—the use of electronic portfolios. Electronic portfolios tended to be used to encourage pre-service teacher reflection and documentation of technology skills and pedagogical knowledge. Most teacher educators indicated that portfolios required pre-service teachers to document how they addressed the standards.

Using technology to support activities that facilitate higher-order thinking skills focused on using Web 2.0 technology tools to support student collaboration and using technology to support project-based learning. Both teachers and teacher educators discussed these two themes. Teachers reported using various technology tools to facilitate student collaboration. One teacher described using blogs and the commenting feature: “I allow them to comment on each other’s blogs. We have a lot of discussion.” Teacher educators did not discuss teaching pre-service teachers how to use technology to support K-12 collaborative projects. Instead, teacher educators described how they modeled the use of technology for collaboration by assigning pre-service teachers to group projects and using Web 2.0 technology to facilitate collaborative activities involved in completing those projects.

Discussion

The results of the study have relevant implications for practice because they help us identify areas of disconnect between preparation programs and actual practice. By investigating this gap, we have accomplished several things. First, we documented several areas in which teacher education programs may not be preparing teachers to be successful in the field. For example, although 99.2 % of teachers reported using technology for communication on a weekly basis, only 59.3 % of teacher education programs reported covering this topic. These results can cue teacher educators as to the importance of this topic area to practicing teachers and thus suggest the need to address this during teacher preparation programs. Another example of an informative result is in the documentation of professional growth.

The research question posed at the beginning of the study focused on examining gaps between the current topics, related to technology integration, that are included in pre-service teacher education programs, and the ways in which practicing teachers use technology to support their teaching and learning efforts. Specifically, we asked the question, “What are the similarities and differences between the technology topics included in pre-service teacher education programs and the technology topics teachers find relevant and meaningful to their teaching/learning practices?” To investigate this, it was important to gather information from both teacher education programs (to see how we prepare teachers for practice) and teachers (to see what teachers are actually doing). Furthermore, it was critical to survey a large sample from both populations to determine what they typically do. The online questionnaire helped us gather this information from a large sample for both populations. Then using this information to identify common themes, we needed to follow-up in order to gather specific examples of these themes. Therefore, interviews and additional documents were gathered in case study format to understand how these themes were manifest in specific bounded contexts.

This study was commissioned by the US Department of Education to address concerns regarding the preparation of students in teacher education programs for meaningful technology use. Based on our own experiences as teacher educators, the authors felt compelled to examine this concern across teacher education programs, nationwide. As noted in the first author’s previous publications (Ottenbreit-Leftwich et al. 2010), she has made a concerted effort to elevate the voices of teachers to promote technology uses that align with their own values and needs. Results of this study confirmed that there are several technology topics/uses for which teachers and teacher educators differed in terms of the frequency of inclusion in teacher education programs versus the prevalence of use in the classroom. Although teacher educators are addressing a wide variety of topics in their programs, these are not completely aligned with the types of topics or uses that classroom teachers most value, as indicated by the technology they incorporate into their classrooms on a regular basis. Future efforts are needed to provide our future teachers with the skills and knowledge they need to be effective technology-using teachers.