The challenges and opportunities in authentically and effectively integrating technology use in higher education are undeniable. Faculty in higher education struggle with how to effectively embed technology-enhanced learning into curriculum, while also preparing students to survive and succeed in an increasingly digital future (Wang and Hannafin 2005). As new generations of learners enter higher education campuses, we encounter students with varied exposure to advanced technologies and applications in educational settings. This diffusion of technology exposure and expectation requires well-trained and technologically versatile faculty and administrative staff to support growing demand.

Despite the evidence that blended learning and instructional technologies have the potential to improve learning outcomes (Arkorful and Abaidoo 2015; Bester and Brand 2013; Francis and Shannon 2013), educators’ use and adoption of instructional technologies varies widely. Bennett (2014) found that “the overriding driver for [the] uptake of TEL [technology-enhanced learning] practices appeared to be the desire to improve or deliver high quality learning for their students” (p. 9). In other words, educators found pedagogical value in these types of tools. Relatedly, this current research project developed from a cross-disciplinary group of educators’ desire to improve student learning through the effective application of learning technologies. One aspect of the research was the development and implementation of surveys to faculty and students to understand the ways in which instructional technologies are used and viewed. At the time of this study, the researchers were members of a professional learning community focused on instructional technologies and digital literacy. As a learning collective, the group members offer a unique perspective and a thick description of the learning experience at one institution. Educators and students may benefit from research that accurately portrays both the nature and scope of instructional technology use in teaching and learning, as well as the learning process and results that may occur through a professional learning community. These data provide a point of departure on which future curricula and professional learning programs could be based.

Theoretical Basis for Research Approach and Design

Active learning is important in the educational process (DuFour and DuFour 2010). DuFour and DuFour’s (2010) work, “Learning By Doing,” serves as the theoretical foundation for this learning community and its subsequent learning process. The authors define a professional learning community (PLC) as “an ongoing process in which educators work collaboratively in recurring cycles of collective inquiry and action research to achieve better results for the students they serve” (DuFour and DuFour 2010, p. 11). Working collaboratively in teams is an important feature of the process. “...Collaboration represents a systematic process in which teachers work together independently in order to impact their classroom practice in ways that will lead to better results for their students, their team, and for their school (DuFour and DuFour 2010, p. 12). PLCs are committed to continuous improvement and are results oriented. Trust et al. (2017) document post-secondary educators’ desire for communities that support learning and professional growth. Another reason faculty may commit to participating in this type of collaborative process is that it offers a “more effective, more gratifying” way to approach the work of educating (DuFour and DuFour 2010, p. 16). Professional learning communities have been embraced in the scholarship of teaching and learning, as well as in practice (Prenger et al. 2018; Stoll et al. 2006). PLCs help us to understand how students learn and how educators can help them learn more effectively.

Instructional Technology: The how and why?

A thorough discussion of instructional technology is outside the scope of this research, yet it is important to establish a definitional baseline before proceeding with the work. Instructional technology looks at systematic approaches to the design of instruction for a specific audience to fill a specific skill or knowledge gap, making use of the appropriate technology solutions (e.g. ranging from classroom-based instruction to online asynchronous instruction to online synchronous virtual instruction.) As an example, instructional technology focuses on tools that help educators and instructors create a program, divide it into a series of classes, divide these classes into a curriculum or series of modules or lessons (Garrison and Kanuka 2004). These technologies are ever changing and adapting to the newest innovations (Johnson et al. 2016). Instructional technologies are not only a way to convey information, but also they are meant to be interactive and collaborative (Zhang et al. 2006).

There are significant challenges that hamper technology adoption and integration across universities. Some scholars believe digital literacy is a solvable challenge in that institutions of higher education know how to move beyond acquisition of isolated technological skills and help students by “generating a deeper understanding of digital environments, enabling intuitive adaptation to new contexts and co-creation of content with others” (Becker et al. 2017 p. 2). The study indicates that institutions are “charged with developing students’ digital citizenship, ensuring mastery of responsible and appropriate technology use, including online communication etiquette and digital rights and responsibilities in blended and online learning settings and beyond” (Becker et al. 2017, p. 22).

Developing instructional technologies impact the plans and procedures for institutions of higher education. This raises questions about how instructors effectively embed these tools into curriculum, and how to best prepare students to survive and succeed in an increasingly digital future. Other challenges are related to integrating technology use into the values and rewards systems in higher education institutions. Finally, significant questions need to be asked about the role and impact of technology in the lives of faculty and students across the spectrum of learning contexts and society more broadly.

The Case: A Baseline for Instructional Technology Use at One Institution

This research is a product of an iterative discussion between a learning community of faculty and administrators about the role of instructional technology in the higher education classroom. Specifically, the research explores the ways in which faculty and students engage with and perceive instructional technologies at a single higher education institution in the Southeastern region of the United States. Framed by existing challenges and opportunities around instructional technology use nationwide, this manuscript offers a case of one institution’s attempts to set a baseline for technology-enhanced learning. The researchers used a design-based research process (Barab and Squire 2004) and collected data in three ways: professional learning community discussion, survey implementation, and presentations and feedback. These three data components unfolded over the course of approximately one year of research.

The Team: A Local Professional Learning Group

The research was formalized as a local professional learning group organized under the faculty technology support department known as the Teaching and Learning Team (TLT) and facilitated by an instructional technologist. These university-supported groups of educators meet regularly, share expertise, and work collaboratively to improve teaching skills and the academic performance of students. In her discussion of professional learning groups, Hord (2008) effectively captured the ethos of our efforts, “The three words explain the concept: Professionals coming together in a group—a community—to learn” (p. 10). Initially, the group agreed that their primary goal was to improve student learning by examining the relationship between teaching practices and student outcomes. Importantly, in addition to the group members’ quest to improve their own instructional outcomes, the university lacked baseline data on the use of instructional technology on which to make improvements and design supports. The format of a collective learning group allowed for reflection and documentation of our learning and challenges along the way.

The authors of this initiative heretofore unknown to one another are members of varied departments, which importantly yielded diverse experiences and knowledge to the work (Wicks et al. 2014). This mix of varied backgrounds and disparate technological abilities would ultimately prove to parallel the technological ability, interest and use of technology by various faculty throughout the university. These distinctions facilitated a dynamic inquiry and open discussion within the group. In a quest to examine the relationship between teaching practices and student outcomes, the researchers required a solid understanding of how and why instructional technologies were used by faculty and students within the institution of study.

The Research Process: An Iterative, Mix-Methods Approach to Data Collection

Characteristic of the professional learning community process, this research can be described as design-based research: an iterative process that gives the researchers an unusual role. “Design-based researchers are not simply observing interactions but are actually “causing“ the very same interactions they are making claims about” (Barab and Squire 2004, p. 9). The researchers used a cyclical approach to gathering data through mixed methods that moved from qualitative-based discussion to quantitative data collection (Skerratt 2005, p. 123). This cyclical approach is based on the understanding that the topic of study is complex, requires a multi-method investigation, as well as the opportunity to review data and its interpretation, including review with participants or the ‘subjects’ of the study (Skerratt 2005).

As a professional learning community, the researchers not only collaborated with one another, but also with other stakeholders within the institution. Were the views of the research team representative of the entire institution of inquiry? Continued discussion and feedback from others within the case institution informed the problem identification, processes, and findings. In-line with Barab and Squire’s (2004) definition of design research, the ultimate aim of this study was to “potentially impact learning and teaching in naturalistic settings” (p. 2).

Discussion and Reflection

Qualitative inquiry took place through meetings of the professional learning community from September, 2017 through August 2018. These conversations were documented and served as a reference for the group’s thematic content and research process. The researchers’ iterative discussion guided and refined the research objectives and survey design.

Terms such as digital learning, instructional technology, learning technology, E-Learning, etc. are all employed in the parlance but not easily differentiated (Greenhow et al. 2009). As such, the researchers spent much of the early weeks working on a common understanding of the language employed and discussing methods to approach the study with methods that would be most meaningful to the entire campus. Common to the group was an interest in improving students’ academic experience and a need to understand the current use of technology on campus. However, as the group evolved and the survey was developed, it became clear that the authors lacked data about students’ abilities, preferences, and outcomes related to instructional technology. Both students and faculty rejected the use of technology for “technology’s sake” and wanted to see concrete benefits from its use.

Through group discussions, the members discovered that individual faculty even in the same schools were using different software in instruction. These experiences led the researchers to agree that multiple technological platforms may create confusion and additional challenges for students. The presence of competing software platforms introduced complexities to our research as well, specifically in the survey design. The group’s discussion also revealed a level of fear surrounding technology use, as well as a lack of time and attention that precluded faculty from using new technology.

The researchers soon realized that they did not have enough information about instructional technology use at our institution. They required a baseline and a better understanding of how other colleagues at the institution framed and enacted these tools. Accordingly, the focus shifted to a broader investigation of the habits and patterns of faculty, instructors, and students at the university of study. This research was timely as the discussion and debate held within the professional learning group mirrored that held within the administration of the institution.

Faculty and Student Surveys

The second component of data collection included surveys that addressed the knowledge, skills, and dispositions of faculty and students around the topics of instructional technology at the case institution. Both extant literature and our observations and discussions as faculty members informed the survey objectives and design. The surveys had three main objectives:

  1. 1.

    How do faculty and instructors use learning technologies? How do they perceive learning technologies?

  2. 2.

    How do students use learning technologies? How do they perceive learning technologies?

  3. 3.

    What advantages do learning technologies offer? What barriers exist to using learning technologies in the classroom?

The construction of the surveys began with the review and modification of an existing survey that examined the use of instructional technology in placements from elementary through higher education. Next, the team developed a composite instrument that was reviewed, and tested within the learning group and by other colleagues within the institution knowledgeable about this content. In order to establish item validity, the researchers undertook a content validation phase with experts familiar with instructional technology use in order to develop definitions for item development (McKenzie et al. 1999). After the validation phase, the researchers adjusted the instrument based on the experts’ feedback. Specifically, survey questions were designed to gauge the following topics that are central to this study—literacy and knowledge; type and frequency of tools; outcomes; and sentiment (Table 1).

Table 1 Faculty and Student Survey Topics

The research population for the faculty survey included all faculty, including adjuncts, library faculty, and all professor ranks. As of 2017, there were 878 full-time and part-time faculty at the college. For the student survey, the population consisted of all undergraduate and graduate students at the college of study. As of 2017, there were 10,863 undergraduate and graduate students at the college. In order to encourage participation, we emailed an “invitation to participate” to all eligible participants. It was important that the names of the members of our cross-disciplinary, faculty-lead research team appear in all recruitment materials.

The final survey instrument comprised 22 questions and required approximately 10 min for participants to complete (see example questions in Table 1). This included eight opportunities for open ended responses (see Table 2). The open ended questions gave respondents the opportunity to elaborate on instructional technology use that was specific to their unique needs and circumstances. The survey instrument was administered over the course of the Spring semester for six weeks (March 16–April 6, 2018). The researchers obtained a 20.56% response rate from faculty (924 contacted, 190 responses) and a 5.38% response rate from students (11,114 contacted, 598 responses). All completed answers were accepted, even if surveys were not completed in total. Thus, response numbers vary by question. Although the faculty response rate was strong, the researchers recognized the low response rate from students.

Table 2 Total Number of Substantive Qualitative Responses from Faculty and Students

The professional learning group took a joint approach to analyzing these data. First, all members of the group reviewed the qualitative data individually. Next, the group members held a brainstorming session in order to articulate individually, and then collectively determine common responses that would be used as the basis for a qualitative coding scheme. This discussion resulted in a unique qualitative coding scheme for each of the open-ended questions. Then, two members of the research team coded the faculty and student responses using the agreed upon coding scheme. These preliminary results were shared with the entire research team for another round of review and discussion. The qualitative findings are reported in the Survey Findings section of the paper.

Presentations and Feedback

Collaboration is central to design research and was very important to this study as well. Specifically, the researchers sought the expertise of administrators and faculty in both the construction of this research, as well as during data collection. At the time of this manuscript, the learning community was actively reporting their findings to various college constituents and external academic communities.

Content validation informed our survey design. Both the formal survey results, as well as informal feedback from survey participants contributed to our understanding of instructional technology in this research setting. For example, after implementing the survey, respondents offered comments and suggestions to the principal investigator related to their individual use of technology in the classroom. Some questioned response options such as the instruction to select, “weekly, monthly, or never” in relation to software use. Others questioned the ways in which we had thought about certain groups such as library faculty and nontraditional students. Although not captured as formal survey results, the learning group worked through these important comments in discussion, which is reflected in our discussion and research documentation.

One of the group’s first tasks after it completed survey analysis was to share preliminary findings with the university’s information technology officers. This administrative unit supported the group’s research and endorsed the survey through the letter of invitation that was emailed to the sample groups. After sharing with this administrative unit, the researchers validated survey findings and considered next steps related to using the information that we had collected. Related, the researchers were especially interested in the faculty response of “I don’t know” to some basic inquiries on the survey. This common response indicated that although faculty may be employing technology in the classroom, they may be uncertain of the pedagogical rationale and related outcomes. A greater knowledge of the benefits and anticipated outcomes of instructional technology may benefit this faculty population.

Information technology administrators realized a need for articulating a vision to support instructional technology. What are the potentials of using instructional technologies and how can the university support these goals? Guided by this research, the university plans to form a task force and schedule town halls around the topic of digital literacy in order to make strategic improvements in this area. In addition to meeting with information technology administrators, members of the learning community presented preliminary findings at the university’s annual faculty-led teaching and technology conference in Spring 2018.

Results

A significant component of this research was designing and implementing a survey to better understand the ways in which faculty and students engage with instructional technologies. The results of the survey are reported in this section of the manuscript, and observations from all three data components (discussion, survey implementation, and presentations and feedback) are included in the Discussion section. These notable findings are chosen from a much larger set of data and visualizations generated for both faculty and student surveys.

190 faculty responded to the Digital Cougars Survey (20.56% of 924 contacted) from different academic schools across campus. The largest number of faculty members were Associate Professors (23.16%), and more than half (56.6%) had worked as a faculty member in higher education for over 10 years. 598 students responded to the Digital Cougars Survey (5.38% of 11,114 contacted). More upperclassmen (29.43% juniors, 24.75% seniors) took the survey than underclassmen (19.9% sophomores, 17.05% freshman). Graduate students (7.86%) and Unsure (1%) made up the lowest population.

When asked to rate their level of expertise with instructional technologies, faculty were closely split between above average expertise (37.04%) and average expertise (35.98%), followed by significant expertise (15.34%) and below average expertise (7.94%). When asked about their personal technology use, fewer faculty reported above average expertise (31.75%) and more reported average expertise (41.8%) than for instructional technology use. More than half of the students surveyed (54.79%) considered themselves as having average expertise with instructional technologies (Fig. 1), followed by above average (26.72%) and significant expertise (11.76%). However, more students reported above average (32.89%) and significant expertise (17.79%) when asked about personal use outside of teaching and learning (Fig. 2). Below average expertise, no expertise or rare use of instructional technology were recorded at much lower levels for both questions.

Fig. 1
figure 1

Rate your level of expertise with instructional technologies

Fig. 2
figure 2

Rate your level of expertise with technologies outside of teaching and learning

The most commonly reported barrier to using instructional technology was the faculty member’s lack of time to develop and implement technologies (23.96% of question respondents), more than double those reporting discomfort with a range of digital tools (11.04%) or that their students had limited technology skills (9.58%). For students, slow internet speed (22.38% of respondents) and spotty internet access (21.26%) were clearly the most prevalent barriers to technology use, followed by a lack of time to implement technologies (8.39%).

Looking at the effects of instructional technology on student outcomes, the majority of faculty either saw some improvement (43.2%), did not know (35%) or saw no improvement (10.9%), leaving a small percentage reporting significant improvement (10.9%). The majority of students reported some improvement in their outcomes from using instructional technology (59%) or saw significant improvement (23.4%) (Fig. 3). Relatively few students were unsure of the effects of instructional technology (9%) or saw no improvement in outcomes (8.6%).

Fig. 3
figure 3

Which barriers do you face in using technology for teaching or your courses?

Comparing the two survey populations revealed key differences in how faculty and students regarded their use of instructional technology. A greater number of faculty reported feeling experienced using technology in an academic setting, whereas more students felt adept outside of teaching and learning. The majority of students (82.37%) reported improved student outcomes from using instructional technology, but over a third of faculty (34.97%) were unsure or saw no positive effects (10.9%). The researchers were interested in learning more about the difference between the faculty and student responses pertaining to the uncertainty around the positive effects of instructional technology on student outcomes. It was concerning that 35% of faculty did not know if these tools led to student improvement. An additional concerning finding was that a significant number of faculty (30.4%) and students (43.7%) reported that they did not know about new ways of using technology in either teaching or learning.

The survey provided eight opportunities for qualitative responses. Four of the opportunities were part of multiple-choice questions when respondents could provide additional text. The remaining four opportunities were open-ended questions that asked about challenges, and aspirational practices, among other items. Table 2 includes the eight questions and total number of faculty and student responses. Note that each response could have been given multiple qualitative codes depending on the statement.

When asked about the frequency of use (Question 1), respondents made note of the ways in which they used technology in the open-ended section. Faculty reported that they use technology for creation, research, consumption/delivery, and assessment (in order of frequency of use), and mentioned specific academic software tools such as iClicker and VoiceThread. Students listed consumption/delivery, discussion/communication, and research as their top uses. When asked specifically about our institution’s digital learning management tool (Question 2), students and faculty reported that they used this most for consumption/delivery of information.

The third qualitative question asked about the impact of technology on academic achievement. Here, the responses were coded for sentiment: positive, negative, unsure, and other. Faculty made 10 positive comments and six negatives. One faculty member remarked,

“I've not done the study to assess this, but at the very least feel that technology helps me to provide feedback to students…and it has helped me to get timely feedback from the students regarding the meaning they are making. I am a bit concerned about over-exposing students to technologies, and out of class technology supported learning. I am concerned that as more faculty incorporate these into their classes, it may overwhelm students outside of class. This may require that we rethink the structure of classes...perhaps the traditional 4-6 classes per semester is not the best approach if faculty are moving to more outside of class, tech supported learning.”

There were few negative comments from students when asked about the.

impact of technology on their academic achievement. Instead, students remarked on how technology helps them to be more organized and complete assignments with greater flexibility. One student wrote, “It allows me to learn new material on my own time so that I am in a mindset to learn. Improved my understanding of material.” When asked about the impact of technology on non-academic outcomes (Question 4), faculty expressed 8 concerning statements about the loss of social interaction and students’ constant reliance on technology.

With the fifth question, the researchers asked about the ways in which someone would like to use technology in teaching and learning—an aspirational question of sorts. For faculty, data were sorted into three broad categories: 1) specific tools, 2) a lack of resources (time, tools, support), and 3) a desire to do more. Faculty made 10 statements about the desire to do more with technology in the classroom. Perhaps not surprising, this was paired with 23 statements about a lack of skills, time, or space as reasons why respondents could not develop their use of technology. Faculty made 42 comments listing specific tools, software, or skills that they would like to use or develop (Zoom, Poll Everywhere, VoiceThread, virtual reality, video creation, etc.). The responses and resulting coding scheme for students was a bit different. The general code categories were: 1) lack of instructor knowledge, 2) greater/more effective use, 3) grading, 4) calendar, 5) real-world application, 6) specific tools, and 7) other. There were 10 comments about a desire for more and more effective use of technology, and numerous comments listing specific tools (Google Applications, Adobe software, more virtual options, etc.).

In question six, participants were asked to provide additional concerns related to technology in teaching and learning. Interestingly, respondents offered 31 statements around the negative effects of technology. For example, one faculty member remarked, “Once students engage through a device, I find they often have trouble speaking up and discussing in class. Students are comfortable responding anonymously, but can’t defend their thoughts, ideas, or positions in an effective way in class.” Another respondent said,

“I wish we could stop viewing technology as some sort of God. It's not. One of the

biggest problems this campus has is that students view their smartphones as having all of

the answers. People no longer think for themselves. The extraordinary short attention

spans are a product of a society drenched in technology. We need to have humans talking

to each other as humans. We're moving in the wrong direction very, very quickly.”

Students were less concerned about the effects of technology broadly, but did raise questions about the effectiveness of learning technologies. 52 coded student statements expressed this sentiment. For example, one student stated,

“I am somewhat concerned that there is an increasing over reliance on digital technologies in the classroom. While I think supplementing "normal" classroom instruction with digital technology is beneficial, both of the courses I have taken entirely online have been more challenging and less engaging than in-person courses. I am concerned that appreciation for digital technology could morph into an all-digital educational landscape, which would be detrimental to my learning.”

Question seven probes respondents to share support systems related to instructional technology. Faculty rely on the college’s Office of Instructional Technology, as well as colleagues and online videos when they need help with a technology. Similarly, students rely on colleagues and online tutorials.

The final qualitative question asks about ideal professional development tools. Faculty mentioned compensation, more examples, one-on-one trainings, online tutorials, workshops, and specific tools/software. Related to the learning group’s discussion, some faculty mentioned that such training should be mandatory for all faculty. Another interesting comment from a faculty member was about technology supporting work outside of the classroom. “How [can] technology could support teaching outside the traditional classroom [?]. Whether it’s for a study abroad course, advising a bachelor’s essay, or in a research lab, there must be ways technology can complement the professional development of faculty, more broadly speaking.” Students remarked on similar ideal support structures, but seek more online video tutorials as a training mechanism. One student comment underscores the need for class-specific technology support. “Maybe [an] IT person comes on the first day of your class every semester and does a small demonstration of the things you will be using in said class. Or as part of an FYE [first year experience], you have a week focused on learning how to use technology tools for [REDACTED] tech things.”

This brief summary of the qualitative findings addresses some of the questions that the learning group had raised at the beginning of the research study. Importantly, faculty raise concerns about an ever-increasing reliance on technology, and seek evidence that instructional technologies are effective tools. Overall, students find practical benefits to technology in the classroom, but are challenged when tools are not functioning properly or when faculty are unable to use technology effectively.

There are several limitations to this work. This study was limited to one liberal arts institution in the U.S. To be sure, a similar study at a different institution would have yielded different results. One way that the researchers addressed this issue was by including multiple data sources --not just an institutional survey to faculty and staff. Through this approach, the PLC members were better able to understand the experiences of faculty and student populations. Other limitations center on survey design and implementation. Although there was a strong response rate from faculty, certainly people using instructional technologies may have been more likely to respond to the survey. Those already using instructional technologies with regularity may have skewed the response population. As such, one cannot assume that the survey responses are representative of the entire faculty. Additionally, the student response rate was small, and thus generalizability is limited.

Discussion

This research began with questions about the ways in which instructional technologies were being used at one institution. After synthesizing these resulting data, the PLC researchers offer several empirically-based observations as well as limitations and avenues for future research. One of the aims of the design research method was to inform practice. This research was motivated by both questions about the state of educational technologies in the field and the home institution, as well as the researchers’ individual roles in these contexts. PLC discussions led to a more formal research process, including the construction, validation, and implementation of a campus-wide survey. The design-based research process provided the framework to guide and document our inquiry over time. This cross-disciplinary, collaborative, and reflective practice offers a unique case for examining instructional technology use.

One challenge of the influx of new instructional technologies revolves around the best opportunities to onboard faculty and staff into these new, digital technologies, and set expectations for their use. In their discussion of the increased use of blended technology, Wicks et al. (2014) found that “thoughtful professional development is needed to effectively teach faculty how to improve their blended pedagogy” (p. 53). This includes identifying a means to train individuals, while identifying expectations, and integrating this into existing value and rewards systems. In terms of understanding the how of these initiatives, this work underscores the need for instructional technology training for faculty and students, as well as assessments of these skills and practices. If these initiatives are of value to the institution, perhaps faculty and students need more support to build these skills. In addition, these skills and practices should be measured or assessed in a valid and credible way to ensure literacy across identified populations.

This research sheds light on the sentiments and values that faculty and staff hold on the importance of instructional technologies in higher educational systems. Faculty, in particular, question the significance and value of technology-enhanced learning. One solution to these questions may be to develop or define what is meant by technology use in specific institutional settings. This framing and understanding needs to be shared widely with the campus community along with the “why” involved in the initiative. Participants expressed questions about why such technology-based initiatives were a focus. To be sure, there is little doubt among this research collective and most of our participants that some type of technology-enhanced learning is valuable. Yet, participants sought more concrete evidence to suggest that the educational outcomes are greater than the effort in embracing new technologies. This aligns with other research that reports on faculty members’ primary interest in sound pedagogy that is supported by technology (Kim and Bonk 2006; Wicks et al. 2014).

The last question that arises from this research, and subsequent examinations of the learning groups’ discussions, is whether or not technology is needed in all spaces and facets of teaching and learning in higher education. Is technology and the ubiquity of digital, social spaces good for all individuals? In open comments, participants raised concerns about too much screen time, and the use of digital devices in classrooms. These questions revolved around whether or not student achievement was being supported through the integration of technology, or whether digital devices were impeding students’ ability to focus and engage in class.

These uncertainties relate to an impression that it is difficult to gauge the effectiveness of technology on student academic achievement. Possible reasons for this perception include the assessment instrument and few standards for what constitutes instructional technology, as specific tools and use vary considerably across disciplines, courses and instructors. With the widespread adoption of technology in higher education, we need more understanding of the impacts of variable assessment methods, specific tools, and user experiences, in order to accurately work to improve student outcomes.

Conclusion

The near constant disruption in teaching, learning, and technology are closely followed by higher educational institutions as they adapt to serve students. Even with these movements in the field, there are still questions about pedagogical and social use of these texts and tools. This research is unique in that it offers the perspectives of cross-disciplinary faculty and staff who were initially interested in embracing technology to improve instruction and student outcomes. After months of discussion and reflection, the group members sought to establish a baseline of instructional technology use in the classrooms of their home institution. The results of an institution-wide survey reveal not only the types and frequency of tools used in the classroom, but also the challenges to instructional technology use. Fundamentally, universities may need to address the larger questions of why before working to implement the how through support structures, new tools, and trainings in the pursuit of technology-enhanced learning.

Using this research as a basis, the professional learning group will work with university administration to address gaps in faculty and students’ expectations around the use of instructional technologies. Initial findings reveal an interesting turn in our understanding of technology use in higher education. Respondents commented that we—as a society—may use too much technology, and they questioned to what ends? This sentiment ties to the uncertainty in outcomes and effectiveness that we previously introduced. In addition to this analysis, the researchers plan to engage in town halls on campus to create a vision for technology use that is appropriate to the case institution.