Abstract
This paper describes the iterative design of a three dimensional collaborative virtual learning environment (3D CVLE) called the Museum of Instructional Design (MID) that was developed using learning experience design processes. A detailed articulation of our three-phased learner experience design process will be outlined. Findings will provide insight into how other instructional designers can use formative learner experience design processes to create highly usable and effective 3D learning environments.
Access provided by Autonomous University of Puebla. Download chapter PDF
Similar content being viewed by others
Keywords
- Three dimensional collaborative virtual learning environment
- Design case
- Learner experience design
- User experience design
Introduction
Instructional design courses and technology provide learners with various opportunities to practice the use of technologies. Yet, traditional instructional design contexts are often limited in promoting and translating innovative products and processes to classroom environments (Karagiorgi & Symeou, 2005). Many instructional designers and educators face difficulties when applying practice-based tasks and using a variety of technology tools due to the lack of such learning experiences (Pellas et al., 2020). To address some of these concerns, there has been a growing interest in the use of virtual reality (VR) and related technologies such as three dimensional collaborative virtual learning environments (3D CVLE). 3D CVLE are three dimensional, digital spaces designed to support collaborative, user-centric learning activities (Churchill & Snowdon, 1998). The affordances of these technologies for teaching and learning have long been established (Dalgarno & Lee, 2010; Shin, 2017). However, even though VR equipment has become commercially available and more affordable, there are challenges with this approach as few university students have access to VR headsets (Eriksson, 2021). In an attempt to maintain many of the same affordances of traditional, immersive VR technologies while reducing the barriers to adoption, many are turning to web-based VR.
In this study, a 3D CVLE called the Museum of Instructional Design (MID) was developed as a free web-based VR platform for a doctoral-level instructional design and technology (IDT) course focusing on trends and issues of current and historical significance to the field. In this paper, we describe how a team of instructional designers used learner experience design (LXD) methodologies to formatively design, develop, and evaluate the Museum of Instructional Design (MID) to support the learning needs of instructional design and technology (IDT) doctoral students.
Project Description
The MID was developed to provide online learners with a collaborative space that would also provide opportunities to engage in critical discourse and to gain essential applied design skills within 3D spaces. Students of this course were enrolled in an online doctoral IDT program at an R1 institution. The majority of students in this program worked full-time and attended night classes. The MID was designed to emulate the experience of an in-person museum with various gallery spaces for students to meet, engage in conversation, and share their own exhibits to represent the IDT field. The instructor and students created the exhibits with the intention of developing a museum gallery that would evolve over the course of the semester.
Software Used to Design the MID
The MID was designed and developed in Mozilla Spoke and Mozilla Hubs. Mozilla Spoke is a free web-based 3D worlds editor that does not require external software or 3D modeling experience. Mozilla Spoke provides access to an open-source repository of images, videos, 3D models, and other tools (e.g., frames in which multimedia can be placed). Virtual environments created in Mozilla Spoke can be seamlessly integrated and accessed within Mozilla Hubs, the end-user interface. The lead author on this paper developed the architecture and underlying 3D CVLE infrastructure within a Mozilla Spoke project (see Fig. 5.1).
The Mozilla Spoke project was then published to a private Mozilla Hubs space. Mozilla Hubs is a web-based 3D meeting platform that can be used with VR headsets, desktops, and mobile devices and is compatible with different technology tools (e.g., Discord; Le et al., 2020). In this private Mozilla Hubs environment, students of the class assumed the role of a virtual avatar of their choice controlled through input device configurations (e.g., keyboard and mouse). Students co-created the museum exhibits as they engaged in curriculum activities within this Hubs space (see Fig. 5.2).
Method
This study describes the learning experience design (Schmidt et al., 2020), development, and evaluation of the MID. LXD uses iterative processes and is “a human-centric, theoretically-grounded, and socio-culturally sensitive approach to learning design, intended to propel learners towards identified learning goals, and informed by user experience design methods” (Schmidt & Huang, 2022, p. 151). The MID was developed in three phases which included: (1) front-end analysis, (2) design and development, and (3) evaluation. The front-end analysis consisted of empathy interviews, empathy mapping, and persona development. The iterative design and development process made use of rapid prototyping (Desrosier, 2011; Tripp & Bichelmeyer, 1990; Wilson et al., 1993) to revise the MID between versions. An evaluation was conducted through usability and learner experience design methods. Research activities were considered exempt by the PI’s Institutional Review Board. The research focused on the following design questions (DQ):
-
DQ1: How can the user experience design methods (empathy mapping and persona development) inform design principles for a 3D CVLE?
-
DQ2: How can the identified design principles be incorporated into the design framework of a 3D CVLE?
-
DQ3: How is the usability of the MID perceived by classroom and expert evaluator participants, and what features promoted or hindered usability?
Successive Approximation Model
To analyze, design, and develop the MID, we used a modified approach to Allen’s Successive Approximation Model version 2 (Allen, 2012). SAM was used because it is an agile approach to design and development that is more flexible than traditional ID models. Furthermore, SAM was selected as it is a common rapid prototyping framework that is used in instructional design contexts (see Schmidt et al., 2020 for a detailed use case of an instructional designer using SAM). As seen in Fig. 5.3, SAM supported the highly iterative process that we used to design the MID. Given the problems with trying to create instructional systems based on the assumptions of students (Schmidt et al., 2020), this three-phase approach was couched in learning experience design (LXD) - with a particular focus on gathering the requirements of end-users to assess their needs (Sleezer et al., 2014). The three phases (see Fig. 5.3) of this approach includes preparation (Phase 1), iterative design (Phase 2), and iterative development (Phase 3).
During Phase 1, empathy and persona development methods were used (Cooper, 2004). Empathy interviews were conducted at the beginning of the project to assist the designer with developing empathy with the targeted end-students. Empathy maps were created based on an analysis of empathy interviews to identify a user’s behaviors and attitudes. They focused on detailing and articulating what the end-users might say, think, do, and feel (Siricharoen, 2020). Themes from these empathy maps were then used to create “personas” or fictional models of expected students of the learning space (Mashapa et al., 2013; McGinn & Kotamraju, 2008; Miaskiewicz & Kozar, 2011). In Phase 2, we used rapid prototyping to incorporate social, technological, and pedagogical considerations that were revealed from the efforts of Phase 1. This led to a design proof consisting of an initial prototype and underlying system architecture. This initial design proof would be the system used during the first week of class. In Phase 3, the MID was iteratively evaluated and developed. Throughout Phase 3, data were collected during the regularly scheduled classroom activities using a variety of quantitative and qualitative data sources to help inform the design of the MID. An expert evaluation was also conducted to further elicit feedback on the MID.
Data Sources
A variety of quantitative and qualitative data sources were used throughout the study (see Table 5.1).
Study Procedures
Learner Evaluations
In Phase 3, ongoing evaluations were conducted with 15 students (n = 15; male = 6, female = 9) in a 15 week doctoral level IDT course. All participation in research activities was voluntary and anonymous. Research activities took place during regularly scheduled class activities and were typically presented as exit tickets or surveys at the end of class. All survey and exit ticket data were collected through an anonymous Google Form. Classroom observations were also documented by the instructor of the class. These data were used to iteratively design and develop the MID throughout the semester.
During Week 1, students of the class were introduced to Mozilla Hubs through a training environment designed to familiarize new students with the features and interface of the platform (Advanced Learning Technologies Studio, 2022). Pilot data were collected at the end of the class period using the CSUQ (Lewis, 2018) from students in the class (n = 12). This survey was administered at the end of the class session to measure the evaluation of the system from the students’ perspective. During Week 2, students (n = 11) provided their insights through an informal exit ticket that asked them to rate their confidence level with the technology: “On a scale from 1-5 (strongly disagree to strongly agree). I am feeling confident using Mozilla Hubs.” During Week 4, students provided their insights through another informal exit ticket. They addressed the following questions: What was the most challenging part of designing multimedia for 3D spaces? What did you learn about designing in 3D spaces? What resources, tools, etc., helped you as you designed your ID leaders exhibit? During Week 7, the System Usability Scale was administered to the students (n = 11). During Week 8, students completed the Adjectival Ease of Use, a single-item questionnaire that measures user friendliness (Bangor et al., 2008). The item states, “Overall, I would rate the user-friendliness of this product as: Worst Imaginable, Awful, Poor, Ok, Good, Excellent, Best Imaginable.” In addition, students completed an exit ticket about features and changes that hindered or promoted usability.
Expert Evaluations
As suggested in Tessmer’s (1993) work on formative evaluation in instructional design, an expert review was also conducted (Phase 3). Three (n = 3) expert reviewers were recruited to provide an evaluation of the MID (see Table 5.2). These participants were purposively recruited based on their background and expertise. Participants were required to have a background relevant to the design and development of digital worlds and/or background with deploying educational technologies. They were also required to be at least 18 years of age. Informed consent was obtained by all expert reviewers prior to their participation in the study.
Expert reviewers were tasked with completing a series of activities structured to mirror those that students enrolled in the class would go through during any given week. Expert evaluators began the session by completing the same training activity that students in the class completed in the first week of class. They then explored the MID to complete tasks that included engaging in a lecture, providing responses to prompts, creating museum exhibits, and placing their artifacts within the environment. Expert review participants were asked to think aloud (Nielsen, 1993) while completing these tasks. These sessions were also screened and audio recorded for later analysis. A trained researcher took field notes during these evaluations. Upon the completion of the activities within the MID, expert reviewers were asked to complete the CSUQ.
Analysis
Empathy Maps
Empathy maps were created using information from empathy interviews conducted during Phase 1. These focused on four areas: say, think, do, feel. From the six empathy maps, four user personas were developed.
Quantitative Analysis
Quantitative usability data were calculated using methods provided by individual instruments. SUS results were calculated using methods outlined by Brooke (1996). Scores for each question of the SUS are converted to a new number, added together and then multiplied by 2.5 to convert the original scores to a value between 0–100. Though these scores are between 0–100, they are not meant to be interpreted as percentages and instead should be considered only in terms of their percentile ranking (Brooke, 1996). Scores of 68 are considered to represent above average usability. CSUQ scores were obtained by using a formula outlined by Lewis (2018) which converts the results to a 100-point scale to match the SUS. Data from the Adjectival Ease of Use Scale and exit tickets were input into a spreadsheet to calculate descriptive statistics.
Qualitative Analysis
Qualitative analysis focused on identifying characteristics of the 3D CVLE that promoted or hindered the MID’s ease-of-use. A deductive approach to qualitative analysis was conducted using usability heuristics and guidelines established in the field (Nielsen, 1994) that have also been adapted and applied to 3D environments (Joyce, 2021).
Results
This study formatively evaluated a 3D CVLE called the MID. The following results articulate how learner experience design methods might inform design principles; how design principles might be incorporated into an operable design framework; how participants perceived usability; and what might be improved. The following sections will detail the results from Phase 1 (Preparation), Phase 2 (Iterative Design), and Phase 3 (Iterative Development).
Results of Phase 1: Preparation (RQ1)
Empathy Maps
We created six empathy maps based on the empathy interviews that were conducted in Phase 1. These empathy maps were then iteratively refined. Each empathy map includes say, think, do, and feel statements as well as a list of potential pains and gains (see examples in Fig. 5.4). The empathy maps we present below represent a stark contrast between student backgrounds, desires, and feelings as they entered into the MID. One student type has little interest in learning a new technology so late in their doctorate program and sees their comprehensive examination as the only thing that matters. The other example represents a student type who is vastly interested in using the MID and further exploring Mozilla Hubs. This student is able to learn the system quickly and frequently asks for more features and wants to push the limits of the system.
Personas
Four data-driven user personas were created (see example in Fig. 5.5). We constantly referenced these personas during the design and development phases to articulate and refine the social, technological, and pedagogical considerations (see Table 5.3).
Results of Phase 2: Iterative Design (RQ2)
In Phase 2, we made various design decisions based on the results from Phase 1, such as selecting Mozilla Hubs over other web-based VR platforms, including a training environment for students to familiarize themselves with the features and controls, and integrating instructional strategies appropriate to the learner. From these findings, three focus areas for subsequent design emerged: (1) social considerations, (2) pedagogical considerations, and (3) technological considerations (see Table 5.3).
Results of Phase 3: Iterative Development (RQ3)
Quantitative results from student responses to the various usability measures indicate that the MID’s usability was above average. Aggregate SUS (69.1) and CSUQ (69.5) values were rated as being acceptable and the results from the Adjectival Ease of Use Scale were “Good” (5.2). Expert evaluators perceived the MID’s usability as being better (81.3). Qualitative results were used to determine features that promoted or hindered usability. These usability issues were iteratively addressed throughout the course of the semester leading to a refined system (see Table 5.4).
Discussion
In this paper, a prototype 3D CVLE called the Museum of Instructional Design was presented. The goal of the MID was to provide doctoral-level students with a flexible 3D space to participate in class activities and to engage in authentic design activities. In the current research, we sought to address three goals. First, we sought to articulate how user experience design methods could inform design principles for a 3D CVLE. Second, we sought to explore how identified design principles could be incorporated into the design framework of a 3D CVLE. Last, we sought to explore how participants rated and perceived the usability of the MID.
By approaching these questions we sought to reveal key design considerations and to provide precedent for how emerging web-based VR can be designed. Much of the existing research in this area focuses on outcomes rather than on documenting design decisions (e.g., Glaser & Schmidt, 2021), which can be critical in how designers go about their design process (Gray & Boling, 2016). VR and related technologies are becoming more prominent in education (Kimmons & Rosenberg, 2022). It is imperative that the field provide design cases to provide precedent (Lawson, 2004, 2019) for addressing the complexities of designing for 3D spaces (Huang & Lee, 2019). Regarding the usability of the MID, findings show that the mean usability scores are above the standard metric for a system to be considered usable (Brooke, 1996, Sauro, 2011). In addition, all students were able to complete the entirety of the semester’s activities within the 3D CVLE. While some students encountered some usability issues, the majority of these issues were remedied through the reflexive iterative design and evaluation process.
Design Implications
There are several implications for using a 3D CVLE. It allows for more feasibility than using traditional VR with a required headset and provides opportunities for students to engage and collaborate. In addition, because using a 3D CVLE does not require software development skills, students and instructors have opportunities to design in a 3D space. However, there are still some challenges with the technology. Constant modifications may be needed to improve the learner experience. There is not a one-size-fits-all template for all instructors to use. In addition, while web-based VR technologies certainly broaden access and potential use cases for instruction and learning, logistical barriers still hinder adoption for some (e.g., rural students with poor Internet connectivity). Given that many students may have little to no experience navigating a 3D environment, instructors may need to provide more guidance and support to students.
Limitations
The findings presented in this chapter should consider the limitations detailed in the following section.
Nature of UX Research
Due to the nature of user experience design research, findings from this work cannot be generalized beyond the current context. The purpose was to design, develop, evaluate, and refine a feasible and acceptable 3D CVLE for adult students enrolled in a PhD level class.
Therefore, this research used small sample sizes and specifically focused on understanding the nature of students’ experiences as they used the MID. Instead of seeking to create generalizable knowledge, the findings from this work seeks to reveal insights into design decisions that can address the social, technological, and pedagogical needs of students.
Same Participants and Initial Impressions
Quantitative results from the students’ responses to usability measures during Phase 3 (iterative development) indicate that their perceived usability did not change throughout the course of the semester (69.1–69.5). However, this finding might be limited due to the first impression bias phenomena (Fiske & Neuberg, 1990; Lim et al., 2000). In this case, with the students from the class acting as research participants, it is possible that their initial impressions of the system led to a reluctance to change their responses to usability measures throughout the course of the semester. In contrast to these results, qualitative findings indicate that students appreciated the revisions made to the MID and that its usability was improved. Further, expert evaluators, who tested the system closer to the end of the MID’s development (after most of the improvements had been made) rated the system higher.
References
Advanced Learning Technologies Studio. (2022). Project PHoENIX virtual reality software. Copyright 2022 University of Florida.
Allen, M. W. (2012). Leaving ADDIE for SAM: An agile model for developing the best learning experiences. Association for Talent Development.
Bangor, A., Kortum, P. T., & Miller, J. T. (2008). An empirical evaluation of the system usability scale. International Journal of Human-Computer Interaction, 24(6), 574–594. https://doi.org/10.1080/10447310802205776
Brooke, J. (1996). SUS: A “quick and dirty” usability. In P. W. Jordan, B. Thomas, B. A. Weerdmeester, & I. L. McClelland (Eds.), Usability evaluation in industry (pp. 189–194). Taylor & Francis.
Churchill, E. F., & Snowdon, D. (1998). Collaborative virtual environments: An introductory review of issues and systems. Virtual Reality, 3(1), 3–15. https://doi.org/10.1007/BF01409793
Cooper, A. (2004). The inmates are running the asylum: Why high-tech products drive us crazy and how to restore the sanity. SAMS Publishing.
Dalgarno, B., & Lee, M. J. (2010). What are the learning affordances of 3-D virtual environments? British Journal of Educational Technology, 41(1), 10–32. https://doi.org/10.1111/j.1467-8535.2009.01038.x
Desrosier, J. (2011). Rapid prototyping reconsidered. The Journal of Continuing Higher Education, 59, 134–145. https://doi.org/10.1080/07377363.2011.614881
Eriksson, T. (2021). Failure and success in using Mozilla hubs for online teaching in a movie production course. In 7th international conference of the immersive learning research network (iLRN) (pp. 1–8). IEEE. https://doi.org/10.23919/iLRN52045.2021.9459321
Fiske, S. T., & Neuberg, S. L. (1990). A continuum of impression formation, from category-based to individuating processes: Influences of information and motivation on attention and interpretation. Advanced in Experimental Social Psychology, 23, 1–74. https://doi.org/10.1016/S0065-2601(08)60317-2
Glaser, N., & Schmidt, M. (2021). Systematic literature review of virtual reality intervention design patterns for individuals with Autism Spectrum Disorders. International Journal of Human–Computer Interaction, 38(8), 753–788. https://doi.org/10.1080/10447318.2021.1970433
Gray, C., & Boling, E. (2016). Inscribing ethics and values in design for learning: A problematic. Educational Technology Research and Development, 64(1), 969–1001. https://edtechbooks.org/-jcS
Huang, H., & Lee, C. F. (2019). Factors affecting usability of 3D model learning in a virtual reality environment. Interactive Learning Environments, 30, 1–14. https://doi.org/10.1080/10494820.2019.1691605
Joyce, A. (2021). 10 usability heuristics applied to virtual reality. Nielsen Norman Group. https://www.nngroup.com/articles/usability-heuristics-virtual-reality/
Karagiorgi, Y., & Symeou, L. (2005). Translating constructivism into instructional design: Potential and limitations. Journal of Educational Technology & Society, 8(1), 17–27.
Kimmons, R., & Rosenberg, J. M. (2022). Trends and topics in educational technology, 2022 edition. Tech Trends, 66, 134–140. https://doi.org/10.1007/s11528-022-00713-0
Lawson, B. (2004). Schemata, gambits and precedent: Some factors in design expertise. Design Studies, 25(5), 443–457.
Lawson, B. (2019). The design student’s journey: Understanding how designers think. Routledge.
Le, D. A., MacIntyre, B., & Outlaw, J. (2020). Enhancing the experience of virtual conferences in social virtual environments. In 2020 IEEE conference on virtual reality and 3D user interfaces abstracts and workshops (VRW) (pp. 485–494). IEEE. https://doi.org/10.1109/VRW50115.2020.00101
Lewis, J. R. (2018). Measuring perceived usability: The CSUQ, SUS, and UMUX. International Journal of Human-Computer Interaction, 34(12), 1148–1156. https://doi.org/10.1080/10447318.2017.1418805
Lim, K. H., Benbasat, I., & Ward, L. M. (2000). The role of multimedia in changing first impression bias. Information Systems Research, 11(2), 115–136. https://doi.org/10.1287/isre.11.2.115.11776
Mashapa, J., Chelule, E., Van Greunen, D., & Veldsman, A. (2013). Managing user experience – Managing change. In P. Kotzé, G. Marsden, G. Lindgaard, J. Wesson, & M. Winckler (Eds.), Lecture notes in computer science: Vol. 8118. INTERACT 2013 (pp. 660–677). Springer. https://doi.org/10.1007/978-3-642-40480-1_46
McGinn, J., & Kotamraju, N. (2008). Data-driven persona development. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1521–1524). https://doi.org/10.1145/1357054.1357292
Miaskiewicz, T., & Kozar, K. A. (2011). Personas and user-centered design: How can personas benefit product design processes? Design Studies, 32(5), 417–430. https://doi.org/10.1016/j.destud.2011.03.003
Nielsen, J. (1993). Usability engineering. Morgan Kaufmann.
Nielsen, J. (1994). Enhancing the explanatory power of usability heuristics. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 152–158). https://doi.org/10.1145/191666.191729
Pellas, N., Dengel, A., & Christopoulos, A. (2020). A scoping review of immersive virtual reality in STEM education. IEEE Transactions on Learning Technologies, 13(4), 748–761. https://doi.org/10.1109/TLT.2020.3019405
Sauro, J. (2011). A practical guide to the system usability scale: Background, benchmarks, & best practices. Measuring Usability LLC.
Schmidt, M., & Huang, R. (2022). Defining learning experience design: Voices from the field of learning design & technology. TechTrends, 66(2), 141–158. https://doi.org/10.1007/s11528-021-00656-y
Schmidt, M., Earnshaw, Y., Tawfik, A. A., & Jahnke, I. (2020). Methods of user centered design and evaluation for learning designers. In M. Schmidt, A. A. Tawfik, I. Jahnke, & Y. Earnshaw (Eds.), Learner and user experience research: An introduction for the field of learning design & technology. EdTech Books. https://edtechbooks.org/ux/ucd_methods_for_lx
Shin, D. H. (2017). The role of affordance in the experience of virtual reality learning: Technological and affective affordances in virtual reality. Telematics and Informatics, 34(8), 1826–1836. https://doi.org/10.1016/j.tele.2017.05.013
Siricharoen, W. V. (2020). Using empathy mapping in design thinking process for personas discovering. In P. C. Vinh & A. Rakib (Eds.), Lecture notes of the institute for computer sciences, social informatics and telecommunications engineering: Vol. 343. Context-aware systems and applications, and nature of computation and communication (pp. 182–191). Springer. https://doi.org/10.1007/978-3-030-67101-3_15
Sleezer, C. M., Russ-Eft, D. F., & Gupta, K. (2014). A practical guide to needs assessment (3rd ed.).
Tessmer, M. (1993). Planning and conducting formative evaluations: Improving the quality of education and training. Routledge.
Tripp, S. D., & Bichelmeyer, B. (1990). Rapid prototyping: An alternative instructional design strategy. Educational Technology Research and Development, 38(1), 31–44. https://doi.org/10.1007/BF02298246
Wilson, B. G., Jonassen, D. H., & Cole, P. (1993). Cognitive approaches to instructional design. In G. M. Piskurich (Ed.), The ASTD handbook of instructional technology (pp. 11.1–21.22). McGraw-Hill.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Glaser, N., Earnshaw, Y., AlZoubi, D., Yang, M., Shaffer, E.L. (2023). Designing the Museum of Instructional Design, a 3D Learning Environment: A Learning Experience Design Case. In: Hokanson, B., Schmidt, M., Exter, M.E., Tawfik, A.A., Earnshaw, Y. (eds) Formative Design in Learning. Educational Communications and Technology: Issues and Innovations. Springer, Cham. https://doi.org/10.1007/978-3-031-41950-8_5
Download citation
DOI: https://doi.org/10.1007/978-3-031-41950-8_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-41949-2
Online ISBN: 978-3-031-41950-8
eBook Packages: EducationEducation (R0)