Keywords

1 Introduction

The development of immersive technologies has generated educational projects that facilitate and improve the teaching/learning process [1,2,3]. This type of technology combines virtual and augmented reality to replicate the real and physical world through a digital experience [4, 5]. The experience captures sensory stimuli that allow the user to interact in the digital environment as if they were in physical reality [6, 7]. These educational projects are aimed at students from basic education to university level [2, 8, 9], and they have the potential to generate extremely attractive virtual class environments [10]. In the university sector, it has had a great impact on areas of engineering and science, such as biotechnology, physics, and chemistry [11]. This is because they provide the opportunity to deal with complex topics in a didactic and easy-to-understand way [12]. In addition, it allows users to carry out experimental practices and obtain experiences like those of a physical laboratory [13]. It replaces certain restrictions to access appropriate laboratory facilities or equipment. These restrictions could be due to security conditions or economic factors [14, 15]. It has even succeeded in replacing experiments that were carried out using scarce and expensive scientific equipment.

Serious Games (SG) have also gained relevance in the fields of education and training. [16,17,18] This is because students learn best by playing in context. The knowledge and physical actions must situate the learning process in attractive scenarios and everyday life [19]. And with the support of the latest simulation and visualization technologies, it is possible to contextualize the student/player experience in entertaining, challenging, and realistic environments and situations [3, 8, 20, 21]. This enhances their learning and ability to learn topics that were difficult to teach with traditional methods [22,23,24,25].

In this context, this paper focuses on showing the development and implementation of the first development phase of a Serious Game intended for teaching advanced physics through immersive technologies. The objective is to determine the impact these technologies have on teaching university students in the first years of college. A case study focused on the implementation of an application for teaching radioactivity through immersive technologies is presented. The rest of the article is organized as follows: a specific review of the state of the art of the use of immersive technologies in teaching in the areas of engineering and science, and a general description of Serious Games (SG) and the aspects to consider are presented in Sect. 2. The development of the proposed application for teaching radioactivity is described in Sect. 3. The methodology for testing and validating the application is covered in Sect. 4. The method by which data was gathered from the test subjects prior and after the exposure to the application will be explained in Sect. 5. Finally, the results obtained by the participants are described and discussed in Sect. 6.

2 Immersive Technology and Serious Games

2.1 Use of Immersive Technologies for the Teaching of Engineering and Sciences

Immersion is defined as the ability of a user to be completely immersed in another (artificial) world. Virtual Reality (VR) is based on the creation of a simulated environment of the real world through computational means, and Augmented Reality (AR) is based on the addition of virtual content in the scene (environment) of the real world [4]. Mixed Reality (MR) is not a technology by itself, rather it is a combination of VR and AR technologies [26]. It is very difficult to adequately transmit the experience and perception associated with using these immersive technologies to each user since we cannot investigate their minds. However, the extensive literature reported in recent years on the benefits of teaching that these new technologies provide support for their incorporation in the teaching of all kinds of subjects [27].

In the present study, works that have implemented immersive technologies in the teaching of topics related to engineering and science have been considered. The first work seeks to extend the perception of temperature, thermal conductivity, and heat arguments. It is based on thermodynamics used in a first-year laboratory course in physics or other engineering science subjects. Immediate data collection allows students to make comparisons with theoretical predictions. These comparisons help improve the links between theory and experiments [11, 24, 28]. A similar study, known as Holo.lab, was carried out using Mixed Reality (MR) technologies. It is an application that, through immersive technologies, allows the development of experiments on the thermal conduction of metals carried out in a virtual laboratory. The objective is to improve the understanding of thermal phenomena and statistical mechanics. This is because the sensation of heat is qualitative and cannot be directly appreciated by human visual perception [29]. Another of the studies analyzed was VALUE @ Amrita Virtual Labs. These are laboratories developed to overcome certain restrictions on access to appropriate laboratory facilities or equipment in environments with economic problems in Asia and Africa. This project has managed to universalize education through access to virtual laboratories with intuitive interfaces for teaching physical sciences, chemical sciences, and biotechnology. This has motivated the students to reinforce their knowledge and repeat the lessons in their free time. Consequently, it has increased their curiosity and triggered their imagination [14]. The study focused on the Special Theory of Relativity in a Physics course was also analyzed. This study used a learning experience design, structured around an immersive virtual reality simulation. Both positive and negative attitude profiles related to science and technology were detected in students [30, 31]. Finally, another of the analyzed studies focuses on the interpretation of bodily representations manifested during experimentation in an immersive and interactive mixed reality (MR) environment. This experimentation was based on the understanding of the force and movement of space. In the study, he reports how MRI technologies are tools that allow people to identify physical intuitions. The added value of this study is the proposal to use Immersive Environments not only for instruction but also for comprehensive comprehension assessments that encompass users’ bodily representations [32].

2.2 Serious Games: Taxonomy and Generalities

In the development of SG, the platforms used for its execution, the purpose and scope, the users for whom it is directed, and how they will have access must be considered. All these characteristics of each SG correspond to the taxonomy [33]. Other aspects should also be considered: theories, mechanics, and pedagogical models in the design and execution of the SG [10]. Parallel to this whole process, the evaluation, feedback, and learning analysis should be planned. [18].

It must be considered that, for the development of these learning tools, it is necessary to have a multidisciplinary team and advanced technologies. Said team must cover at least the areas of pedagogy, psychology, design, and programming. This team must plan the storytelling, determine the HCI, and establish the objectives, rules, and levels. In addition, they must define the type of display technology, determine the characteristics of the graphics and establish the programming that will be necessary [34].

The success of the results obtained will depend on the correct fusion between a professional design and solid pedagogical strategies. The latter focus on the learning objectives and the characteristics of the games: level of difficulty, duration, aesthetics, and interaction modalities. In addition, a methodology that continually balances the student’s skills and knowledge with the challenges of the game is essential [10]. The expected result through the SG is to challenge and support the players as they explore and overcome the problems presented in the posed scenario.

In the field of physics, there have been several works that have been carried out through the fusion of new technologies and SG for the teaching-learning process [15, 25, 28, 35,36,37]. For this, simulation environments have been used / games related to the fundamental learning of physics have been used. They have also been subjected to evaluations to determine the knowledge acquired. As a result, a better construction of knowledge is obtained.

3 Application Development

3.1 VR Equipment and Development Software

Due to the limited time, budget, and talent available for developing the testing application for this research, many aspects were compromised. Physics students from Yachay Tech University and other volunteers with limited programming experience collaborated with a Physics Professor to design the VR application with free and partially free programs such as Unity and Blender due to the freedom of development they offer.

Unity is an ideal option for developing applications for educational and research purposes, as it has licensing options that allow for the distribution of products that do not exceed a certain amount of revenue. It uses C#, an object-oriented programming language, as the main means of generating scripts, and creating an ideal environment for game development in addition to the support provided for virtual reality devices, a large and constantly growing community, and many resources offered by the company itself such as free assets.

Blender is an open-source tool for 3D modeling with great potential that allows for animations. It has been used to model certain elements of the application such as the main character HalDron.

FL Studio DAW (Digital Audio Workstation) was employed as a tool to create and design the audio.

With the aim of facilitating interactions and improving the user experience in immersion, we have chosen to develop the application for the VR head-mounted display (HMD) Oculus Quest 2, which has a wide variety of tools and plugins that facilitate the development environment in Unity, one of the game engines with the most resources and support for virtual reality [38].

Since the gaming experience has been prioritized to analyze its effects on learning, the problem of computing requirements to run the application has been solved by running the game on a computer through Unity’s editor and the game-mode provided. This way, better graphics and lighting effects can be used since they will not be limited by the processor incorporated in the Oculus headset while allowing real-time monitoring of players through Unity. Thanks to this close tracking technique, it is possible to directly support the user when problems arise and altering their virtual environment, as well as change values such as movement speed, lighting, volume, among others when necessary.

3.2 Scenario Design

The scenario is inspired by a space laboratory, the planet Earth can be observed through one of the windows in order to create an experience that brings the user closer to a futuristic environment consistent with the idea of being guided by an intelligent robot (Virtual Learning Companion, VLC) that invites to perform fictional activities such as shooting Alpha particles with a machine that forces semi-decay periods.

The color palette used avoids warm colors to give a lifeless environment and enhance the sensation of really being in a simulation to facilitate its discernment from reality.

3.3 VLC Design

HalDron is the main character and guide who is responsible for a linear tour for users through the laboratory. Its function is to transmit knowledge through parallel comments to the events that the test subject will experience as they progress through the different rooms to facilitate its comprehension [39].

Fig. 1.
figure 1

Original HalDron (VLC) design in Blender.

To prevent volunteers from getting lost in the corridors, HalDron moves slowly towards the key points that trigger new events once it is reached by the controllable character, allowing enough time for a new Oculus user to gradually get used to it without delaying those whose have more experience with virtual reality devices by constantly asking to be followed for more instructions.

Finally, it has been decided to make HalDron a purely fictional entity without any humanoid traits to strengthen the objective sense of the criteria issued by the guide and allow the user to concentrate only on what is expected to be learned. To achieve this, a 3D model of a spherical robot with a circular light source that simulates an eye was created in Blender (Fig. 1). In this way, the simplistic geometry of the robot (Fig. 2) allows for the optimization of graphic resources such as animations and rendering, which is important when running the application on the Oculus.

Fig. 2.
figure 2

HalDron in a wire-view mode.

3.4 Gameplay Methodology

Gameplay is a very important factor to consider when developing an application focused on education and any game in general. Considering the lack of experience with virtual reality devices in the studied population, a linear game style was decided in which volunteers go through experiences in the order they are desired to be shown (linear development). However, in order not to stress the participants with restricted routes, the scenario presents the possibility of walk around and explore the laboratory to a certain extent. To maintain the linearity of events without disturbing the players’ decision-making, interactions that don´t contribute to the development of the experience in the expected order have been restricted, giving a false feeling of freedom that will provide a better gaming experience and allow us to take everything under control. Linear development also played a role in allowing the inexperienced development team to significantly reduce the time required to complete the application in due time for the experimental phase of the research [40].

Similarly, the use of controllers has been minimized to simulate a more natural character control. For this, the only necessary interactions will be pressing virtual buttons that are activated by placing the virtualized hand on them and lifting or releasing them as if it were a real interaction. With this game style, controller controls are reduced to the left-hand joystick for movements, while rotations can be done with the headset or using the right-hand joystick. However, a cube placed in the Lobby can be grabbed for those who dare to take it, although it is irrelevant to the concepts HalDron explains to the user. Handlers gave very specific verbal prompts with the required actions to assist those test subjects who hadn’t yet developed the logic of moving and interacting in the virtual world through head-mounted displays (HMDs) to help them advance and complete the gameplay [38], while retaining focus.

It was decided to limit the gameplay time to approximately 5min since it was expected that most test subjects for this study would have limited or null previous exposure to VR and/or head-mounted displays. The aim was to prioritize immersion and completion of the gameplay so that every test subject would listen to the complete lecture from the VLC as well as observe every virtual experiment.

3.5 Simulation Level Design

The game consists of three scenes: Lobby, Natural Decay and Alpha radiation laboratory, Nucleosynthesis and Beta radiation laboratory.

Lobby.

It is the main room where participants spawn when the simulation starts. Here, in the first instance, HalDron can be observed resting on its charging station (Fig. 3) while welcoming us and acknowledging that its own existence and events to be experienced are exclusively part of a simulation.

Fig. 3.
figure 3

HalDron’s charging station is in the starting room.

It is divided into two sections: the starting room and a hallway leading to the two labs. These sections are separated by a transition room with automatic doors that anticipate the user leaving the starting area to venture into the experiences prepared by HalDron. See Fig. 4.

Fig. 4.
figure 4

Orthogonal view of Lobby with different sections

Natural Decay and Alpha Radiation Laboratory.

The first lab is presented as a similar environment, but with darker lighting to highlight monitors, buttons and Haldron’s eye illumination to make it easier to follow and help users to focus their attention on the most important elements at the same time, it creates a better playing experience. This resource allows a distinction between scenes and prevents players from being distracted by non-relevant elements of the environment. It should be noted that these techniques are used only in the laboratories since the Lobby is intended to work as a ‘safe place’ in which players can rest from the possible mental exercise that the laboratories require and freely explore or interact with the placed cube. For distraction.

Fig. 5.
figure 5

HalDron explains while the monitor shows the natural decay.

Here are two activities: the first one is observing the natural decay of a group of atoms in a monitor while HalDron explains the events shown (Fig. 5); the second one is pressing a button on a machine that forces semi-decay periods and shoots alpha particles at a sheet of aluminum at the same time that a projection monitors the radioactive material state inside the machine to be evaluated by the player (Fig. 6).

Beta Radiation Laboratory.

To proceed to the second scene or laboratory, HalDron places itself over the door that works as entrance. Same as before, it was decided that the user would interact with a monitor and buttons, yet this time events showed on the monitor could be affected by the user’s interaction. A button will dope a group of atoms and the second button will allow to accelerate the time to make it possible to watch what would take thousands of years in real life (hyperlapse).

Fig. 6.
figure 6

Projector and alpha particles shooter machine.

Advancing and accelerating time is one of the advantages provided by virtual worlds compared to real world lab experiences. As in the first laboratory, HalDron will request the user to press certain buttons, however, this time interactions are restricted to be performed only after each explanation is finished in order to avoid the users from activating premature actions before an explanation has been provided, in this scene the actions are also being shown quantitatively and the test subject will be able to conceptualize the verbal explanation through observation of the phenomena and the visual data provided by a monitor (Fig. 7).

Fig. 7.
figure 7

Monitor showing the first state of the group of atoms that will be doped.

3.6 Audio Design

Since the VLC is a robot, it was necessary to create a voice that captures both the robotic essence and the human-like qualities of a lecturer.

During the initial phase, recordings of the dialogues were conducted to be delivered by the VLC. These recordings were captured using an AKG Perception P420 Microphone, paired with a Focusrite Scarlett Solo (3rd Gen) interface.

The sound design process followed a systematic approach. First, the voice actor performed the script recording the audio within FL Studio DAW using the designated microphone and interface. Second, by performing audio cleaning procedures a limiter was employed to eliminate extraneous sound artifacts to preserve the primary recording, the implementation of this allowed to obtain a pristine audio quality devoid of ambient noise.

After obtaining the clean audio, a plugin chain was constructed to achieve the desired robotic humanized sound. The chain commenced with a multiband compressor, which was employed to compress specific frequency bands selectively. Following this, a vocoder plugin was utilized to manipulate the sound’s spectral characteristics, resulting in a robot-like timbre. To further enhance the sonic output and imbue a sense of humanization, a parametric equalizer was employed to attenuate frequencies not typically found in human voices. Finally, a stereo enhancement plugin was utilized to transform the audio from mono to stereo, providing a wider and more immersive auditory experience.

Once the sound processing was completed, the recordings were transmitted to the coding department for integration into the program. Following this method, it would be simple to create a separate compilation of the APK with audio in a different language.

4 Methodology for the Study

To validate if the VR experience helps the students to learn a subject. A mail was sent with fifteen questions about radioactivity 143 students answered the questionary with 5 possible answers (one of them is “I don’t know”).

With the results, the students are called to review the virtual experience, and the next day, a new test is sent via mail to them to ask about their experience, previous experience in video games, and the same test about radioactivity with the order of the questions altered.

5 Validation and Results

Fig. 8.
figure 8

Grades of the questionnaire on radioactivity for all the students that answer it. The results are not a normal distribution.

To validate if the VR experience helps the students to learn a subject. A mail was sent with fifteen questions about radioactivity 143 students answered the questionary with grades from 1 to 15 points, as is shown in Fig. 8.

The high dispersion in the results is due to our university some students graduate in Physics and Chemistry, so they study radioactivity in the upper grades. The results show that 80% of the students with more than 9 points on the test are from these two grades. A convolution of two populations is proposed, and the distribution is shown in Fig. 9

Fig. 9.
figure 9

The lower population of students who didn’t know about radioactivity had an average value of 4.9, with a standard distribution of 2.4 points; the students in the upper grades of chemistry and physics had an average of 11.6, with a standard distribution of 1.6 points.

The questionary also wrote the question ‘if the student wanted to continue with our research’ 81,6% of the students answered that they would help and give us a way to contact them. It was chosen to try the VR experience with the students with 8 points or less.

Also, we let some students with grades of 9 points or above try the VR experience, 10% of all the subjects.

5.1 Oculus Experience

During the test, all the students that tried the Oculus finished successful the experience. Also, in the post-experience questionary, 37.9% of the students didn’t suffer from dizziness, 34.5% only at the start, 24.1% after time inside the experience, and 3, 4 suffered nausea but finished the whole experience.

It was observed during the application’s testing that most subjects would learn the basic movements to interact with the virtual world in 30−120 secs, averaging 40 secs. They could choose a method for moving the camera between the joystick in their thumb, the head-mounted display with their heads, or the HMD by rotating their bodies in place. This provided an advantage since they could focus earlier on the VLC. 73.4% of the students used an Oculus for the first time.

Test subjects who started to experience some imbalance (similar to trying to stand inside a moving bus without holding any handlebars) regained proprioception just by having a hand placed against their backs.

Subjects who initiated exploration of the virtual space right after the start of the experience expressed an interest in continuing after it was concluded. It was observed they wouldn’t wait for any prompts for action from the VLC but would show interest in how it behaved.

After the immersive experience, when asked about their impressions of the application and what they would improve in the application, approximately half of the participating undergraduate students expressed difficulty in understanding the VLC’s robotic voice, they would be able to understand it but were forced to pay more attention what was being said and would prefer a natural human-like voice.

5.2 Radioactivity Test After the VR experience

The points obtained by the students the next day after using the VR experience can be seen in Fig. 10, the 10% of the students with more than 9 points in the pre-test are removed, and only left the points obtained by the students with 8 or less. 40% duplicate or triplicate the obtained points, and only 16% didn’t increase their grades of the students with more than 9 points, 50% increased their obtained points by one point, and the other 50% obtained the same grade.

Fig. 10.
figure 10

Points obtained by the students on the next day who used the Oculus have an average value of 9.8 points and a standard deviation of 2.2 points

The experience of games is not related to the increase of the points on the test. The designed simulation is only like a museum that needs a few actions by the ‘player’ to follow the whole experience.

Let’s compare these results with the students that already had a radioactivity class in chemistry or physics, the red dispersion in Fig. 9. The students in the VR experience have a lower average, but they study a whole semester for 14 weeks. This experience is only 8–15 min. Is a good way to introduce the topic to the students, because 79% of the students wanted to learn more about the presented topic.

6 Conclusions

The present paper shows how a small team from Yachay Tech University designs and develops a Virtual Reality application to teach a subject related to advanced physics to undergraduate science students from Ecuador. The objective was to observe and gather data to understand better how immersive technologies can aid in studying advanced topics in higher education, with the intent of establishing a path for future research while considering the difficulties in acquiring this kind of technology in different countries.

The results show that not only does it help to better understand the concepts through immersion and visualization, but there is a high predisposition to fast learning the logic of how to use said technology.

Almost all the students with grades under 8 points could increment their scores by 50–300%. We will continue working to extend the application for other studies.