Keywords

1 Introduction

The continuous emergence of new and more powerful media systems is allowing today’s users to experience stories in 360º immersive environments away from their desktops. Head-Mounted Displays (HMD) such as the Oculus RiftFootnote 1 and Google CardboardFootnote 2, are becoming mainstream and offer a different way of experiencing narratives. In Immersive Virtual Reality (IVR), the audience is in the middle of the action, and everything is happening all around them. Traditional filmmakers are now tasked with adapting tightly controlled narratives to this new media that defies a single view point, strengthens immersion in the viewing experience by offering the freedom to look around but also presents challenges, such as the loss of control over the narrative viewing sequence and the risk of having the audience miss important exciting steps in the story. For this reason in 360º IVR, it is important to understand what attracts their attention to or distracts them from the story.

In this paper, we describe the development of IVRUX, a 360º VR analytics tool and its application in the analysis of a VR narrative scene. Our aim is to further advance the studies of user experience in 360º IVR by trying to understand how we can enhance the story design by analyzing the user’s perception of their experience in conjunction with their intentions during the visualization of the story.

2 Related Work

In their summary on future Entertainment Media, Klimmt et al. [6] defend the argument that the field of interactive narrative is still in flux and its research is varied. IVR is currently being explored in several technologies and formats [3, 10, 13] One of the common links between these experiences is the freedom in the field of view. Directing a user’s gaze is essential if they’re to follow a scripted experience, trigger an event in a virtual environment, or maintain focus during a narrative. Currently, developers are compensating for the free movement of the user’s gaze by utilizing automatic reorientations and audio cues as in Vosmeer et al.’s work [13], at the risk of affecting user’s presence and immersion in the narrative. Such experiments demonstrate the need for a better understanding of user experience in VR, which can be advanced by capturing qualitative information about the user’s experience that can be easily visualized and communicated. Nowadays, eye-tracking is used to analyze visual attention in several fields of research. Blascheck et al. [1] highlighted several methods for the visualization of gaze data for traditional video such as attention maps [4, 8] and scan path [9]. However, this is not the case for 360º IVR as the participants have the freedom to look around. Efforts into developing data visualizations that allow users to inspect static 3D scenes in an interactive virtual environment are currently being made [11, 12] but results are incompatible with dynamic content (video, 3D animation). Lowe et al. [7] research the storytelling capability of immersive video, by mapping visual attention on stimuli from a 3D virtual environment, recording gaze direction, and head orientation of participants watching immersive videos. Moreover, several companies are engaged in investigating this topic, such as RetinadFootnote 3, CognitiveVRFootnote 4, GhostlineFootnote 5, by providing analytical platforms for VR experiences. However, little information is available about them as they are all in the early stages of development.

3 “The Old Pharmacy”

“The Old Pharmacy” is an 360º Immersive narrative scene, part of a wider transmedia story called “Fragments of Laura”, designed with the intention of informing users about the local natural capital of Madeira island and the medicinal properties of its unique plants. The storyline of the overall experience revolves around Laura, an orphan girl who learns the medicinal powers of the local endemic forest. In the “The Old Pharmacy” scene, Laura is working on a healing infusion when a local gentleman, Adam, interrupts her with an urgent request. The experience ends in a cliffhanger as a landslide falls upon our characters. For a summary of the story see Fig. 1.

Fig. 1.
figure 1

IVRUX data mapping the plot points of the scene, coded alphabetically from A to S. Story Timeline (A) Laura enters the scene, opens and closes door 1; (B) Laura looks for ingredients; (C) Door 2 opens; (D) Thunder sound; (E) Laura reacts to Adam’s presence; (F) Adam enters the room; (G) Door 2 closes; (H) Dialogue between Laura and Adam; (I) Laura preparing medicine; (J) Laura points at table; (K) Adam moves to table; (L) Dialogue between Laura and Adam; (M) Laura points at door 3; (N) Adam leaves the room, opens door 3; (O) Door 3 closes; (P) Landslide; (Q) Characters screaming for help; (R) Laura leaves the room; (S) End of scene.

The implementation of the IVR mobile application used was programmed using the Unity 5 game engineFootnote 6. In this scene, we are presented with a 360º virtual environment of a pharmacy from the 19th century. The 360º Camera Rotation in the virtual environment is provided by the Google VR pluginFootnote 7. All multimedia content is stored in the device and no data connection is needed. Information needed for analysis of the VR behavior is stored locally in an Extensible Markup Language (XML) file.

4 IVRUX - VR Analytics Tool

In order to supply authors with useful insight and help them design more engaging 360º narratives, we developed a VR analytics prototype (IVRUX) to visualize the user experience during 360º IVR narratives. The implementation of IVRUX was also developed using Unity 5. The prototype, using the XML files extracted from the mobile device, organizes the analytics information into a scrubbable timeline, where we are able to monitor key events of five types: story events, character animation, character position (according to predefined waypoints in the scene), character dialogue and environment audio. The prototype allows the researcher to switch between three observation modes; the single camera mode, a mode for 360º panorama (see C in Fig. 2) and a mode for simulation of HMD VR. The prototype replicates the story’s 3D environment and the visual representation of the user’s head tracking (field of view) by a semi-transparent circle with the identification number of the participant. Moreover a line connecting past and present head-tracking data from each participant allows us to understand the participant’s head motion over time. Semi-transparent colored spheres are also shown, one represents the points of interest (PI) in the story, simulating the “Director’s cut” and the others represent the location of the two characters.

Fig. 2.
figure 2

IVRUX interface. (A) Pie charts representing intervals of time where a participant is looking at target spheres; (B) User selection scrollview; (C) 360º panorama; (D) Intervals of time where a participant is looking at target spheres; E) Story Events; F) Environment Audio; (G) Character Audio; (H) Character Movement; (I) Character Animation; (J) Scrubbable Timeline.

The scrubbable story timeline (see J in Fig. 2), presents the logged events and audio events. A scrollable panel (see B in Fig. 2) allows the user to choose which participant session to analyze and by selecting it, three pie charts (see A in Fig. 2) are shown indicating the ratio of time that the participant spent looking at one of the target spheres. Additionally, the timeline is also updated to represent the intervals of time where a participant is looking at each target (see D in Fig. 2).

5 Methodology

To test IVRUX, we conducted a study with a total of 32 users (16 females), non-native English speakers, of which 31.3 % were under the age of 25, 62.5 % were within the 25–34 age range and 6.2 % were over the age of 35. To assess participant homogeneity in terms of the tendency to get caught up in fictional stories, we employed the fantasy scale [2], with a satisfactory internal reliability (α = 0.53). Two researchers dispensed the equipment (Google Cardboard with a Samsung Galaxy S4) with the narrative and took notes while supervising the evaluation. Participants were asked to view the 3 min IVR narrative. Subsequently, the participants were asked to complete a post-experience questionnaire and were interviewed by the researcher (for questions see Table 1.); this process took, on average, 15 to 20 min. After collecting all the interview data, two researchers analyzed questions IQ1-4, 7, 9 and open coded the answers. In IQ1, 3, answers were classified into three levels of knowledge (low, medium and high). In IQ2, 7, answers were classified positively and negatively. Finally, in IQ9, answers were classified according to the engagement with story plot, environment exploration or both. We used the Narrative Transportation Scale (NTS) [5] to assess participant ability to be transported into the application’s narrative (α = 0.603).

Table 1. Semi-structured interview table

6 Findings

6.1 Findings from Questionnaires and Interviews

The results from the NTS, which evaluates immersion aspects such as emotional involvement, cognitive attention, feelings of suspense, lack of awareness of surroundings and mental imagery, presented a mean value of 4.45 (SD = 0.76).

From the analysis of the semi-structured interviews (see Fig. 3), most participants understood the story at the medium level (IQ1), while with regard to knowledge about the virtual environment (IQ3) participants generally demonstrated medium to high levels of reminiscence of the virtual environment. More than half of the participants had a high awareness of character movement (IQ4) and most participants did not have difficulties following the story (IQ2) and were not averse to the story (IQ7).

Fig. 3.
figure 3

Clustered column charts for participants scores in relation to the semi-structured interviews questions: IQ1, IQ2, IQ3, IQ4, IQ7 and IQ9

For example, participant A26 said “I took the opportunity to explore while the characters weren’t doing anything”. According to participants the most interesting elements of the experience (IQ5) were factors such as the 360º environment, the surprise effect (doors opening, character entry, thunder, etc.) and the immersiveness of the environment. For example, participant A3 stated “the thunder seemed very real (…)- I liked the freedom of choosing where to look.”, participant A6 mentioned”I was surprised when the door opened and I had to look for Adam.”. When asked if they would prefer to explore around the environment or focus on the story, the answers were inconclusive; a portion of users believe that the combination made the experience engaging. For example, participant B9 said “I enjoyed both and the story complements the environment and vice-versa.”; moreover, participant B1 stated “At the beginning I was more engaged with the environment but afterwards with the story.”

6.2 Findings from the IVRUX

Through the analysis of the data captured through the IVRUX, we noted that 48 % of the time, participants were looking at the “Director’s cut” (M = 85.31 s, SD = 14.82 s). Participants spent 51.16 % of the time (M = 90.93 s, SD = 21.25 s) looking at the female character and 15.37 % (M = 27.32 s, SD = 8.98 s) looking at the male character. All users started by looking at the “Director’s cut” but after a couple of seconds, around 10 users drifted into exploring the environment. Of those 10 users, 8 chose to explore the left side rather that the right side of the pharmacy, where the table with the lit candle was situated. Once Laura, the protagonist started talking (B in Fig. 1), the 10 who were exploring shifted their focus back to her and the story (“Director’s cut”). Around 9 users looked around the pharmacy as if they were looking for something (mimicking the protagonist’s action of looking for ingredients). When Laura stopped talking and started preparing the infusion (end of B in Fig. 1), around 12 users started exploring the pharmacy, while the rest kept their focus on Laura. At the sound and action of the door opening, (C, D in Fig. 1) 13 of the users shifted their attention immediately to the door. When Adam walked in (F in Fig. 1) we observed the remaining users redirecting their attention to the door. As Laura started talking to Adam, 22 users refocused on Laura, however we noticed some delay between the beginning of the dialog and the refocusing. When the characters were in conversation, 20 users shifted focus between Adam and Laura. During the preparation of the medicine (I in Fig. 1), 25 users kept their focus on Laura, while around 7 users started exploring. While all users followed the characters and trajectories, they did not follow indications to look at specific places (J, M in Fig. 1). When Adam left the scene (N in Fig. 1), all users re-directed the focus to Laura. After the landslide, when Adam screamed (Q in Fig. 1), none of the users were looking at the door from where the action sounds emanated.

7 Discussion

As we were developing IVRUX, through continuous user testing we understood that at the beginning, orientation of the virtual camera was influential in the user following the “Director’s cut”, therefore the placement and orientation of the camera should be carefully considered. When characters are performing the same actions for long periods of time or are silent, users tend to explore the environment. These “empty moments”, when users are exploring and plot is not developing, are best suited to directing the user’s attention to branched narratives or advertising. During the character’s dialogue some participants were shifting focus between the two as they would in a real life conversation. A subset of our sample explored the environment during the dialogue; in the interview, users explained that once they knew where the characters were, it was enough for them to fall back on audio to understand the story. This is a clear illustration of freedom of choice in IVR that filmmakers have to embrace.

Lighting design emerged as crucial in drawing the attention of participants to specific elements in the narrative or environment. Users directed themselves towards areas that were better illuminated. Similarly, audio can also be used to attract attention – for example when a doors opens (C, G in Fig. 1) or when characters speak, as participants were seen to focus their attention on the area where the noise originated. From the interviews, participants recalled the characters’ movements easily (IQ4); this was also observed in IVRUX as the participant’s head tracking accompanies the character’s movement. However, participants did not pay attention to where characters were pointing (J, M in Fig. 1.). When concurrent events are happening (S in Fig. 1), it is difficult for participants to be aware of all elements, accentuating a need for a buffer time for awareness and reaction to the events. In VR, we need to adjust the pacing of the story, as has been suggested by the Oculus Story Studio [14].

“The Old Pharmacy” NTS’s values are average, this could be explained by two conditions: the average fantasy scale scores of participants and the short duration of the experience as mentioned by participants in IQ9 (e.g. participant B8 “If the story was longer, I would have been more focused on it.”). Contrary to what we expected, we did not find significant correlations between NTS and the amount of time spent looking at the “Director’s cut”. This could be justified by participants who defy the “Director’s cut” intentionally (IQ7) or unintentionally (participants who rely on the audio rather than looking at the characters). Authors must account for defiance in participants when designing the story narrative in 360º environments. In the interviews, participants highlighted as interesting (IQ5) the technology and the nature of the medium: “It really felt like I was there” (Participant B8).

8 Conclusions and Future Work

In this paper, we have described the development, testing and results of IVRUX, a 360º VR analytics tool and its application in the analysis of IVR “The Old Pharmacy”. Results from our study highlight the potential of using VR analytics as a tool to support the iteration and improvement of 360º IVR narratives, by relaying information as to where the users are looking and how their focus shifts. Creators can now take informed decisions on how to improve their work. We were able to identify shortcomings of “The Old Pharmacy” narrative, such as the camera orientation, story pacing issues and lighting design. We hope that this encourages the further development of 360º IVR analytics tools to empower creators to test narrative design assumptions and create experiences that are immersive and engaging. Furthermore, we envisage the integration of biometric sensing feedback into IVRUX to enable visualization of the user’s body reaction to the narrative, superimposed on the IVRUX visualization already discussed. From the point of view of interactive storytellers, testing the tool with further IVR narratives, such as IVR narrative with multiple story threads or a non-linear story is crucial to gathering guidelines to understanding user preference.