Abstract
Mixed reality systems immerse users into environments where reality is bridged with virtual worlds. The proliferation of augmented reality compatible devices constitutes a useful means to overcome application limitations. The research presented in this paper focuses on the enhancement of mixed reality environments using mobile applications by altering the virtual parts of mixed reality environments, enriching application functionality, promoting social interaction and facilitating user-generated storytelling authoring and narration. The presented ongoing work builds upon a green screen mixed reality application which can be used in combination with one or more augmented reality application instances, showcasing the benefits of employing mobile augmented reality applications to complement MR systems.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Mixed reality (MR), sometimes referred to as Hybrid reality (encompassing both augmented reality and augmented virtuality) refers to the merging of real and virtual worlds to produce new environments and visualizations, where physical and digital objects co-exist and interact in real time. The reality-virtuality continuum presented by Millgram et al. [13] allows the classification of applications in terms of the users’ feeling of presence and illustrates systems’ distribution in accordance to their fundamental characteristics, ranging from completely virtual environments to reality-based systems. In between these distinct categories there are systems in which the physical and the virtual world are combined [2], blending reality with computer-generated imagery and encompassing augmented reality and augmented virtuality. Mixed Reality is applied in various contexts, including games, and in particular tabletop games [16] in order to preserve the physical artifacts of the game. In the domain of cultural heritage, Grammenos et al. [11] use pieces of paper that host additional information upon placement over areas of interest. Another interesting approach is presented by Ridel et al. [15], who employ pointing to reveal virtual cultural heritage exhibits in a metaphor similar to the flashlight.
Mixed reality encompasses immersion, which is based on covering physically a person’s stimuli, namely vision, spatialized sound and haptic feedback [4], so as to engage users and bridge reality with a virtual environment generated by interactive systems. Immersion is strongly related to the interaction process: in addition to perceiving a Mixed Reality (MR) application with human senses, the interaction modality employed constitutes a decisive factor in feeling of immersion and the overall user experience.
Augmented Reality (AR) is defined by Carmigniani and Furht [7] as a real-time direct or indirect view of a physical real-world environment that has been enhanced/augmented by adding computer generated information to it. According to [10] AR refers to computer displays that add virtual information to a user’s sensory perceptions, enhancing the user’s perception of and interaction with the real world by superimposing virtual objects and cues upon the real world in real time. Digital storytelling is accomplished using AR technologies, thus providing an immersive means of narrative presentation across a variety of domains ranging from scientific information [3] to cultural heritage information [1].
Mobile devices are widely employed for creating and deploying AR systems [14]. The improved sensing and processing capabilities of modern mobile devices facilitate the creation of a variety of applications, including AR applications, which can communicate and interoperate with existing systems. The proliferation of handheld devices such as smart phones provides a pool of potential interactive system control devices with which the users are familiar. These control devices can be straightforwardly connected to interactive installations through a typical mobile application.
2 Design Decisions and Rationale
Installations in public spaces typically turn into multiple-user applications, even if they are designed as for single user. It is common that passers-by are inquisitive regarding interactive installations and approach them. This procedure is defined by Brignull and Rogers as the “honey pot effect” [5], as the more people approach a display, the more passers-by are attracted to view the exhibit. This paper presents ongoing work regarding the use of mobile applications (e.g. AR applications) to enrich interactive mixed reality systems.
Applications deployed on mobile devices have the potential to provide various perspectives of the reality viewed by users during interaction in virtual environments. Thus, spectators are able to examine additional aspects of either the user or the computer generated imagery, allowing the exploration of different stories. An indicative example can be the capitalization of interactive systems deployed in Ambient Intelligence environments, which contain fruitful information on users’ aims and the context of use.
Mobile applications can act as second screen displays, allowing systems connected with the framework to be augmented with additional information, which they are not displaying for the sake of improved user experience in MR environments. Moreover, certain applications are natively incapable of presenting certain content types: a 2D application presenting photographs can be enhanced by 3D models or 360 degrees panoramic videos.
Narratives are another aspect which can be unfolded using applications deployed on mobile devices through the presentation of visual storytelling through state-of-the-art computer generated imagery (CGI). Stories and narratives can be described, in a broad sense, as “unique sequences of events, mental states, or happenings involving human beings as characters or actors” [6]. This process adds up value to story presentation and differentiates simple event sequences from stories, as stories include the implicit and explicit bindings between the individual events, especially when the story is presented in an interactive manner [8]. Story narrations can be either loosely defined, such as a multimedia selection chosen by an end user, or well-defined, such as an event sequence unveiling a historical period. Moreover, real-time assistance, application key features or even thorough tutorials can be presented using the mechanisms of storytelling in a manner which users are familiar with.
In addition to enhancing individual interactive systems, the work presented in this paper aims to facilitate social interaction and encourage user-to-user communication. Social interaction is carried out both through face-to-face verbal communication and through messages sent via the AR applications. Furthermore, the users are able to affect the MR installation itself. In the case of a MR application similar to a green screen, AR users are able to alter the displayed backgrounds, initiating social interaction through the MR application and adding up to overall playfulness of the experience.
3 Implementation
In order to elaborate on the application of AR to MR systems an individual component has been designed and implemented so as to facilitate the in-between communication. The developed component, i.e. Mixed Reality Server (MRS) in Fig. 1, retrieves data from the MR system describing users’ location in a coordinate space defined by the MR system’s position. MRS exposes two types of information: user locational data and MR application context.
User locational data regard each user’s unique id and skeletal 3d transformations which describe necessary positions and rotations of skeletal joints. The communication exchange regarding positional data between MRS and AR client applications is accomplished via web sockets in order to support real-time bidirectional information flow.
MR Application context includes exposed data regarding the application’s state in real-time, allowing interested client applications to be aware of the displayed content in an agreed coordinate system. The provided information is not limited to the rendered multimedia, but also includes semantic information, content metadata and points of interest within the elements.
Two individual prototypes were designed and implemented to illustrate the potential of AR for adding value to interactive mixed reality systems. Both prototypes were implemented to support full body skeletal tracking, using the Kinect One depth sensor [12]. Skeletal joints transformation (position and rotation) are transformed from the sensor’s space to the application’s display space. At a later stage, MRS provides each skeletal joint transformation in real world coordinates both as the points in which people are physically located and in which they are rendered in the MR display; thus, each AR application is able to superimpose information either in front of the users or on top of the application.
3.1 BeThereThen
The first prototype, BeThereThen, is an extension of BeThereNow [9], a mixed reality application that immerses users in landscapes in a manner similar to green screens. The prototype aims to assist social interaction in the cultural heritage domain, facilitating storytelling and user-generated content.
The developed prototype offers the ability to exhibit historical aspects of the displayed landscapes through suggested photographs or videos. Users are capable of choosing specific elements from a multimedia collection related to the currently shown landscape using an AR mobile application. The users are able to interactively unveil in their own private display aspects of the CGI shown in the public MR display. The background landscapes where the application users are virtually standing in can be filled in with user-generated content, such as views of the same landscapes at different time periods through historical photographs or graphic representations. This is accomplished either by completely replacing the background or by brushing the preferred areas of the MR environment via touching the AR display. Upon completion, the users are able to instantly take a real-time photograph of the users in front of the MR display immersed in their personal background. An indicative view of the MR display is shown in Fig. 2, where a user is standing in front of the Venetian fortress of Castello a Mare (Koules) in Heraklion. A large section of the fortress is fused with a photograph of the early 20th century, creating a unique mixture of the initial background with a historical representation.
Furthermore, the resulting background can then be saved into a collection of altered landscape backgrounds. As a further step, the users are able to define sequences of sceneries, creating their own personal stories, which can be made public and shown in the main MR display on demand.
3.2 HelloThereNow
The second prototype extending BeThereNow [9], HelloThereNow, was created aiming at allowing users to superimpose information on the MR display. The overlaid information can be either placed over sections of the background or follow a specific user, if visible on the display.
Firstly, information laid over background artefacts can annotate aspects of the illustrated elements. For instance, in the scenery of a market with traditional items, users can annotate the products sold with a message sharing their personal opinion. Such an example is shown in Fig. 3, where a woman has shared her personal experience of the traditional showcased products.
Additionally, users are able to create personalized messages and share them with other users. These messages can be sent to other users of the mobile AR application or be shared with the public MR display: users are able to choose people visible in the MR display and make comments, create annotations or even add thought clouds with messages (Fig. 3). This communication can be either named or anonymous: the named comments can be persistent, i.e. be shown for a prolonged period and facilitate discussion with other users, whereas anonymous comments are only displayed for a short duration of five seconds and are meant to create a mini-game with other users around the public display in order to find out the author of the message.
Gamification constitutes another aspect towards which AR can contribute to. Apart from messages, the prototype is able to assign elements to the users interacting with the MR system. These elements are suggested in accordance to the background currently visible and include everyday objects such as accessories or clothing that can be mapped to a specified body area. The user of the AR application drags the objects on the user’s body and the application automatically selects the nearest skeleton joint and assigns the element to the corresponding joint, allowing users to make objects follow either a user’s torso or a specific part, such as the user’s hands or feet (Fig. 4).
4 Conclusions and Future Directions
This paper reported on ongoing work regarding the potential of employing augmented reality technology via mobile devices to enhance mixed reality systems. The presented approach aims to enrich the functionalities and improve the user experience of mixed reality systems located in public spaces through AR mobile applications. Two prototypes were created to enhance an application similar to green screen, facilitating social interaction, user-generated content and storytelling, while also acting as second screen displays. The next steps involve evaluating the prototype, firstly in-vitro using informal methods and secondly qualitatively in-vivo, so as to measure the proposed system’s likeability and receive users’ comments, suggestions and recommendations. Future work includes extending the system to be integrated in Ambient Intelligence environments, exploiting the potential of functionalities such as user (re-)identification, context of use knowledge and user profile information. Profile information includes data regarding user interests, preferences, semantic knowledge and interaction metadata.
References
Angelopoulou, A., Economou, D., Bouki, V., Psarrou, A., Jin, L., Pritchard, C., Kolyda, F.: Mobile augmented reality for cultural heritage. In: Venkatasubramanian, N., Getov, V. Steglich, S. (eds.) MOBILWARE 2011. LNICST, vol. 93, pp. 15–22. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-30607-5_2
Billinghurst, M., Kato, H., Poupyrev, I.: The magicbook-moving seamlessly between reality and virtuality. IEEE Comput. Graph. Appl. 21(3), 6–8 (2001)
Bimber, O., Encarnação, L.M., Schmalstieg, D.: The virtual showcase as a new platform for augmented reality digital storytelling. In: Proceedings of the Workshop on Virtual environments, pp. 87–95. ACM, May 2013
Bowman, D.A., McMahan, R.P.: Virtual reality: how much immersion is enough? Computer 40(7), 36–43 (2007)
Brignull, H., Rogers, Y.: Enticing people to interact with large public displays in public spaces. In: Proceedings of Interact, vol. 3, pp. 17–24, September 2003
Bruner, J.S.: Acts of Meaning, vol. 3. Harvard University Press, Cambridge (1990)
Carmigniani, J., Furht, B.: Augmented reality: an overview. In: Furht, B. (ed.) Handbook of Augmented Reality, pp. 3–46. Springer, New York (2011). https://doi.org/10.1007/978-1-4614-0064-6_1
Crawford, C.: Chris Crawford on Interactive Storytelling. New Riders, Indianapolis (2012)
Drossis, G., Ntelidakis, A., Grammenos, D., Zabulis, X., Stephanidis, C.: Immersing users in landscapes using large scale displays in public spaces. In: Streitz, N., Markopoulos, P. (eds.) DAPI 2015. LNCS, vol. 9189, pp. 152–162. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-20804-6_14
Feiner, S.K.: Augmented reality: A new way of seeing. Sci. Am. 286(4), 48–55 (2002)
Grammenos, D., et al.: Macedonia from fragments to pixels: a permanent exhibition of interactive systems at the archaeological museum of Thessaloniki. In: Ioannides, M., Fritsch, D., Leissner, J., Davies, R., Remondino, F. (eds.) EuroMed 2012. LNCS, vol. 7616, pp. 602–609. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-34234-9_62
Kinect for Xbox One. https://www.xbox.com/en-GB/xbox-one/accessories/kinect
Milgram, P., Takemura, H., Utsumi, A., Kishino, F.: Augmented reality: a class of displays on the reality-virtuality continuum. In: Telemanipulator and telepresence technologies, vol. 2351, pp. 282–293. International Society for Optics and Photonics, December 1995
Papagiannakis, G., Singh, G., Magnenat-Thalmann, N.: A survey of mobile and wireless technologies for augmented reality systems. Comput. Anim. Virtual Worlds 19(1), 3–22 (2008)
Ridel, B., Reuter, P., Laviole, J., Mellado, N., Couture, N., Granier, X.: The revealing flashlight: Interactive spatial augmented reality for detail exploration of cultural heritage artifacts. J. Comput. Cult. Herit. (JOCCH) 7(2), 6 (2014)
Zidianakis, E., Antona, M., Paparoulis, G., Stephanidis, C.: An augmented interactive table supporting preschool children development through playing. In: Proceedings of the AHFE International, pp. 21–25 (2012)
Acknowledgements
The work reported in this paper has been conducted in the context of the AmI Programme of the Institute of Computer Science of the Foundation for Research and Technology-Hellas (FORTH).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Drossis, G., Stephanidis, C. (2018). Enriching Mixed Reality Systems with Mobile Applications. In: Stephanidis, C. (eds) HCI International 2018 – Posters' Extended Abstracts. HCI 2018. Communications in Computer and Information Science, vol 851. Springer, Cham. https://doi.org/10.1007/978-3-319-92279-9_32
Download citation
DOI: https://doi.org/10.1007/978-3-319-92279-9_32
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-92278-2
Online ISBN: 978-3-319-92279-9
eBook Packages: Computer ScienceComputer Science (R0)