Keywords

1 Introduction

A Naval Innovative Science and Engineering (NISE) project titled User Interface Prototyping Toolkit (UIPT) and its follow-on project Information Warfare Commander (IWC) continue to explore the rapid production of high-fidelity user interfaces (UIs) prototypes to be utilized by end users, researchers, designers, and engineers working together to develop and validate, new concepts of operations (CONOPS) and emerging technologies. The IWC project was targeted to include exploration and integration of technologies involved with a future vision and function of Naval Command and Control spaces with emphasis on the human-computer interfaces and interaction.

The objective is to examine the ability to rapidly develop a specific human-machine interface prototype for future Navy information systems that address emerging operational and tactical threats and is currently targeting advanced UI designs specifically for a large multitouch display device as well as the exploration of other forms of UIs. UIPT/IWC is a software development effort to explore rapid prototyping of information systems while standardizing forward looking UI interactions, such as multitouch and Augmented Reality (AR). One goal of the project was to examine the combined use of the touch table interface with the AR interface appearing virtually above the same virtual environment. The full benefits of AR is still an unknown with regards to user experience.

1.1 Motivation

The US Navy is focused on maintaining a decisive advantage over adversarial forces. The Warfighter as an individual end user and as collective end users plays a critical role in maintaining this advantage. New technologies such as Artificial Intelligence, Machine Learning, and Autonomous Systems grow in importance and applicability. How the user interfaces with these technologies becomes critical, especially in a fast-paced environment where situational awareness and decision making are out pacing the human’s ability to not only comprehend but to understand, formulate and then act in the most optimal way possible. The UI between the end user and advanced capabilities such as ambient intelligence, autonomous systems and the overwhelming influx of data from intelligence sources and sensors, is positioned as a key element that allows the end users awareness, understanding and control with regards to these complex systems. How end users will be able to utilized such interfaces and maximize the man-machine performance and relationship is still largely unknown.

The lack of User Experience (UX) design in software development in the past has resulted in software user interfaces which are non-intuitive, difficult to use, non-task oriented, and have a general lack of form and function required for knowledge and decision support end use. The goal for the UI in the UIPT/IWC project was to overcome this trend and provide an easy to use and valid interface for the Warfighters. UI prototypes which focus on innovative concepts continue to progress through the efforts of processes such as User-Centered Design (UCD) [1]. For the domain of interest, it becomes more than just the User Experience (UX). It is the timely and well-designed placement and interaction of information on the UI that can better support the critical functions of situational awareness, decision making and control of the supporting systems. The exploration for the use of AR must follow a similar path for it to become an interface of the future.

The application of AR to the UIPT/IWC project is still in its infancy and even more so when it comes to actual military end users making use of it in operational and tactical situations. The Navy, like many other organizations, has expressed interest in AR as a technology for end use; however, the practical application remains difficult. The current size, weight, fitment and calibration of AR head hear is an operational barrier, though these obstacles are will become less of an issue as technology matures. Current motivation in the evaluation of AR is directed more toward the actual end use for visualization, situational awareness, and scene understanding especially within a collaborative environment shared with multiple end users. AR allows individually tailored perspectives of the interface, along with the option to provide remote collaboration through virtual avatars. This provides some unique features not present in the current common flat display user interface.

1.2 Technical Background

The UIPT/IWC projects are based on a technical and design centered approach from years of experience developing user interfaces for the Navy. Previous examples include the Multi-modal Watch Station [2, 3] (see Fig. 1) from the 1990s, a touch-screen application which was user task centric and included a task management system in support of the end users. MMWS provided the concept of man power reduction through a 3 to 1 increase in efficient tasking. From then to today, a 55″ multi-touch workstations [4,5,6] (see Fig. 2) for the NIWC Pacific Command Center of the Future VIP demonstration room has demonstrated a technological increase in the design and implementation of UIs.

Fig. 1.
figure 1

Multi-Modal Watch Station (MMWS), 1990s

Fig. 2.
figure 2

Advanced user interface prototype design, 2015

NIWC Pacific programs involving mission planning, task-managed designs, decision support aids and visualization have provided a wealth of experience in design and development of user interfaces. This included many user engagements and knowledge elicitation sessions to validate UI/UX concepts and requirements, all of which provided the foundational basis for the UIPT/IWC interface. This knowledge, experience and examples provide sound principles which can be applied to an AR based interface.

UI design and development has found greater emphasis during the last decade plus [1, 2] with changes to UI by such companies as Apple, Google and a variety of new comers. While current AR technologies do not have the display quality and capability such as found in a 55″ touch table display [4] these AR based technologies in terms of visualization quality are well underway to becoming sharp and vibrant displays of their own accord. Some of the current research in AR [7,8,9,10] has incorporated technologies such as ray tracing that allow AR based virtual objects to be almost indistinguishable from real objects within the AR view. While this is not the intention of AR technology for Warfighter uses, owing to the fact that the augmented objects must be easily distinguishable, it does demonstrate that these displays will at some point provide high quality graphics which can then appropriately integrated for a wide variety of uses.

This effort grew out of a mockup demonstration of a future Navy information system as depicted in a NIWC Pacific Vision video, produced in 2015. NIWC Pacific contracted a multi-touch display manufacturer to work with NIWC designers and subject matter experts to design and manufacture Touch Table hardware and code the mockup application. This mockup (see Fig. 3) was specifically developed for demonstrations to DoD VIP visitors striving to understand the future direction of C4ISR for the U.S. Navy. The initial and current concept for the AR interface is to have it overlaid on top of the UIPT/IWC interface. This is an interesting concept which in of itself covers some interesting human factors questions about overlaying AR visualization on top of a more conventional interface and yet have them interplay in unison.

Fig. 3.
figure 3

Touch table used by UIPT

1.3 User Interface Display Hardware

Early exploration for AR prior to and during development of UIPT/IWC project was based off of the Microsoft HoloLens device. For this AR interface connections were made to Command and Control (C2) systems with the intent to display data from the C2 system in a virtual sand table and be able to not only view this information but send commands through UI elements back to the C2 system. This AR interface was additionally implemented for collaboration which providing control on classified information between users of the system. This allowed users of different classification levels to collaborate and interact in the same virtual space but were able to visualize information that pertained to their classification but have it filtered from the view of others that do not share the same classification level. This provided an initial foundation of the promise of AR both for visualization and collaborative efforts.

This work will not address the comparison or utility of certain AR headsets over others. This has been subjective with a variety of undefined or unscientific experiments. The general assumption is that AR devices will improve in time as technology improves and better understanding and data from user experience with AR equipment will pave the path going forward.

1.4 Interfacing to Information Systems

For the case of UIPT/IWC project the emphasis is placed on the user interface rather than the backend components and thus as a working prototype the data is often supported in the form or hardcoded entries in spreadsheets or as comma separated values. These files are read into support information to be presented as well as timed events to illustrate actively changing scenarios. This ability has been improved through the integration and connection to the Next Generation Threat System (NGTS) [11]. The NGTS system is a synthetic environment generator that provides models of threat and friendly vehicles such as aircraft, ground vehicles, ships, submarines, weapon systems, sensors and other elements commonly found in military systems. Both the simulator and the UIPT/IWC act together as the source for the AR system. The AR system provides a one-to-one matching of the information sets integrated with UIPT/IWC and NGTS.

2 Objectives

The UIPT/IWC application allows users to explore the impact of new technology on emerging operational concepts before large program investments are made, the Navy can significantly reduce the risks associated with pursuing revolutionary technology. The Navy is currently investing in new technologies, such as, autonomous vehicles, artificial intelligence, distributed sensing and integrated fires, but the way in which Fleet users will work with and “team” with these new capabilities is largely unknown. Included in this is the use of new human-computer interface devices and methods such as AR.

The objective of the addition of an AR component to the UIPT/IWC project is to provide users, researchers, designers and developers with the means to explore, analyze, develop and test new CONOPs and emerging technology that maximize the human interaction efficiency especially in the realm of collaboration. Currently, there are few, if any, AR tools dedicated to the examination of these specific futuristic CONOPs and technology and the effects on C4ISR decision making through the user interface. The AR component will enable the acceleration, experimentation and evaluation of these concepts through implementation, testing and end use. Prototyping and testing before major investment will save money as Fleet users will already have been informed of and validated the concepts of AR applied to future C2 systems.

2.1 Development of the Augmented Reality Interface Overlay

The effort to integrate AR into the UIPT/IWC software application was a collaborative effort amongst several technology groups at NIWC Pacific. The primary effort was a collaboration between the UIPT/IWC and Battlespace Exploration of Mixed Reality (BEMR) teams. This permitted the quick development of UIs to collaboratively explore and validate new CONOPS in AR and the impact on the end user decision making process. Initial development involved the use of the Magic Leap One AR device. The data shared between UIPT/IWC and the AR device included touch table view orientation (position, rotation and camera zoom level) in addition to the position and heading of NGTS air, surface and subsurface tracks.

Critical for this particular development effort and the primary objective was to determine if an AR view could be overlaid on top of an existing touch table view. The AR view would be offset from the touch table view but would align in position and stay aligned with movement of display elements in the touch table display. Size, position, orientation and other factors had to be synchronized correctly and efficiently across two independent and different but shared views. This primary effort then allows a human-computer interface study to determine if this particular technique can be validated for effectiveness under operational use. This turned out to be the main emphasis of the development effort.

2.2 Employment of the Toolkit

The AR component of the UIPT/IWC project is still an initial prototype and requires additional development effort to be able to run human evaluation experiments. Preliminary judgements can be made to see if further development and experimentation is warranted. The preliminary conclusion is that continuation down this path is valuable and shows promise as a collaborative environment for improved situational awareness and decision making for the Naval Warfighter end user.

3 Approach – Design and Development of an AR Overlay

The technical approach to the AR component of the UIPT/IWC project has two main goals, first the examination of a novel user interface in the exploration of future Navy human-machine interface capabilities, and second the development of a creation of a AR prototype application which can be used collaboratively with other non-AR users of a touch table display of the equivalent information set. Both goals are required to meet the objective of the UIPT/IWC project. Designers and software developers work hand in hand in an agile development process with weekly evaluations of the AR component for both form and functionality.

3.1 User Interface Design

The AR component of the UIPT/IWC project utilized a simple user interface design by creating an overlay of the same air, surface and subsurface track data directly above the touch table interface. The elevation of air tracks and subsurface tracks were exaggerated to provide more awareness of the depth of the battlespace. Each air and subsurface AR tack renders a light pillar that anchors it to the paired position of the UIPT/IWC touch table interface. AR headset images from the current prototype have been captured that illustrate the concept. In Fig. 4 the real-world view through the AR devices shows both the surrounding area as well as the outer edge of the touch table device and its display along with several AR tracks overlaid on top of the real-world view.

Fig. 4.
figure 4

AR objects overlaid atop touch table application

Figure 5 illustrates the use of billboard-based data tags where AR objects are provided with names of the entities. Colorized entities also represent the type of entities presented as either friend, foe or neutral. Note: alignment of AR content is shifted from the UIPT/IWC interface only in the image capture due to the relative position of the recording camera on the device.

Fig. 5.
figure 5

AR objects with data tags indicating type of object

Figure 6 demonstrates the correlation and synchronization of the UIPT/IWC display objects with those of the AR prototype objects. Leader lines are drawn down from AR entities to those of touch table entities. This illustrates the position, orientation and correlation between the two user interfaces but provides an AR based visualization of the same dataset. The AR view is synchronized to the touch table view, pan, rotation and zoom actions.

Fig. 6.
figure 6

AR objects with leader lines corresponding to touch table object locations

Figure 7 shows the AR virtual terrain occluding the touch table UIPT/IWC’s side menus. This depicts one of the potential challenges with the AR overlaid concept which is that the table top menu is seen through the virtual AR terrain.

Fig. 7.
figure 7

AR image occluding touch table menu

3.2 Experimentation Plans and Metrics

The AR component of UIPT/IWC is still in early stages of development and is essentially a prototype application. No formal or rigorous experimentation efforts have been performed. Only internal evaluations have been formulated from which indicate that further development is warranted, which would allow UX sessions with Naval Warfighter feedback. Common technical parameters to be examined in the future will be divided into prototype usability and target concept utility [11]. For prototype usability a combination of the following metrics will be collected during testing as seen in Table 1.

Table 1. Usability metrics

For target concept utility, we will focus on Situational Awareness during collaboration. The final metrics have not been determined at this time, however a modified Situation Awareness Global Assessment Technique (SAGAT) [12] could be used to collect metrics during the course of testing seen in Table 2.

Table 2. Utility metrics

3.3 The AR Component Network Development

The UIPT/IWC project is built and developed using Unity [12]. Unity is a cross-platform game engine produced by Unity Technologies which is generally used for 2D and 3D game development and simulations. The use of Unity for advanced display systems development was successfully demonstrated as a means to build C2 system UIs. The ability to integrate AR was also an integrated feature of Unity and third-party tools for Unity. Such a third-party tool was the Photon Unity Networking 2 (PUN 2) plugin. This permitted the creation of a broadcast server that could broadcast the UIPT/IWC and NGTS data across a Local Area Network (LAN) instead of having to go through the Photon public servers. This was useful because the receipt of networked data could be provided offline and could easily be integrated with Unity and the Magic Leap One device. This additionally allowed the project to be developed in a standalone unclassified environment and easily be integrated into a classified environment.

Photon is a high-level set of libraries and functions that allow developers to easily convert Unity game objects and prefabrications (prefabs) into objects which can send data over the network to another instance of a Unity build. This was the manner in which key pieces of data was transferred to the companion Magic Leap version of the software to allow the content to be rendered in the AR content correctly. Color coding was utilized to differentiate objects in the scene, commonly referred to as tracks, to be displayed according to type along with a data tag indicating the track name.

Photon uses the lobby room convention found in many online games however, this was modified so that the connection would be both immediate and automatic. This allowed any Magic Leap device to connect and join. In this networked online game concept the first user to enter the lobby room would become the host for subsequent users that join. This provides a frame of reference for subsequent AR users. Built in is the capability that if the host should leave or be disconnected then a randomly chosen current user is then selected as the new host. This preserves continuity of the experience and removal of reliance on the just the first device to join the lobby room.

The main component which is used to send data over the network is an objected called PhotonView. This can be used to send the data of game objects over the network. This is done by attaching the scripts that contain the data to the list of elements being observed by the PhotonView or, by taking it directly from components like the Transform of a game object which holds the position, rotation and scale. Additionally, method calls can be made over the network using Remote Procedure Calls (RPCs) which can be used in sending chat messages or updating values observed by the game object PhotonView. RPCs however, are more commonly used for communicating infrequent events whereas a OnPhotonSerializeView method call is more appropriate to use. OnPhotonSerializeView allows the opening of a predefined data stream which serialized data is passed from host to client and vice versa. This happens by encoding high level Unity C# code and translating it to a binary file which can then be easily parsed and sent over the network using a PhotonStream class from PUN 2. On the other side this binary file can then be un-serialized, read and reconstructed into the C# code.

3.4 The AR Component Camera Synchronization Development

Track data synchronization was the first task which was followed by a more difficult problem of synchronizing the camera between the Magic Leap One device view and the camera view of the UIPT/IWC software on the touch table device. Essentially the way that the camera works for each of these two components is very different. With the Magic Leap One device mounted on an end user’s head, the camera view is essentially aligned with the view of the end user. For the UIPT/IWC application the camera is a game object which is independent of anything that can be panned, which equates to movement on a two-dimensional plane in which tracks are laid out, rotated in a line orthogonal to the camera axis of rotation, and zoomed in and out. This equates to moving along a line orthogonal to the camera plane. Furthermore, Unity cameras have two mode; perspective mode which mimics human vision in the sense that objects that are further away will decrease in size and conversely increase in size as they come closer to the camera view. UIPT/IWC application content is displayed in camera perspective view.

In order to emulate the camera behavior in Magic Leap One, the size of the virtual world needed to be scaled so the elements in the AR view would be in the same position, orientation and scale so that they correspond exactly to the UIPT touch table elements upon which they are superimposed. This was accomplished by a using a scaling factor which was calculated to convert from the scale of 1 km to 1 Unity unit of measure in the UIPT/IWC version and 1 m to 1 Unity unit of measure in the Magic Leap AR version. A simple scaling of the map by a factor of 1000 accomplished this. This however, was not the only problem. Magic Leap One view required the UIPT touch table view be in orthographic camera mode to prevent the depth distortion of a perspective view. To accomplish matching views, scaling and switching from perspective to orthographic view, a trial and error process occurred to find the best scaling factor to achieve this. This was set as 0.613771151640216 * OrthographicSize – 1.0032187899731. OrthographicSize is the current size of the orthographic camera in Unity. The scaling equation is dependent on the real-world size of the touch table display. In future work, an in-application adjustment feature would be helpful to assist with scale calibration.

3.5 The AR Component Camera Synchronization Development

With the track data synchronized the next challenge was to have only one touch table as the host as the AR Magic Leap One host and any other touch table would be considered a just standalone hosts that do not share the synchronized camera data. The problem of interest occurs when the touch table is initially the host and then gets disconnected for any reason. The Magic Leap One device then becomes the new host and when the touch table is back up online everything would have to be reset. The solution to this problem was to create an additional widget. This widget is placed in the top right corner of the visor view just out of sight. The widget monitors the host connection such that when it becomes the host because the touch table has disconnected it will then disconnect the Magic Leap One devices and wait with periodic checking to see if the touch table was reconnected and serving as the host. It would then reconnect as the client.

3.6 The AR Component Utilizing Simulation Data

More realistic real-world data was simulated using NGTS to generate the scenario that would reflect the desired events. NGTS as a modeling and simulation enterprise tool for Warfighters is used to create scenarios where tracks can be positioned anywhere in the world using the Distributed Interactive Simulation (DIS) version 6 standard protocol to create Protocol Data Units (PDUs) to be exchanged over the network. In NGTS there is also the option to record data and then store it in a chapter10 file. This allows playback of the same scenario which is useful for repeated demonstrations. The receipt of DIS PDUs in UIPT/IWC was achieved via a third party took called Architecture Development Integration Environment (AMIE). This provided a middleware solution to open a connection to NGTS, capture DIS PDU data and then generate from it a data structure usable in Unity to create the actual entities.

3.7 The AR Component Graphics and Computational Load Reduction

Since AR devices have limitations in both graphics and computations the 3D model set that was used in UIPT/IWC were too high in the number of polygons used in the models and would needlessly use too much of the resources in the Magic Leap One device an thus hinder the rendered result. The solution was to obtain a set of reduced resolution models and determine new methods for shading operations. A mesh decimation algorithm using Euler operations of vertex clustering, vertex deletion, edge collapsing and half end collapsing were utilized. This provided adequate 3D model representation while allowing the Magic Leap One to not be hindered by expensive resources.

3.8 The AR Component Entity Data Tags

An additional display factor was considered for the AR view. Data tags for each entity was required. A data tag is a simple text object that is presented next to the object of interest. The text data is obtained over the networks which is typically the call sign or name of the entity but can include amplifying information. There was a need to have this data tag continue to be placed next to the entity it belongs to as well as face towards the view of the end user of the AR device. With AR the user is free to walk around the sand table display and view it from different angles. In the UIPT/IWC case the AR user can walk around the physical touch table while viewing the scene from different angles but still have the data tags aligned to the user’s current view.

4 Conclusions and Future Work

The UIPT/IWC project has been in development for about two years while the effort to explore the use of AR for Naval Warfighter applications has just began. Internal evaluation of the prototype to date provide good indicators that further development followed by human-computer evaluation from Navy Warfighters is warranted. The UIPT/IWC project continues to find applicable use in multiple domains. The question still remains if AR technology poses significant advantages in future human-computer interfaces that allow a unique collaboration environment for improved situational awareness and decision making? The question needs to be addressed at least to the determination of its value as well as the proper end use. Future work should continue down the development path until an adequate application prototype exists that will permit the desired end user feedback and thus validate the need to move forward or not move forward with AR technologies for future Navy human-computer interfaces.