Keywords

1 Introduction

To bridge the gap between computer games and Multi-Domain Operations (MDO) wargaming, a concerted programmatic effort is needed to develop artificial intelligence and machine learning (AI/ML) research, grounded in brain and decision science principles, and embedded in an immersive advanced visualization environment. The AI/ML research will discover the foundational theories needed to develop universal tools for conceptual understanding that can recombine and interpret different sources of information to aid complex MDO decision-making. Modeling and visualizing these decision-making strategies at all levels of operations requires new algorithms applied to dynamic environments characterized by changing rules, cognitive states, uncertainty, individual biases and heuristics [1]. The capability to rapidly respond to and mitigate unexpected hostile capabilities as well as exploit new opportunities and friendly technological capabilities is critical for decision-making overmatch in a MDO environment.

For complex MDO situations with imperfect knowledge and uncertainty, an AI that provides a landscape of near-optimal solutions may be more helpful than one that provides a single ‘optimal’ solution [2]. How the AIs can convey these solution alternatives in a transparent manner to command and operational staff is a research gap that needs to be addressed [3]. Experimentation of conditions such as near-optimality and uncertainty [2], with new warfighter machine interfaces (WMIs) and advanced visualization technologies can lead to new AI/ML algorithm development and universal tools and principles that better synergize the human + AI exploration of complex decision-making [4]. In this paper, we begin to explore the ties between AI/ML-generated solutions, WMIs, and advanced visualization technology by utilizing the Augmented REality Sandtable (ARES) [5] battlespace visualization platform.

The ARES platform provides mixed reality visualization capabilities for our investigation of Friendly vs Hostile Decision Dynamics for complex MDO decision-making. ARES is a battlespace visualization research and development testbed using commercial off-the-shelf products to create a low-cost method of geospatial terrain visualization with a tangible user interface. Currently, ARES is primarily used for simulation and training with the goal of offering battlespace visualization capabilities and a user-defined common operating picture of the battlespace environment. The ARES platform supports multimodal visualization capabilities such as a traditionally used military sand filled table enhanced with commercial off the shelf components and projection technology used in combination with a depth sensor; browser-based 2D and 3D interfaces; head mounted and hand-held devices; and mixed reality integration. Figure 1 shows ARES running in a multi-user visualization view.

In the next section, we overview two research projects related to our Friendly vs Hostile Decision Dynamics research conducted on the ARES visualization platform. Section 3 describes Phase 1 implementation details of visualizing the Friendly vs Hostile Decision Dynamics research on the ARES platform. Section 4 discusses the challenges of algorithm development and Phase 1 testing the wargame using a text-based visualization instead of a graphical user interface (GUI). Section 5 describes future plans for the Phase 2 integration with other data sources and the SyncVis platform. Section 6 concludes the paper.

Fig. 1.
figure 1

Augmented REality Sandtable (ARES) running on Mixed Reality and Head Mounted Display with the larger viewport showing the complete immersive view from a Head Mounted Display and the smaller viewport showing the mixed reality view from the Mixed Reality display device.

2 Related Work

ARES is a research and development (R&D) testbed for experimentation in battlespace visualization and decision-making [6]. Typically, ARES R&D focuses on human factors-related research in the areas of information visualization, multi-modal interaction, and human performance assessment. Garneau et al. [6], in a DEVCOM Army Research Laboratory technical report, discussed completed, ongoing, and planned research framing an overall strategy for future work with ARES as a R&D testbed. According to Garneau et al. [6], the predominant research question underlying all ARES research activities is “What improvements in battlespace visualization and decision-making aid in providing a common operating picture at the point of need and best meet user requirements?”. Outcomes from our Friendly vs Hostile Decision Dynamics experiment will support and significantly contribute to Garneau’s vision of ARES as an R&D testbed.

ARES software provides geospatial terrain information and map images allowing users to build or edit tactical mission plans and courses of action. On the ARES version of a traditional military sand table, the visualization uses a depth sensor device with the sand topography based on the user’s physical interaction with the sand in real-time. ARES software also acts as a data server distributing the data to client applications that provide data to the users via one or more of the supported visualization modalities including Head-Mounted Display devices, web based interfaces, android table devices, and the mixed reality devices. Figure 2 shows the various modalities supported by the ARES battlespace visualization platform.

Fig. 2.
figure 2

ARES platform concept showing multiple visualization modalities.

Boyce et al. [7] research investigated military tactics comprehension using the ARES platform. Boyce’s experiment assessed how displaying information onto different surfaces (flat vs. raised) influenced the performance, workload, and engagement of cadets answering questions on military tactics. Participant engagement was measured using a modified User Engagement Scale and the System Usability Scale; workload was measured by the NASA-TLX. The findings of the experiment indicated that a raised terrain surface led to reduced workload and increased engagement and time on task as compared to a flat terrain surface. ARES support of alternative display methods, in this case, demonstrated increased engagement by augmenting the instruction of the military tactical tasks. Boyce et al. [7] conclusions, based on empirical data, affirms the suitability of ARES as a testbed for our Friendly vs Hostile Decision Dynamics decision-making experimentation.

3 Implementation Details

The Friendly vs Hostile Decision Dynamics research investigates AI/ML approaches to decision-making in a MDO context. In Phase 1 of our research, we are investigating new computational and human decision-making principles such as decision-making under uncertainty with many near-optimal solutions in a complex decision space. As part of this research initiative, we designed a research platform with a new WMI to facilitate the understanding of how humans overcome complex decision-making challenges. The WMI will have a limited simulation capability inspired by Mission Command, which is a variation on the game Battleship [8]. Unlike previous AI games, our rules will be dynamic (e.g., unexpected new classes of L- and T- shaped battleships), and both friendly and hostile players must choose from a complex decision landscape with uncertainty.

In the decision dynamics component of the project, uncertainty may be titrated by limiting player visibility of the game scenario and the effectiveness of player actions. Dynamics can also be increased by asking opposing players to switch among several predefined strategies mid-game. This requires creating new algorithms to rapidly adapt to new strategies by forming flexible abstract representations tied to reward probabilities and by having mechanisms for both fast and slow learning. Our approach to developing these algorithmic capabilities is inspired by brain mechanisms for flexible memory and behavior-reward associations. We will leverage the ARES battlespace visualization platform as part of the research activities mentioned above including the development of the WMIs and the game scenarios.

3.1 Warfighter Machine Interfaces Design Using Augmented REality Sandtable Battlespace Visualization Platform

As described in the previous section, the ARES battlespace visualization platform allows users to build and edit tactical scenarios using one or more visualization modalities. In the Friendly vs Hostile MDO environment, the decision dynamics are visualized as a team-based wargaming scenario on a real-world terrain.

The ARES Table Manager allows users to specify a map of an area of operations based on available data from Google Maps, OpenStreetMaps, OpenTopoMap, and USGS, or from a user-uploaded map image. In the example scenario described below, cesium terrain tiles generate a high-resolution terrain of the area surrounding the Fort Irwin National Training Center located in northern San Bernardino County, California. The Mojave Desert’s hills and the surrounding mountains provide an isolated area of over 1,000 square miles capacity for maneuver and ranges for conducting realistic joint and combined arms training [9].

After the terrain map is imported and loaded in the Table Manager, the terrain can be shared and displayed across 2D and 3D devices. Any number of scenarios can be created and overlaid on a terrain that can be saved, edited, shared, and exported. Here the ARES Tactical Web Planner application was used to build a simple tactical scenario using MIL-STD 2525C symbols [10] associated with the Friendly vs Hostile wargame (see Figs. 4 and 5).

The Web Tactical Planner provides the ability to run a web-based version of ARES in a browser on a desktop or laptop. The Web Tactical Planner has a 2D top-down view used for mission planning and a 3D viewer to provide a more immersive perspective of the terrain. Tactical symbols are added to the terrain using the Tactical Symbol Selection menu. There is also the capability to add the corresponding 3D model for the symbol using the Model Chooser panel with hierarchical filtering by Space, Air, Ground, Sea, and different types of Operations.

A visual prototype of the Friendly vs Hostile wargaming environment is overlaid on the terrain as a 10 by 10 grid as shown in Fig. 3. Currently, the wargame has two teams, Friendly and Hostile, and can be played with any combination of four human or AI (agent) players. There is a Land grid and a parallel and elevated Air grid. The grids, which are not visible to the players, assist in implementing game-related constraints.

Figure 4 shows a simple Friendly vs Hostile 2-player scenario created on the Fort Irwin terrain using the Web Tactical Planner’s 2D view. Each team has 5 units: fixed wing Airborne Warning System (AEW), tank, cross-country truck, platoon of soldiers, and a mobile command post. The blue-colored symbols represent the Friendly team and the red-colored symbols represent the Hostile team. In Fig. 4, the Hostile team has taken control of a small fuel depot (right side) and situated their command post behind a fuel tank and small outbuilding. The Friendly team’s position offers protection for a small isolated electric power plant (left side) located outside the main Fort Irwin area. Each team’s AEW (symbols with “W”) is scanning the opposing team’s area of operations in an effort to locate their command post.

Fig. 3.
figure 3

Friendly vs Hostile wargaming 10 by 10 L and grid that implements 2-player game-related constrains but is not visible to players on the ARES Fort Irwin terrain. The blue-colored section represents Friendly team’s initial unit placement area. The red-colored section represents Hostile team’s initial unit placement area.

Fig. 4.
figure 4

ARES Tactical Web Planner 2D view of Friendly vs Hostile 2-player scenario created on the Fort Irwin terrain. Blue symbols (Friendly) protect a small electric power plant (left side); red symbols (Hostile) control a small fuel depot (right side).

Figure 5 shows the Web Tactical Planner’s 3D view offering a more realistic wargaming decision-making perspective. The 3D view visualizes spatial information of both the terrain and man-made features and the position of the units in relation to terrain features. This geospatial perspective enhances the ability of decision makers to make more informed decisions based on better perception of the terrain and more realistic presence of the units.

Fig. 5.
figure 5

ARES Tactical Web Planner 3D view of Friendly vs Hostile 2-player scenario created on the Fort Irwin terrain. 3D view offers a more realistic decision-making perspective, for example, showing elevation of the AEWs positioned over the opposing player’s entities.

Figure 6 shows the mixed reality 3D view offering the most realistic wargaming decision-making perspective. The mixed reality 3D view also visualizes spatial information of both the terrain and man-made features and the position of the units in relation to those features. However, the mixed reality visualization platform offers the player a more natural interaction with the 3D environment. A player has the option to rotate, pan, and zoom into an area of interest on the terrain to investigate the wargaming decision-making alternatives in greater detail. The increased interactivity with the 3D environment together with the geospatial perspective further enhances the ability of decision makers to make more informed decisions compared to the other visualization modalities.

Fig. 6.
figure 6

ARES Mixed Reality 3D view of Friendly vs Hostile 2-player scenario created on the Fort Irwin terrain. Mixed Reality 3D view offers a more realistic decision-making perspective and a more immersive user interaction experience with the 3D environment.

3.2 Hardware and Software Environment

On the computational backend of the Friendly vs Hostile Decision Dynamics research, we are setting up the ARES battlespace visualization platform to run on the Persistence Services Framework deployed on high performance computing resources. Running on the Windows virtual machine, the ARES server pushes the scenario data to the connecting visualization modalities. The Web Tactical Planner running on the Secured Remote Desktop [11] on a high performance computing visualization node allows users to connect to the server running on the Persistence Services Framework.

Persistence Services Framework is a recently available distributed virtualization solution that can be leveraged to access or create non-traditional high performance computing services or workloads that utilizes a rich web front-end. Unlike traditional high performance computing computational nodes that are allocated to the users in batch mode for the time period requested, persistent services provide continuous access to data, databases, containerized toolsets, and frameworks. Unlike traditional batch mode where high performance computing resource are released back to the resource management system, these services and resources remain available to authorized users and processes as defined by the originator. Therefore, the ARES battlespace visualization platform would be able to take advantage of this type of high performance computing resources provisioning making available our Friendly vs Hostile tactical scenario data to any visualization modality connected to the data server from multiple users thus supporting collaborative visualization. Users using a head-mounted display will be able to collaborate with other users using the mixed reality display devices on the same tactical scenario.

3.3 Visualization of Decision Dynamics

To support simulation capabilities, we are implementing a simulation engine that will keep track of all states in a session of a Friendly vs Hostile Decision Dynamics wargame. Recording all decisions made by both human and AI players will allow us to analyze the decision-making behavior occurring during the game and then use the identified decision-making patterns and strategies to develop better AI players for tactical game play.

A key component of the ARES battlespace visualization platform is its messaging protocol that we can use to update our WMI described in Sect. 3.1 using the scenario states information from the simulation engine for a graphical representation of the scenario as illustrated in Figs. 4, 5 and 6.

One of the messaging protocols supported by ARES is the Distributed Interactive Simulation protocol (IEEE-1278.1). The data server can be configured to listen for the Distributed Interactive Simulation protocol and broadcast the updates to the visualization based on the data received. To implement the Distributed Interactive Simulation protocol in our simulation engine, we used an open-source implementation of the protocol, OpenDIS [12]. Since our simulation engine is implemented in Python, we were able to use the OpenDIS Python library in our simulation engine to send the scenario states to the ARES data server.

Additionally, the ARES platform has a native interface that can be used to send scenario updates to the data server. For our Phase 2 implementation, we plan to leverage the native interface that offers more features and is supported by the ARES visualization platform development team.

4 Discussion

The ARES-based visualization component of the Friendly vs Hostile decision dynamics research using the Fort Irwin terrain is currently under development. The AI/ML algorithm development component of the research uses a text-based user interface via a console window instead of a GUI or mixed reality interface of the battlespace. Command line interaction using a keyboard and text-based display is common when algorithm development is conducted on a high performance computing architecture, which is the case here. There are advantages and disadvantages to different types of user interaction. However, the goals of the visualization component of the Friendly vs Hostile decision dynamics research are to improve battlespace visualization, provide a user-defined common operating picture, and increase decision-making capabilities all of which are especially critical for success in a complex joint forces operation.

During our AI/ML algorithm testing with human players matched against one or more AI players, it became distinctly noticeable the lack of understanding, difficulty in interpretation, and general sustained confusion caused from using a text-based display of wargaming state space. Sometimes reading the unit position information from the text-based formatted table caused poor decision-making resulting in same team AEW collisions, friendly fire unit destruction, and friendly fire self-destroying projectiles. The presence of these erroneous decision making situations reaffirmed the critical need for embedding the human + AI game play within an advanced visualization platform such as ARES.

A text-based table or report, the more traditional form of distributing operation information, can be compared to Figs. 4, 5 and 6 where the Fort Irwin unit information is displayed geospatially using the ARES Tactical Web Planner and a HoloLens device. These advanced visualization modalities allow the players to interact with the visualization system using graphical elements such as windows, icons, menus, and images. For example, one click of the “view” icon in the Tactical Web Planner toggles between a 2D and 3D view of the terrain. Mouse buttons and the scroll wheel are used to zoom, pan, and rotate the terrain view. Players can select the type of unit, and unit symbol and corresponding 3D model from a series of menus and then place the unit on the terrain and drag the unit to new locations using a mouse. This type of interactive wargaming is visually intuitive and easy to understand accommodating a broader audience of user expertise. Additionally, the multimodal flexibility of the ARES platform enables user interaction with a wide range of electronic devices fulfilling specific user-defined requirements at their situated point of need. The mixed reality visualization modality of the ARES platform is especially apt at providing game players an accurate 3D visualization of the current state of the battlespace with capabilities for gesture-based system interaction with the units and terrain features in an immersive environment.

5 Future Work

In phase 2 of our Friendly vs Hostile Decision Dynamics research, we will begin expanding the effort to MDO-inspired problem spaces where human + AI teaming will support collaborative complex decision-making. We will examine how teams of humans + AI coordinate strategies and counter hostile strategies, e.g., AI aiding in strategy discovery and deception. In addition, we will investigate the effectiveness of certain actions that can be made contingent to other actions (e.g., key bricks in a Jenga tower) to better mimic the complex interdependencies in MDO decision-making. Phase 2 research will also include integration with MDO modeling and simulation tools to expand the breadth and MDO-relevance of the decision-making issues being investigated. This will include a tighter integration of the algorithm development and the ARES visualization platform, as well as assimilation of other information sources such as after-action reports and physiological signals while strengthening the focus on the foundational science of decision-making.

The integration effort with the ARES battlespace visualization platform will be expanded to leverage the use of SyncVis [13], a hybrid 2D and 3D data visualization platform, to investigate how problem solving occurs across an integrated or “synchronized” 2D and 3D problem space. Figure 7 shows using SyncVis to visualize 2D network devices connectivity status on a High-Resolution display device and visualizing the network connectivity between network devices in a 3D environment using a fully immersive Head-Mounted display device. In the context of the Friendly vs Hostile Decision Dynamics wargame, the 2D High-Resolution display system will visualize the unfolding of the complex AI neural network during game play while displaying the resulting MDO effects on a 3D mixed reality device.

Fig. 7.
figure 7

SyncVis, Hybrid 2D and 3D visualization platform supporting multiple synchronized views of the same problem space.

6 Conclusion

We overviewed the Phase 1 and future Phase 2 details of the Friendly vs Hostile Decision Dynamics investigation aimed at taking initial steps in bridging the gap between computer games and MDO wargaming. Currently, our AI/ML algorithm development and testing is conducted on high performance computing resources with game play displayed in a text-based format in a console window. Lack of understanding using the text-based display of game states during testing resulted in instances of poor decision making reaffirming the need for an advanced visualization platform such ARES. ARES multimodal capabilities are discussed in reference to a prototype battlespace terrain located at Fort Irwin National Training Center and an accompanying 2-team scenario. We plan to leverage the ARES battlespace visualization platform as an important component of the Friendly vs Hostile Decision Dynamics investigation.