Keywords

1 Introduction

Most existing civil infrastructure in North America was built in the post-World War II era and is close to the end of its useful life. The American infrastructure report card of 2021 [2] indicated that one-third of America’s infrastructure is at risk of rapid deterioration. Canada’s infrastructure is also assessed to be “at-risk” and needs improvements in structural maintenance. Traditional structural health monitoring (SHM) and infrastructure management techniques face significant challenges under adverse weather and busy operational conditions in North America. These challenges include prolonged closures of structures, dense instrumentation, the need for numerous images of damage from inaccessible areas, and an asynchronous mode of data collection. Considering the abundance of aging infrastructure (i.e., bridges, culverts, highways, and pavements) in North America, such asynchronous collection of big data and associated decision-making becomes a laborious and time-intensive process, resulting in billions of dollars of annual maintenance budget for the government and infrastructure owners. This paper proposes augmented reality devices to enhance real-time visualization of structural inspection of large-scale infrastructure.

Built assets can be continuously monitored using SHM systems using a network of sensors. Structural performance and damage identification can be assessed using a wide range of sensors by measuring various parameters. These parameters are generally related to the structural loads and their responses. Visual inspections have been used as a conventional method for detecting surface damages such as spalling, corrosion, cracks, and partially failed members. This method can be limited in detecting embedded and minor damages, and particularly delamination. Additionally, this manual approach can be labor-intensive, time-consuming, and subjective in nature. The reduced costs of sensing technologies have allowed for the widespread installation of SHM systems in numerous construction projects. SHM has emerged as a powerful solution for inspectors and operators with the recent advancements in transducers, data acquisition, and signal processing techniques. Infrastructure owners and decision-makers have recognized this need to improve SHM and have developed technologies, including contact [13] and non-contact sensing [1, 22] to eliminate outdated visual inspection practices. Examples of recent hardware technologies that have been used in SHM include cameras, LiDAR, drones, ground-based robots, and smartphones. These hardwares have been combined with new software technologies to enable the Internet-of-things (IoT) devices and digital twins (DT) systems for improved communication and visualization of SHM data. The large amounts of data collected from these technologies are often referred to as “Big Data,” which can be analyzed using artificial intelligence (AI), computer vision, and data science to further inform these decisions [9]. Various tools have also been developed to display the big data of various disciplines [6], including augmented reality (AR), virtual reality (VR), and mixed reality (MR) as illustrated in Fig. 1.

Fig. 1
figure 1

Illustration of improved visualization of SHM data and decision-making

2 Recent Developments of Mixed Reality in SHM

Interest in MR-enabled SHM has increased over the past few years in part due to the increased accessibility and decreased cost of MR hardware and software, such as the Microsoft HoloLens, which was first released for development use in 2016. A variety of different head-mounted displays (HMD) are featured in Fig. 2. According to Milgram’s Taxonomy of MR Visual Displays, AR and VR are subsets of MR. VR involves merging real and virtual worlds by replacing a user’s field of view with a virtual environment displayed on a screen, whereas AR involves merging real and virtual worlds by replacing the user’s entire view with a real environment, that may be augmented with virtual objects using a screen. MR does not replace the user's view with a screen but instead allows the user to see the real environment directly with virtual objects superimposed on the real environment [16].

Fig. 2
figure 2

Typical AR HMDs: a Google Glass Explorer 1, b Epson Movario BT-300 Smart Glass, c Microsoft HoloLens (1st generation), d Magic Leap 1, e Microsoft HoloLens 2, f Google Glass Enterprise Edition, g Eyesight Raptor

Proper use of MR technologies can yield significant SHM insights and benefits the inspection and surveying operation in structural and construction engineering. For example, Kamat and El-Tawil [11] proposed an AR and GPS-based post-earthquake damage assessment for buildings. The distance between the post-earthquake building stories was compared to pre-earthquake measurements to determine a damage index for the building. Shin and Dunston [20] analyzed the potential to use AR to complete a variety of construction tasks, including a structural inspection. Shin and Dunston [21] analyzed the tilt of steel columns in a laboratory using AR. This AR approach delivered faster results than traditional manual inspection. Similar to Kamat and El-Tawil [11], Dong et al. [8] proposed an AR-based post-earthquake damage assessment for buildings by measuring inter-story drift. The authors proposed an autonomous system that allowed inspectors visualization of the inter-story drift. Bernasconi et al. [5] used AR to measure crack lengths in structures which allowed to compare crack length and strain data simultaneously. Glisic et al. [9] reviewed nearly 150 articles primarily related to AR/VR-based SHM visualization techniques, including the current uses of AR/VR in SHM and future research directions. Behzadan et al. [4] reviewed the existing challenges in AR and listed spatial registration, visual occlusion, and integrations with open-access software as well as challenges preventing more adoption of AR in AEC industries. In recent studies, Moreu et al. [17] proposed a structural inspection tool using AR. A railway bridge was displayed and tested in an AR environment.

Napolitano et al. [18] proposed a workflow for organizing and integrating SHM data with VR technologies. In another recent study, Napolitano et al. [18] used MR to visualize building models and superimposed these models in the real world. Attard et al. [3] proposed a mobile robot capable of creating 3D models of underground infrastructure. These underground 3D models were displayed with VR. Palmarini et al. [19] reviewed multiple articles about the use of AR to support maintenance operations and found that interoperability of hardware and software components was a primary inhibitor of wider adoption. Dang and Shim [7] developed a bridge maintenance system using AR and 3D bridge models. The HoloLens was used to edit and view the 3D model for over a year. Mascarenas et al. [15] proposed a quantitative visual inspection tool using AR to capture 3D measurements of critical infrastructure. For example, the tool was used to measure the geometry of irregularly shaped sidewalks. Maharjan et al. [14] proposed an infrastructure interface for structural inspection and monitoring. The HoloLens was used to display real-time strain data using sensors. In another recent study, Wang et al. [23] proposed an AI algorithm that automated damage detection, analysis, and visualization using the Epson BT-300 glasses. Kilic and Caner [12] used non-destructive testing to determine the position of rebar and defects in a bridge, and using AR, the position of the rebar and defects could be visualized. Xu and Moreu [24] reviewed recent articles on AR use in multiple civil engineering fields and ranked 16 HMDs by their applicability. The authors also compared different features of each HMD, including camera resolution, RAM, memory, battery life, working temperature and humidity ranges, safe dropping distance, weight, the field of view, and whether the HMD had head and eye-tracking and depth perception.

3 Description of the AR Device

The HoloLens is a MR-HMD developed by Microsoft. When used for the first time, the user powers on the device and places it on their head with the transparent visor in front of their eyes. The user is then presented with a few user-friendly calibration steps and a tour of how to use the device through holographic illusions projected by the HoloLens onto the transparent visor. The user interacts with these holograms by “touching” them with their finger. The HoloLens maps this touch to the holographic illusion and responds. The HoloLens also has eye-tracking capable of identifying commands and can receive some voice commands. The HoloLens uses a modified version of Windows 10 called Windows Holographic OS, has four head-tracking cameras, two eye-tracking cameras, a camera for taking videos or photos, Wi-Fi and Bluetooth connectivity, and an adjustable head strap. When using an HMD, an important feature is the field of view (FOV). FOV is the angle at which the user can see the holograms. Any holograms outside this FOV will not be visible. The HoloLens 2 has a FOV of 52º, which is higher compared to many other existing HMD. A comparison of the HoloLens’ (first generation) FOV to other popular HMDs is presented in Fig. 3. New applications for the HoloLens are being developed and discussed by Microsoft and the HoloLens community.

Fig. 3
figure 3

Field of view of the HoloLens (first generation) and other AR/VR headsets [10]

4 Demonstration of the AR-Based Structural Inspection

4.1 Details of the Preliminary Setup

To explore AR applications related to SHM, the HoloLens 2 and other hardware and software components were combined and tested. In addition to the HoloLens 2, a MXChip AZ3166 IoT DevKit was purchased. The MXChip included the following sensors: a microphone, two clickable physical buttons, a magnetometer, and pressure, motion, humidity, and temperature sensors. The MXChip was also capable of connecting to Wi-Fi or a computer through a micro-USB port and displaying a variety of status and sensor information through six light sources (i.e., one infrared light bulb and five LEDs) and one non-touch OLED screen. The MXChip is specially configured for connection to Microsoft’s Azure cloud services. A consumer-grade laptop, the Acer Aspire E5-574 running Microsoft Windows 10, was also used to interact with the HoloLens and MXChip using multiple software applications as shown in Fig. 4.

Fig. 4
figure 4

Typical configuration of hardware and software used for AR applications

The HoloLens 2, MXChip, and Microsoft Windows 10 laptop constituted the hardware are used to explore AR-SHM applications. Multiple software applications were used to further explore AR-based SHM. The software used included: Unity, a game logic and design software with emerging enterprise applications; Azure IoT Central and Azure IoT Hub, cloud services for connecting to Internet-connected devices; Visual Studio Code, a code editor; Google Draw and diagrams.net, diagram illustration software; Trimble Connect, a construction app for the HoloLens; and Git/Github, a version control software.

One functionality of Trimble Connect is to measure distances. By utilizing “measure” mode, the user is presented with a white dot in their FOV. The user can move the white dot by moving their hand and can set the white dot in place by tapping their index and thumb fingers together. Once the first white dot is set in place, a second white dot will appear in the user’s FOV. By moving the white dot with their hand and then tapping their index and thumb fingers together, the user can set the position of the second white dot. After the second white dot is in place, a distance is calculated.

4.2 Case Study for IoT Sensor

Accurately displaying data from SHM devices is an emerging problem as the amount of data increases. In this case study, a solution for displaying data using the HoloLens and MXChip is presented. Originally, Unity and Azure IoT Hub were used to create a custom interface display for temperature data from the MXChip; however, after further exploration of the MXChip’s integrations, Azure IoT Central was selected to be used for further development because it provides many more ready-to-use functionalities that would have to be built from scratch if Unity and Azure IoT Hub were used. In Fig. 5, the Azure IoT Central interface is displayed using the HoloLens Edge browser. The data displayed are also accessible through mobile and desktop devices with modern browser capabilities.

Fig. 5
figure 5

Azure IoT central interface accessed through the HoloLens’ Edge browser

To display these data, the MXChip was connected to Wi-Fi and the laptop; the laptop was used solely as a power source for the MXChip—all data were transmitted through the device's own Wi-Fi capabilities. The data were then sent to an Azure IoT Central API endpoint which displayed the data. Azure IoT Central has a customizable interface that can be used to present data in a variety of formats. Displaying data using the HoloLens is potentially useful as it can allow workers to move the window to a comfortable position they can refer to or pin the window to move with the view all while keeping the worker's hands free to focus on other tasks.

4.3 Case Study for Crack Inspection in Lab Models

This section demonstrates the linear measurement capabilities of the HoloLens device. Figure 6 shows the measurements for a scaled lab concrete beam model. The dimensions of the beam are 14″ × 3″ × 3″. The length of the beam is measured twice in Fig. 6a with varying degrees of accuracy. The depth of the beam or the crack depth is measured in Fig. 6b. Furthermore, Fig. 7 shows the measurements of a concrete cylinder specimen taken with HoloLens. The dimensions of the concrete cylinder are 4″ × 8″. Figures 7a and b show the height and diameter of the concrete cylinder measured with the HoloLens device. The measurement of crack depth is shown in Fig. 7c. This approach is further developed and applied to a full-scale structure in the next section.

Fig. 6
figure 6

Measurements of a concrete beam; a beam length, and b crack depth

Fig. 7
figure 7

Measurements of a concrete cylinder; a cylinder height, b cylinder diameter, and c cylinder crack length

4.4 Case Study for Crack Inspection in Full-Scale Structure

Measuring crack lengths in existing structures is a common inspection strategy for SHM; however, such an exercise can be difficult due to accessibility issues. For example, inspectors may measure the same crack differently, and cracks sometimes cannot be accessed easily (e.g., a crack on the underside of the bridge). In this case study, a solution for measuring crack length using the HoloLens and Trimble Connect is demonstrated. An example of two measurements is shown in Fig. 8.

Fig. 8
figure 8

Top: measurements on a retaining wall; bottom: exterior view of the measurement

Pictures of the measurement can be taken easily using the HoloLens built-in camera. Taking a photo of the measurement solves the first example problem because inspectors could compare images and crack length measurements. The HoloLens ability to superimpose images at a distance also solves the second example problem. As shown in Fig. 9, two measurements are made on the underside of a pedestrian overpass that would have been physically awkward to make otherwise.

Fig. 9
figure 9

Top: two measurements on the underside of a pedestrian bridge; bottom: exterior view of the measurement

5 Conclusions

This paper proposed the use of an AR device called HoloLens for the purpose of structural inspections. Multiple software applications were used in this study to explore AR-based SHM. Apart from visualizing the data using an IoT sensor, the device could perform crack detection in a lab environment and a full-scale structure. Both tasks explored in this study using HoloLens can enhance the decision-making capabilities of the stakeholders by providing them with real-time SHM information. With huge amounts of data generated from SHM systems, the visualization of such endless streams of data can be challenging. This issue was resolved by using Azure IoT central as a user interface to interact and visualize the collected data. The data could also be accessed through mobile or desktop devices with browsing capabilities. Secondly, the HoloLens device was used for damage detection by detecting cracks in a full-scale structure. The measurement capabilities of the device were first tested in the lab environment with concrete beam and cylinder specimens. Afterward, the crack measurements were taken on a full-scale pedestrian bridge. The pictures of crack measurements, superimposed over the structure, can be taken using the device and shared instantaneously with remote participants. This device has the potential to significantly change conventional structural inspections. The application of this device can be further enhanced by using sophisticated coding and programming, which will be part of a future study.