Keywords

1 Introduction

It is important to researchers to monitor vibrations to ensure the safety of the nearby environment. Researchers need to track vibration levels to prevent damage to sensitive machines, but current technology does not allow for a researcher to work freely without having to check a computer monitor or use their hands [1]. Development of machine learning and deep learning algorithms in conjunction with smart sensors enables automated predictive maintenance [2]. This is enabled by IoT technology which is used for wireless sensor networks for environmental sensing and healthcare monitoring [3]. New framework has not been developed for the direct augmentation of data into the user’s vision. Researchers have also examined the time delay associated with network communication by analyzing Internet communication features. They propose a control structure to overcome the transmission delay [4]. Fu demonstrates the reliability of a proposed time-delay estimation method through Internet time-delay estimation experiments [5]. This research seeks to investigate the delay in local network and a mobile hotspot for use in the field.

Researchers depend on sensors to inform on critical events especially in structural health monitoring (SHM). Smart Infrastructure wireless sensors are useful for their reliability, low-cost, low power, and fast deployment characteristics [6]. Wireless sensor networks are used for monitoring and assessing vibration risk in historical buildings and cultural sites [7]. Forming a network of wireless sensors supports the gathering of data and decision-making before, during, and after a crisis event. A wireless sensor network in Torre Aquila proved the system is an effective tool for assessing the tower stability while delivering data with loss ratios <0.01% with an estimated lifetime over 1 year [8]. Researchers developed an ad hoc wireless sensor deployment for indoor environmental quality monitoring in office buildings consisting of 19 sensor devices continuously measuring vibration and other levels [9]. Researchers reported installations of wireless sensor networks in a suspension bridge, slab bridge, rail tunnel, and water supply pipeline to manage infrastructure in a way that ensures safe and efficient operation by informing data about short- and long-term performance [10]. Wireless sensors are optimal for efficient and reliable data feedback.

Augmented Reality is extremely useful to researchers in informing of real-time data. The concept has been explored in several different areas with the goal of projecting data between the image-based environment and the AR environment. Researchers at the University of New Mexico augmented displacement data, but these values were first recorded and stored in a database before they were graphed in AR [11]. Researchers at Princeton developed a human-machine interface which organizes metadata and provides actionable information by visualizing data about the built environment both on and off-site using AR [12]. Ballor et al. investigated using AR in infrastructure inspections. The framework uses the headset’s sensors to capture a high-resolution 3D measurement of the infrastructure which can be used to analyze the state of the structure over time, and track damage progression [13]. Researchers at Los Alamos National Laboratory used AR for structural health monitoring including detecting heat emitted from electronic equipment [14]. AR has a wide range of uses making it a valuable tool for SHM.

The new interface has been explored in the domain of structural design since it is now possible to interface the structural responses with holograms and other models permitting the engineer or technician to quantify structural dynamics in the augmented interface. The interface includes a LEWIS5 (Low-cost Efficient Wireless Intelligent Sensor) which is an Arduino Metro M4 microcontroller equipped with an accelerometer to measure vibrations wirelessly. The sensor requires a power source, which is connected via micro-USB. This data is sent over WiFi using TCP connection to the Microsoft HoloLens AR headset, where acceleration values are plotted real time in the user’s field of view. A comprehensive menu is included in the user-interface. This work is innovative in the area of human-structures interfaces, and it enables a new close to real-time control of structures and sensing of dynamics. This research validates the application as a useful tool in the field by comparing the time delay of the graph on a local area network versus a mobile WiFi hotspot.

2 Background

Augmented reality blends interactive digital elements with a real-world environment. An environment is generated by a computer and super-imposed onto the real-world environment, which allows the user to interact with the merged environment. The platform used in this research is optical see-through display, the Microsoft HoloLens. The head-mounted display (HMD) allows for contact-free operation by hand gestures and voice commands. The headset collects data via its image sensors. It also contains an inertial measurement unit, four sensors, an ambient light sensor, a depth camera, a photographic video camera of 2.4-megapixels, and four microphones [15]. The HoloLens has four light-sensitive grayscale cameras labeled in Fig. 1 which track visual features.

Fig. 1
figure 1

AR headset—Microsoft HoloLens

The sensing platform is developed to read acceleration data in a triaxial coordinate system as a wireless Structural Health Monitoring (SHM) system. This is done with a Low-cost Efficient Wireless Intelligent Sensor, abbreviated as LEWIS5. The sensor connects via WiFi but requires a power source hooked up via micro-USB. The physical components of the fully assembled sensor are labeled in Fig. 2.

Fig. 2
figure 2

LEWIS5 sensor full assembly; (a) Metro M4 express; (b) MMA8451 accelerometer; (c) Airlift WiFi shield

The Metro M4 Express is a 32-bit microcontroller with the ATSAMD51 microchip [16]. The Cortex M4 core runs at 120 MHz with floating point support. The board is powered via micro-USB or barrel jack connection. The Airlift WiFi Shield allows the use of the ESP32 chip as a WiFi co-processor [17]. The Metro M4 microcontroller does not have WiFi built in, so the addition of the shield permits WiFi network connection and data transfer from websites as well as the sending of socket-based commands. The triple-axis accelerometer used for this project is the high-precision MMA8451 with a 14-bit Analog-to-digital converter [18]. Digital accelerometers are used to detect motion, tilt, and basic orientation. For the purpose of this project the accelerometer is used to detect motion as well as vibrations. Its usage range varies from ±2G up to ±8G, which is ideal for its application to this project.

Programming and development of the AR application is done in Unity 2018.4.19f1 taking advantage of the Mixed Reality Toolkit (MRTK) from Microsoft. The application is developed for Universal Windows Platform (UWP) which allows deployment to the HoloLens. The programming platform is Visual Studio 2019, and the code is written in C#. The graph of the live data is developed as a scatter plot, which was chosen as the most effective and efficient solution. The graph is developed based on a tutorial from Catlike Coding [19]. Points at each appropriate coordinate are generated by Unity’s default cube game object based on the incoming parsed accelerometer data. Each data point is graphed as a small 3D cube for visual feedback, and the cubes are connected with a Unity LineRenderer so that the graph appears as a line chart. The graph updates with each frame, meaning the cubes are adjusted as time progresses, defined by the function f(x, t). The points are positioned using an Update method with a for loop. A custom matte shader was made for each cube using Unity’s material assets. Red designates the x acceleration, yellow designates the y acceleration, and cyan designates the z acceleration.

The application interface consists of six different buttons with specific functionality. From top to bottom: Client Start connects the client to the server via Transmission Control Protocol (TCP). In the context of the application, the computer running the Arduino program acts as the server and the device running the AR application is the client. The Unity code requires the IP Address of the Arduino board, and the Unity code and Arduino code are set up on the same port. Client Stop closes the client connection to the server. The View Start button initiates the Unity function ContinueInput. Incoming data from the server is parsed into x y and z vectors converted to terms of the gravitational constant G, which is graphed on the left side of the interface. The x and y data are also offset so that the x line does overlap and hide the y line. Therefore, the graph axis is labeled as “Z Acc” for the purpose of the experiment as well as simplicity. Included below the graph is a horizontal line indicating the time in seconds of the data. View Stop zeros out the three data lines but does not disconnect the client. The view may be resumed by selecting View Start again. The purpose of the slider at the bottom is a visual aid. When the data breaches the desired threshold value, the line flashes pink as a warning to the user (Fig. 3).

Fig. 3
figure 3

Application interface as seen in the user’s view

The objective of the experiment was to prove the validity of the real-time acceleration graph and to investigate and verify the minimal delay in the data sending, receiving, and plotting over WiFi on a network and mobile hotspot compared to real time. In the experiment, the researchers used the Microsoft HoloLens 1 AR headset and a Model K2007E01 “SmartShaker,” which is an electrodynamic exciter from The Modal Shop. The AR headset was used to augment live acceleration data sent directly from the LEWIS5 sensor. This data was controlled by a set frequency input to the shaker. Therefore, the AR graph could be validated by the known input. Additionally, a video capture allowed for the investigation of the graph delay. This was done with simple frame-by-frame video analysis, which returned an approximate value for the delay in the graph for all two different WiFi connections. The experiment was conducted in a researcher’s home office due to restrictions imposed by the global pandemic. The LEWIS5 sensor is positioned horizontal to the ground with the accelerometer z axis matching gravitational acceleration. The sensor is attached to the top of the electrodynamic exciter for controlled input and is connected with a micro-USB cable to a laptop computer as a power source (Fig. 4).

Fig. 4
figure 4

Experimental setup—sensor-shaker configuration

The experiment was conducted following this procedure:

  1. 1.

    Open Arduino IDE and connect LEWIS5 to power source.

  2. 2.

    Attach the LEWIS5 sensor and stinger pairing to the mounting insert on the SmartShaker.

  3. 3.

    Press reset on the sensor once to load the program.

  4. 4.

    Open Serial Monitor and wait for the server to connect to WiFi.

  5. 5.

    Initiate the SmartShaker by selecting a 10 Hz input in a smart phone frequency generator application, which is connected to the shaker via 3.5 mm jack.

  6. 6.

    Once message “Listening for clients on port” is initiated, the Client Start button is selected in the AR application.

  7. 7.

    When the client connects, the Serial Monitor will begin auto scrolling and displaying the three columns of acceleration data, at which time View Start is selected.

  8. 8.

    The three colored lines are plotted on the graph in the user’s view.

  9. 9.

    View Stop and Client Stop are selected to close the graph and shut down the client’s connection to the server.

  10. 10.

    The visual feedback from the graph is recorded in the HoloLens as a video to validate the results and analyze the delay.

  11. 11.

    The time-delay data and Arduino data are plotted in MATLAB.

The time delay in the program was investigated using video analysis. By counting the frames between the initial sensor acceleration and the recorded response, the time delay can be approximated. With the known value of the video framerate in FPS, the time delay of the application can be calculated using the following formula.

$$ \left({\mathrm{Frame}}_1-{\mathrm{Frame}}_0\right)\ast \frac{1}{\mathrm{Video}\ \mathrm{FPS}}=\mathrm{Time}\ \mathrm{Delay} $$
(1)

3 Test Results and Analyses

The test was run with a total of three trials with four excitations for each WiFi connection. Between each trial the sever-client connection was shut down and the AR application was restarted. Therefore, the graph displayed a total of 12 positive peaks which could be analyzed to determine the time delay. Frame0 was designated as the time of initial acceleration upwards by the sensor. Thus, Frame1 designates the response plotted on the graph in the user’s view. The difference in frames was recorded and the time delay was calculated with Eq. (1). The results were plotted in MATLAB for comparison in Fig. 5 and the individual values are reported in Table 1.

Fig. 5
figure 5

Delay on local network vs. mobile hotspot

Table 1 Experimental results

The results of the experiment validate the AR application for use in the field. As proven clearly in the graph, the time delay on the local network was inconsistent and slow. The time delay in the graph response on the mobile WiFi network is consistently under 0.5 s, proving the reliability of the application for work in the field. The average time delay in the graph on the mobile network is much closer to real time at 0.263 s versus 0.6875 s on the local network. Additionally, the delay on the local network peaks at one full second while the delay peaks at less than 0.5 s on the mobile network.

4 Conclusions

This paper developed and tested an augmented reality application for live sensor feedback. An analysis of the delay in the acceleration graph shown in AR was conducted and the graph was verified by a known vibration frequency input to the sensor. Researchers demonstrate the reliability of the AR application in graphing acceleration data close to real time. This emphasizes the importance of accurate measurements and transmission consistency. The variance in time delay on the local network makes for a less reliable mode of data transmission in the AR application, whereas the connection on the mobile hotspot relays and graphs vibration data close to real time. The research helps to understand the reliability of several WiFi network options in close to real-time acceleration data transmission while developing an AR tool for data feedback and perception for use in the field.