Keywords

1 Introduction

Among the core activities performed by engineers, are the design, modelling, analysis and prediction of the behavior of real-world systems using numerical simulation tools and techniques [1]. Finite element analysis (FEA) has been used in a variety of engineering fields, including structural mechanics, electromagnetics, heat transfer, and fluid flow. Computers and professional software, such as ANSYS [2], are widely utilized for the completion of the majority of practical FEA studies. By extension, engineers can efficiently handle complex modeling and result interpretation due to the well-developed computational techniques such as automatic mesh generation and scientific visualization. FEA models can be used to assess structural conditions after being validated with test results from on-site measurement or inspection [3]. The current implementations of FEA software rely solely on computer-generated graphics in a virtual desktop environment (VE). The validation of finite element (FE) models and the interpretation of FEA results are not made easier by this VE. Furthermore, several properties of the physical structures, in particular scale, orientation, and material, are usually incomplete or unrealistically represented in a VE. In most cases, the VE does not provide access to the actual physical context. Examining FE models and interpreting FEA results without realistic senses of the physical context is not intuitive or efficient. Most traditional FEA tools have complex, non-intuitive user interfaces that require time-consuming data entry, 3D model development, and manipulation. Furthermore, users have to repeat standard steps, such as pre-processing, solving, and post-processing, in order to investigate changes to FE models. Consequently, the utilization of FEA in various engineering applications is frustrating engineers due to such time-consuming and complex procedures [4].

Another significant challenge is the lack of interactivity. Pre-processing, solving, and post-processing are the three cornerstone steps involved in the modelling and simulation of a traditional FEA model. A 3D (three dimensional) mesh model is created in the pre-processing stage, and the relevant parameters, such as loads, material properties, constraints, and fixtures, are specified, usually by the user. The FEA system then constructs and solves the equations. Afterwards, the data are post-processed and displayed in accordance with the user’s preferences. The computation time is largely depended by the number of nodes and the performance of the processor. As a result engineers always face a tradeoff between the computational power/solution time and accuracy of results. Therefore, traditional FEA systems suffer from poor interactivity which could be facilitated by the utilization of technologies and techniques developed under the framework of Industry 4.0, such as Augmented Reality (AR). FEA systems are expected to update results efficiently in response to parameter variations in a wide range of situations, such as geometry changes, material changes, in product design, fixtures and weldments/bolted joints in structural elements etc. [5]. In some special applications, such as interactive entertainment and surgery simulation [6] real-time performance is required. As such, the development of real-time interactive FEA technologies is driven by this demand.

Data visualization and task assistance have both been proven to be effective with the development of AR. The use of AR to visualize FEA results in a real-world setting improves perception and comprehension of datasets. Additionally, AR supplements reality by superimposing real-time computer-generated content such as graphics, texts, and audio over the real-world environment. AR enables the use of physical objects with properties that cannot be adequately embodied in a virtual FEA environment, such as scale, orientation, and so on. The exploration and interpretation of results will be aided by superimposing FEA results on the corresponding real objects. The onsite environments also allow for the direct acquisition of loads and BCs using the appropriate sensors and Information and Communication technologies (ICT) infrastructure. Furthermore, for intuitive user interaction, tangible interfaces can be created [7].

As a result, taking into consideration the nine (9) pillar technologies of Industry 4.0, simulation, and Extended Reality (XR) are among the top candidates for advancing the existing techniques regarding the design and development of new products and components [8]. The aim of this paper is the design and development of a cloud-based framework for the integration of structural simulation tools with MR, in order to facilitate engineers to setup simulations and visualize the results in a more robust and intuitive manner. The contributions of the present research work can be summarized to the following:

  1. 1.

    Design of a framework utilizing Cloud technologies, and Augmented Reality for FEA result visualization

  2. 2.

    Development of a smart device application (Android based) for the setup and visualization of FEA experiments

  3. 3.

    Cloud repository can be used as an information exchange portal between engineers

  4. 4.

    Following the Everything as a Service (XaaS) model simulation tools can be distributed as cloud services, offering increased functionalities to Smart Devices (e.g. remote experiment execution on the go)

The rest of the manuscript is structured as follows. In Sect. 2 the most pertinent literature is investigated. In Sect. 3 a detailed presentation and discussion of the proposed framework is provided and in Sect. 4 the implementation framework is presented. Finally, in Sect. 5 conclusions are drawn and future research directions are discussed by the authors.

2 State of the Art

Several researchers have used Virtual Reality (VR) techniques to achieve immersive environments in an attempt to overcome the shortcomings discussed in the previous paragraphs regarding user interaction in traditional FEA environments. Several visualization techniques and interactive applications have been created. In order to improve human perception and comprehension of complex datasets, Scherer and Wabner [9] proposed a method to visualize FEA results and constraints using stereoscopic visualization methods and glyph-based display. Hafner et al. [10] presented a method for post-processing electromagnetic solutions in VR. Interactive cutting and operating of solution data is possible with the Visualization Toolkit (VTK) [11]. Next, Ingrassia and Cappello [12] developed VirDe, a system that allows designers to perform design processes, such as CAD modeling and FEA simulation, in a VR environment, with the goal of integrating VR technologies into the entire product design process rather than just post-processing CAD models and FEA results. However, the visualization of various datasets with real-world backgrounds is the most definite advantage that AR provides to numerical simulation when compared to VR. Weidlich et al. [13] used a mobile AR system to visualize the FEA results of machines. The reported system is built on a client–server model, with the server reading the data, generating the mesh, and calculating the result. The client renders the results and performs some interactive visualization functions. Heuveline et al. [14] presented a system for outdoor applications based on the advantages of mobile AR systems. A computational fluid dynamics (CFD) simulation of urban buildings was run on a server. Using a hybrid tracking technique, the visualization of the results through smartphones in AR is enabled.

Similarly, Extended Reality (XR) is a cluster of cutting-edge digital technologies for the visualization and registration of digital content/information to the user’s Field of View (FoV). In particular, XR includes Augmented Reality (AR), Mixed Reality (MR), and Virtual Reality (VR) [15]. Recent research works such as in the review of Kent et al. in 2021 [16] MR has been investigated as a medium to promote virtual prototyping of new products. A prototype is “any representation of a design idea, regardless of medium” and is not constrained to be realized in a specific form (physical or virtual). Virtual prototypes, are easier to create, iterate and to better understand the design space. Additionally, it is not evident how to decide which method to use by the time the produced knowledge is expressed as means of knowledge generation with ten (10 dimensions) [17]. An interesting approach has been presented by the authors in [7], where they utilize a WSN (Wireless Sensor Network) for the acquisition of data and AR FEA (Finite Element Analysis) analysis. The simulation runs in the background and in the AR (Augmented Reality) environment, the users can visualize the loads on the structure. Similarly, in the research work presented in [4] the framework presented, supports the modification of the physical prototype with the addition of virtual elements, in order to improve the structural rigidity of the revised physical prototype. In the context of FEA analysis and engineering education, in the research work of Turkan et al. in 2017 [18] a framework has been proposed based on the utilization of marker-based AR for the visualization of simple trusses and FEA analysis. However, this research work unveiled that the complex nature of the subject and the Graphical User Interface (GUIs) implemented required further elaboration as per the pilot case results.

To sum up, in the literature investigated, there are limited research works regarding the integration of XR technologies to FEA. However, there is fertile ground for further research, since XR technologies have been improved during the last years under the framework of Industry 4.0. Furthermore, given the recent advances in ICT (Information and Communication Technologies), large volumes of data can be wirelessly transmitted with minimal latency periods (e.g., 5G networks will be capable of 1ms latency) [19]. By virtue of the integration of sensing systems on equipment, it is feasible to acquire several types of data regarding the operation of a mechanical mechanism, therefore, with the development and implementation of proper software, the visualization of data becomes easier than ever. As regards the integration of XR functionalities, engineers are capable of monitoring equipment wirelessly as well as to inspect enclosed equipment eliminating the need for disassembly and manual inspection.

As such the contribution of the paper can be summarized to the fact that proposes a novel system that integrates FEA simulation, scientific visualization, and load measurement into an AR-based environment with the goal of utilizing AR technologies to assist structural analysis.

3 Proposed Framework

The contribution of this research work lies in the design of a Cloud platform for the provision of services and applications towards the integration of Mixed Reality (MR) in structural analysis simulations. Unlike traditional FEA systems, the integrated system will be capable of collecting load data directly from real-world environments and simulate structural deformations and stresses in real-time.

To acquire spatially distributed loads, a wireless sensor network (WSN) is established. To create clear illustrations, the real-time FEA results will be superimposed on the physical structures, wherever this is possible, based on the scale of the structure investigated. The following paragraphs of this Section are dedicated to the presentation and discussion of the framework, the key modules comprising it and the information flow. A high-level representation of the framework is presented in Fig. 1.

Fig. 1.
figure 1

General system architecture

3.1 Simulation Service

The simulation service, as the name implies, is the main service for the setup and execution of the FEA simulations. This service is saved and recalled from the Cloud repository. In order to setup the FEA experiments, the users have to input the simulation data, regarding the 3D structure to be used, its mechanical and geometrical properties and the material properties. These data can be input from the GUIs of the MR application. Further to that, in an attempt to facilitate the process of data input, material libraries have been created and uploaded to the Cloud database, which can be recalled from the users and implemented to each experiment. A NoSQL database will be used for the storage of the materials and their corresponding properties. In order to improve the user experience, a GUI is provisioned for the creation of new materials as well as for updating the existing entries in the materials database. Concretely, the communication between the abovementioned is achieved via a PHP (Hypertext Preprocessor) script, which sends requests to the NoSQL database and returns the results (i.e., physical properties of the materials). As per the modules displayed in Fig. 1, the execution of the simulation experiments is performed by the simulation suite (ANSYS Structural). However, for the preparation and representation of the results in the MR environment, the utilization of a middleware software is required, along with the development of a script for the efficient communication between the two software suites (please refer to Sect. 4 for more technical details on that issue). The engineers can download and import to the FEA simulator the data for the execution of the experiments, and the 3D geometry of the structure to be included in the simulation domain. Afterwards, the simulation results are stored in a matrix format in the Cloud platform, so that the AR module can incorporate them within the AR scene and create the visualization of the results in the user’s Field of View (FOV).

3.2 Cloud Database

The Cloud database can be considered as the backbone of the method, as it supports the operation of the entire platform. In its essence, the Cloud platform resembles an online workspace for engineers, on which a series of services are stored. Concretely, these services are relevant to the storage of 3D CAD libraries. The file repository is supported by an FTP (File Transfer Protocol) server, in which registered users can upload and download files, e.g. CAD (Computer Aided Design) files to be used in conjunction with the simulation service and for the MR visualization.

3.3 Mixed Reality Application

The MR application has been developed as multi-platform application for the utilization of Smart mobile devices (e.g. Smart phones and tablets) and HMD (Head Mounted Displays). The application can be realized as a series of intuitive and interactive Graphical User Interfaces (GUI) for the creation of the virtual experimental setups. Moreover, the user through the AR application is capable to create a file containing the experimental parameters, such as the boundary conditions of the control volume, the physical properties of the fluids etc. Afterwards, the FEA results are augmented on the physical prototype of the structure. However, if there is not a physical prototype, or the prototype is of large scale, then a scaled virtual representation is used. Another functionality of the MR application is the discretization of the 3D models via voxels. This functionality enables the ability to edit the part geometry in real-time, update the file repository on the Cloud database and execute again the FEA experiment.

4 Implementation Guidelines

For the development of the Cloud platform presented in the previous paragraphs, several Software and Hardware tools have been utilized.

Fig. 2.
figure 2

a) GUI of AR FEA app main menu, b) material library database menu, c) total deformation contours, d) equivalent stress contours

For the MR functionalities as well as for the development of the GUIs (Fig. 2a and b), the Unity 3D game engine was used, in conjunction with the MRTK (Mixed Reality Toolkit) and the Vuforia SDK (Software Development Kit). The scripts were developed in C# and Java using the Microsoft Visual Studio IDE (Integrated Development Environment). For the acquisition, preparation and communication of the simulation results in the MR environment (Fig. 2c and d), the use of Paraview data analysis and visualization application is required. However, in order to properly pass the data from Ansys to Paraview, the generation of an input data file (.inp) and the conversion of the.inp data file to.vtk (Visualization Toolkit Legacy) is necessary. Therefore, a script will be developed in Python, to act as an external/custom extension to Ansys. With regards to the hardware used, a PC equipped with an Intel Core i7 CPU, 16GB RAM, and Nvidia 1060 GPU has been used.

The current developments include the design of the GUIs, and the interface between the mobile application and the Cloud platform. As regards the AR module, the current development is based on the utilization of marker-based visualizations. However, the visualization of FEA results in AR is still under development. More specifically, the authors have focused on automating the interface between the FEA simulator and the AR app in order to process the resulting data and create the 3D visualization.

5 Concluding Remarks and Outlook

In this research work a framework for the integration of MR to structural simulation has been presented and discussed. Based on the literature investigated the proposed MR-based framework could be utilized for the more efficient design of new products and components (virtual prototyping). Further to that, the MR application could be utilized for the training and teaching young engineers, since the spatial visualizations of the simulation results can further facilitate them understanding the concepts of mechanics involved in the design and operation of mechanisms and components. Future research work will be focused on the development of a VR (Virtual Reality) module in order to enable the creation of fully virtual environment and promote the remote collaboration between engineers. Moreover, in order to improve the impact of the proposed framework a topology optimization algorithm will be implemented. The aim of this module is to further automate the process of geometry optimization, taking advantage of the recent developments in Artificial Intelligence (AI) and Machine Learning (ML). Finally, in an extended version of the method presented, the integration of a 3D algorithm can also be investigated in order to scan physical structures on-the-fly, transform them into 3D objects, examine their rigidity and if needed to improve the initial geometry.