Keywords

1 Introduction

Engineering and production expierence a drift in the direction of circular economy and towards batch size one due to customer and governmental requirements. This leads to several side effects, such as non-applicability of existing processes, as these are designed for high-quantity production and disposal of the products after their service life. Business model innovation is needed to adapt to those changing boundary conditions. Sustainable circular economy with highly customised production close to batch size one, means more effort in terms of Maintenance, Repair and Overhaul (MRO), as employees have to master a wider range of MRO scenarios. By using Mixed Reality such scenarios can be developed, trained and refreshed with reasonable effort. MR training scenarios can be automatically derived from conventional assembly or maintenance instructions.

Another possibility of generating such trainings is by tracking a skilled employee while executing an assembly and converting this data into a repeatable training session. This method allows to convey the implicit knowledge of the employee about the process, that is not included in the instructions, to other colleagues. The latter method bears the chance of process optimisation (e.g. Six-Sigma) via implicit knowledge in manufacturing, but especially for manual assembly.

Virtual and Rapid Prototyping help speed up engineering and development and minimise errors at the same time. Especially in the context of industrial digitisation the amount, size and complexity of data in research and development (R & D) has drastically increased (Pajarola, 2012). This is mainly a result from growing computing perfomance, that has been outperforming Moore’s Law (Moore, 2006) for the past decades (Koomey, 2010) and is continuing to grow further in a “Post-Moore Era” (Vetter et al., 2017). So the tendence of bigger and more complex data will not be plateauing soon, if ever. This huge and constantly growing pile of information is almost impossible to process close to realtime. This development leads to tasks being split into smaller and more refined sequences, requiring more interdisciplinary cooperation and information transfer, e.g. in the form of data visualisation. Due to its ability to represent complex, spatial data in a comprehensible way and across locations, MR is excellently suited for this task. MR is a hybrid of VR and Augmented Reality (AR) according to the reality-virtuality spectrum of Milgram et al. (1994, 1995) and does neither solely take place in the physical nor in the virtual world. In MR physical and virtual objects coexist and interact in real time, allowing the creation of completely new visualisations. According to a study performed by (Millais et al., 2018) usage of VR in data analytics also benefits deeper and more precise findings in the data and a higher level of satisfaction and success for the user, when exploring data through VR. Several other research supports those theses based on human vision and perception (Moloney et al., 2018; Bach et al., 2018; Donalek et al., 2014).

The majority of data generated by Computer Aided Engineering (CAE) today is 3D (e.g. Computer Aided Design (CAD), Computational Fluid Dynamics (CFD), Finite Element Method (FEM)), but analysis and post-processing of this data is mainly performed on two-dimensional (non-immersive) user interfaces (e.g. Desktop), although respective MR hard- and software is available on the market. This contradiction leads to a lower information yield from the data despite what could be possible. According to (Donalek et al., 2014) it is beneficial to visualise as much dimensions as possible in order to have a higher chance of finding additional information, such as correlations, patterns or outliers, in the data.

MR can not only help to better understand and comprehend complex 3D data, but also to translate real-world challenges to the virtual world in order to find solutions in a more effective and robust manner. Another promising field of application for MR is the training of personnel, be it in the medical or the technical field.

Off-the-shelf MR software solutions are costly and therefore do not find much application in the SME area. However, since the problems and opportunities of digitisation affect most enterprises, from big players to SMEs, this contribution presents a cost-effective, modular and highly customisable engineering pipeline to interactively visualise industrial 3D data, collaborate and do virtual trainings in MR.

This contribution is subdivided as follows. Section 2 discusses the functioning of the modular engineering pipeline and briefly discusses the main challenges. Section 3 explains a realisation of the pipeline for the use case of immersive and interactive post-processing of unsteady simulation data, like CFD. Section 4 gives additional examples for the usage of the modular engineering pipeline in different fields. Section 5 concludes this contribution by evaluating the opportunities and possibilities enabled through the Modular Engineering Pipeline.

2 Modular Engineering Pipeline

The following section illustrates the modular and customisable engineering pipeline for collaborative visualisation of industrial 3D data in MR without using costly commercial visualisation software. Figure 1 shows the basic structure of the engineering pipeline, whose modularity lies within its loosely coupled sub-modules, that can be exchanged easily depending on the desired use case or whenever a more powerful solution for a specific use case arises.Footnote 1

Fig. 1
A model diagram depicts a modular mixed reality pipeline that includes data sources, data reduction, interchange file formats, a visualization engine, an M R bridge, and visualization devices.

Modular mixed reality visualisation pipeline

The visualisation pipeline allows to immerse into data from a Data Source. Data from various sources can be processed. Be it stationary geometrical data from CAD or unsteady simulation data from CFD, FEM or a process simulation. Basically all representations or physical simulations from a digital shadow or twin (DT) can be integrated.

The data goes through several pre-processing steps before being visualised and is prepared for MR in the Data-Reduction pre-processing step. The underlying meshes and textures of the 3D data are reduced or decimated using mesh manipulation algorithms. Because of increasing computing power (Moore, 2006; Koomey, 2010; Vetter et al., 2017) finer temporal and spatial resolution (discretisation) of data is possible, resulting in huge and complex data sets (Pajarola, 2012). The data is becoming more difficult to handle for 2D, but especially for MR post-processing, leading to a drop in visualisation performance. Whereas in the MR context, ensuring the visualisation performance leads to the reduction of so-called cybersickness. Poor visualisation performance translates into low frame rates and asynchronous behavior of the virtual scene and the users’ input. If the physical movements of the user no longer correlate to the virtual movement, VR sickness or cybersickness occurs (Rebenitsch & Owen, 2016). After pre-processing, the data is transferred to the Visualisation Engine via an Interchange File Format (IFF) and imported to the virtual scene. With several different modules for the visualisation engine various use cases are covered. Virtual training purposes for example require an additional module, that allows to (manually or automatically) set up a sequence of assembly or maintenance steps. Tracking and recording a skilled employee, as mentioned before (see Sect. 1), is dependent on yet another module capable of processing this respective input and converting it into a sequence. The Visualisation Engine takes over the rendering on the various Visualisation Devices via an MR-Bridge. This module is mostly supplied by the manufacturer of the respective Visualisation Device. In exceptional cases, this module can also be obtained elsewhere. An example for such an exceptional case is the use of a Cave Automatic Virtual Environment (CAVE) as Visualisation Device, since they are not very widespread and mostly unique. Due to the flexibility of the Visualisation Engine, different MR Visualisation Devices can be operated. Common HMD for VR (e.g. Oculus Quest/Rift, Windows Mixed Reality headsets) and AR (e.g. Microsoft HoloLens1/2, Tablets), such as aforementioned CAVEs are supported by the Visualisation Engine.

A client-server connection allows to have several Visualisation Devices simultaneously participate the same virtual scene and work on the same data set. The Visualisation Engine uses the tracking data of the Visualisation Devices via the MR-Bridge to locate the users in the virtual scene. Other user inputs, such as commands on the Graphical User Interface (GUI), are also processed via this channel. The interaction with the data can be adapted from case to case and has to be specifically implemented for the Visualisation Devices used.

A closed loop, that allows manipulation of the Data Source from the virtual scene, secures a seamless and fully immersive workflow, so that there is no need to leave the virtual scene during an MR session.

3 Demonstrator for the Modular Engineering Pipeline

In accordance with the previously discussed engineering pipeline (see Sect. 2), a modular process was set up as a demonstrator. It allows for immersive and interactive post-processing of both, steady and unsteady simulation data in MR on different Visualisation Devices (see Fig. 2). It can therefore be assigned to the field of Scientific Visualisation (SciVis) or more accurate to the field of Immersive Analytics.Footnote 2

Fig. 2
A model diagram depicts a software module within a modular mixed reality pipeline that includes data sources, data reduction, interchange file formats, a visualization engine, an M R bridge, and visualization devices.

Software modules within a modular mixed reality visualisation pipeline

3.1 MEP for Immersive Analytics

The process uses the widespread simulation post-processing tool ParaView as Data Source. ParaView is capable of reading, interpreting and post-processing numerical simulations of various types and IFFs and is based on the Visualization ToolKit (VTK). The VTK features a range of filters for the post-processing of simulations such as other data (e.g. point clouds). To ensure good visualisation performance one of those filters, the “vtkDecimate”-filter, is used for Data-Reduction. An adaptive edge collapse algorithm (Schroeder et al., 1992) is used to reduce the number of triangles in the meshes of the simulation data. Those meshes can be reduced either before export or at runtime in the Visualisation Engine, whereby Data-Reduction in advance of export is recommended to keep input-output-times brief. Reduced simulation data and underlying geometry can be exported from ParaView in efficient, VTK-native IFFs (VTM, VTP, VTU).

Using a game engine for visualisation has turned out to be a cost-effective and flexible solution compared to other visualisation software. Therefore and because of the availability of source code from a similar project (D’Eri, 2017) (at the time of implementing the pipeline) Unity® has been implemented as the Visualisation Engine in this pipeline. In order to process the data from the Data Source a plug-in has been integrated as a module of the engineering pipeline, that renders a VTK-scene into the Unity® rendering pipeline.Footnote 3 This plug-in has been developed by KitWare Inc., whom also maintain the VTK and ParaView, as they recognised increasing demand for an integration of the VTK within Unity® for SciVis. The plug-in executes a VTK-instance parallel to Unity®, that allows to use the vast majority of VTK-readers, -filters and -writers from inside Unity®. The before mentioned closed loop back to the Data Source is not needed anymore, since the “VTKUnity-ActiViz” plug-in is capable of manipulating the data in the same extent as the Data Source. Based on this functionality two workflows were developed:

  1. (i)

    Visualisation of prepared post-processings.

  2. (ii)

    Interactive post-processing of raw simulation volume data.

Several software is then used to bridge the gap in between the virtual scene (Visualisation Engine) and MR (Visualisation Devices):

  • SteamVR supports a large number of HMDs including the Oculus series and all Windows Mixed Reality HMDs.

  • Remote renderingFootnote 4 is realised through the Oculus Quest HMD and Virtual Desktop.

  • AR devices, such as the Microsoft HoloLens2, are covered by vuforia and the Mixed Reality Tool Kit (MRTK). The latter also supports remote rendering to the Microsoft HoloLens2.

  • Rendering into CAVEs is enabled through UniCAVE (Tredinnick et al., 2017).

The unsteady data can then be explored and interacted immersively.

3.2 Discussion of the Demonstrator

The demonstrator for immersive and interactive post-processing of unsteady simulation data described above makes it possible to dive into and better understand relationships in the data. This is realised by an uninterrupted, immersive workflow that leads to a stronger focus on the data.

The main challenge for this use case is the management of the large amounts of data that are generated in the context of numerical simulations and increasing computing power. The optimisation of the visualisation data is currently based on arbitrary mesh reduction mechanisms. In order to improve the visual quality of the reduced meshes and thus enable further reduction, the reduction of the data should be parameter-oriented. Thus, a further step is to develop and implement a reduction mechanism based on a Machine Learning (ML) or Artificial Intelligence (AI) approach.

However, the immersive and interactive post-processing of simulation data in MR is a promising approach that, with some effort and optimisation, could significantly increase the information yield when data mining.

4 Application Examples Within the Modular Engineering Pipeline

The following section briefly presents more examples for the MEP to demonstrate its flexibility in terms of use cases and applications. The pipeline combines aspects of Engineering and Immersive Analytics, Production and Assembly such as Virtual Collaboration. Thus creating an uncompromised and fully immersive workflow.

4.1 Production Planning and Control (PPC)

The engineering pipeline is used for aspects of PPC, such as layout planning, visualisation of layout changes or process simulation with an respective backend. An example application has been implemented that allows for visualisation and interaction of a periodically updated production layout. Within this application it is possible to switch between different time stamps of the production digital shadow and move through it at original scale. It is intended to extend this application to interact with DTs as well, which will then allow manipulation and correction of processes directly from the engineering pipeline.

4.2 Immersive Training and Process Optimisation

Coming from the PPC example, the application of manual assembly in production cannot be neglected. With the addition of the externally provided module “Innoactive” to the engineering pipeline, manual assembly processes can be digitised or developed and afterwards be used for process optimisation or training purposes. A process was set up that automatically translates assembly or maintenance instructions into virtual assembly sequences via Natural Language Processing (NLP). The process allows for execution of trainings and assembly simulations for (new) parts on the Digital Mock-Up (DMU) instead of the physical product, thus saving a great amount of time, space and effort.

4.3 Virtual Collaboration

Motivated by the pandemic and the resulting decentralised way of working, the engineering pipeline also includes a virtual collaboration module. Hereby it should not matter which Visualisation Devices are used, possible are for example: CAVEs, AR- or VR-HMDs or tablets. Since development data contains very sensitive information, it is important to ensure a secure communication channel. The data must be distributed to all Visualisation Devices ensuring that all participants in the collaborative session have access to identical information. To ensure seamless communication, all model changes must be synchronised for all users.

The virtual collaboration module can be used with all other modulers and is currently in a demonstrator state.

5 Conclusion

Broad application possibilities for the modular engineering pipeline arise across many use cases for SMEs from engineering via production to assembly. The pipeline offers several advantages opposed to other solutions, since it is based on a game engine as visualisation software. Most importantly the very flexible working method, due to the open architecture of the underlying framework, opens many possibilities. The large community and professional support, which is not mandatory, help with urgent questions and solutions. The community even helps with free, high quality tutorials and template projects, that make the game engines easy to use even for users with little programming experience. Common CAE IFFs (e.g. OBJ, FBX, ...) are supported by readers or importers. If an IFF is not natively supported, there are plug-ins or so called assets for this purpose in most cases (e.g. JT, VTK). Another benefit is that common MR visualisation devices are natively supported. Even rendering into CAVEs and multi display setups is possible with a respective asset for Unity® and by native support in Unreal Engine. Since the Unreal Engine and Unity® share many features visualisation concepts and ideas can be translated in between the game engines with moderate effort. Last but not least the reasonable pricing is a great benefit for using game engines.

These benefits and the resulting opportunities of the pipeline were reinforced with several practical examples beforehand (see Sect. 4). There are several ways of implementing such a pipeline for a custom task in an SME. Hiring a game studio or software developer is a possibility, but as employees know the processes and tasks best it is recommendable to have them participate in the development. If an employee is at hand who enjoys programming and trying things out, the aforementioned advantages make it possible to create your own custom application with a modest amount of time and effort. If such an employee has limited time resources, it is also an option to contact a university or research institution to initiate a joint research and development project in which your needs and wishes can be addressed in detail.

All in all the Modular Engineering Pipeline can be described as a digitisation tool that is almost infinitely flexible in terms of use cases.