1 Introduction

Digitalization is one of the pillars of Industry 4.0 (I4.0); indeed, the digital connectivity enables decentralized controlled machines taking “autonomous” decisions and self-optimized production systems [1]. In this context, machines are becoming cognitive and “intelligent,” and more and more robots are being added to the workforce. But human operators are still fundamental and play a critical role, that has imposed to re-organize their training and tasks, and requires further skills. A recent research, considering the global manufacturing industry, shows that in the last three years, most employers plan to increase or maintain headcount as a result of automation [2]. As a matter of fact, I4.0 is not going to reduce people in industry, but it will surely change their tasks and the skills required to cope with new intelligent machines and environments. Therefore, the new challenge is to integrate humans with advanced technological systems.

In the last years, the assessment of human factors (HF) in manufacturing has continuously increased: today process’ design focuses not only on task planning and layout design, but also on how people work considering both physical and psychological comfort, that have been proved to be highly connected with manufacturing performance and, in turn, with factory productivity. Despite the increasing level of factory automation, humans are becoming even more important, since they are able to increase automation performance, process quality, and global company competitiveness when properly integrated in manufacturing processes, thanks to their inner capacity to adapt to new operations without disrupting the production environment [3]. This is a key feature of flexible and advanced manufacturing and becomes mandatory in those sectors with high customization level and variable volume of production in relatively short intervals, that are characterized by a significant amount of manual human work. Consequently, the assessment of the overall human performance in industrial systems is a fundamental engineering concern for successful process design. HF are useful to consider the impact of different process features: from task typology to environmental conditions, repetitiveness of tasks, handling of heavy loads, static and awkward postures, and every factor that can expose workers to ergonomic risks that adversely affect their performance.

Simulation is the mean to reach a comprehensive overview of future working scenarios during design stages. In particular, VR allows to create realistic and interactive virtual environments to immerse the user into a virtual world and to simulate its actions and reactions in order to test his/her experience [4]. Nowadays, the gradual reduction of costs and the growing technological maturity and stability are extending VR application also to industrial contexts. Immersive virtual simulations can directly involve users as workers into a virtual factory in order to evaluate task feasibility and quality of interaction (in terms of usability, visibility, reachability, perceived comfort). It can be used to improve both workstation design and task planning, reducing inefficiencies [5, 6]. This specific application of VR is usually known as virtual manufacturing (VM). Moreover, simulated human actions can be recorded to collect information and find out ergonomics issues. Contrarily to traditional desktop-based digital simulations, the use of VR opens new scenarios including the assessment of cognitive aspects, such as human reliability, human errors, decision-making, until the quality of the operational performance [7].

This paper describes a VR-based application to design assembly lines, considering both process efficiency and factory ergonomics in an immersive environment. It proposes a methodology to create a VM environment, where operators can be involved and assembly tasks simulated using a combination of VR tools. The aim is to replicate, or better anticipate, what will happen at the shop floor in order to optimize both product and process design. The VM system architecture includes different hardware and software tools, as follows:

  • Unity 3D as the main VR engine, for generation of the virtual factory layout and interaction features;

  • HTC VIVE as head-mounted display (HMD), to immerse the user in the virtual world and replicate assembly tasks in a realistic way;

  • Xsens as tracking system, to track the user into the virtual scene during the simulation and to adjust the scene coherently;

  • Leap Motion for gesture recognition, to make the user work with their bare hands.

The research provides two main contributions in the field: it provides the guidelines for the practical integration of the analysis of HF in industry thanks to VM applications, and it analyzes the expected benefits by quantitatively comparing VM simulation with more traditional desktop-based digital simulations for design purposes, highlighting strengths and weaknesses after 1-year validation on a set of industrial cases, replicating different assembly tasks of complex machineries. The main novelty refers to the proposed simulation approach and validation results. The study is aimed at both industrial practitioners and academic researchers.

2 Related works

Current practice on human performance analysis in manufacturing is based on engineering simulations for preventive analysis, including virtual layout visualization, task simulation by digital mock-ups, or human actions replication by digital human models (DHMs) [8]. These tools provide a quick, virtual representation of humans in a simulated working environment to identify ergonomic problems, and support the analysis of the biomechanical attributes for specific postures, as well as head view and reach envelope for a reference population, selected by specific libraries. Commercial software toolkits implement ergonomic assessment methods like NIOSH equation, Rapid Upper Limb Analysis (RULA), or Ovako Working posture Analysis System (OWAS). However, results highly depend on the expert assumptions and personal judgment, as well as the personal knowledge of the tool, while the simulation setup is complex and time consuming. As a consequence, results obtained by digital simulations risk to be not coherent with the real users’ response.

In order to overcome these limits, VR has emerged as a popular technology for computer-human interfaces in order to master the shortcomings of 3D CAD models, providing more powerful and immersive simulations. The use of VR to enhance natural environments and to analyze the human performance has been an active research topic for years. Over the last decade, VR has also been used for modelling specific types of manufacturing environments and creating computer-based simulation of the real manufacturing system to predict system-related operations. Today, the so-called “virtual factory” approach is generally accepted as a useful tool for facilitating the user’s access and understanding of manufacturing processes and operations [9]. In this context, VR can provide engineers and/or trainers the opportunity to play a pro-active role in simulating manufacturing processes, identifying the most critical aspects and defining the most proper optimizing actions. In addition, simulation environments are cost-effective means for improving manufacturing productivity: they easily allow defining and validating the most appropriate system architecture for any given manufacturing system without really implementing it into current manufacturing operations. VR simulations were proved to highly benefit human-centered design thanks to enhanced assessment about visibility, postural comfort, space, reachability and use of tools [10]. Virtual environment can also be successfully adopted to support workstation design for different scopes [11]. As far as manufacturing processes are concerned, the term Virtual Manufacturing (VM) is now widespread to refer to a computer-based technology approach that uses virtualization for simulating manufacturing processes early in the design stage, in order to address some manufacturing-related issues and optimize the critical operations and entities in a factory plant. In some cases, VR is a tool which offers visualization for VM. In particular, the design-centered VM uses virtual manufacturing-based simulations to provide manufacturing information to the designer during the design phase, with the aim to optimize both products and processes for a specific manufacturing goal (e.g., assemblability, quality, flexibility) and evaluate many production scenario at many levels of fidelity to support decision-making [12].

The application of VM in industry has two main application fields: system design and manufacturing ergonomics. About system design, the design and development of automatic and flexible manufacturing systems require careful decision-making. In this context, VM can be used for matching physical information and data model domains to define the most proper information system requirements for effective implementation. A set of rules and a knowledge base can be appended to the VM space to remove any inconsistency that could arise between material and information flows during the requirement analysis [13]. Moreover, HF can be integrated to focus on users’ knowledge acquisition and motivation, as proposed by Stadnicka et al. [14], to pave the operators for change within I4.0.

As far as factory ergonomics, VM aims at promoting ergonomics in manufacturing by reducing both mental and physical workload of workers [15]. Simulation allows predicting the workers’ behavior and optimizing their actions and results, preventing work-related musculoskeletal disorder (WRMSD) and guaranteeing the ideal performance of the overall operational sequence. For instance, Caputo et al. [16] recently proposed a preventive ergonomic approach based on immersive VR, where real users are tracked and inserted into the virtual simulation to design the workspace. VR also aims to avoid poor ergonomic processes, with consequent low product quality and high costs for companies due to loss of productivity. However, the reliability of VM simulations is a key factor in this context, and assessment methods are needed to judge the quality of the simulation. A recent study [17] proposed a VR assembly assessment method (called VR2A) to evaluate the overall VR system performance, considering the impact of different factors such as visualization limitations, quality of rendering, and precision of tracking.

In this field, a lot of applications have been developed to support manufacturing system design. Many of them focus on assembly and maintenance purposes, demonstrating the competitive advantages of VR in these two areas. Abdulrahman et al. [18] described how to develop a VM assembly simulation system, from modeling the physical behaviors of the objects, to inserting into the virtual prototype, revealing collisions and enabling interactions with the user. Tang et al. [19] showed how providing virtual instructions can sensibly reduce errors. In this direction, it has also been demonstrated that task completion times were longer when using 2D drawings to train how to assemble water pumps before assembling the real product, in comparison with AR and VR training [20]. Ŝtefánik et al. [5] showed that VR tools could reduce the bottlenecks of the assembly process and improve the initial stages of product development. Additional advantages of VR to specifically design assembly tasks were presented in several studies: reduction of data retrieval times and improvement of ergonomic behaviors in assembly [21], achievement of a more comprehensive skill transfers and reduction in training time [22]. More recently, Simoes et al. [23] developed a proper infrastructure integrating VR and Internet of Things (IoT) to support assembly tasks in hybrid human-machine manufacturing lines, to reduce costs and improve skill transfer. Numfu et al. [24] focused on the application of VR in the design and evaluation of training process related to maintenance activities, with the final aim of identifying the appropriate VR tools to create proper work instructions. In addition, Loiuson et al. [25] applied VR to assess accessibility in assembly scenarios: vibrotactile feedback is adopted to improve physical presence and interaction in the virtual environment in order to validate task feasibility and quality of the operator’s postures. De Giorgio et al. [26] showed the potentiality of VR in human-robot collaboration, using game engines as simulation software to control AR/VR hardware and industrial machines.

The analysis of recent literature showed that nowadays VM has gained a good level of maturity, enabling its use for numerous industrial sectors and becoming a standard tool in the design of different production processes. Digital technologies are widely used and accepted to save costs and time, and obtain more accurate simulations.

The main advantages coming from the adoption of VR from VM are the following:

  • shortened time to market,

  • reduced number of physical prototypes,

  • better knowledge of the manufacturing process,

  • minimization of tooling and manufacturing cost,

  • optimization of product and production quality, and

  • improved ergonomics thanks to the creation of a “friendly” and safe context, where the operator can test the specific working sequence and identify possible critical points in terms of accessibility, reachability, visibility, or perceived comfort.

Such advantages, if properly exploited, can make manufacturing industry more competitive at global level, supporting the shift from mass production to flexible and responsive production. Despite these aspects, there are some limitations to fully integrate VM in industry, mainly: assuring data integrity, providing a proper system training, and integration with other company systems.

The most interesting papers for the research domain are classified according to the design goal, the type of simulation allowed, and the simulation target, as reported in Table 1.

Table 1 Cases of application of VR for VM in industry

3 Virtual manufacturing methodology

3.1 VM approach

With the introduction of digital human modelling tools, larger factories, such as the automotive and aerospace industries, began to integrate such tools into the design procedures, but often they were used only for verification and validation at the end of the design process when the product or process is already completed, and a modification would lead to an increase in production costs. Nowadays, HF analysis is carried out within a minor part of large companies. The standard approach for ergonomic analysis is based on the use of desktop simulation software for DHM such as Siemens Jack, or Dassault Systems Delmia or Ramsis. Such assessment is based on a static analysis of the sequence of task, divided into sub-tasks, using a virtual mannequin, and the consequent extrapolation of critical postures to be evaluated through internationally accepted and certified ergonomic checklists. Also, visibility and reachability assessment are usually carried out. The current approach allows the definition of a limited number of critical postures deemed by the expert performing the analysis, who pose the mannequin to replay the different postures into the virtual scene. Mannequin positioning is the most crucial phase because it depends on the expert knowledge of the process and influenced by the operator’s experience. Moreover, it can be affected by human error. Traditional approach considers only “snapshots” of the most critical phases of the process, and does not take into account the process dynamics, providing a partial evaluation.

The innovative VM approach presented in this paper was defined to overcome the critical aspects of the desktop-based analysis, as reported by Högberg et al. [30]. Indeed, the virtual environment proposed by [30] is desktop-based, involving virtual mannequins, while the new approach proposes to create immersive VR simulation to allow real-time analysis of the entire process with the involvement of real users into the virtual scenario. As a consequence, designers and engineers can carry out design validation both by themselves acting as final users, or directly involving sample users to focus on problems linked to users’ expertise and process knowledge. The sense of immersion created by the VM scenario allows to analyze both product design (e.g., visibility and accessibility to specific parts, time for assembly or disassembly) and process features (e.g., task sequence, factory layout, material handling, and information availability) and it guarantees more precise simulations than using DHM. In addition, tests can be performed by operators with any level of experience, involving also different roles within a company, analyzing the results and how they can be affected by the single user opinion and personal experience.

The new approach consists in creating the virtual scenarios by importing CAD models into Unity 3D and recreating the factory layout. The use of motion capture by Xsens allows to track and record the user movements, while gesture recognition by Leap Motion allows to have inputs with bare hands, for an intuitive and natural interaction. The user has also the possibility to replicate (as accurately as possible) the assembly or maintenance procedure to be analyzed. Simultaneously, the streaming into the DHM software can be activated to export of the most critical postures for further post-processing.

3.2 Metrics for ergonomic assessment

The postures exported by the DHM software are evaluated through internationally recognized ergonomic indices to assess the risk to develop WRMSD, according to the specific analysis to be performed.

The metrics chosen for users’ comfort assessment are as follows:

  1. 1.

    OWAS (Ovako Working posture Analyzing System): it is a synthetic system for postural assessment suitable for different task execution. It considers the sequence of postures assumed by the worker and evaluates each posture according to the position of the back, arms, legs, and to the weight lifted. Each posture is designated by a 4-digit code that depends on the classification of the current posture with respect to predefined levels of danger [31];

  2. 2.

    REBA (Rapid Entire Body Assessment): it evaluates the comfort of a working posture analyzing the position of the trunk, neck, legs, wrist, upper and lower limbs as well as the presence of loads, in more details. It uses a checklist when each body part is carefully evaluated, according to a set of predefined limit positions. A score is assigned to each posture, which indicates the urgency of changing the workstation layout, in order to reduce the risk of potential damage to the operator [32];

  3. 3.

    EAWS (European Assembly Work-Sheet): it is an ergonomic tool for measuring the workload activity in assembly workstation considering a specific work organization (including working times and breaks). The evaluation of working posture with EAWS index considers many aspects such as the geometric figure of workload and workspace, the different sequence of operator’s movements, the working tools (if there are), the workloads and the postures of activity. In particular, it is a holistic system (full coverage of all risk areas) and it provides detailed results in four sections: body postures, action forces, manual materials handling, and upper limbs [33].

Such methods are widely recognized, making an objective identification and measurement of incongruous postures, and are directly linked to the workers’ well-being and, consequently, the overall system performance and company productivity.

3.3 The proposed methodology

The purpose of the new methodology is to create VM simulations to carry out reliable and effective factory ergonomics analyses during the design stages. It uses several VR technologies to create immersive virtual simulations, integrating motion capture and gesture recognition to replicate a faithful user experience within a realistic factory layout. This allows reliable human simulations and detailed process analysis at the early stage of product development to prevent design errors.

The VM simulation involves the use of the following tools:

  • Unity3D: complete development environment for the creation of interactive 3D content; it provides all that is necessary for virtual scene design, such as a Game Engine and a development interface;

  • Siemens Tecnomatix® Jack: tool for DHM that allows to replicate the operators’ activities through virtual mannequins. Activities focused on human operators can be analyzed with realistic and scalable human models (according to various characteristics of the population) and different ergonomics indexes;

  • Xsens®: full-body human motion capture system that uses MEMS (Micro Electro-Mechanical Systems) sensors based on inertia, extremely accurate in detecting orientation, and with anti-magnetic distortion filters that make them particularly accurate and robust;

  • Leap Motion: innovative system of HCI (Human-Computer Interaction) centered for hands tracking (even the fingers position), based on the recognition of human gestures through mathematical algorithms;

  • HTC Vive pro eye: wearable HMD equipped with two Fresnel’s lenses, not completely circular to regulate the IPD (Interpupillary Distance). The HMD is made up of 32 infrared sensors for the 360-degree tracking, a gyroscope, an accelerometer, and a laser position sensor, making it a tracker with 6 DOF. It is also equipped with an integrated eye tracking system for eye data collection. Steam VR is used for HTC Vive control.

The immersive simulation set-up has been realized by the following software architecture: Unity 3D, Jack, Xsens MVN Analyze, Leap Motion Controller, and Steam VR are installed on the same workstation. During the simulation, Xsens software streams in Unity 3D in order to virtually carry out the procedure to be analyzed, while, at the same time, streaming must take place in Jack for postures export and consequent evaluation of the ergonomic indexes. From the hardware viewpoint, MVN Awinda version of Xsens uses a set of 17 inertial sensors for full-body human tracking, worn by the operator in specific segments of the body through adjustable straps. Leap Motion sensor is placed on the center of HTC Vive with a specific support. In order to delimit a physical perimeter that must be transformed into a virtual space in which the user can move freely, the Vive kit is completed with two base stations and two controllers, in addition to the 3D visor. The two base stations are positioned diametrically opposite, in order to guarantee a 360-degree tracking of HMD and controllers.

The process to create the VM simulation requires several phases, listed below:

  • Phase 1. Creation of the virtual scene in Unity3D: import the 3D CAD models to recreate the workstation layout, identification of “movable” objects and “interaction” paths to use in the simulation, assignment of Components to these movable objects in order to define the behavior of the Gameobject in the scene;

  • Phase 2. Setup of the Leap Motion controller: import of the preset Leap Motion packages to make real user interact with virtual scene using bare hands;

  • Phase 3. Setup of Xsens motion capture: the user wears the Xsens sensors and the preset Xsens packages are imported in order to choose the most proper user’s avatar for full-body interaction, and proper system calibration;

  • Phase 4. Creation of specific scripts allow the merging of the two points of view;

  • Phase 5. Simulation: the user wears the HMD and performs the task sequence immersed in the virtual scene. At the same time, the motion capture system records the user movements while is streaming in Unity3D;

  • Phase 6. Export to Jack: after simulation, the most critical operations and postures are isolated and the posture file is exported through the Xsens streaming into Jack software;

  • Phase 7. Ergonomic evaluation: postures files from Jack are properly post-processed in order to obtain the abovementioned metrics for ergonomic analysis.

Figure 1 shows the workflow to create the VM simulation, with relation to the abovementioned phases. Figure 2 represents how the selected tools are used in practice.

Fig. 1
figure 1

The VM simulation workflow according to the proposed approach

Fig. 2
figure 2

The VM hardware set-up

4 Validation on industrial cases

The industrial case study has been developed in collaboration with CNH Industrial, a global leader in design and manufacturing of agricultural machines, buses, and trucks that employs more than 64,000 people in 66 manufacturing plants and 54 R&D centers in 180 countries. In particular, the collaboration was developed within the San Matteo plant, located in Modena, Italy. It is the most relevant R&D unit in the field of tractors in Europe, using the most advanced technologies for design and engineering purposes. Thanks to their support, a variety of case studies have been analyzed; the results of simulations will be explained in the next paragraph.

4.1 Use cases

The proposed VM methodology has been validated on ten use cases. In particular, the research focuses on the two most relevant cases, in order to provide an example of the adoption of the proposed method and synthetically summarize how the obtained results have been used. Focusing on ten cases risks to be repetitive and too long. Diversely, two cases can better demonstrate the pethood flexibility to different application scenario. The two selected cases are about different assembly activities, well representing the variety of assembly tasks :

  • Use case no. 1: assembly of the SCR (Selective Catalytic Reduction) on medium-size tractors;

  • Use case no. 2: the mount of electric cables and brakes pipes on tractor with ROPS (Roll Over Protection System) arc on small-size tractors.

The first use case refers to the assembly of the SCR (Selective Catalytic Reduction) (Fig. 3). The tasks sequence of this assembly is detailed below in 20 steps:

  • Step 1: Insert the alignment gauge in order to block the SCR orientation with the SCR support;

    Fig. 3
    figure 3

    SCR system location in medium-size tractors (left) and brakes pipes and electric cables assembly on small-size tractors (right)

  • Step 2–3: Rotate left and right wheels to lock the SCR position;

  • Step 4–9: Insert the three bushing in the holes of main support bracket (LH side) and tight bushing with help of Allen key and torqueing the bushing 0.5 kg-m with help of torque wrench;

  • Step 10: Pick the SCR mounting bracket and fix it with two washers and bolts;

  • Step 11: Assemble the support bracket SCR pipe with main support bracket with help of bolts and washers and tight the bolts with help of socket or pneumatic tool;

  • Step 12: Assemble the bracket assy side panel LH with bracket hinge mounting with help of two bolts and then tight bolts with help of socket or pneumatic gun;

  • Step 13–14: Assemble the bracket assy hoses top support with bracket hinge mounting with help of two bolts flange and then tight bolts with help of socket or pneumatic gun;

  • Step 15–16: Assemble the bracket assy ATS sensor with bracket hinge mounting with help of two bolts flange and then tight bolts with help of socket or pneumatic gun and torqueing the bolts with (27–36.5) Nm;

  • Step 17: Assemble the bracket assy hoses support with bracket hinge mounting with help of two bolts and washers;

  • Step 18–19: Assemble the support bracket (gas strut holding) with bracket main support with help of bolts and washers and then tight bolts with help of socket or pneumatic gun;

  • Step 20: Pick the isolator ECU mounting rubber and isolator rubber, first insert the rubber in the isolator ECU, then tight both the nuts with help of socket or battery gun torqueing both the nuts (1.1–1.5 kg m) with help of torque wrench.

According to the selected ergonomics metrics, the use cases highlighted that the operator should be constantly curved, in an awkward and risky position, twisted and bent over the SCR support throughout the assembly phase (Fig. 4). Figure 5 shows an example of critical task. Therefore, the need to design a higher SCR assembly support and to re-design some assembly sequence tasks emerged.

Fig. 4
figure 4

The use cases about the use case no.1 (SCR assembly): task n.1 (top) and task n.20 (down)

Fig. 5
figure 5

Example of critical task in use case no.1 (SCR assembly)

The second use case deals with the mount of electric cables and brakes pipes on tractor with ROPS arc; it is a particularly difficult operation because the operator must work constantly under the cabin, with his arms in uncomfortable postures and the neck bent. The tasks sequence of this assembly is detailed below in 21 steps:

  • Step 1: Screw the ring on the frame using a screwdriver;

  • Step 2–3: Insert four cables in the frame hatch and the front cable on the firewall hatch;

  • Step 4–5: Fix the steering column and the fuse box;

  • Step 6: Fix the hatch frame cover;

  • Step 7: Fix he cables on the dashboard with cable ties;

  • Step 8: Assemble the LOM lever and Bowden cables;

  • Step 9: Lift the cabin frame;

  • Step 10: Insert the bowden cable into the frame hatch;

  • Step 11–12: Assemble gas pedal (manual screwing of bolts and final screwing with screwdriver)

  • Step 13–14: Insert and fix two brake pipes in the frame hatch;

  • Step 15–16: Mount the electric cables;

  • Step 17: Mount LOM bowden;

  • Step 18–20: Mount cables on the right and left side of the ROPS arc

  • Step 21: Fix the electric transmission cable with connectors

In this use case, two different assembly sequences were compared using two alternative cabin supports. The current assembly procedure is carried out using a cabin lifter, and the operator is forced to put himself under the cabin with a sliding chair. The new procedure has been optimized by using a cabin tilter, able to rotate the cabin of 90°, allowing the operator to assemble it in a standing position and with his arms in a more comfortable position (below shoulder height), differently from the current one. Figure 6 shows the current and the optimized activity.

Fig. 6
figure 6

The real physical operator assembling cables under the cabin using a cabin lifter (left) and the virtual optimized assembly supported by the cabin tilter (right) for the use case no.2

Table 2 presents the comparison between the two cases, respectively using the cabin lifter (current) and tilter (optimized). Results showed that the VM analysis can carefully replicate the real users’ actions and highlight how the use of tilter or lifter can impact on the user comfort. As a result, the use of tilter can assure a more comfortable assembly of the electric cables and brake pipes in line. In detail, the analysis showed that the arms’ position is less risky when the cabin is rotate; the postures assumed by the user are less stressful reducing the risk of musculoskeletal disorders. This result is clearly summed up by the final EAWS index.

Table 2 Comparison between the two support tools for use case no.2 (cabin lifter vs cabin tilter). Colors refer to the ergonomic risk related to each task (green = no risk, yellow = low risk, red = high risk)

4.2 Comparison with digital manufacturing methods

The two methodologies (desktop-based and VM) were compared according to a set of Performance Indicators (PIs), measured during simulation creation and user testing, such as:

  • Time for virtual scenario creation (hours);

  • Time for task sequence preparation (hours);

  • Time for task simulation (with virtual mannequins or real users) (hours);

  • Time for data post-processing (hours);

  • Time to perform the ergonomic analysis by the selected metrics (hours);

  • Time to adjust the postures to achieve the necessary accuracy (hours);

  • Accuracy of the ergonomic analysis as compared with on-field assessment, according to a set of metrics related to the most suitable methods for postural ergonomic assessment in assembly tasks. Selected metrics are:

    • OWAS, for a synthetic assessment of the postures assumed by the worker [31];

    • REBA, for a more detailed analysis of the postured assumed by the worker [32];

    • EAWS, for a global assessment of the work cycle considering the specific work organization, times, and breaks [33].

About the use case no.1 (SCR assembly), the creation of the virtual environment with the VM approach provided an ergonomic evaluation of the entire task sequence with REBA, OWAS, and EAWS metrics on 20 postures extracted as critical from the entire assembly sequence, by streaming Xsens into the Jack environment. On the other hand, these 20 postures were manually recreated in Jack in order to compare the VM ergonomic analysis with the desktop-based DHM ergonomic analysis. Table 3 synthetizes the comparison between desktop-based and VM methodologies for the selected PIs. As a conclusion, the VM approach requires a longer preparation phase (+30% on average), but it drastically reduces the time necessary for task simulation, data post-processing and performing the ergonomic analysis. We can say that the preparation effort is reasonably justified for simulation of complex sequences (more than 10 steps), while for easier task sequence (4–5 steps) DMH is the best option. However, the process knowledge of the expert executing the digital simulation is crucial to have reliable results.

Table 3 Comparison between the two methodologies (Desktop-based and VM) for use case no.1

The on-field assessment has been carried out by replicating the assembly workstation in Lab and involving students for simulation, after an ad-hoc training. Five students were involved after a proper training of 30 min about the assembly procedure. Ergonomic analyses have been carried out by a HF expert in a traditionally way, observing users at work, video-recording them, and measuring the risks by adopting the checklists and interviewing people. Table 4 shows the results obtained between the two simulation approaches and the on-field assessment, considering the selected metrics.

Table 4 Ergonomic analysis accuracy between VM and desktop-based approaches for use case no.1

Experimental results highlighted that data obtained in the VM simulations are more close to the on-field assessment, providing a more realistic simulation. The variation between the desktop-based and VM methods is mainly due to the limited process knowledge of the expert performing the desktop-based analysis, and the lack of specific rules for positioning the mannequins during task simulation, which can cause unreal postures. The VM approach is able to overcome these issues by streaming the real position of the user. Finally, it is worth to consider that such a comparison is limited to the selected metrics. However, the combination of the three metrics, each of them with a different ergonomic scope, can guarantee a well-balanced evaluation.

5 Comparison with the mainstream research and discussion

The proposed approach has been compared with the mainstream research in the field of VR-based simulation for human-centric manufacturing. The comparison included the following items:

  • XR dimension, distinguishing between VR and AR (Augmented Reality);

  • Technologies used, considering hardware (HW) and software (SW) tools;

  • Applications proposed;

  • Level of interaction achieved by the simulation, considering the different senses involved (e.g., vision, touch, motion) and the depth of the analysis (e.g., type of immersion);

  • Level of realism, evaluated by the resemblance to reality, according to the classification proposed by Numfu et al. [24] based on 4 levels (1 = not perceived, 2 = a bit, 3 = close to reality, 4 = equal to reality);

  • Type of analyzed data, synthetizing the type of data that can be analyzed form the simulation.

Results of the comparison are described in Table 5.

Table 5 Comparison between different VM-based approaches

From the comparison, it can be stated the presented work adopts a similar approach with respect to the mainstream literature in the field of VR for manufacturing. It uses a widely tested software toolkit as VR engine (i.e., Unity3D), also adopted by other studies. The level of realism and interaction generated are high, as achieved only by few recent studies. However, it contains some interesting novelties:

  • The variety of applications, from workstation design to assembly assessment, ergonomics analysis and also training. This aspect highlights the flexibility of the proposed approach and the possibility of re-use scenarios and environments for multiple scopes, that is an important aspect for companies;

  • The holistic perspective and the richness of the interaction provided, stimulating the sense of vision, hearing, motion, and touch in a synchronized way. In particular, it provides a total immersion of the user into the virtual world, diversely to other proposed researches. Other approaches are, indeed, limited to fewer aspects or consider them separately, taking advantage mainly of only one or two coordinated VR devices;

  • The variety of the analyzed data, merging multi-parameter ergonomics analysis based on several indices, performance indicators, and also eye data useful for the analysis of the users’ workload;

  • The integration between Tecnomatix Jack software and Xsens motion capture in order to overcome the limitations of static ergonomics evaluation as highlighted by Högberg et al. [31], and to carry out a complex and dynamic assessment during task execution;

  • The higher precision of the human-related data, with respect to the majority of the other approaches, thanks to the adoption of Xsens motion capture system and the following data post-processing. The proposed approach allowed to detect ergonomic issues in a very precise way and to more accurately investigate interaction problems. The higher level of accuracy overcomes the traditional desktop-based solution, obviously for complex cases which are usually analyzed with digital simulation;

  • The ease of integration with company systems, since VR simulation is synchronized with a commercial digital human simulation (Tecnomatix Jack), diffused within companies. This fact simplifies the introduction within companies’ departments and integration with existing tools. This aspect does not emerge from Table 2 but is a vital importance to judge the quality of the approach;

  • High flexibility, since the proposed ergonomic analysis could be adapted to several company requests matching to different evaluation standard metrics.

The proposed approach also some limitations, related to the complexity of the system set-up and consequently the integration issues to properly synchronize the different devices and the relative costs. Moreover, the set-up could be enhanced to improve also a tactile feedback, to improve the level of interaction and the sense of realism. However, the complexity will further increase. About applications, the examined case study validated the adoption of the proposed approach to support workstation design, but it does not deepen the application to operator training, promoting “learn by doing” and build-up of skills thanks to the introduction of additional contents in virtual scenes, also in AR modality.

6 Conclusions

The paper investigated the application of VR to manufacturing to support the design of human-centric workstations, considering both process efficiency and factory ergonomics. In particular, it proposed a methodology to create an immersive and interactive VM environment, where operators can be directly involved, using a proper combination of VR tools. The aim is to replicate and anticipate the real conditions during design stages to optimize both product and process design. A VM procedure has been defined to create an immersive virtual simulation using Unity 3D as main VR engine, a HMD to immerse the user into the virtual world, a motion capture system to track the user into the scene and a gesture recognition system to make the user work with their bare hands. The VM procedure was validated on industrial cases focusing on the design of assembly lines; results are synthetized presenting the two most representative cases about tractor assembly, with average complexity (20–21 tasks). The VM procedure was then compared with desktop-based digital simulations and more traditional ergonomic assessment, based on expert direct observation. The comparison between VM simulation and traditional desktop-based digital simulations highlighted the strengths and weaknesses after 1-year validation on the two cases. Results showed that the new methodology based on VM is more powerful to predict process criticalities thanks to the direct user feedback simulating the specific tasks, and allows a more precise ergonomic analysis thanks to the real-time detection of human body joint angles. Such results supported the optimization of the workstation layout design; at the same time, it is less sensitive to errors during ergonomic assessment related to the expert’s subjectivity during the analysis. On the contrary, the creation of the Unity3D simulation requires a bigger effort, which can be rewarded for complex and long sequence simulations, not for simple use cases with few postures to analyze. Vice versa, the application of the desktop-based digital methodology of the mannequins is extremely time-consuming for long cases, with a great number of postures to replicate.

Finally, the novelties of the proposed approach have been highlighted in comparison with the mainstream research in VR-based simulation for manufacturing purposes. The novelties of the proposed approach have been highlighted in comparison with the mainstream research in VR-based simulation for manufacturing system design. Future works will focus on the addition of multi-sensory stimulation and validated the proposed approach on other application cases, about training of operators and support to maintenance service.