Keywords

1 Introduction

This research proposed a conceptual framework of human-robot collaboration (HRC) system for semi-automated process to produce electric motors. This process consists of automated machines, a handling robot, and a human worker. The automated machines conduct processing operations such as cut off, drilling, facing and etc. The robot assists the human worker in the material handling operation. The robot picks up the parts from each automated machine and locates on the pallets, then transfers the partial processed parts to the assembly section which performed by a human worker. In the current practice, there are no interaction between the robot and human worker on the part information. Human worker does not receive information of the type of parts that are picked up by the robot in advanced, i.e. before the part arrives at assembly section, especially when changeover of part model. Hence, delay occurred when the human worker needs to measure and identify the part.

In this paper, machine vision technology is applied to identify the part in collaboration process. When image processing through machine vision is completed, accurate work order and part information corresponding to the extracted image should be defined. We defined the work order and part information using Part-flow based Manufacturing Process Modeling (PMPM) which can express process flow and part flow at the same time [1]. In order to transmit the tracked data to the worker immediately, we used the augmented reality (AR) device to transmit work order information and product specification information. This study proposes a conceptual framework of HCR system using a process model and AR interface.

2 Literature Reviews

2.1 Human-Robot Collaboration in Assembly Operation

Human-robot collaboration is one of new trends in the field of industrial robots as a part of strategy Industry 4.0. Full automation has limitation in achieving the required production demand due to the high complexity in process and flexibility of equipment to cope with the rapid response. Hence, HRC which has advantage to maximize the flexibility in production became one of the important trends in current industrial application. Human-robot collaboration is a system in which a robot and a human are collaborating on a same task which combines benefits from both human and robot in a joint task [2]. HRC system plays an important role to increase flexibility to cope with wide variety of products and the frequent changeover in equipment to cope with the fluctuating demand [3]. In HRC workstation, the collaborative robot must have the capability to sense the existence of human, i.e. with appropriate safety mechanism that defines the ability to collaborate with human without safety cage.

Machine Vision for Assembly.

Machine vision system consists of image capture device, lighting and computer with image processing software [4]. Machine vision systems enable flexibility in automation by providing the machine capability of “see” and think. From the existing researches, the main application of machine vision system is quality monitoring for wide range of industries such as electromechanical parts [5], textile manufacturing [6], pharmaceutical [7], food packaging [8], steel making industry [9] and etc. Teck et al. [10] integrated machine vision system into automation systems to provide product data which assists the decision making of the production system. Gao et al. [11] developed machine vision to track and calibrate the coordinates of fast moving battery lid in automated sealing rings assembly system. The application of machine vision with the robot in the assembly system increased the efficiency and robot adaptability to the environmental changes and enable real-time adjustment of robot motion.

AR for Assembly.

AR is a technology derived from a field of VR, which refers to a computer graphics technique that superimposes virtual contents created in 3D in the real world of the user [12]. Since the augmented reality technology models and matches only the required virtual objects based on the real-time image of the actual manufacturing environment, the construction cost and time for 3D models necessary for implementing a new manufacturing system can be drastically reduced.

Kollatsch et al. [13] discussed about an AR based application for mobile devices realizing a user-friendly and problem-oriented visualization of information. However, this research has limitations in simply suggesting the AR-based concept introduced for visualizing the process value of mobile devices. Further, Wang et al. [14] presented about a novel human Cognition-based interactive Augmented Reality Assembly Guidance System is proposed to investigate how AR can provide various modalities of guidance to assembly operators for different phases of user cognition process during assembly tasks. In addition, Danielsson et al. [15] implemented an environment in which an untrained worker using AR in a human-robot collaboration environment. This research, however, has a problem that various errors occur during assembly due to misunderstandings of the instructions of the test person.

Although many studies have been conducted to utilize the strength of the AR technology in the manufacturing system implementation, there is a lack of an AR system which can be properly utilized in the manufacturing system.

2.2 Part-Flow Based Manufacturing Process Modeling: PMPM

PMPM is a modeling tool for visualizing and visualizing man (worker), machine, material (parts), and method (activity) of collaborative manufacturing process. It can clearly define the order of the manufacturing operations that make up the process and its execution objects. This method consists of 6 notations for activity expression and 10 notations for part expression. It can provide functions to record parts’ history and to manage parts status by using part notations. and this method can define collaborative activities in each process. The flow of parts and the flow of work can be determined identically or independently. It is possible to define changes in characteristics such as changes in the number of management units of parts and merging of parts. It also allows a clear definition of objects using manufacturing facilities. It is manufacturing process-oriented modeling methodology. In this paper, we used PMPM to store process information and track part information.

3 Process Model Based HRC System

3.1 System Framework

Figure 1 shows data flow diagram of the proposed process model based HRC system. Electronic motor manufacturing process consists of four automated workstations which perform coil forming and cap machining and assembly of electric motor by human worker. Two types of material handling methods are used in the process: (1) using conveyor to transfer parts between automated workstations; (2) a robot collects the partial complete components from these workstations on a pallet and transfers to assembly workstation. PMPM modeling methodology is used in this study to build the digital model for this process. The PMPM model stores information about work order, parts flow, and task performing objects. When the part arrives at the assembly process, the operator recognizes the part through the AR device. In this paper, the machine vision is integrated in AR interface to identify the parts in the assembly model. After the parts are identified, the work order such as assembly sequence, parts required and production quantity corresponds to the identified part is extracted from the PMPM model and transmitted to the AR interface. The PMPM modeling simulates the start time of each workstation in the order of minimizing the production cycle time. The coordination of processing time among the workstations and the material handling time by robot i.e. part pick-up time at each automated workstation and delivery time to human assembly workstation by robot is important to ensure the smooth flow and avoid stacking of part and idle time at each station especially the human assembly workstation. Hence, the real-time production status at the human assembly workstation has to be sent to the robot via AR interface to “call” for parts and “stop” input of parts. Besides, this method also can reduce overproduction at each automated workstation and part shortage with the updated finished parts quantity.

Fig. 1.
figure 1

Data flow of system framework.

3.2 Implementation

PMPM Modeling.

The first step is to model the electronic motor assembly process using PMPM. Figure 2 shows the result for modeling the target process. Rectangles are the tasks performed by automated machines. Rounded rectangles and octagons are the collaborative works of the robot and the worker. As shown in Fig. 2, the type and number information of the input and output parts are defined in the model. Therefore, this model plays a role as an important data engine that has main information of HRC system and is used to track part information and verify image processing results. From PMPM modeling, the human worker who performs the assembly is updated with the real-time type of model and quantity during the production in each automated workstation via the AR device. Hence, PMPM modeling acts as communication interface between the automated workstation and human worker who performs the assembly.

Fig. 2.
figure 2

PMPM model for electric motors.

Image Processing.

The electric motor models are differentiated according the size of case and the size of cap. The dimension of cap and case are shown in Table 1. Since the outer shapes of the parts are similar, there are possibility for human worker assemble the wrong combination. The measurement using Vernier caliper every time before the assembly process reduce the productivity of the operation. Hence, part measuring using machine vision system is proposed to increase the accuracy of part identification and increase the productivity of the assembly process. In order to achieve these, the vision system must be capable to acquire the image for part detection as close as human vision [16]. The steps in image processing involved to identify cap and measure the dimension of the cap arrived at the assembly section are shown in Fig. 3. First, the image is captured by the Augmented Reality interface and send to MATLAB image processing software. Image processing is performed using image processing toolbox in MATLAB. The image processing starts with converting the image acquired to gray scale image. Then, the image is converted to binary image for edge detection using Canny method. Since the cap is in round shape, the ‘imfindcircles’ function is used to identify the shape with defined radius range. Finally, the identified cap is marked and the radius is measured in pixels (see Fig. 4). The algorithm needs further improvement to remove the shadow of the image captured and to provide measurement in centimeters.

Table 1. Types of raw material and measurement for processing.
Fig. 3.
figure 3

Image processing for identify and measure cap.

Fig. 4.
figure 4

Visualization of image processing steps.

AR Interface Development.

In this study, app development environment is built. Based on Android, an app is developed that uses an image target and a virtual button which are linked to a tracker by camera rendering by using Vuforia and Unity. The developed application visualizes actual factory data and pre-analyzed simulation data to the manager and visualizes them to the human worker in real time through the AR device. Managers can make manufacturability decisions and verify production plans through pre-simulation. In addition, real-time log data preprocessing and data exchange environment can be constructed by data linkage between log data from manufacturing facilities and MES.

To build an AR interface, we used the Vuforia module by using the Unity engine. Unity is a virtual/augmented reality production tool which has the advantage of creating mobile AR programs regardless of iOS or Android. This is expected to be effective in the manufacturing field where various AR equipment are combined. The user can monitor real-time information of the identified part. The information is predefined and can be freely set according to the characteristics of the operator and each part.

Figure 5 shows the pilot program screen of the AR interface. It provides the assembly recipes and process information stored in the database to the operator using the AR through the part information verified by image processing. The developed AR application allows process values to be stored and analyzed to show process-related information to the operator or administrator. Thus, the user does not need extensive knowledge of the machine and the control device, and the production process is not disturbed. The collected data can be displayed to the user in real time or can be evaluated later. Process data is stored in the database and can be retrieved from other applications. The traditional monitoring system provides operators with a simple signal on the designated board, but the AR based interface provides the operator with real-time status of the plant and makes the work more efficient. The operator can virtually overlap the plant parameters and process state at the actual location of the sensor/equipment/actuator. It can also visualize information about areas that are generally inaccessible or dangerous.

Fig. 5.
figure 5

AR interface pilot program.

4 Conclusion

In this paper, we proposed process model based HRC system using AR interface. We defined the work order, parts flow, and collaboration information of the electronic motor assembly process using the part-flow based manufacturing process modeling (PMPM). The PMPM model is used to act as an interface between human worker, automated processing machines and material handling robot through the identification of parts and corresponding work sequences. Hence, integration of AR devices and image processing is needed to capture parts information efficiently. Image processing technique used to extract features of component from the image captured using AR device and feedback the result of part identification to human workers via AR device. The proposed system is expected to reduce human error in part model identification and work sequence. Furthermore, production flexibility that able to cope with frequent model changeover can be achieved. The main contributions of this framework are the data management using process model and the use of image processing to increase the part recognition of AR. However, since this study is an early stage, it is necessary to verify the proposed framework and define practical problems through application.