Keywords

1 Introduction

A digital twin [1, 2] is a digital duplicate of a physical entity whose base is infrastructure and data and at its core are algorithms and models and applications software and services. This technology along with robotics has been growing rapidly in recent years and is considered by research and industry to be the main driving factors for Industry 4.0. Under the development of artificial intelligence (AI) [3,4,5,6,7], robotics [8,9,10,11] has made breakthrough developments in the capabilities of robots such as performing complex jobs with high danger, taking care of health, cooperation with people, and more. The combination of digital twin and robot applying new technologies and algorithms will be the foundation for building and developing smart manufacturing. The trends of using AI to analyze data and make prediction are being special care.

The Robot Operating System (ROS) platform is a collection of tool software libraries that help build robotic applications with source code [12]. The control algorithms tend to apply ML to increase performance and optimization process. In this study, ROS is used as middleware to connect the virtual and real parts and makes it possible to apply platforms like motion planning [13,14,15,16], autonomous mobile robotics [17, 18].

A digital twin provides various features such as visualization, simulation with real data, and performance monitoring. To do that, the amount of data collected must be stored. Firebase is a cloud-based database service, accompanied by an extremely powerful server system of Google. Firebase provides capabilities such as analytics, databases, activity reports, and error reporting for easy development. It has versatile services and trusted security. Therefore, the data in this project, namely the joint values and the execution time, are all stored using Firebase.

The main contribution of this paper includes data storage cloud which are the limitations of the recent research papers [19,20,21,22]. In addition, we build virtual environment, calculated the kinematics, assure real time two-way data transmission between virtual and analyze and evaluate the obtained data.

In this paper, we divide into five main sections. We first build a virtual model and a virtual environment for the robot. Second, we build the real part system for the robot. Third, we connect and transfer data between the real and the virtual models. Fourth, we calculate the kinematics for the robot. Finally, the data will be transmitted in both directions to control, evaluate errors, and store the data on both local storage and the cloud.

2 Related Works

2.1 Proposed Framework

We have built a digital duplicate for a robot architecture. It is divided into two components: virtual and real parts. The virtual element is a software built on the Unity platform. For the real part, we use two servomotors with built-in encoder to represent the two joints of the robot. These two motors are controlled via a PID controller. In addition, we also use Jetson Nano with integrated ROS as a central processor. The virtual element and the real part are connected to each other through the TCP socket protocol suite.

Jetson Nano is a small but very powerful computer that allows to run multiple neural networks in parallel for applications such as image classification, object detection, segmentation, and speech processing. Being in one platform, it is easy to use and consumes less than 5 W. The Jetson Nano also delivers 472 GFLOPS to run modern AI algorithms quickly, with a 64-bit ARM quad-core CPU, an onboard 128-core NVIDIA GPU, as well as 4 GB of LPDDR4 memory. It is possible to run multiple neural networks in parallel and handle several high-resolution sensors simultaneously. Here, users can control from the virtual model to real model and vice versa. First, we will present the control process from virtual to real model. When the user changes the robot’s joint position via the Unity controller, the matching angle data will be sent through the central processor. The central processor will calculate the forward kinematics to give the position of the end of the manipulator. Next, the matching angles will be controlled by the Virtual robot model, and the data is also given to the user interface to display the joint angle values and the coordinates of the robot’s manipulator end point. After that, the data will be sent to the Physical robot to control the real model through the TCP socket protocol suite.

In this study, we take two servomotors representing two joints of the robot. The data will be sent to the central processor and then sent to the controller to control the servomotor. Then the controller will take the position value of the servomotor and send it to the central processor. The central processor will process the data through the aforementioned formula and calculate the error between the matching angles of the real and imaginary model. The data will then be stored on Cloud and sent back to Unity which calculates the delay and then saves the data including the error, the position of the real and virtual model’s joints, and the delay to local storage. For the control process from real to virtual object, it will reverse the above process and start from the controller at ROS. In this process, the Unity side controller will be ignored and the robot’s matching angle values will be fed directly into the Virtual robot model through Unity’s central processor. Figure 1 is the architecture of the system.

Fig. 1
figure 1

System overview. The system consists of two parts: real and virtual models. On the left is a virtual element system built on the Unity platform. On the right is the real part system

2.2 Virtual Environment

We use the Unity platform to build and develop virtual environments. Unity is software that specializes in games, so building a virtual environment is almost easy. In addition, Unity also supports virtual reality and augmented reality for future development of the system. To build a virtual environment on Unity, we used available tools and some other graphics software like Blender. After building the objects of the virtual environment, we started to set up features such as data communication, kinematics, and rotation controller for the robot. In order to build a robot model and put it into a virtual environment, we first downloaded the details of the ABB IRB 120 robot model from the ABB homepage. Second, we edit and stitch the parts together into a robot ABB IRB 120. Third, we set the coordinate axes for each joint of the robot model. Fourth, we set the parameters of the model. Fifth, we exported the model as a URDF file and stored it in the directory where the project is located. Finally, we bring that model into the Unity virtual environment.

In the virtual environment with the Virtual robot model, the central processor will process the data, store the data in local memory, and transmit the data to the real model, meanwhile the controller helps the user to change the angles. This virtual environment allows the data visualization, monitoring, and control. Figure 2 shows the virtual environment built.

Fig. 2
figure 2

Virtual environment in Unity includes robot model, robot status panel, and control panel

3 Approach: Digital Twin of Joint Angle of Robot Arm

3.1 Compare Inverse Kinematics Between the Construction Algorithm and Robot Studio

After building the inverse kinematics algorithm, we compare the inverse kinematics with Unity, MATLAB, and Robot Studio. We found that the coordinates of Unity, the algorithm we built, and Robot Studio were interchanged. The joints values have very low errors. These errors are due to the rounding process in the algorithm's calculation. The results showed that our inverse kinematic calculation is valid within very acceptable limits of error. Table 1 is the test value obtained after testing.

Table 1 Evaluation of inverse kinematics

3.2 Build a PID Controller

A PID controller is built for the purpose of controlling the motor rotation angle. Planetary GP36 Planetary Reducer DC Servomotor with 1:27 deceleration ratio with 145 rpm integrated Optical Encoder 500 CPR (count per round) and two channels A–B has been used. From here, we get the formula to convert between pulse count to angle with angle being the current angle and p being the count of pulses, c is the number of pulses of the rising or falling edge when the motor is turn 1 revolution, and r is rate of motor reducer:

$$ \alpha \left( x \right) = \frac{p}{{{\text{c}} \times {\text{r}}}} \times 360 $$
(1)
$$ = \frac{p}{500 \times 27} \times 360 = \frac{p}{37.5} $$
(2)

The PID controller for the motor includes three parameters: Kp; Ki; and Kd. The above parameters are determined by the method of manual adjustment through experiment, respectively, Kp = 0.01; Ki = 0.002; and Kd = 0.001. The built-in PID controller is used to control the rotation position of the motor with the feedback signal being the number of pulses of the encoder. Through the PID algorithm, it will determine the voltage to be supplied to the motor (the desired voltage value). U(V) is the value of voltage to be supplied to the motor is shown by the formula:

$$ U = K_{p} \times e + K_{i} \times \left( {e_{{{\text{pre}}}} + e \times \left( {t - t_{{{\text{pre}}}} } \right)} \right) + \frac{{K_{d} \times \left( {e - e_{{{\text{pre}}}} } \right)}}{{t - t_{{{\text{pre}}}} }} $$
(3)

In which, Kp, Ki, Kd, respectively, are the PID coefficients declared above. t is the current sampling time, and tpre is the immediate previous sampling time. t and tpre have units of seconds (s). e is the error between the desired motor angle (p) and the current motor angle (pcur). epre is the number of times before.

$$ e = p - p_{{{\text{cur}}}} $$
(4)

Since the power supply for the motor driver (UPow) is 12 V source and the microcontroller is 8-bit, the formula for converting voltage to control pulse (Pwm) is

$$ {\text{Pwm}} = \frac{U \times 255}{{U_{{{\text{Pow}}}} }} = \frac{U \times 255}{{12}} = U \times 21.25 $$
(5)

3.3 Two-Way Data Transfer Between Real and Virtual Models

Here, we install ROS on a Jetson Nano and set up nodes and topics to transmit and receive data between Arduino and ROS and between ROS and Unity. ROS is an intermediary that supports sending and receiving data between Arduino and Unity. We use TCP socket protocol for two-way data transmission between ROS and Unity and UART for two-way data transmission between ROS and Arduino.

A TCP end point running as a ROS node facilitates message passing to and from Unity and ROS. The message being passed between Unity and ROS is expected to be serialized as ROS would internally serialize them.

4 Experiences and Results

4.1 Compare the Forward Kinematics Between the Construction Algorithm and Robot Studio

After building the forward kinematics algorithm, we compare the forward kinematics with Unity and Robot Studio. We found that the coordinates of Unity, the algorithm we built, and Robot Studio were interchanged. Here, Unity’s X-axis is the Y-axis of our algorithm and Robot Studio’s X-axis. Unity's Y-axis is the Z-axis of the algorithm we built and Robot Studio's Z-axis. Unity’s Z-axis is the X-axis of the algorithm we built and the Y-axis of Robot Studio. The obtained values have very low errors. These errors are due to the rounding process in the algorithm’s calculation. That proved our algorithm can accept. Table 2 is the test value obtained after testing.

Table 2 Evaluation of forward kinematics

4.2 Error and Latency

The error between the virtual model and the real model is calculated by the central processing unit of Physical robot. We take the value of the virtual model matching angles minus the value of the real model matching angles. Here, Em is the value of the joint virtual model and En is the value of the joint physical model with m and n are from 1 to the last retrieved data stored at the system. From there, we calculate the ΔE(t) of the system using the formula:

$$ {\Delta }{\mathbb{E}}\left( t \right) = {\mathbb{E}}_{m} - {\mathbb{E}}_{n} $$
(6)

To calculate the delay, we took the time of data sent from Virtual robot to Physical robot set as t1 and time of data sent from Physical robot to Virtual robot set as t2. From there, we calculate the Δt(s) (latency) of the system using the formula:

$$ {\Delta }t\left( s \right) = t_{2} - t_{1} $$
(7)

4.3 Data Storage

The data is stored locally, in the Firebase Cloud and Google sheet. Cloud Firestore’s data transmission and reception rate is 10 Hz, and a limit of 50,000 data updates and edits in one day for the free version. We also tried storing data on Google Sheets, and it also receives and sends data at 10 Hz and is limited to 100 fetches and updates per minute. The data is divided into three parts including the collection, the directory and the file. The collection we named data contains folders inside. The name of each folder is the time that the data inside that folder is stored in the Cloud. And the file is inside the directory. The data including the time, the matching position of the virtual model, the rotational position of the servomotor and the error between the matching position of the virtual model and the position of the virtual model, and rotation position of the servomotor at that time was collected.

We also used one more local data storage. With this local data storage, the frequency is around 165 Hz and is limited to memory on local storage only. This is not a concern because local storage can expand memory and it also takes a long time to fill this local storage. Where the STT column is the ordinal number of the data sorted in ascending order based on the time that the data was stored, the Time column is the column that stores the data about the time that the data was stored. The Error column is the column used to represent the error between the matching position of the virtual model and the real one. The Joints Unity column is the column representing the joint position of the virtual model, and the last column is the Joints Servo column, which represents the rotation position of the servomotor.

4.4 Results

The analyzed data showed that the system takes about 2 s for the system to stabilize after booting. Here, the sampling frequency is about 165 Hz. They are shown in Fig. 3. Here, the vertical axis is the value of the number of samples taken in one second and the horizontal axis is the time the sample is taken. The system latency is about 30 ms with an error of ± 2 degrees. They are shown in Fig. 4. Figure 4 has the vertical axis being the error between the matching angle of the virtual and real model (degrees) and the horizontal axis being the time (ms).

Fig. 3
figure 3

Number of samples received in 1 s

Fig. 4
figure 4

Error between real and virtual model

Here, we have simulated servomotor power failure. Figure 5 shows the difference of the data when the fault occurs and when the failure occurs. When there are 3000 data stored, we have disconnected the servomotor and powered up again when 3150 data is stored, and we have also disconnected the servomotor when there is 4665 data until it stops completely system. Error is error of joint between real and virtual when the system is operating normally.

Fig. 5
figure 5

Error between real and virtual model when there is a problem

5 Conclusion

In this paper, we have successfully built a digital twin model of ABB IRB 120 robot. The digital twin model has developed communication between real and virtual through the TCP socket protocol suite. We also calculated the forward kinematics of the robot and compared it with the forward kinematics of Robot Studio. In addition, we also build a PID controller for the engine and store the data on Cloud and local storage for later machine learning development.

The system operates stably with a maximum frequency of 165 Hz when not storing data and 10 Hz when putting data in the cloud. Calculation of kinematics gives results with very small error due to rounding of values in the calculation. The PID controller is stable and has been able to control the position of the motor as desired. Data is transmitted bidirectionally with very small errors due to rounding during conversion in calculations.

In the future, we will increase the frequency of sending data to the cloud and use machine learning to analyze the collected data to help monitor and evaluate the system. We will then make digital copies of typical mobile robots like manipulator platform and rehabilitation robots that we plan to do in the near future.