Keywords

1 Introduction

Vision-based robotic grasping has played a central role in various industrial automated operations such as pick-and-place, packaging, and part assembly [1, 2]. During these operations, automatic inspection which aims to detect the defects of the grasped objects is also an imperative task to meet the increasingly high standards in the production line of factory automation. One efficient way to perform non-contract inspection is using the image sensing data from cameras to exclude both accidental failure and quality defects. However, inspection solely by using images can only detect surface defects [3]. In general cases, it is difficult to notice the defects such as stiffness change or texture damage just by its exterior, especially for some soft and delicate grasped objects (e.g., fresh food transportation). If the stiffness (i.e., impedance) of these objects is known in advance, appropriate grasping forces can be simply generated through the use of impedance control [4]. However, because the grasped objects may be damaged or changing its stiffness at any process from time to time, it is important to have the information of stiffness characterization in real-time [5] so as to maintain a high yield rate and perform accurate fault detection and diagnosis.

This study aims to develop an on-line method to measure object’s stiffness using readily-available and low-cost components in vision-based robotic grasping systems. Instead of using extra displacement sensors and involving complicated manufacturing processes and system integration, the work directly applies a robotic gripper equipped with a piezoresistive force sensor to obtain the relationship between grasping forces and compression amount of the grasped objects. The goal is to monitor the stiffness change during the vision-guided autonomous grasping process and use this information to conduct more efficient automation tasks. For simplicity the object compression values are acquired from encoder feedback of the servo motor installed in the gripper. However, due to the limited encoder resolution, the position measurement is susceptible to quantization errors and subtle compression may be hardly detectable. Another issue is that unexpected deformation with the gripper may be occurred in grasping high stiffness objects by using such low-cost flexible gripper. Since image feedback is readily available in vision-based robotic grasping systems, this sensing information is also adopted to fuse the gripper encoder feedback data for estimation of object compression. Because image processing is widely vulnerable to environments with noises, an extended Kalman filter is applied to compensate the imperfect measurement from these two sensing data and obtain more robust stiffness estimation. In this presented system, a robot manipulator is first commanded to approach a static grasping target by using visual servo control technique. After contacting with the targeted object, the system then starts to estimate the object’s stiffness during the grasping process. A least-square method is simultaneously applied to fit a time-varying stiffness equation for on-line evaluation. The experimental results demonstrate the feasibility of the proposed stiffness estimation method.

2 Experimental Setup

Figure 1 shows the photograph of the system setup used in this research, which is composed of three webcams, a robot manipulator, a force sensor, and an Arduino embedded board. Two downward-looking cameras are applied to build stereopsis to determine the grasping target’s position and implement visual tracking control. Another camera (all made by Logitech C170) is installed in front of the manipulator to obtain the displacement information of the gripper. The robotic manipulator used in the experiments has 4 degree-of-freedom (DOF) and the servo motor installed in each joint is made from Robotis with model number AX-18A. Particularly, the servo model number used for the gripper is MX-28T and has an encoder with 4096 ppr. A FlexiForce sensor is equipped with the end of the gripper to measure the grasping forces. A low pass RC filter is applied to eliminate undesired high frequency electrical noises and the measured force values are sent to PC through the communication interface in Arduino. Figure 2 illustrates the schematic diagram of the proposed stiffness estimation system. The images of the target are first captured by using stereo cameras. After appropriate image processing the targeted object’s center-position is given to drive the robot manipulator to approach the target by sending correct motor joint commands through visual servo control. During the entire grasping process, the encoder values associated with the gripper servo motor and the continuous images captured by the forward-looking camera are simultaneously integrated to estimate object compression values. The extended Kalman filter algorithm is implemented to reduce the influences of the error sources and noises from these two sensing data. Given measured grasping force feedback, a curve that illustrates the relationship between force and compression is continuously updated and fitted by using a least square method.

Fig. 1.
figure 1

Photograph of the experimental setup: a. front view; b. side view

Fig. 2.
figure 2

Schematic diagram of the proposed stiffness estimation system

3 Visual Servoing

This study simply implemented a position-based visual servoing (PBVS) control system [6] for grasping and stiffness estimation experiments. The images captured from camera are first used to calculate the error between manipulator end-effector and grasping target. The motor position commands derived from classical resolved motion rate control are calculated to reduce this tracking error. Figure 3 depicts the applied PBVS control block diagram, where g represents the estimation of the goal position, h represents the calculation of manipulator end-effector with forward kinematic model K, and J is the robot Jacobian matrix. \( {\uptheta} \) and \( \dot{\uptheta} \) denote the joint angle and angular velocity of the robot manipulator, respectively. The control law for this visual servo system is given as

Fig. 3.
figure 3

Block diagram of position-based visual servoing

$$ \dot{\mathbf{\uptheta }} = {\lambda }{\mathbf{J}}^{ - 1} \left( {{\mathbf{g}} - {\mathbf{h}}} \right) $$
(1)

Note that \( \uplambda \) is a proportional gain for tuning.

4 Object Compression Estimation by Sensor Fusion

In this study, two system parameters are to be measured/estimated in order to obtain the stiffness curve of the grasping objects. One is grasping force and the other is compression of the grasped objects. The objective is to find a relationship between this two physical properties and a fitting curve. The grasping force is measured by a piezoresistive force sensor at the end of gripper. To compensate the imperfect measurement from the available used low-cost hardware without adding extra displacement sensor, the amount of object compression is estimated by applying an extended Kalman filter [7] to fuse image feedback and gripper encoder feedback together.

As mentioned in the introduction, the measurement error sources by using a low-cost and low-rigidity gripper to estimate grasping object’s compression include quantization errors and slight gripper deformation when grasping with high stiffness objects. These errors and uncertainties along with image noises are modeled as part of stochastic noises to satisfy the Kalman filter problem assumptions. In this section the measurement models associated image feedback and encoder feedback are presented to facilitate the Kalman filter problem setup.

For the purpose of visual recognition and tracking control, a red marker attached to the gripper end is treated as an image feature point. After following the image processing procedures mentioned in Sect. 3, the image coordinate of the gripper can be obtained and should be transformed to a world coordinate representation by using a camera model. The gripper movement is used to represent the object compression. The schematic diagram of the applied camera model is illustrated in Fig. 4 and the transformation formula can be represented as

$$ p = \frac{f}{d}x + p_{0} $$
(2)

where p denotes the gripper compression expressed in image plane with an unit in pixel. d is the distance between camera and gripper. f is the focal length of camera. x is the actual gripper compression presented in world coordinate. p 0 is the translation between pixel coordinate system and image coordinate system. For simplicity, in this study the compression value is expressed in camera coordinate system since this value is the same as the one expressed in world coordinate system.

Fig. 4.
figure 4

Illustration of camera model associated with gripper compression

In addition to visual feedback, the encoder feedback available from the robotic gripper system is simultaneously applied to estimate object compression. As shown in a schematic diagram of gripper kinematic model (Fig. 5), the geometric relationship between gripper movement x and motor’s rotation angle \( \theta \) can be derived as

$$ \theta = \cos^{ - 1} (\frac{x - L}{a}) - \alpha $$
(3)

where \( L \) is the half of the horizontal distance between the axis center of two motors. a is the vertical distance from the axis of rotation to the front-end of motor. \( \alpha \) is an offset angle of \( \theta \).

Fig. 5.
figure 5

Schematic diagram of the applied gripper for compression estimation

It is obvious in Eq. (3) that a nonlinear term \( { \cos }^{ - 1} \) accompanies with the encoder measurement equation. This is the primary reason why an extended Kalman filter is applied to fuse both vision and encoder feedback information for compression estimation. Combining the above two measurement models with a static equation of motion using the fact that the gripper movement is slow, the system model and measurement model used in this study can be summarized as

$$ x_{t} = x_{t - 1} + u_{t} + w_{t} $$
(4)
$$ \left[ {\begin{array}{*{20}c} {\theta_{t} } \\ {p_{t} } \\ \end{array} } \right] = \left[ {\begin{array}{*{20}c} {\cos^{ - 1} (\frac{{x_{t} - L}}{a}) - \alpha } \\ {\frac{f}{d}x_{t} + p_{0} } \\ \end{array} } \right] + \left[ {\begin{array}{*{20}c} {v_{1,t} } \\ {v_{2,t} } \\ \end{array} } \right] $$
(5)

where \( x_{t} \), \( x_{\text{t - 1}} \), \( w_{t} \) and \( v_{t} \) share the same notations with the ones mentioned in the previous subsections. \( u_{t} \) represents the displacement produced by the motor command. Note that the constants \( \alpha \) and p 0 can be simply offset to satisfy the linearity. To finish the problem setup the linearized measurement matrix H is derived as

$$ {\mathbf{H}} = \left[ {\begin{array}{*{20}c} {\frac{ - 1}{{a\sqrt {1 - (\frac{{x_{t} - L}}{a})^{2} } }}} \\ {\frac{f}{d}} \\ \end{array} } \right] $$
(6)

5 Experimental Results and Discussion

In order to evaluate the feasibility and performance of the proposed stiffness estimation method, a 4-DOF robot manipulator was adopted to conduct grasping experiments and the estimated stiffness parameters were updated on-line through a visual interface programmed by OpenGL libraries. This section presents object compression estimation with fused sensor data, and curve fitted results for on-line stiffness demonstration.

5.1 Grasping Object Compressing Estimation

The purpose of this experiment was to verify that the accuracy of object compression can be further improved by applying an extended Kalman filter technique integrating with the vision feedback and gripper encoder feedback obtained in the grasping process. Simulation and experiments were both conducted by sending motor commands to move the gripper to a desired position and comparing the differences between estimated and actual gripper position. In the simulation, the servo motors were commanded to rotate sequentially from 0° to 45° sampled with one degree, simulating the compression process when grasping the object. Gaussian noises were added to simulate the uncertainties and imperfect measurement due to the applied hardware in this study. Figure 6 shows the estimation errors of using three different sensing data: image feedback only, gripper encoder feedback only, and sensor fusion data. X-axis represents the motor command and Y-axis represents the estimated errors. Assume that the image noises have larger influences than the noises existed in the gripper system. The results by using fused sensor data (blue solid line in the plot) justify the improved accuracy of this proposed method.

Fig. 6.
figure 6

Effectiveness of using different sensing data: simulation results (Color figure online)

In the experiment, the gripper was commanded to move from 10 cm to 0 cm and the data was recorded every 0.5 cm movement. Figure 7 shows the results by comparing the three sensing data with the values measured by a 1 mm resolution laser rangefinder. In Fig. 7, X-axis stands for the gripper movement, which is also treated as the amount of gripper compression. Obviously, the estimated errors in the proposed experimental system were dominated by image noises with an offset value. The regular triangular-like estimation errors by using encoder feedback only were primarily due to the quantization effects with limited encoder resolution. The estimation error distribution in experiments suggests that the normal distribution assumption may be not applicable to the cases using either encoder feedback or image feedback. However, the valid error offset compensation and further error magnitude reduction still verifies the effectiveness of the method by fusing these two feedback information.

Fig. 7.
figure 7

Effectiveness of using different sensing data: experimental results

5.2 Measuring Object Stiffness

In this experiment, a sponge-made block was adopted for grasping tests and stiffness estimation. In order to establish a comparison basis, the stiffness of the block was first measured off-line using an electronic scale with 0.1 g accuracy. The block was placed on the scale and a drill installed on a small desktop machine was fed gradually to the block and provide different compression amounts. The displacement resolution of the scale on the drilling machine is 1 mm. The force and compression data were recorded and plotted as the red solid line shown in Fig. 8. The blue dashed line represents the estimated stiffness relationship by using the proposed method, where the compression forces were measured by a piezoresistive sensor equipped on the end of gripper. As can be seen, both curves show a close match with small amounts of compression and a deviation starts after the amount of compression is larger than 0.2 cm. Nevertheless, the overall trend of the proposed method is still similar to the one of off-line measurement results.

Fig. 8.
figure 8

Stiffness measurement using the proposed method and offline validation: grasping a soft block (Color figure online)

5.3 On-Line Stiffness Curve Fitting

To quantize the stiffness estimation results obtained in the previous subsection, a cubic polynomial was adopted as the stiffness equation in which its parameters were identified by using a standard least square method. The fitted parameter values were updated on-line in conjunction with the continuously-added measurement data. Figure 9 shows the curve fitting result with the whole compression process and the final fitted equation is represented as

Fig. 9.
figure 9

Stiffness curve fitting results using least-square method: grasping a soft block

$$ y = 21.7263x^{3} + 19.2486x^{2} + 5.7681x + 0.1519 $$
(7)

The coefficient of determination (R-squared) is 0.9857, which is very close to 1. Therefore, the fitting result is good enough for stiffness quantization and evaluation.

6 Conclusions

This paper presents an intelligent on-line stiffness method which identifies the time-varying stiffness property of objects grasped by vision-based robotic manipulators. One great feature in this proposed system is the exemption from using an extra displacement sensor for compression measurement by integrating with the image feedback and gripper encoder feedback information during grasping process. The experimental results justify the improved compression estimation with fused sensor data and the curve-fitted results demonstrate the feasibility of the proposed method. This technique may be applicable to many applications that require automated manipulation, monitoring, and stiffness inspection. Another application of this system is to further integrate the estimated stiffness with virtual reality technique to develop various haptic interfaces in games, surgical training, and military use. This study assumes that the measured noises and uncertainties are Gaussian noises and applies this assumption to facilitate the Kalman filtering problem formulation. To be more realistic as the true cases and obtain better estimation this strong assumption may be relaxed by applying other advanced algorithms that fit non-Gaussian noise distribution assumption such as particle filters [8].