Abstract

One of the problems with industrial robots is their ability to accurately locate the pose of the end-effector. Over the years, many other solutions have been studied including static calibration and dynamic positioning. This paper presents a novel approach for pose estimation of a Hexa parallel robot. The vision system uses three simple color feature points fixed on the surface of the end-effector to measure the pose of the robot. The Intel RealSense Camera D435i is used as a 3D measurement of feature points, which offers a cheap solution and high accuracy in positioning. Based on the constraint of three color feature points, the pose of the end-effector, including position and orientation, is determined. A dynamic hybrid filter is designed to correct the vision-based pose measurement. The complementary filter is used to eliminate the noise of image processing due to environmental light source interference. The unscented Kalman filter is designed to smooth out the pose estimation of the vision system based on robot’s kinematic parameters. The combination of two filters in the same control scheme contributes to increased stability and improved accuracy of robot’s positioning. The simulation, experiment, and comparison demonstrate the effectiveness and feasibility of the proposed method.

1. Introduction

Robots have been used a lot in production today to replace human labor because of their ability to work quickly, continuously, and accurately, among many other outstanding advantages. Serial robots, which have an open-loop kinematic chain, are the most common in the industry. Its advantage is the large workspace, but the drawback is accumulated errors, low stiffness, relatively low load performance due to open kinematic structure, and large weight of actuators. Unlike serial robots, parallel robots, which have a close loop kinematic chain, have high stiffness, high speed and acceleration, and high accuracy and carrying capability, making them a good solution for many manufacturing applications [13]. In this paper, a six-degree-of-freedom parallel manipulator, Hexa parallel robot [47], is studied to apply the proposed control methods.

One of the problems with industrial robots is their ability to accurately locate the pose of the end-effector. Integrating absolute encoders in the joints is efficient, but expensive and difficult to install. Over the years, many other solutions have been studied including static calibration and dynamic positioning [810]. The authors in [11] used an inertial measurement unit (IMU) and ultrasonic triangulation sensors for self-calibration of the robot pose. Its advantage is not complex. However, when using IMU, one of the difficulties is handling cumulative errors when integrating measurement parameters due to the large noise of IMU. Li et al. [12] used the sensor frame and some reflectors fixed on the end-effector of the robot to estimate the relative pose of the robot. This method can be applied to parallel and serial robots. It requires a complex setup and calculation. In [13], the authors used a 3D visual sensor to correct the pose of the end-effector of the robot. The sensor is attached to the end-effector and it is not easy to install for some applications. FarzanehKaloorazi et al. [14] optimized the trajectory planning of a robotic work cell by using particle swarm optimization (PSO). However, the PSO algorithm needs complex computation and can cause a delay in real-time execution.

In robot control, many various sources of error can affect the positioning of the end-effector such as uncertainties of the model, the nonlinearity of the system, backlash of the mechanical system, and unknown noises. Solving all sources of error is not feasible and effective. Therefore, the visual measurement-based solutions [15] are of interest because they directly address and correct the pose of the end effector. Liang et al. [16] used a vision system and a deep convolutional network human pose estimation algorithm to estimate the 2D and 3D poses of robots on construction sites. This method uses a vision system and deep convolutional network algorithms but does not use additional sensors and markers. The author in [17] proposed a real-time method based on deep learning to estimate the pose of the robot arm using a vision system. This method does not need calibration.

In this paper, the pose of a Hexa parallel robot is estimated dynamically by using a 3D vision system. This method uses three simple color feature points fixed on the surface of the end-effector. The Intel RealSense Camera D435i is used as a 3D measurement of the feature points, which offers a cheap solution and high accuracy in positioning. Based on the constraint of three color feature points, the pose of the end-effector, including the position and orientation, is determined.

As with other vision-based positioning methods, this study needs to address noise in image processing, noise caused by system vibrations while moving, and other unknown noises. In the previous studies, some authors used an extended Kalman filter (EKF) or adaptive Kalman filter (AKF) to stabilize the pose estimation. Wilson et al. [18] applied an EKF to the pose estimation based on visual servo control of a 5-DOF robot arm. In [19], the authors used an AKF to improve accurate pose estimates for visual servoing. In [20], a dynamic path tracking scheme for industrial robots using an AKF is proposed. González et al. [21] used an EKF for the estimation of the camera space manipulation parameters and applied it to a parallel robot. The authors in [22] proposed an adaptive robust Kalman filter to correct the vision-based pose detection of industrial robots. Compared to EKF and AKF, a new extension of the Kalman filter to nonlinear systems [23], named unscented Kalman filter (UKF), uses many sigma points to linearize a nonlinear model, so it may be more accurate. In [24], the authors proposed a two-stage Kalman filter-based algorithm for processing the data from an inertial measurement unit in order to obtain accurate position estimation over a short period of time. In [25], the authors proposed a dynamic filtered path tracking control schema for a 3RRR planar, where the unscented Kalman filter is designed to reduce the errors of the pose estimation.

Because the pose estimation error can be caused by multiple sources of noise, this paper proposes a dynamic hybrid filter to smooth out the pose measurement of the vision system. A complementary filter is used to eliminate the noise of the image processing due to environmental light source interference. A UKF is designed to correct the pose estimation of the vision system based on robot’s kinematic parameters. The combination of two filters in the same control scheme contributes to increased stability and improved accuracy of robot’s positioning.

Based on the above discussion and literature review, the contributions of this study could be summarized as follows: (1)The pose of a Hexa parallel robot is estimated dynamically by using a 3D vision system. This method uses three simple color feature points fixed on the surface of the end-effector. The Intel RealSense Camera D435i is used as a 3D measurement of the feature points, which offers a cheap solution and high accuracy in positioning. Based on the constraint of three color feature points, the pose of the end-effector, including the position and orientation, is determined(2)A dynamic hybrid filter is proposed to smooth out the pose measurement of the vision system. It is the combination of two filters in the same control scheme that contributes to increased stability and improved accuracy of robot’s positioning. The complementary filter is used to eliminate the noise of the image processing due to environmental light source interference. The UKF is designed to correct the pose estimation of the vision system based on robot’s kinematic parameters(3)Although there are many applications of Kalman filters for industrial robots in the literature review, there are no specific studies on UKF’s application in the positioning of parallel robots such as the Hexa robot. This study details how to define the kinetic expressions of the Hexa robot and how it is applied to the UKF, as well as presents the establishment of UKF parameters and key expressions in practical application. The simulation, experiment, and comparison are conducted to verify the effectiveness and feasibility of the proposed approaches

The remainder of this paper includes the following: Section 2 introduces the kinematics of the Hexa parallel robot. Section 3 is the system description, vision-based pose estimation, and the problem statement. Section 4 presents the dynamic hybrid filter. Section 5 shows the simulation, experiment, and comparison. Section 6 is the conclusion.

2. Kinematic Model of the Hexa Parallel Robot

Figure 1 presents the kinematic structure of the Hexa parallel robot. The moving plate is connected to the fixed platform by six identical pairs of arms and rods, which are divided into three groups spaced 120 degrees around the center of the fixed platform. The six arms are driven by six DC motors. In this study, the index indicates the pair of arm and rod, where as presented in Figure 1. is the rotation center of the revolute joint of the arm. is the center of the universal joint that connects the arm to the rod. is the center of the spherical joint that connects the rod to the moving plate. is the radius of the fixed platform calculated from its center to the point . is the radius of the moving plate calculated from its center to the point .

is the angle formed by three points , , and the center of the fixed platform. is the angle formed by three points , , and the center of the moving plate. The fixed coordinate frame is located at the center of the fixed platform. The -axis is toward the center point of the edge . The -axis is perpendicular to the fixed platform in the upwards direction. The -axis is determined by the right-hand rule. The moving coordinate frame is centrally located in the moving plate and has axes that are oriented similar to those of the frame . The length of the arm is . The length of the rod is . is the rotation angle of the arm. The rotation state of the arms is specified by a vector . The pose (position and orientation) of the moving plate is specified by a vector with respect to the frame , where is the position of the moving plate and is the Euler angles around the -, -, and -axes, respectively.

2.1. Inverse Kinematics

The coordinate of the joint with respect to the frame is calculated as follows: where is the homogeneous transformation matrix from to . is the fixed coordinate of the joint with respect to the frame and determined as follows: with

The homogeneous transformation matrix is as follows: where and present cosine and sine.

The coordinate of the joint with respect to the frame is calculated as follows: where is the fixed coordinate of the joint with respect to the frame \. is the coordinate of the joint with respect to the joint . and are calculated as follows: with

The relationship between the joint and the joint is expressed as follows:

By substituting Equations (1) and (5) into Equation (8), a new equation is obtained in the following format: with

By solving Equation (9), the rotation angles of the arms can be obtained as follows:

2.2. Kinematic Jacobian Matrix and Singularity Analysis

To establish the relationship between the velocity of the moving plate and the angular velocity of the arms, the Jacobian matrix needs to be determined. Equation (8) can be rewritten as follows:

Substituting Equations (1) and (5) into Equation (12) and differentiating the equation will result to where is the velocity vector of the moving plate.

The time derivative of the homogeneous transformation matrix can be calculated by the skew-symmetry operator [26]:

The time derivative of is as follows:

Equation (13) can be rewritten as follows: where is the angular velocity vector of the moving plate. is the skew-symmetry operator of a vector . Suppose , the skew-symmetry operator will result to

Equation (16) can be rewritten as follows: where

and are the forward and inverse Jacobian matrices, respectively. Equation (18) is used to find the system equation for the UKF in Section 4.2.1.

3. System Description, Vision-Based Pose Estimation, and Problem Statement

3.1. System Description

Figure 2 is the block diagram of the hybrid filter for the pose estimation of the Hexa parallel robot. Block A is the motion control, which includes the path planning and motion control of the robot. In this paper, motion control is not covered in detail because it is not the scope of the study. It is assumed that the desired trajectory is ready to be set and that the motion controller controls the motors through all points in the trajectory. Block B is the image filter, which includes image processing, features detection, and a complementary filter to stabilize the feature points. Details will be presented in Sections 3.2 and 4.1. Block C is the pose estimation, which includes the 3D reconstruction to calculate the 3D coordinate of the feature points and through which the pose of the moving plate will be calculated. Details will be presented in Section 3.2. Block D is the dynamic filter, which includes a UKF to smooth out the pose estimation. Details will be presented in Section 4.

is the desired pose of the moving plate at one point in the trajectory. is the rotation vector of the arms corresponding to . The relationship between and is described as follows: where is the apparent function of the inverse kinematics problem presented in Section 2.1.

is the real rotation vector of the arms. is the desired angular velocity of the arms. is the real angular velocity of the arms. and are obtained by tracking the encoders of the motors. is the real pose of the moving plate after the motion control corresponding to . There is always an error between and due to many reasons such as uncertainties, system nonlinearities, the backlash of the motor gearbox, and unknown disturbances. In the experiment, is measured by a highly accurate instrument after each moving point in the trajectory and used as ground truth to evaluate the results.

is the coordinate of the feature point in the image plane. is the pose of the moving plate after the vision-based pose estimation. is the pose of the moving plate after using the UKF to smooth out the pose estimation. The filtered value after the dynamic filter is used to calculate the angles of the arms using the inverse kinematics problem and compared with the desired value to compensate for the errors.

3.2. Vision-Based Pose Estimation

Figure 3 presents the vision system of the Hexa parallel robot using in this study. The Intel RealSense Camera D435i is located below the fixed platform, pointing towards the surface of the moving plate to capture the image of the feature points. is the camera coordinate system. The feature points are detected by color, shape, and area using OpenCV 4.3.0, which can be searched on the internet. After the image processing, the coordinate of the feature point in the image plane is obtained.

The 3D coordinate of the feature point with respect to the frame is calculated as follows: where is the apparent function of the 3D reconstruction using Application Programming Interfaces (APIs) for the Intel RealSense Camera, which is supplied with the device or can be found on manufacturer’s website.

The 3D coordinate of the feature point with respect to the fixed frame is calculated as follows: where is the homogeneous transformation matrix from to .

The position of the moving plate with respect to the fixed frame is calculated as follows:

To calculate the orientation of the moving plate, the plane passing through the three feature points is defined as follows: with

The orientation of the moving plate is calculated as follows: (i)The Euler angle is the angle between the plane and the -plane, where the -plane has an equation of (ii)The Euler angle is the angle between the plane and the -plane, where the -plane has an equation of (iii)The Euler angle is the angle between the -axis and the line passing through the feature point 1 and the center of the plate

Finally, the pose of the moving plate is determined as follows:

3.3. Problem Statement

Two factors may influence the outcome of the pose estimate in Figure 2: (i)Feature detection in block B is noisy and has a certain margin of error in image processing mainly due to the influence of ambient light sources as demonstrated in Figure 4. The noise affects the results of the depth measurement and the calculation of the 3D coordinates of the feature points by Equation (21).(ii)The pose calculation in block C has errors due to many reasons such as uncertainties, system nonlinearities, the backlash of the motor gearbox, and unknown disturbances. The trajectory and speed of the robot also affect the error

The objective of this study is to design a dynamic hybrid filter, which is the combination of the complementary filter in block B and the UKF in block D to smooth out and correct the pose estimation of the moving plate. The complementary filter reduces the noise and stabilizes the measurement of image processing. The UKF smooths out the pose calculation based on robot’s kinetics. The corrected value will be used as the feedback of the motion control to increase the accuracy.

4. The Dynamic Hybrid Filter

4.1. The Image Filter Using a Complementary Filter

In image processing, the coordinate of feature points in the image plane may have an error due to noises, mainly light source noise and vibration when moving the robot. A 1-pixel error can also lead to a position error of several millimeters after 3D reconstruction as shown in Equation (21). To increase accuracy, the pixel positions in this study are calculated by real numbers instead of integers. In this study, a complementary filter is used to remove the noise and stabilize the coordinate of the feature points in the image plane. Figure 4 shows the role of the complementary filter in the proposed hybrid filter. It can reduce noise significantly.

The formula of the filter is as follows: where is the filter gain, which is chosen experimentally to balance smoothness and lag. and are the previous and current instant, respectively. is the index of the feature point. is the coordinate of the feature point on the image plane calculated by image processing at the current instant. is the coordinate of the feature point on the image plane after filtering.

4.2. The Dynamic Filter Using an Unscented Kalman Filter

The UKF [23] is used to smooth out the pose calculation based on robot’s kinetics. The corrected value will be used as the feedback of the motion control to increase the accuracy. This process is demonstrated in Figure 5.

4.2.1. The System Equation

Equation (18) can be rewritten as follows: where is the forward kinematics Jacobian matrix of size and is the inverse kinematics Jacobian matrix of size . and are calculated by Equation (19). is the angular velocity vector of the arms of the robot obtained by tracking the encoders of the motors.

Equation (28) is expressed in discrete form as follows: where and are the previous and current instant, respectively. is the discrete interval. is the pose vector of the moving plate.

4.2.2. Step 1: Design Sigma Points

UKF uses many sigma points to linearize a nonlinear model. The state variable (the pose vector of the moving plate) has a dimensionality of , so the number of sigma points will be . To start a new instant, the sigma points are calculated as follows: where the sigma points matrix has the size of . is the mean value of . The covariance matrix has the size of . The squared root can be calculated by using the Cholesky decomposition, which is available on the Internet (such as the website of Wikipedia). The index denotes the column of the matrix. The initial value of is set by the estimated standard deviation of elements of . The initial value of is set equal to the desired pose of the moving plate at the current moving point. For other instants, and are the estimation in the update step of the previous instant. is the difference between the sigma points around . is the scaling coefficient. In this paper, and .

The weights are calculated as follows: where is the weight to calculate the mean associated with the sigma point. is the weight to calculate the covariance associated with the sigma point. and have the size of . is the optimal coefficient.

4.2.3. Step 2: Prediction

The predicted mean and covariance of UKF are determined as follows: where the index denotes the column of the matrix. is the process noise covariance matrix, which has the size of . is mapped associated with the sigma points. is the nonlinear function, which is estimated based on the system kinetics as shown in Equation (29).

4.2.4. Step 3: Update

The Intel RealSense camera measures the pose of the moving plate and returns a measured pose vector . This measurement has a lot of noise and is not fixed between instants even at the same moving point. The difference vector between and is calculated as follows:

After that, the sigma points are mapped into the measurement space based on the difference by using a function as follows: where is the mapped sigma points matrix, which has the size of associated with . is the measurement noise signal, which can be assumed to be a zero-mean Gaussian white sequences noise.

In practice, each column of is calculated as follows:

The measured mean and covariance of UKF are determined as follows: where the index denotes the column of the matrix. is the sequences noise covariance matrix, which has the size of .

The cross variance is determined as follows:

The Kalman gain is calculated as follows:

Finally, the new state variable and covariance matrix are updated as follows:

The new state variable and covariance matrix are used for the next instant from step 1 in Section 4.2.2.

4.3. The Stability of the Unscented Kalman Filter

The stability of the UKFs is complicated especially since the measurement of the update step is provided from a vision system. The study in [27] proved that by choosing some initial conditions of the filter, the estimation error will be bounded, and the filter will be stable. In this paper, the stability analysis is not performed again but achieved experimentally by tuning the initial filter parameters.

5. Simulation, Experiment, and Comparison

5.1. Hardware

Figure 3 is the real Hexa parallel robot using in this study. The Intel RealSense camera D435i is used as a 3D vision system. The simulation is conducted in MATLAB. The experimental robot control software is written in Visual C++ by the authors. Table 1 shows the parameters of the 3D vision system. The real frame rate of the vision system is 20 fps because the process interval of the program is . Figure 6 is the block diagram of the simulation in MATLAB. To verify the performance of the hybrid filter, in both simulation and experiment, the Hexa robot is moved by the helix path as shown in Figure 7. The angular velocity of the arms is set  rad/s.

5.2. Initial Parameters of UKF

In Equations (30) and (31), the coefficients are tuned as , , and . In Equation (32), , where is the Hadamard product, is the process standard deviation, and . In Equation (36), , where is the measurement standard deviation and . In Equation (30), the initial covariance matrix is set temporarily , the initial mean vector is set equal to the desired pose of the moving plate at the start point of the UKF process. These values are tuned in the simulation for best performance, then applied to the experiment.

5.3. Simulation

The simulation creates the feedback angular velocity by adding a random noise into the desired angular velocity . The random noise is controlled in the range. Consider that the real motor has a maximum error of 2 degrees per revolution; the random noise of angular velocity is in the range rad/s. The feedback rotation is simulated by adding a random noise into the desired rotation . Consider that the real motor encoder has a maximum error of 3 pulses; the random noise of rotation is in the range degrees. The simulation creates the measurement for the UKF by adding the random noise into the desired pose at the corresponding instant. Based on the accuracy of the vision system as shown in Table 1, the random noise of the measurement is set in the range  mm for the translation and  rad for the Euler angles. The tracking data is recorded at each moving point.

Figures 813 show the tracking data of the first 50 points of the helix path in the simulation. The dot blue line indicates the pose estimation of the vision system. The vision system is simulated noise close to the real system. The dashed green line is the real pose of the plate. The solid red line is the smoothed estimation after using the UKF. For easy comparison of values, mean squared error (MSE) and mean absolute error (MAE) of the simulation are summarized in Table 2. After using the UKF, the vision-based pose estimation is smoothed out, and the errors are reduced significantly.

5.4. The Experiment

Similar in simulation, the Hexa robot is controlled to follow the helix trajectory through 101 points as shown in Figure 7. The motion control of the motor controller board will measure the angular velocity and rotation to use in the UKF process to calculate the nonlinear function in Equation (32). The 3D Intel RealSense Camera D435i will estimate the real pose of the plate and use it in the update step of the UKF. The pose estimation after smoothing by UKF is used as the feedback of the motion control to correct the pose of the plate. When the plate is moved point to point, the UKF is calculated continuously with a cycle of as presented in Section 5.1. After each movement, the robot is stopped for a moment to measure the real pose manually by instrument, which is used as the ground truth of the experiment.

Figures 1419 show the tracking data of the first 50 points of the helix path in the experiment. The dot blue line indicates the pose estimation of the vision system. The dashed green line is the real pose of the plate, which is the manual measurement. The solid red line is the smoothed estimation after using the UKF. As showing in the figures, the hybrid filter corrects the pose measurement very close to the real pose although the measurement has a big noise due to the vibration of the real Hexa robot when moving. If the smoothed estimation is highly accurate, it will help the closed-loop motion control to be highly accurate. As stated in Section 3.3, motion control is not the scope of this paper, so the experiment data is not shown in this result. For easy comparison of values, mean squared error (MSE) and mean absolute error (MAE) of the simulation are also summarized in Table 3. After using the UKF, the vision-based pose estimation is smoothed out, and the errors are also reduced significantly.

The high accurate result of the vision-based pose estimation using the proposed hybrid filter could be used as feedback in the dynamic path tracking control in conjunction with a position controller. This content will be covered in a different topic because of article’s limitation.

5.5. The Comparison

Table 4 shows the comparison of the mean absolute error of different pose estimation methods with the proposed approach in this paper. The dynamic hybrid filter has encouraging results compared to most other methods. The pose estimation accuracy is inferior to the method in [20]. However, the method in [20] is applied to an industrial robot arm and uses a high-quality vision system. Meanwhile, the proposed method used a simple and cheap vision system that is easy to install and use. It also applies to a laboratory model of a Hexa parallel robot, which has low mechanical quality. If applied to a professional industrial robot with good hardware, this proposed approach may achieve better performance.

6. Conclusions

This paper proposes a novel approach for pose estimation of a Hexa parallel robot. The vision system using simple color feature points fixed on the surface of the end-effector and Intel RealSense Camera D435i offers a cheap solution and high accuracy in pose estimation. The dynamic hybrid filter is designed to correct the vision-based pose measurement. The complementary filter is used to eliminate the noise of image processing due to environmental light source interference. The UKF is designed to smooth out the pose estimation of the vision system based on robot’s kinematic parameters. The combination of two filters in the same control scheme contributes to increased stability and improved accuracy of robot’s positioning. The simulation, experiment, and comparison results are encouraging. The measurement error of the vision system is reduced to 9 to 12 percent left in simulation and 14 to 21 percent left in experiment. Although the result is not too good if compared to methods using high-tech hardware, it is a feasible approach for low-cost robot systems. As a further improvement, EKF, AKF, and the proposed hybrid filter can be deployed on the same Hexa parallel robot for more accurate comparison and evaluation of efficiency. The vision-based pose estimation using the proposed hybrid filter could be used as feedback in the dynamic path tracking control in conjunction with a position controller to analyze the performance in the real application. In this paper, the parameters of UKF are determined by tuning experimentally. In future works, an adaptive learning controller could be designed to determine those parameters automatically.

Data Availability

The data of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

This study was supported in part by the Ministry of Science and Technology, Taiwan, under grants MOST 109-2221-E-011-068 and MOST 109-2811-E-011-516.