Abstract
A new type of distributed wireless network which combines a laser range finder with binocular vision sensors is developed to improve the accuracy of measurement along the direction of optical axis. By obtaining the coordinate of the target by the binocular vision sensor, the laser range finder which is installed at a two-axes rotary table is able to measure the distance between the target and the turntable of the current position. Then, an adaptive weighted fusion algorithm of multi-sensor information fusion is proposed to improve the utilization efficiency of the multi-sensor information and to make the results more accurate. Finally, the parameters of the system are calibrated through the simulations and the experiments show that the system is feasible and effective.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
As an important technology in the society, the target tracking technology is initiated firstly during the Second World War. Then with the development of various kinds of sensors, such as radar, infrared and sonar, the target tracking technology has been developing rapidly. The combination of data association and Kalman filer in target tracking by Bar-Shalom in 1975 marked that the target tracking entered a new stage [1–5].
Nowadays, most of noncontact positioning measurement researches focused on long distances beyond 1000 m. Also, there are some works on short distances below 1 cm or 1 mm taking advantages of high resolution photonic devices. However, few measurement systems for medium distances between 1 and 100 m are designed due to technology gaps [6–8]. With the increasing requirement of coordination among vehicles in aerospace tasks, it is very important and necessary to develop accurate and efficient medium-distance positioning measurement systems [9–13], and the development of wireless sensor networks [14–17] makes this possible.
Cameras have been extensively used in positioning measurements due to the advances in machine vision and image processing technologies [18–20]. The binocular vision sensor system is able to obtain the target position information by target images which are taken with two parallel cameras. However, the shortage of this method is the low accuracy in the optical axis direction [21–23].
With the developing of the scientific technology, the increasing detected target and the faster speed, the data of measurement by the single sensor could not meet the requirement. To obtain the measurement data maximum efficiency, the combination of multi-sensor is explored and it is the emphasis of the multi-sensor information fusion and target tracking [24–27].
In the multi-sensor information fusion system, the mainly methods of aerial target tracking include: GPS navigation, passive vision sensors, active vision sensors, etc. In [28], a new combined system of a millimeter-wave radar and a CCD camera is proposed. In [29], the multiple-camera system is developed with overlapping or non-overlapping views to reconstruct the 3D trajectories of objects. In [30], a method of the multiple-camera based tracking in the cameras surveillance system to reconstruct 3D object position from 2D image of the object is proposed. These methods are suitable for long-distance tracking, but the airfoils and tail will hinder the GPS in the medium-distance, although the visual sensor could be used to this range, because of the influence by the weather and the cloud, the errors along the optical axis direction are large. The paper considers using laser range finder to modify the binocular vision sensor errors along the optical axis distance, then uses information fusion algorithm to make the target tracking more accurate.
In early 1990’s, Kalmen and Sastry presented a novel approach to multiple target tracking based on symmetric measurement equations in [31] and [32]. It laid the foundation of information fusion. Then the Extended Kalman Filters (EKF) was used to handle the nonlinearities system in [33, 34], and Unscented Kalman Filter (UKF), introduced by Julier and Uhlmann in [35, 36], was applied. So far numerous algorithms of information fusion were proposed. Although the above algorithms need complex computational process, the system of this paper that has two sensor systems is a simple one and the results are obtained quickly by the information fusion algorithm necessarily. Based on this system, the multi-sensor adaptive weighted fusion algorithm is proposed.
In this paper, a noncontact position measurement system is designed by using a heterogeneous wireless sensor network for improving the accuracy of the medium distance position measurement. The heterogeneous sensor network is composed of three nodes which are two camera sensors and a laser range-finder and the information obtained by the sensors can be exchanged from one to another in the network. By using data fusion approaches, the accuracy of the target position measurement can be greatly improved. Experiment results have verified the performance of the designed position measurement system.
The rest of this paper is organized as follows. In Section 2, the coordinate system is established. Section 3 presents the registration of systems. Section 4 presents the Fusion Algorithm. In Section 5 and Section 6, the simulation and experimental results are provided respectively followed by the conclusion in Section 7.
2 Establishment of the coordinate system
As shown in Figs. 1 and 2, the heterogeneous distributed sensor network system is composed of three sensors which include a laser range finder and two vision cameras, the laser range finder is fixed on a two-axes rotary table, so that the laser aiming direction can be changed. Each senor has its own processor to process the algorithm. A wireless communication system based on WiFi is employed, and the senor node can communicate with each other through the wireless network, which is considered to be secure since an unsecure wireless network may result in system failure [37]. The targets which have different colors or shapes are pasted on the surface of the object to be measured by the sensors. If no less than three targets are caught by the sensors, then the position and attitude of the object can be calculated.
Figure 1 shows the principle of the measure method with one target.
When the system is working, the target images are taken by the two vision cameras, and the 2D coordinates of the target(x 1, y 1) and (x 2, y 2) in each image can be calculated by the processor of camera sensors. Then 2D coordinates obtained by each camera will be exchanged among the three senor nodes. The 3D coordinates of the target(x c, y c, z c) can be obtained by the Space Rendezvous algorithm in the processor of each senor node using the following formula.
where dx and dy are pixel size of x-axis and y-axis of the camera, u 0 and v 0 are the optical center coordinates of the camera, ax and ay are the normalized focal length of x-axis and y-axis.
After getting the 3D coordinates of the target(x c, y c, z c), the processor of laser range finder drives the two-axes rotary table to aim at the target and measures the distance to target via the laser range finder. The information of measured distance will be sent to each sensor node of the wireless network, and the accurate target coordinates will be calculated by using a data fusion approach.
The structure of the heterogeneous wireless sensor network is shown in Fig. 2.
As shown in Fig. 3, the coordinate systems are established. O1、O2 represent the two cameras of binocular vision sensor, A is the exit-point of laser range finder, which is installed in the two-axes rotary table, and C represents the target. Two coordinate systems are set up: coordinate system O1XYZ and coordinate system AX′Y ′ Z ′.
Coordinate system O1XYZ: Camera O1 is the origin of coordinates, the extension from O1 to another camera O2 is X-axis, the vertical axis to O1O2 and the upward direction is Y-axis, the right-hand rule can determine the direction of the Z-axis.
Coordinate system AX ′ Y′Z ′: point A is the origin of coordinates, X, Y, Z axes are the same of coordinate system O1XYZ. The coordinate of point C in the coordinate system AX ′ Y ′ Z ′ is (x ′ c , y ′ c , z ′ c ), the coordinate of point A in the coordinate system O1XYZ is (xA, yA, zA), the surface projection of point C is C0. The included angle between AC0 and Y ′ axis is α, the included angle between AC and X ′ AY ′ is β.
3 The registration of systems
The registration of systems includes two parts: space and time.
The data of sensors is registered firstly in different space when the multi-sensor data is fusing. The data of different measurement coordinates must be converted to the same coordinate system, it is the same to mean that the different measurement data are transformed into public coordinate system of the information processing center and the fusion center. Information fusion in the public coordinate system compensates the combined error of each sensor and takes advantage of multi-sensors multi-source information sufficiently.
Meanwhile, the measurements are obtained asynchronously, if the calibration of time domain is inaccurate, the quality of information fusion will decline. In this system, the measurement of binocular vision sensor can be seen as real-time data. Due to the limited accuracy and distance, the frequency of the laser rage finder is lower than the binocular vision sensor, while the other restriction is the mechanical property of the two-axes rotary table, which cannot satisfy the real-time demand. The frequency of laser range finder is far less than binocular vision sensor, so it is set as the fiducial frequency.
Coordinate system O1XYZ is the fiducial coordinate. The target’s information from the binocular vision sensor is in coordinate system O1XYZ, so it needs to be converted to the coordinate system AX ′ Y ′ Z ′ and then the pitch and yaw angles of the rotary table are obtained. The basic conversion formula is shown as follows:
where Rot(x, m) is the rotation matrix around the x axis, m is the rotation angel, and other axes are in the same way. [x cr , y cr , z cr ] is the target coordinate after the rotation of coordinate system AX ′ Y ′ Z ′.
The yaw angle α and pitch angel β are obtained by the geometric vector:
The Algorithm can be summarized in Algorithm 1.
According to the yaw angle α and pitch angle β, the two-axes rotary table can be adjusted so that the laser is guided to target C, and l (the distance of AC) is got.
The z-coordinate in coordinate system O1XYZ which is corrected by the Laser range finder can be obtained from Eqs. (3.1) and (3.3).
4 Fusion Algorithm
In this work, the multi-sensor adaptive weighted fusion algorithm is adopted to combine the information of the binocular vision and laser range finder. The model is shown in Fig. 4. The measurements of binocular vision sensor and laser range finder are X1, X 2, respectively, and then the data are weighted. The overall idea is under the condition of minimum variance, according to the data from sensors and using the way of auto-adapted, the optimal weighted factor is explored to make the \( \widehat{\mathrm{X}} \) optimally.
4.1 The derivation of fusion algorithm
The variance of n sensors are σ 1 2、σ 2 2、…σ n 2 respectively. The truth-value is X. The Measured values of n sensors are X 1 , X 2 …Xn, respectively. The unbiased estimation is X. Each sensor of the weighted factor is W 1 , W 2 , …W n ,respectively. The fusion of \( \hat{\mathrm{X}} \) satisfy the following relations [30]:
Total variance is σ2:
Because the X 1 、X 2 、…X n are independent, and the estimation of X is unbiased, we have
The formula (4.2) shows that total variance is a multiple quadratic function of the weighting factor, which has the minimum value:
According to the extreme value theory of multivariate function, when the total variance is a minimum, it can be obtained by the corresponding optimal weighting factor:
\( {\upsigma}_{\min}^2=\raisebox{1ex}{$1$}\!\left/ \!\raisebox{-1ex}{${\displaystyle {\sum}_{\mathrm{i}=1}^{\mathrm{n}}}\frac{1}{\upsigma_{\mathrm{i}}^2}$}\right. \)
Like most of adaptive techniques based studies [38–44], the proposed algorithm is adaptive in the sense that it can be applied to other similar systems without repeating the complex design procedure. The Algorithm is summarized in Algorithm 2.
4.2 Calculate the weighted factor
Based on the proposed adaptive weighted algorithm, the multi-sensor adaptive weighted fusion algorithm of our study is defined as follows:
In formula (4.5), \( \overline{\mathrm{X}} \) is the estimation of the moment k, namely the final fusion value; W *p is the optimal weighting factor of the laser range finder (hereinafter referred to as the p); W *q is the optimal weighting factor of the binocular sensor (hereinafter referred to as the q); \( {\overline{\mathrm{X}}}_{\mathrm{p}}\left(\mathrm{k}\right) \) and \( {\overline{\mathrm{X}}}_{\mathrm{q}}\left(\mathrm{k}\right) \) are mean values of sensor p and q in moment k respectively.
The Formula (4.6) represents the optimal weighted factor of the given algorithm, which are given, that is the minimum mean square error corresponding weighting factor, σ 2p is the variance of sensor p at moment k.
Above equations are estimated based on the measured value of laser range finder at one moment. If the true estimate value \( \overline{\mathrm{X}} \) is a constant, the estimation is obtained according to the historical true values. The formula (4.7) gives the average calculation at time k. Estimation of binocular vision sensor and laser range finder are same formulas. Calculating the optimal weighted factor of the laser range finder uses the following formulas:
Formula (4.8) to formula (4.11) give the variance of σ 2p of laser range sensor; the same as the binocular vision sensor. V p is the error between true value and the measured value, the R pp (k) is the auto-covariance of sensor p, R pq (k) is the covariance of sensor p and q, \( {\overline{R}}_{pq}(k) \) is the covariance mean value and the estimation of R pq (k), X p (i) is the ith measurement of sensor p. With multi-group measured values, the optimal weighting factor can be calculated according to the above definition of each sensor.
4.3 The computations of fusion algorithm
Because the vision sensor error in the X and Y directions is very small, the X C (k) and Y C (k) can take a mean value in the X and Y direction, and the Z-coordinate of target can be obtained at time k as follows:
where X C (k)、Y C (k)、Z C (k) are target space coordinates at moment k; W * Pz is the optimal weighted factor of laser range finder in the Z-direction; Z p (k) is the measured coordinate value of laser range finder at moment k; W * qz is the optimal weighted factor of binocular vision sensor in the Z-direction; X q (k)、Y q (k)、Z q (k) are the measured coordinate values of binocular vision sensor at moment k; l k is the distance value of laser range finders at moment k;α k ,β k are the measurement of the pitch and yaw angle values by binocular vision sensor at moment k; z A is the space coordinates of laser range finder in binocular vision sensor coordinate system.
5 The simulation
In order to verify the accuracy and effectiveness of the fusion algorithm, we designed and built the simulation platform in PC environment, and conducted many experiments based on this platform. The experiment does not consider the effect of environmental factors on the equipment. The experimental parameters are shown as follows:
-
1)
With calibrating the system for many times, the deviation matrix of the origin from laser range finder coordinate system to binocular vision coordinate system is obtained.
-
2)
The different locations are measured by binocular vision sensor and laser range finder using a least squares fitting method to get the rotation matrix of the coordinate system:
-
3)
X-direction errors are 0.15, 0.3 and 0.6 mm respectively, Y-direction errors are 0.15, 0.3 and 0.6 mm, Z-direction errors of measurement are 12, 20 and 31 mm of visual sensor at 15, 17, 20 m.
-
4)
The measurement error of the laser range finder is 1.5 mm.
-
5)
The pitch and yaw angular accuracy of two-axes rotary table is 0.02 degree.
The distances of 15, 17, 20 m were simulated respectively using the fusion algorithm. Figure 5 is the RMSE (Root Mean Square Error) of 15, 17, 20 m respectively. Through the RMSE, we can clearly find that after using the fusion algorithm, the error of the Z-coordinate is further controlled, the mean square error value is smaller than a single one’s, and the measurement accuracy is greatly improved.
Because of the better accuracy of laser range finder that led to its larger weighted factor, the tendency of RMSE change in Fig. 7 conforms to the results of information fusion algorithm. The RMSE values are increasing at 15, 17, 20 m gradually; it means that the fusion error of the target movement distance is increasing.
6 Experimental study
To evaluate the effectiveness of this position measurement system, an experimental system shown in Fig. 6 is built. It consists of a fight vehicle model, two cameras, a rotary table, a laser-range sensor and 2-axis slider table. The 2-axis slider table is used to simulate the fight mode of the fight vehicle.
The parameters of experimental system are shown in Table 1.
In this experiment, the slider table moves according to a sinusoidal reference signal along both x(vertical to the optical axis) and y(along the optical axis) directions, respectively.
It can be seen from the experimental results shown in Fig. 7 that the results obtained by the data fusion closely match that outputted from camera as the slider table moves along X direction. However, there exist time lags in the results obtained by the data fusion as the movement of the turntable and laser-range sensor takes time.
As is shown in Fig. 8, when the slider table moves along Y direction, the results obtained by the proposed position measurement system are much better than the results obtained from cameras as the data fusion of laser distance and camera outputs overcome the error along the optical axis and thus improve the accuracy of the measurement.
7 Conclusions
A noncontact position measurement system using heterogeneous distributed sensor networks is proposed which can be applied in the complex space. Then, the adaptive weighted fusion algorithm of multi-sensor information is used to improve the measurement accuracy and to make the process more efficient. The improvement of the accuracy is verified by the experimental results.
One of the future works is to improve the serious time-delay by using a new mechanical structure of system which is called scanning mirror to replace the two-axes rotary table. The reason is that the scanning mirror has much higher dynamic characteristic than the two-axes rotary table. Another future work is to design advanced algorithms to perfect the system accurately and robustly and considers the impact of cyber attacks.
The results given in this work have a strong guiding significance and reference value to the subsequent research.
References
Li XR, Jilkov VP (2003) A survey of maneuvering target tracking-part i: dynamic models[J]. IEEE Trans Aerosp Electron Syst 39(4):1333–1364
Li XR, Jilkov VP (2001) A survey of maneuvering target tracking-part iii:measurement models[C]. // In: Proc2001 SPIE Conf. Signal and Data Processing of Small Targets. San Diego, CA 423–446
Fitzgerald RJ (2000) Analytic track solutions with range-dependent errors[J]. IEEE Trans Aerosp Electron Syst 36(1):343–348
Li XR, Zhu Y, Wang J et al (2003) Optimal linear estimation fusion. I. Unified fusion rules[J]. IEEE Trans Inf Theory 49(9):2192–2208
Yang W (2004) Multi-sensor data fusion and application[M].Xi’an: Xidian University Publishing House
Liu H, Zhang F (2013) Design and implementation of wireless sensor network management systems based on WEBGIS[J]. J Theor Appl Inf Technol 49(2):792–797
Chen D (2009) Calibration and data fusion between 3D laser scanner and monocular vision[D]. Dalian University of Technology 22–31
Wu Y, Hu D, Wu M et al. (2005) Unscented Kalman filtering for additive noise case: augmented Vs. non-augmented[C]//American Control Conference, 2005. Proceedings of the 2005. IEEE 4051–4055
Särkkä S (2010) Continuous-time and continuous–discrete-time unscented Rauch–Tung–Striebel Smoothers[J]. Signal Process 90(1):225–235
Ito K, Xiong K (2000) Gaussian filters for nonlinear filtering problems[J]. IEEE Trans Autom Control 45(5):910–927
Arasaratnam I, Haykin S, Elliott RJ (2007) Discrete-time nonlinear filtering algorithms using Gauss–Hermite Quadrature[J]. Proc IEEE 95(5):953–977
Arasaratnam I, Haykin S (2009) Cubature Kalman Filters[J]. IEEE Trans Autom Control 54(6):1254–1269
Gordon NJ, Salmond DJ, Smith AFM (1993) Novel approach to nonlinear/non-gaussian bayesian state estimation[C]//IEE proceedings F (radar and signal processing). IET Digit Libr 140(2):107–113
Chen J, Xu W, He S et al (2010) Utility-based asynchronous flow control algorithm for wireless sensor networks[J]. IEEE J Sel Areas Commun 28(7):1116–1126
Zhang Y, He S, Chen J (2016) Data gathering optimization by dynamic sensing and routing in rechargeable sensor networks, IEEE/ACM Transactions on Networking, vol. 24, no. 3, pp. 1632{1646, 2016
Meng W, Wang X, Liu S. “Distributed load sharing of an inverter-based microgrid with reduced communication,” IEEE Transactions on Smart Grid, DOI:10.1109/TSG.2016.2587685
He S, Chen J, Li X, Shen X, Sun Y (2014) Mobility and intruder prior information improving the barrier coverage of sparse sensor networks. IEEE Trans Mob Comput 13(6):1268–1282
Arulampalam MS, Maskell S, Gordon N et al (2000) A tutorial on particle filters for online Nonlinear/Non-Gaussian Bayesian Tracking[J]. IEEE Trans Signal Process 50(2):174–188
Cappé O, Godsill SJ, Moulines E (2007) An overview of existing methods and recent advances in sequential Monte Carlo[J]. Proc IEEE 95(5):899–924
Van Der Merwe R, Doucet A, De Freitas N et al. (2000) The unscented particle filter[C]//NIPS 584–590
Pitt MK, Shephard N (1999) Filtering via simulation: auxiliary particle filters[J]. J Am Stat Assoc 94(446):590–599
Schon T, Gustafsson F, Nordlund PJ (2005) Marginalized particle filters for mixed linear/nonlinear state-space models[J]. IEEE Trans Signal Process 53(7):2279–2289
Kotecha JH, Djuric PM (2003) Gaussian particle filtering[J]. IEEE Trans Signal Process 51(10):2592–2601
Kotecha JH, Djuric PM (2003) Gaussian sum particle filtering[J]. IEEE Trans Signal Process 51(10):2602–2612
Li N, Sun SL, Ma J (2014) Multi-sensor distributed fusion filtering for networked systems with different delay and loss rates. Digital Signal Process 34:29–38
Wang XX, Liang Y, Pan Q, Zhao CH, Yang F (2014) Design and implementation of Gaussian filter for nonlinear system with randomly delayed measurements and correlated noises. Appl Math Comput 232:1011–1024
Dunne D, Kirubarajan T (2013) Multiple model multi-Bernoulli filters for manoeu-vering targets. IEEE Trans Aerosp Electron Syst 49(4):2679–2692
Sugimoto S, Tateda H, Takahashi H et al (2004) Obstacle detection using millimeter-wave radar and its visualization on image sequence[C]//Pattern Recognition, ICPR 2004. Proceedings of the 17th International Conference on. IEEE 3: 342–345
Li Y, Wu B, Nevatia R (2008) Human detection by searching in 3D space using camera and scene knowledge[C]//Pattern Recognition, 2008. ICPR 19th International Conference on. IEEE 2008: 1–5
Bücher T, Curio C, Edelbrunner J et al (2003) Image processing and behavior planning for intelligent vehicles[J]. IEEE Trans Ind Electron 50(1):62–75
Kamen EW, Sastry CR (1993) Multiple target tracking using products of position measurements[J]. IEEE Trans Aerosp Electron Syst 29(2):476–493
Lee YJ, Kamen EW (1993) SME filter approach to multiple target tracking with false and missing measurements[C]//Optical engineering and photonics in aerospace sensing. international society for optics and photonics 574–586
Sastry CR, Kamen EW (1993) SME filter approach to multiple target tracking with radar measurements[C]. Radar Sig Proces IEE Proc F IET 140(4):251–260
Julier SJ, Uhlmann JK (1997) New extension of the Kalman filter to nonlinear ystems[C]//AeroSense’97. International Society for Optics and Photonics 182–193
Julier SJ, Uhlmann JK (2004) Unscented filtering and nonlinear estimation[J]. Proc IEEE 92(3):401–422
Li Y, Zhang L (2008) Multisensor adaptive weighted fusion algorithm and its application research [J]. Autom Instrum 136(2):10–13
Zhang H, Cheng P, Shi L et al (2016) Optimal DoS attack scheduling in wireless networked control system[J]. IEEE Trans Control Syst Technol 24(3):843–852
Meng W, Yang Q, Sun Y (2016) Guaranteed Performance Control of DFIG Variable-Speed Wind Turbines[J]. IEEE Trans Control Syst Technol. doi:10.1109/TCST.2016.2524531
Liu S, Liu XP, Saddik AE (2014) Modeling and distributed gain scheduling strategy for load frequency control in smart grids with communication topology changes. ISA Trans 52(2):454–461
Wang HQ, Liu XP, Liu KF (2016) Robust adaptive neural tracking control for a class of stochastic nonlinear interconnected systems. IEEE Trans Neural Netw Learn Syst 27(3):510–523
Liu S, Wang X, Liu PX (2015) Impact of communication delays on secondary frequency control in an islanded microgrid. IEEE Trans Ind Electron 62(4):2021–2031
Wang HQ, Liu XP, Liu KF (2015) Adaptive fuzzy tracking control for a class of pure-feedback stochastic nonlinear systems with non-lower triangular structure. Fuzzy Sets Syst. doi:10.1016/J.FSS.2015.10.001
You HE, Wang G, Peng Y (2000) Multi-sensor information fusion and application[M]. Beijing, Electronic Industry Press
Chao Z, Fu S, Jiang G, Yu Q (2011) Mono camera and laser range finding sensor position-pose measurement system [J]. Acta Opt Sin 31(3):1–6
Acknowledgments
The work is supported by National Natural Science Foundation of China(51405013), the Fundamental Research Funds for the Central Universities(2014JBZ016) as well as the National Key Scientific Instrument and Equipment Development Project(2013YQ350747).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Shen, H., Zhang, K. & Nejati, A. A noncontact positioning measuring system based on distributed wireless networks. Peer-to-Peer Netw. Appl. 10, 823–832 (2017). https://doi.org/10.1007/s12083-016-0525-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12083-016-0525-5