Abstract
The combination of photogrammetric aerial and terrestrial recording methods can provide new opportunities for photogrammetric applications. A UAV (Unmanned Aerial Vehicle), in our case a helicopter system, can cover both the aerial and quasi-terrestrial image acquisition methods. A UAV can be equipped with an on-board high resolution camera and a priori knowledge of the operating area where to perform photogrammetric tasks. In this general scenario our paper proposes vision-based techniques for localizing a UAV. Only natural landmarks provided by a feature tracking algorithm will be considered, without the help of visual beacons or landmarks with known positions. The novel idea is to perform global localization, position tracking and localization failure recovery (kidnapping) based only on visual matching between current view and available georeferenced satellite images. The matching is based on SIFT features and the system estimates the position of the UAV and its altitude on the base of the reference image. The vision system replaces the GPS signal combining position information from visual odometry and georeferenced imagery. Georeferenced satellite or aerial images must be available on-board beforehand or downloaded during the flight. The growing availability of high resolution satellite images (e.g., provided by Google Earth or other local information sources) makes this topic very interesting and timely. Experiments with both synthetic (i.e., taken from satellites or datasets and pre elaborated) and real world images have been performed to test the accuracy and the robustness of our method. Results show sufficient performance if compared with common GPS systems and give a good performance also in the altitude estimation, even if in this last case there are only preliminary results.
Article PDF
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
References
Amidi, O., Kanade, T., Fujita, K.: A visual odometer for autonomous helicopter fight. In: Proceedings of the Fifth International Conference on Intelligent Autonomous Systems (IAS-5) (1998)
Ascani, A., Frontoni, E., Mancini, A., Zingaretti, P.: Feature group matching for appearance-based localization. In: IEEE/RSJ 2008 International Conference on Intelligent RObots and Systems—IROS 2008, Nice (2008)
Caballero, F., Merino, L., Ferruz, J., Ollero, A.: A visual odometer without 3D reconstruction for aerial vehicles. applications to building inspection. In: Proceedings of the International Conference on Robotics and Automation, pp. 4684–4689. IEEE (2005)
Caballero, F., Merino, L., Ferruz, J., Ollero, A.: Improving vision-based planarmotion estimation for unmanned aerial vehicles through online mosaicing. In: Proceedings of the International Conference on Robotics and Automation, pp. 2860–2865. IEEE (2006)
Cesetti, A., Mancini, A., Frontoni, E., Zingaretti, P., Longhi, S.: From simulated to real scenarios: a framework for multi-UAVs. In: Carpin, S., Noda, I., Pagello, E., Reggiani, M., von Stryk, O. (eds.) Simulation, modeling, and programming for autonomous robots, Lecture notes in artificial intelligence, vol. 5325, pp. 17–28. Springer, Heidelberg (2008)
Cesetti, A., Frontoni, E., Mancini, A., Zingaretti, P., Longhi, S.: A vision-based guidance system for UAV navigation and safe landing using natural landmarks. J. Intell. Robot. Syst. 57(1–4), 233–257 (2010)
Conte, G., Doherty, P.: An integrated UAV navigation system based on aerial image matching. In: 2008 IEEE Aerospace Conference, pp. 1–10, (2008)
Corke, P.I., Sikka, P., Roberts, J.M.: Height estimation for an autonomous helicopter. In: Proceedings of ISER, pp. 101–110 (2000)
Dickmanns, E.D., Schell, F.R.: Autonomous landing of airplanes using dynamic machine vision. In: Proc. of the IEEE Workshop Applications of Computer Vision, pp. 172–179 (1992)
Eisenbeiss, H., Lambers, K., Sauerbier, M.: Photogrammetric recording of the archeoligical site of Pinchango Alto (Palpa, Peru) using a mini helicopter (UAV). Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. XXXIV-5/C34, 238–243 (2005)
Fischler, M., Bolles, R.: Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. CACM. 24(6), 381–395 (1981)
Garcia-Pardo, P.J., Sukhatme, G.S., Montgomery, J.F.: Towards vision-based safe landing for an autonomous helicopter. Robot. Auton. Syst. 38(1), 19–29 (2001)
Grant, M.S., Katzberg, S.J., Lawrence, R.W.: GPS remote sensing measurements using aerosonde UAV. In: AIAA 2005-7005, Arlington (2005)
Haarbrink R.B., Koers E.: Helicopter UAV for photogrammetry and rapid response. In: 2nd Int. Workshop “The Future of Remote Sensing”, ISPRS Inter-Commission Working Group I/V Autonomous Navigation, vol. XXXVI–1/W44 (2006)
Haarbrink R.B., Eisenbeiss H.: Accurate DSM production from unmanned helicopter systems. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. XXXVII(B1), 1259–1264 (2008)
Lowe, D.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60(2), 91–110 (2004)
Mancini, A., Cesetti, A., Iaulè, A., Frontoni, E., Zingaretti, P., Longhi, S.: A framework for simulation and testing of UAVs in cooperative scenarios. J. Intell. Robot. Syst. 54(1–3), 307–329 (2009)
Ollero, A., Merino, L.: Control and perception techniques for aerial robotics. Annu. Rev. Control, Elsevier (Francia). 28, 167–178 (2004)
Saripalli, S., Montgomery, J.F., Sukhatme, G.S.: Visually guided landing of an unmanned aerial vehicle. IEEE Trans. Robot. Autom. 19(3), 371–380 (2003)
Saripalli, S., Sukhatme, G.S.: Landing on a mobile target using an autonomous helicopter. In: Proceedings of the International Conference on Field and Service Robotics, FSR (2003)
Shakernia, O., Vidal, R., Sharp, C., Ma, Y., Sastry, S.: Multiple view motion estimation and control for landing an aerial vehicle. In: Proceedings of the International Conference on Robotics and Automation, ICRA, vol. 3, pp. 2793–2798. IEEE (2002)
Srinivasan, M.V., Zhang, S.W., Garrant, M.A.: Landing strategies in honeybees, and applications to UAVs. In: Springer Tracts in Advanced Robotics, pp. 373–384. Springer-Verlag, Berlin (2003)
Vidal, R., Sastry, S., Kim, J., Shakernia, O., Shim, D.: The Berkeley aerial robot project (BEAR). In: Proceeding of the International Conference on Intelligent Robots and Systems, IROS, pp. 1–10. IEEE/RSJ (2002)
Wu, A.D., Johnson, E.N., Proctor, A.A.: Vision-aided inertial navigation for flight control. In: Proc. of AIAA Guidance, Navigation, and Control Conference and Exhibit (2005)
Yakimenko, O.A., Kaminer, I.I., Lentz, W.J., Ghyzel, P.A.: Unmanned aircraft navigation for shipboard landing using infrared vision. IEEE Trans. Aerosp. Electron. Syst. 38(4), 1181–1200 (2002)
Zhang, Z., Hintz, K.J.: Evolving neural networks for video attitude and hight sensor. In: Proc. of the SPIE International Symposium on Aerospace/Defense Sensing and Control, vol. 2484, pp. 383–393 (1995)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Cesetti, A., Frontoni, E., Mancini, A. et al. A Visual Global Positioning System for Unmanned Aerial Vehicles Used in Photogrammetric Applications. J Intell Robot Syst 61, 157–168 (2011). https://doi.org/10.1007/s10846-010-9489-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10846-010-9489-5