Abstract
2.5D vision-based techniques, also known as hybrid vision-based techniques, provide flexible ways to estimate the range or velocity of moving objects. The information from both the image space (2D) and the Cartesian space (3D) is simultaneously utilized to construct the system state in this technology, which overcomes the disadvantages of the traditional visual servoing schemes. It has been widely adopted in motion from structure, structure from motion, and structure and motion problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Bibliography
Chaumette F, Hutchinson S (2006) Visual servo control part I: Basic approaches. IEEE Robot Autom Mag 13(4):82–90
Chen J, Dawson DM, Dixon WE, Behal A (2005) Adaptive homography-based visual servo tracking for a fixed camera configuration with a camera-in-hand extension. IEEE Trans Control Syst Technol 13(5):814–825
Chen J, Dawson DM, Dixon WE, Chitrakaran VK (2007) Navigation function-based visual servo control. Automatica 43(7):1165–1177
Chen J, Chitrakaran VK, Dawson DM (2011) Range identification of features on an object using a single camera. Automatica 47(1):201–206
Chen J, Zhang K, Jia B, Gao Y (2018) Identification of a moving object’s velocity and range with a static-moving camera system. IEEE Trans Autom Control 63(7):2168–2175
Chitrakarana VK, Dawson DM, Dixon WE, Chen J (2005) Identification of a moving object’s velocity with a fixed camera. Automatica 41(3):553–562
Chwa D, Dani AP, Dixon WE (2015) Range and motion estimation of a monocular camera using static and moving objects. IEEE Trans Control Syst Technol 24(4):1174–1183
Dani AP, Kan Z, Fischer NR, Dixon WE (2011) Structure estimation of a moving object using a moving camera: An unknown input observer approach. In: 50th IEEE conference on decision and control and European control conference, Orlando, pp 5005– 5010
Dani AP, Fischer NR, Dixon WE (2012) Single camera structure and motion. IEEE Trans Autom Control 57(1):241–246
Hutchinson S, Hager GD, Corke PI (1996) A tutorial on visual servo control. IEEE Trans Robot Autom 12(5):651–670
Janabi-Sharifi F, Deng L, Wilson WJ (2011) Comparison of basic visual servoing methods. IEEE/ASME Trans Mechatron 16(5):967–983
Malis E, Chaumette F (1999) 2-1/2D visual servoing. IEEE Trans Robot Autom 15(2):238–250
Malis E, Chaumette F (2000) 2 1/2 D visual servoing with respect to unknown objects through a new estimation scheme of camera displacement. Int J Comput Vis 37(1):79–97
Mariottini GL, Oriolo G, Prattichizzo D (2007) Image-based visual servoing for nonholonomic mobile robots using epipolar geometry. IEEE Trans Robot 23(1): 87–100
Parikh A, Cheng TH, Chen HY, Dixon WE (2017) A switched systems framework for guaranteed convergence of image-based observers with intermittent measurements. IEEE Trans Robot 33(2):266–280
Zhang K, Chen J, Li Y, Gao Y (2018) Unified visual servoing tracking and regulation of wheeled mobile robots with an uncalibrated camera. IEEE/ASME Trans Mechatron 23(4):1728–1739
Zhang K, Chaumette F, Chen J (2019) Trifocal tensor-based 6-DOF visual servoing. Int J Robot Res 38(10–11):1208–1228
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this entry
Cite this entry
Chen, J. (2021). 2.5D Vision-Based Estimation. In: Baillieul, J., Samad, T. (eds) Encyclopedia of Systems and Control. Springer, Cham. https://doi.org/10.1007/978-3-030-44184-5_100148
Download citation
DOI: https://doi.org/10.1007/978-3-030-44184-5_100148
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-44183-8
Online ISBN: 978-3-030-44184-5
eBook Packages: Intelligent Technologies and RoboticsReference Module Computer Science and Engineering