Abstract
In this paper, a robust monocular visual-inertial SLAM based on nonlinear optimization is proposed. In our method, visual feature points are assigned different information matrices according to the image pyramid layers at which the features are extracted. IMU pre-integration strategy is adopted to avoid repeated IMU integration caused by initial states change in optimization. Meanwhile, we adopted the strategies of sliding window and marginalization in order to yield higher precision of states estimation and restrict the computational complexity. Experiments are designed to compare our algorithm with MSCKF and VINS on EuRoC dataset, and the results show that our method can effectively estimate the motion and sparse map.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Davison, A.J., Reid, I.D., Molton, N.D., et al.: Monoslam: real-time single camera SLAM. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 1052–1067 (2007)
Mur-Artal, R., Montiel, J.M.M., Tardos, J.D.: ORB-SLAM: a versatile and accurate monocular SLAM system. IEEE Trans. Robot. 31(5), 1147–1163 (2015)
Engel, J., Schps, T., Cremers, D.: LSD-SLAM: large-scale direct monocular slam. In: 13th European Conference on Computer Vision, ECCV 2014, Zurich, Switzerland, 6–12 September 2014 (2014)
Li, M., Mourikis, A.I.: High-precision, consistent EKF-based visual-inertial odometry. Int. J. Robot. Res. 32(6), 690–711 (2013)
Bloesch, M., Omari, S., Hutter, M., et al.: Robust visual inertial odometry using a direct EKF-based approach. In: The 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2015, Hamburg, Germany, 28 September–03 October 2015 (2015)
Shen, S., Michael, N., Kumar, V.: Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs. In: The 2015 IEEE International Conference on Robotics and Automation, ICRA 2015, Seattle, America, 26–30 May 2015 (2015)
Yang, Z., Shen, S.: Monocular visual-inertial state estimation with online initialization and camera-imu extrinsic calibration. IEEE Trans. Automation Sci. Eng. 14(1), 1–13 (2016)
Strasdat, H., Montiel, J.M.M., Davison, A.J.: Visual SLAM: why filter? Image Vis. Comput. 30(2), 65–77 (2012)
Qin, T., Li, P., Shen, S.: VINS-Mono: a robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 34(4), 1004–1020 (2018)
Forster, C., Carlone, L., Dellaert, F., et al.: On manifold preintegration for real-time visual-inertial odometry. IEEE Trans. Robot. 33(1), 1–21 (2017)
Acknowledgements
This work is supported by the National Natural Science Foundation of China (Grant No. 41874034), the National Science and Technology Major Project of the National Key R&D Program of China (Grant No. 2016YFB0502102), the Beijing Natural Science Foundation (Grant No. 4202041), the Aeronautical Science Foundation of China.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Duo, J., Ji, L., Zhao, L. (2021). Robust Monocular Visual-Inertial SLAM Using Nonlinear Optimization. In: Jia, Y., Zhang, W., Fu, Y. (eds) Proceedings of 2020 Chinese Intelligent Systems Conference. CISC 2020. Lecture Notes in Electrical Engineering, vol 705. Springer, Singapore. https://doi.org/10.1007/978-981-15-8450-3_59
Download citation
DOI: https://doi.org/10.1007/978-981-15-8450-3_59
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-8449-7
Online ISBN: 978-981-15-8450-3
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)