Abstract
Special Session: Hybrid and Convertible Unmanned Aerial Vehicles—Environments that have structures in multiple domains are a great challenge to performing inspection and monitoring tasks, often requiring several robots to perform in these ambient. Hybrid vehicles can become an alternative to the use of several devices. Especially, industrial marine ecosystems present a drastic change in the environment that can represent a challenge for these hybrid vehicles both in sensing and locomotion. Consequently, the use of infrared and acoustic sensors is discouraged for this particular vehicle type because they operate only in one specific medium. The infrared just works in air and acoustic sensor in the underwater environment, leading to limitations concerning weight for the vehicle’s structural development. Furthermore, the integration of both sensors adds complexity to the implementation of autonomy. This work presents a benchmark to evaluate the performance of visual sensors in both air and water showing its advantage and limitation.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Dudek, G., Giguere, P., Prahacs, C., Saunderson, S., Sattar, J., Torres-Mendez, L.-A., Jenkin, M., German, A., Hogue, A., Ripsman, A., et al.: Aqua: An amphibious autonomous robot. Computer 40(1), 46–53 (2007)
Crespi, A., Badertscher, A., Guignard, A., Ijspeert, A.J.: Amphibot i: an amphibious snake-like robot. RAS 50(4), 163–175 (2005)
Li, M., Guo, S., Hirata, H., Ishihara, H.: Design and performance evaluation of an amphibious spherical robot. RAS 64, 21–34 (2015)
Dos Santos, M.M., De Giacomo, G.G., Drews, P.L., Botelho, S.S.: Matching color aerial images and underwater sonar images using deep learning for underwater localization. IEEE RA-L 5(4), 6365–6370 (2020)
Dos Santos, M.M., De Giacomo, G.G., Drews-Jr, P.L., Botelho, S.S.: Cross-view and cross-domain underwater localization based on optical aerial and acoustic underwater images. IEEE RA-L 7(2), 4969–4974 (2022)
dos Santos, M.M., de Oliveira Evald, P.J.D., de Giacomo, G.G., Drews-Jr, P.L.J., da Costa Botelho, S.S.: A probabilistic underwater localisation based on cross-view and cross-domain acoustic and aerial images. JINT 108(3), 1–16 (2023)
Miranda Pinheiro, P., Dias de Oliveira Evald, J., Bedin Grando, R., Alves Neto, A., Jorge Drews-Jr, P.L.: Hybrid unmanned aerial underwater vehicles: A survey on concepts and technologies. Available at SSRN 4424715 (2023)
Yang, X., Wang, T., Liang, J., Yao, G., Liu, M.: Survey on the novel hybrid aquatic-aerial amphibious aircraft: Aquatic unmanned aerial vehicle (aquauav). Prog. Aerosp. Sci. 74, 131–151 (2015)
Drews, P.L., Neto, A.A., Campos, M.F.: Hybrid unmanned aerial underwater vehicle: Modeling and simulation. In: IEEE/RSJ IROS, pp. 4637–4642 (2014)
Maia, M.M., Soni, P., Diez, F.J.: Demonstration of an aerial and submersible vehicle capable of flight and underwater navigation with seamless air-water transition. arXiv preprint arXiv:1507.01932 (2015)
Horn, A.C., Pinheiro, P.M., Grando, R.B., da Silva, C.B., Neto, A.A., Drews, P.L.: A novel concept for hybrid unmanned aerial underwater vehicles focused on aquatic performance. In: IEEE LARS/SBR, pp. 1–6 (2020)
Pedroso, A.A., da Silva, A.C., Pinheiro, P.M., Drews, P.L.: Prototyping and construction of a hybrid unmanned aerial underwater vehicles. In: IEEE LARS/SBR, pp. 61–66 (2022)
Fan, R., Jiao, J., Pan, J., Huang, H., Shen, S., Liu, M.: Real-time dense stereo embedded in a UAV for road inspection. In: IEEE/CVF CVPRw (2019)
Ma, Y., Li, Q., Chu, L., Zhou, Y., Xu, C.: Real-time detection and spatial localization of insulators for uav inspection based on binocular stereo vision. Remote Sens. 13(2), 230 (2021)
Wang, D., Li, W., Liu, X., Li, N., Zhang, C.: Uav environmental perception and autonomous obstacle avoidance: a deep learning and depth camera combined solution. Comput. Electron. Agric. 175, 105523 (2020)
Rueda-Ayala, V.P., Peña, J.M., Höglind, M., Bengochea-Guevara, J.M., Andújar, D.: Comparing uav-based technologies and rgb-d reconstruction methods for plant height and biomass monitoring on grass ley. Sensors 19(3), 535 (2019)
Bobkov, V., Kudryashov, A., Inzartsev, A.: Method for the coordination of referencing of autonomous underwater vehicles to man-made objects using stereo images. J. Marine Sci. Engin. 9(9), 1038 (2021)
Duecker, D.A., Hansen, T., Kreuzer, E.: Rgb-d camera-based navigation for autonomous underwater inspection using low-cost micro auvs. In: IEEE/OES AUV, pp. 1–7 (2020)
Wang, Y., Ma, X., Wang, J., Hou, S., Dai, J., Gu, D., Wang, H.: Robust auv visual loop-closure detection based on variational autoencoder network. IEEE Trans. Industr. Inf. 18(12), 8829–8838 (2022)
“Bluerov2,” Blue Robotics (2022). https://www.bluerobotics.com/store/rov/bluerov2/
Quigley, M., Conley, K., Gerkey, B., Faust, J., Foote, T., Leibs, J., Wheeler, R., Ng, A.Y., et al., Ros: an open-source robot operating system. IEEE ICRAw 3(3.2), 5 (2009)
Wang, C., Zhao, C., Yang, J.: Monocular odometry in country roads based on phase-derived optical flow and 4-dof ego-motion model. Indust. Robot: Int. J. (2011)
Nenchoo, B., Tantrairatn, S.: Real-time 3d uav pose estimation by visualization. MDPI 39(1), 18 (2020)
Grando, R.B., Pinheiro, P.M., Bortoluzzi, N.P., da Silva, C.B., Zauk, O.F., Piñeiro, M.O., Aoki, V.M., Kelbouscas, A.L., Lima, Y.B., Drews, P.L., et al.: Visual-based autonomous unmanned aerial vehicle for inspection in indoor environments. In: IEEE LARS/SBR, pp. 1–6 (2020)
Ahluwalia, V., Arents, J., Oraby, A., Greitans, M.: Construction and benchmark of an autonomous tracked mobile robot system. RAS 2(1), 15–28 (2022)
Tadic, V., Toth, A., Vizvari, Z., Klincsik, M., Sari, Z., Sarcevic, P., Sarosi, J., Biro, I.: Perspectives of realsense and zed depth sensors for robotic vision applications. Machines 10(3), 183 (2022)
Wang, C., Zhang, Q., Lin, S., Li, W., Wang, X., Bai, Y., Tian, Q.: Research and experiment of an underwater stereo vision system. In: OCEANS 2019-Marseille, pp. 1–5 (2019)
Wang, M.-S.: Eye to hand calibration using anfis for stereo vision-based object manipulation system. Microsyst. Technol. 24, 305–317 (2018)
Du, Y.-C., Muslikhin, M., Hsieh, T.-H., Wang, M.-S.: Stereo vision-based object recognition and manipulation by regions with convolutional neural network. Electronics 9(2), 210 (2020)
Jetson nano, Nvidia (2023). https://www.nvidia.com/pt-br/autonomous-machines/embedded-systems/jetson-nano/
Acknowledgements
This study was financed by the Human Resource Program of The Brazilian National Agency for Petroleum, Natural Gas, and Biofuels—PRH-ANP, supported by resources from oil companies considering the contract clause n\(^{\underline{o}}\) 50/2015 of R, D &I of the ANP, CAPES, and CNPq. Moreover, we would like to thank to the Technological University of Uruguay (UTEC).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Mateus, M.G. et al. (2024). Visual Sensors Benchmark for Development of an Autonomous Navigation Setup for a Hybrid Unmanned Aerial Underwater Vehicle. In: Youssef, E.S.E., Tokhi, M.O., Silva, M.F., Rincon, L.M. (eds) Synergetic Cooperation Between Robots and Humans. CLAWAR 2023. Lecture Notes in Networks and Systems, vol 810. Springer, Cham. https://doi.org/10.1007/978-3-031-47269-5_20
Download citation
DOI: https://doi.org/10.1007/978-3-031-47269-5_20
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-47268-8
Online ISBN: 978-3-031-47269-5
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)