Abstract
Reacting to unforeseen obstacles is a major issue in the field of autonomous navigation. In the context of Unmanned Aerial Vehicles, an “obstacle” is any object that stands between the UAV and its desired position (waypoint). Therefore, obstacle detection can be reduced to the problem of assessing the visibility of the waypoint from the point of view of the drone. In this work, data acquired from an onboard depth camera are used to describe the visibility of the target waypoint in a qualitative framework, and to plan a new route when obstacles are detected.
Access provided by CONRICYT-eBooks. Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Unmanned Aerial Vehicles (UAVs) have a wide range of applications. Multicopters, in particular, have found great popularity due to their maneuverability and relative low cost (Cai et al. 2014). However, the state of the art only provides solutions for point-to-point navigation in free space: therefore their application in cluttered environments still depends heavily on remote control by a human operator.
In most cases, UAV missions consist of a sequence of movements interleaved with operations that involve the payload, such as taking photographs or dropping a package. To increase the degree of automation in such missions, systems are needed that can adapt the flight plan to newly-discovered information about the environment.
Many strategies found in the literature (see Goerzen et al. 2010 for a broad classification; Kendoul 2012 for a more recent and comprehensive review) rely on the creation and periodic update of a detailed model of the drone surroundings. For concrete examples of this approach, see Nieuwenhuisen and Behnke (2014), Hrabar (2011).
Qualitative approaches, on the other hand, could mimic the adaptability of human operators without storing such a great amount of information.
To this effect, we propose an algorithm that can guide a multicopter UAV towards a destination waypoint, adapting the route as new obstacles are detected. Both obstacle detection and path replanning are based on the evaluation of GPS information together with a depth map, i.e. raster data acquired from an on- board depth camera. The algorithm is reactive: no model of the environment is stored in memory, and decisions only rely on the currently available sensor data.
2 Assessing Visibility Relationships
We can use our knowledge of both the UAV and the target’s GPS coordinates to point the depth camera towards the target and then “project” the latter on a pixel p of the depth map.
We can then compute the waypoint-UAV visibility relation according to the qualitative framework presented in Fogliaroni and Clementini (2014). This is achieved by evaluating a neighborhood of p whose size depends on the distance between the UAV and the target.
3 Wayfinding
When an obstacle occludes the target, we need to find an escape waypoint. This is an intermediate position that is already visible from the current position of the UAV, and from which the original target should be visible unless more obstacles are detected along the way.
Wayfinding is a two-step process: first we have to rule out points that are visible but too close to obstacles. This can be done by applying a “depth-aware” dilation filter to the depth map.
We then choose the candidate that minimizes the overall distance from the target, and apply a transformation to obtain the GPS coordinates of the escape waypoint.
4 Simulation
Currently the algorithm is implemented as a set of Python 3 scripts that communicate with a simulated quadcopter inside the Coppelia V-REP robotic platform (Rohmer et al. 2013).
The tests show good performance even with lower depth map resolutions, as the effective path taken by the UAV is well below 1.1x the distance between the start position and the target waypoint.
References
Cai G, Dias J, Seneviratne L (2014) A survey of small-scale unmanned aerial vehicles: recent advances and future development trends. Unmanned Syst 02(02):175–199. doi:10.1142/S2301385014300017
Fogliaroni P, Clementini E (2014) Modeling visibility in 3D space: a qualitative frame of reference. In 9th International 3DGeoInfo 2014—Lecture Note in Geoinformation and Cartography, pp 1–19. doi:10.1007/978-3-319-12181-9_15
Goerzen C, Kong Z, Mettler B (2010) A survey of motion planning algorithms from the perspective of autonomous UAV guidance, vol 57. doi:10.1007/s10846-009-9383-1
Hrabar S (2011) Reactive obstacle avoidance for rotorcraft UAVs. In: IEEE International Conference on Intelligent Robots and Systems (August), pp 4967–4974. doi:10.1109/IROS.2011.6048312
Kendoul F (2012) Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems. J Field Robot 29(2):315–378. doi:10.1002/rob.20414,10.1.1.91.5767
Nieuwenhuisen M, Behnke S (2014) Hierarchical planning with 3D local multiresolution obstacle avoidance for micro aerial vehicles. In: Joint 45th international symposium on robotics (ISR) and 8th German conference on robotics (ROBOTIK), University of Bonn, Germany
Rohmer E, Singh SPN, Freese M (2013) V-REP : a versatile and scalable robot simulation framework. In: Proceedings of The International Conference on Intelligent Robots and Systems (IROS)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Di Stefano, L., Clementini, E., Stagnini, E. (2018). Reactive Obstacle Avoidance for Multicopter UAVs via Evaluation of Depth Maps. In: Fogliaroni, P., Ballatore, A., Clementini, E. (eds) Proceedings of Workshops and Posters at the 13th International Conference on Spatial Information Theory (COSIT 2017). COSIT 2017. Lecture Notes in Geoinformation and Cartography. Springer, Cham. https://doi.org/10.1007/978-3-319-63946-8_10
Download citation
DOI: https://doi.org/10.1007/978-3-319-63946-8_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-63945-1
Online ISBN: 978-3-319-63946-8
eBook Packages: Earth and Environmental ScienceEarth and Environmental Science (R0)