The inspection of large structures is increasingly carried out with the help of Unmanned Aerial Vehicles (UAVs). When navigating relative to the structure, multiple data sources can be used to determine the position of the UAV. Examples include track data from an installed camera and sensor data from the orientation sensors of the UAV. This paper deals with the fusion of this data and its use for navigation alongside the structure. For the sensor fusion, a concept is developed using a Kalman filter and evaluated simulatively in a prototype. The calculated position data are also fed into a vector flight control system, which dynamically calculates and flies a trajectory along the component using the potential field method. This is done taking into account obstacles detected by the onboard sensors of the UAV. The established concept is then implemented with the Robot Operating System (ROS) and evaluated simulatively.
CITATION STYLE
Schörner, M., Bettendorf, M., Wanninger, C., Hoffmann, A., & Reif, W. (2021). UAV inspection of large components: Indoor navigation relative to structures. In Proceedings of the 18th International Conference on Informatics in Control, Automation and Robotics, ICINCO 2021 (pp. 179–186). SciTePress. https://doi.org/10.5220/0010556301790186
Mendeley helps you to discover research relevant for your work.