A versatile real-time vision-led runway localisation system for enhanced autonomy

Front Robot AI. 2024 Dec 6:11:1490812. doi: 10.3389/frobt.2024.1490812. eCollection 2024.

Abstract

This paper proposes a solution to the challenging task of autonomously landing Unmanned Aerial Vehicles (UAVs). An onboard computer vision module integrates the vision system with the ground control communication and video server connection. The vision platform performs feature extraction using the Speeded Up Robust Features (SURF), followed by fast Structured Forests edge detection and then smoothing with a Kalman filter for accurate runway sidelines prediction. A thorough evaluation is performed over real-world and simulation environments with respect to accuracy and processing time, in comparison with state-of-the-art edge detection approaches. The vision system is validated over videos with clear and difficult weather conditions, including with fog, varying lighting conditions and crosswind landing. The experiments are performed using data from the X-Plane 11 flight simulator and real flight data from the Uncrewed Low-cost TRAnsport (ULTRA) self-flying cargo UAV. The vision-led system can localise the runway sidelines with a Structured Forests approach with an accuracy approximately 84.4%, outperforming the state-of-the-art approaches and delivering real-time performance. The main contribution of this work consists of the developed vision-led system for runway detection to aid autonomous landing of UAVs using electro-optical cameras. Although implemented with the ULTRA UAV, the vision-led system is applicable to any other UAV.

Keywords: aerial systems: perception and autonomy; autonomous landing; autonomous vehicle navigation; computer vision for automation; vision-based navigation.

Grants and funding

The author(s) declare that financial support was received for the research, authorship, and/or publication of this article. The Research and Development was funded by an Innovate UK Grant provided for the Protecting Environments with Swarms of UAVs project in the Future Flights Strand 3 series, grant number 10023377 and by the EPSRC through Project NSF-EPSRC: ShiRAS. Towards Safe and Reliable Autonomy in Sensor Driven Systems, under Grant EP/T013265/1. ShiRAS was also supported by the USA National Science Foundation under Grant NSF ECCS 1903466.