Autonomous Landing Scene Recognition Based on Transfer Learning for Drones
Main Article Content
Abstract
This paper introduces an advanced approach to autonomous landing scene recognition for drones, addressing the challenges posed by similar scenes and varying representations at different altitudes. Leveraging deep learning techniques, particularly a hybrid ensemble method, our study significantly enhances the accuracy and robustness of the recognition system. We build upon the base model's success of 95% accuracy by incorporating a novel ensemble technique, CNN + LSTM + BiLSTM, achieving an impressive 99% accuracy rate. Our model utilizes knowledge transfer learning on the LandingScenes-7 dataset, integrating ResNext50, ResNet50, and recurrent neural networks (RNNs) to analyze and identify suitable landing spots in real-time. Additionally, a novelty detection module and thresholding techniques ensure adaptability to unforeseen scenarios and provide confidence assessment for classification. The implications of this research extend to various industries relying on drone technology, particularly in emergency response, surveillance, and logistics. By enhancing drone autonomy and safety in landing procedures, our approach contributes significantly to the broader goal of advancing drone intelligence and ensuring safer operations in dynamic environments.