Document Type

Article

Source of Publication

Displays

Publication Date

9-1-2025

Abstract

Wildfires are increasing in frequency and severity, presenting critical challenges for timely detection and response, particularly in remote or resource-limited environments. This study introduces the Inception-ResNet Transformer with Quantization (IRTQ), a novel hybrid deep learning (DL) framework that integrates multi-scale feature extraction with global attention and advanced quantization. The proposed model is specifically optimized for edge deployment on platforms such as unmanned aerial vehicles (UAVs), offering a unique combination of high accuracy, low latency, and compact memory footprint. The IRTQ model achieves 98.9% accuracy across diverse datasets and shows strong generalization through cross-dataset validation. Quantization significantly reduces the parameter count to 0.09M and memory usage to 0.13 MB, enabling real-time inference in 3 ms. Interpretability is further enhanced through Grad-CAM visualizations, supporting transparent decision-making. While achieving state-of-the-art performance, the model encounters challenges in visually ambiguous fire-like regions. To address these, future work will explore multi-modal inputs and extend the model towards multi-class classification. IRTQ represents a technically grounded, interpretable, and deployable solution for AI-driven wildfire detection and disaster response.

ISSN

0141-9382

Volume

89

Disciplines

Computer Sciences

Keywords

Bushfire detection, Inception-resNet, Quantization, Smart city applications, Transformer models, Unmanned aerial vehicles (UAV)

Scopus ID

05004742851

Creative Commons License

Creative Commons Attribution 4.0 International License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Indexed in Scopus

yes

Open Access

yes

Open Access Type

Hybrid: This publication is openly available in a subscription-based journal/series

Share

COinS