BiLSTM-LN-SA: A Novel Integrated Model with Self-Attention for Multi-Sensor Fire Detection

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Multi-sensor fire detection technology has been widely adopted in practical applications; however, existing methods still suffer from high false alarm rates and inadequate adaptability in complex environments due to their limited capacity to capture deep time-series dependencies in sensor data. To enhance the robustness and accuracy of fire detection, this paper proposes a fire detection model based on a Bidirectional Long Short-Term Memory network with Layer Normalization and Self-Attention (BiLSTM-LN-SA). The model employs a Bidirectional LSTM (BiLSTM) to autonomously extract intricate time-series features and long-term dependencies from multi-sensor data. Furthermore, Layer Normalization(LN) is introduced to effectively mitigate feature distribution shifts across different environments, thereby improving the model's adaptability to cross-scenario data distributions and generalization capability. Coupled with a self-attention mechanism that dynamically evaluates the importance of features at different time steps, the model adaptively enhances fire-critical information and achieves deeper dynamic process-aware feature fusion. Experimental results on a real-world fire dataset demonstrate that the BiLSTM-LN-SA model effectively identifies fire events, exhibiting superior detection performance.

Article activity feed