Gas Detection and Classification Using Multimodal Data Based on Federated Learning

Sensors (Basel). 2024 Sep 11;24(18):5904. doi: 10.3390/s24185904.

Abstract

The identification of gas leakages is a significant factor to be taken into consideration in various industries such as coal mines, chemical industries, etc., as well as in residential applications. In order to reduce damage to the environment as well as human lives, early detection and gas type identification are necessary. The main focus of this paper is multimodal gas data that were obtained simultaneously by using multiple sensors for gas detection and a thermal imaging camera. As the reliability and sensitivity of low-cost sensors are less, they are not suitable for gas detection over long distances. In order to overcome the drawbacks of relying just on sensors to identify gases, a thermal camera capable of detecting temperature changes is also used in the collection of the current multimodal dataset The multimodal dataset comprises 6400 samples, including smoke, perfume, a combination of both, and neutral environments. In this paper, convolutional neural networks (CNNs) are trained on thermal image data, utilizing variants such as bidirectional long-short-term memory (Bi-LSTM), dense LSTM, and a fusion of both datasets to effectively classify comma separated value (CSV) data from gas sensors. The dataset can be used as a valuable source for research scholars and system developers to improvise their artificial intelligence (AI) models used for gas leakage detection. Furthermore, in order to ensure the privacy of the client's data, this paper explores the implementation of federated learning for privacy-protected gas leakage classification, demonstrating comparable accuracy to traditional deep learning approaches.

Keywords: dataset; gas leakage; image enhancement; low-cost sensors; multimodal dataset; thermal camera.

Grants and funding

This research received no external funding.