Towards microscope-video-based fire-detection

被引:0
|
作者
Schultze, T [1 ]
Willms, I [1 ]
机构
[1] Univ Duisburg Essen, Dept Commun Syst, NTS, D-47057 Duisburg, Germany
关键词
D O I
10.1109/CCST.2005.1594868
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nowadays video-based fire-detection is a nearly common application, as the number of installed surveillance video-systems is growing and the related processing units getting more powerful. The fire-detection-feature is usually only an add-on feature of 'intelligent' surveillance video-systems. The detection-criteria are therefore based on macroscopic characteristics observable in the surveillance video, like particular smoke dynamics, flame flickering or loss of image contrast due to obscuration by the smoke. In a new approach not the macroscopic, but the microscopic characteristics of aerosols are analysed with view to a more reliable discrimination between fire and non-fire aerosols. By monitoring an illuminated sheet of air some centimeters under the ceiling of a room, it is possible to get information about a limited range of the particle size distribution, density and flow characteristics of the suspended aerosol. A prototype of the scanning system has been developed and a series of test-fires (according to the EN54) and non-fire tests has been carried out in the Duisburg-Fire-Detection-Laboratory. The comparison between the test-fires and the non-fire scenarios shows that the discrimination between fire and non-fire aerosols is cogitable. The analyses of the aerosol characteristics are based on adapted pattern recognition techniques applied in the image processing. Some differences between different aerosol types are even visible to the naked eye. This paper presents the developed prototype specifying important features. Afterwards the most interesting results are shown and commented. Finally the possibilities and limitations of an automatic fire-detection system based on the microscope-video analysis are discussed.
引用
收藏
页码:23 / 25
页数:3
相关论文
共 50 条
  • [41] Research on Unsupervised Fire Detection Method based on UAV Infrared Video
    Ye, Yang
    2024 3RD INTERNATIONAL CONFERENCE ON IMAGE PROCESSING AND MEDIA COMPUTING, ICIPMC 2024, 2024, : 268 - 272
  • [42] A Video-Based Fire Detection Using Deep Learning Models
    Kim, Byoungjun
    Lee, Joonwhoan
    APPLIED SCIENCES-BASEL, 2019, 9 (14):
  • [43] Target-Tracking Based Early Fire Smoke Detection in Video
    Wei, Zheng
    Wang, Xingang
    An, Wenchuan
    Che, Jianfeng
    PROCEEDINGS OF THE FIFTH INTERNATIONAL CONFERENCE ON IMAGE AND GRAPHICS (ICIG 2009), 2009, : 172 - 176
  • [44] Light Condition Estimation Based on Video Fire Detection in Spacious Buildings
    Jia, Yang
    Lin, Gaohua
    Wang, Jinjun
    Fang, Jun
    Zhang, Yongming
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2016, 41 (03) : 1031 - 1041
  • [45] Video-Based Fire Detection by Transforming to Optimal Color Space
    Manickam, M. Thanga
    Yogesh, M.
    Sridhar, P.
    Thangavel, Senthil Kumar
    Parameswaran, Latha
    COMPUTATIONAL VISION AND BIO-INSPIRED COMPUTING, 2020, 1108 : 1256 - 1264
  • [46] Covariance matrix-based fire and flame detection method in video
    Habiboglu, Yusuf Hakan
    Gunay, Osman
    Cetin, A. Enis
    MACHINE VISION AND APPLICATIONS, 2012, 23 (06) : 1103 - 1113
  • [47] Fire Detection and Recognition Optimization Based on Virtual Reality Video Image
    Huang, Xinchu
    Du, Lin
    IEEE ACCESS, 2020, 8 : 77951 - 77961
  • [48] AUTOMATIC FIRE DETECTION SYSTEM BASED ON CONTOUR ANALYSIS VIDEO IMAGES
    Egoshina, Irina
    Titov, Dmitry
    Stuchkov, Anton
    VII SCIENTIFIC CONFERENCE WITH INTERNATIONAL PARTICIPATION INFORMATION-MEASURING EQUIPMENT AND TECHNOLOGIES (IME&T 2016), 2016, 79
  • [49] Covariance matrix-based fire and flame detection method in video
    Yusuf Hakan Habiboğlu
    Osman Günay
    A. Enis Çetin
    Machine Vision and Applications, 2012, 23 : 1103 - 1113
  • [50] Forest Fire Detection Based on Video Multi-Feature Fusion
    Jie, Li
    Jiang, Xiao
    2009 2ND IEEE INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND INFORMATION TECHNOLOGY, VOL 2, 2009, : 19 - 22