The foreground detection algorithm combined the temporal-spatial information and adaptive visual background extraction

被引:7
|
作者
Qu, Z. [1 ,2 ,3 ]
Huang, X. -L. [1 ]
机构
[1] Chongqing Univ Posts & Telecommun, Coll Comp Sci & Technol, Chongqing 400065, Peoples R China
[2] Chongqing Univ Posts & Telecommun, Sch Software Engn, Chongqing 400065, Peoples R China
[3] Chongqing Engn Res Ctr Software Qual Assurance Te, Chongqing 400065, Peoples R China
来源
IMAGING SCIENCE JOURNAL | 2017年 / 65卷 / 01期
关键词
Visual background extraction; Background model; Ghost; Illumination change; REAL; VIBE;
D O I
10.1080/13682199.2016.1258509
中图分类号
TB8 [摄影技术];
学科分类号
0804 ;
摘要
Visual background extraction algorithm, which utilises a global threshold to complete the foreground segmentation, cannot adapt to illumination change well. It will easily choose the wrong pixels to initialise the background model, resulting in the emergence of the ghost in the beginning of detection. In order to address these problems, this article proposes an improved algorithm based on pixel's temporal-spatial information to initialise the background model. First of all, the pixels in video image sequences and their neighbourhood pixels are used to complete background model initialisation in the first five frames. Second, the segmentation threshold is adaptively obtained by the complexity of background that uses the spatial neighbourhood pixels. Finally, the background model of the neighbourhood pixels is updated by a dynamic update rate which is gained by calculating the Euclidean distance between pixels. Experimental results and comparative study illustrate that the improved method can not only increase the accuracy of target detection by reducing the impact of illumination change effectively but also eliminate the ghost quickly.
引用
收藏
页码:49 / 61
页数:13
相关论文
共 50 条
  • [31] Intelligent object extraction algorithm based on foreground/background classification
    1600, Auto-ID Labs Korea; Mobile Multimedia Research Center (Springer Verlag):
  • [32] Intelligent object extraction algorithm based on foreground/background classification
    Wang, JF
    Hsu, HJ
    Li, JS
    EMBEDDED AND UBIQUITOUS COMPUTING - EUC 2005 WORKSHOPS, PROCEEDINGS, 2005, 3823 : 101 - 110
  • [33] Integrating a statistical background-foreground extraction algorithm and SVM classifier for pedestrian detection and tracking
    Li, Dawei
    Xu, Lihong
    Goodman, Erik D.
    Xu, Yuan
    Wu, Yang
    INTEGRATED COMPUTER-AIDED ENGINEERING, 2013, 20 (03) : 201 - 216
  • [34] Siamese Network for Visual Tracking with Temporal-spatial Property
    Jiang, Shan
    Di, Xiaoqiang
    Han, Cheng
    Binggong Xuebao/Acta Armamentarii, 2021, 42 (09): : 1940 - 1950
  • [35] Temporal-spatial information mining and aggregation for video matting
    Zhiwei Ma
    Guilin Yao
    Multimedia Tools and Applications, 2024, 83 : 29221 - 29237
  • [36] The moving target detection algorithm based on the improved visual background extraction
    Huang, Wei
    Liu, Lei
    Yue, Chao
    Li, He
    INFRARED PHYSICS & TECHNOLOGY, 2015, 71 : 518 - 525
  • [37] Extraction of the Foreground Regions by Means of the Adaptive Background Modelling Based on Various Colour Components for a Visual Surveillance System
    Frejlichowski, Dariusz
    Gosciewska, Katarzyna
    Forczmanski, Pawel
    Nowosielski, Adam
    Hofman, Radoslaw
    PROCEEDINGS OF THE 8TH INTERNATIONAL CONFERENCE ON COMPUTER RECOGNITION SYSTEMS CORES 2013, 2013, 226 : 351 - 360
  • [38] Temporal-spatial information mining and aggregation for video matting
    Ma, Zhiwei
    Yao, Guilin
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (10) : 29221 - 29237
  • [39] Temporal-spatial memory: retrieval of spatial information does not reduce recency
    Farrand, P
    Parmentier, FBR
    Jones, DM
    ACTA PSYCHOLOGICA, 2001, 106 (03) : 285 - 301
  • [40] Text line extraction in graphical documents using background and foreground information
    Pratim Roy, Partha
    Pal, Umapada
    Llados, Josep
    INTERNATIONAL JOURNAL ON DOCUMENT ANALYSIS AND RECOGNITION, 2012, 15 (03) : 227 - 241