Combining texture and stereo disparity cues for real-time face detection

被引:5
|
作者
Jiang, Feijun
Fischer, Mika [1 ]
Ekenel, Hazim Kemal [1 ,2 ]
Shi, Bertram E. [3 ,4 ]
机构
[1] Karlsruhe Inst Technol, Inst Anthropomat, D-76021 Karlsruhe, Germany
[2] Istanbul Tech Univ, Fac Comp & Informat, Istanbul, Turkey
[3] Hong Kong Univ Sci & Technol, Dept ECE, Hong Kong, Hong Kong, Peoples R China
[4] Hong Kong Univ Sci & Technol, Div Biomed Engn, Hong Kong, Hong Kong, Peoples R China
关键词
Multi-view face detection; Stereo vision; Disparity energy model; Gabor filter; CORTEX;
D O I
10.1016/j.image.2013.07.006
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Intuitively, integrating information from multiple visual cues, such as texture, stereo disparity, and image motion, should improve performance on perceptual tasks, such as object detection. On the other hand, the additional effort required to extract and represent information from additional cues may increase computational complexity. In this work, we show that using biologically inspired integrated representation of texture and stereo disparity information for a multi-view facial detection task leads to not only improved detection performance, but also reduced computational complexity. Disparity information enables us to filter out 90% of image locations as being less likely to contain faces. Performance is improved because the filtering rejects 32% of the false detections made by a similar monocular detector at the same recall rate. Despite the additional computation required to compute disparity information, our binocular detector takes only 42 ms to process a pair of 640 x 480 images, 35% of the time required by the monocular detector. We also show that this integrated detector is computationally more efficient than a detector with similar performance where texture and stereo information is processed separately. (c) 2013 Elsevier B.V. All rights reserved.
引用
收藏
页码:1100 / 1113
页数:14
相关论文
共 50 条
  • [21] Comparing and combining depth and texture cues for face recognition
    Benabdelkader, C
    Griffin, PA
    IMAGE AND VISION COMPUTING, 2005, 23 (03) : 339 - 352
  • [22] Combining retrieval and classification for real-time face recognition
    Fusco, Giovanni
    Noceti, Nicoletta
    Odone, Francesca
    2012 IEEE NINTH INTERNATIONAL CONFERENCE ON ADVANCED VIDEO AND SIGNAL-BASED SURVEILLANCE (AVSS), 2012, : 276 - 281
  • [23] Real-time moustache detection by combining image decolorization and texture detection with applications to facial gender recognition
    Wang, Jian-Gang
    Yau, Wei-Yun
    MACHINE VISION AND APPLICATIONS, 2014, 25 (04) : 1089 - 1099
  • [24] Real-time Beard Detection by Combining Image Decolorization and Texture Detection with Applications to Facial Gender Recognition
    Wang, Jian-Gang
    Yau, Wei-Yun
    PROCEEDINGS OF THE IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE IN BIOMETRICS AND IDENTITY MANAGEMENT (CIBIM), 2013, : 58 - 65
  • [25] Real-time moustache detection by combining image decolorization and texture detection with applications to facial gender recognition
    Jian-Gang Wang
    Wei-Yun Yau
    Machine Vision and Applications, 2014, 25 : 1089 - 1099
  • [26] Texture boundary detection for real-time tracking
    Shahrokni, A
    Drummond, T
    Fua, P
    COMPUTER VISION - ECCV 2004, PT 2, 2004, 3022 : 566 - 577
  • [27] Real-time disparity map extraction in a dual head stereo vision system
    Calin, G.
    Roda, V. O.
    LATIN AMERICAN APPLIED RESEARCH, 2007, 37 (01) : 21 - 24
  • [28] Feature-Based Resource Allocation for Real-Time Stereo Disparity Estimation
    Hunsberger, Eric
    Osorio, Victor Reyes
    Orchard, Jeff
    Tripp, Bryan P.
    IEEE ACCESS, 2017, 5 : 11645 - 11657
  • [29] Real-time local stereo via edge-aware disparity propagation
    Sun, Xun
    Mei, Xing
    Jiao, Shaohui
    Zhou, Mingcai
    Liu, Zhihua
    Wang, Haitao
    PATTERN RECOGNITION LETTERS, 2014, 49 : 201 - 206
  • [30] A real-time large disparity range stereo-system using FPGAs
    Masrani, DK
    MacLean, WJ
    COMPUTER VISION - ACCV 2006, PT II, 2006, 3852 : 42 - 51