Gender and gaze gesture recognition for human-computer interaction

被引:35
|
作者
Zhang, Wenhao [1 ]
Smith, Melvyn L. [1 ]
Smith, Lyndon N. [1 ]
Farooq, Abdul [1 ]
机构
[1] Univ W England, Bristol Robot Lab, Ctr Machine Vis, T Block,Frenchay Campus,Coldharbour Lane, Bristol BS16 1QY, Avon, England
基金
英国工程与自然科学研究理事会;
关键词
Assistive HCI; Gender recognition; Eye centre localisation; Gaze analysis; Directed advertising; CLASSIFICATION; IMAGES; SHAPE;
D O I
10.1016/j.cviu.2016.03.014
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The identification of visual cues in facial images has been widely explored in the broad area of computer vision. However theoretical analyses are often not transformed into widespread assistive Human Computer Interaction (HCI) systems, due to factors such as inconsistent robustness, low efficiency, large computational expense or strong dependence on complex hardware. We present a novel gender recognition algorithm, a modular eye centre localisation approach and a gaze gesture recognition method, aiming to escalate the intelligence, adaptability and interactivity of HCI systems by combining demographic data (gender) and behavioural data (gaze) to enable development of a range of real-world assistive-technology applications. The gender recognition algorithm utilises Fisher Vectors as facial features which are encoded from low-level local features in facial images. We experimented with four types of low-level features: greyscale values, Local Binary Patterns (LBP), LBP histograms and Scale Invariant Feature Transform (SIFT). The corresponding Fisher Vectors were classified using a linear Support Vector Machine. The algorithm has been tested on the FERET database, the LFW database and the FRGCv2 database, yielding 97.7%, 92.5% and 96.7% accuracy respectively. The eye centre localisation algorithm has a modular approach, following a coarse-to-fine, global-to regional scheme and utilising isophote and gradient features. A Selective Oriented Gradient filter has been specifically designed to detect and remove strong gradients from eyebrows, eye corners and self shadows (which sabotage most eye centre localisation methods). The trajectories of the eye centres are then defined as gaze gestures for active HCI. The eye centre localisation algorithm has been compared with 10 other state-of-the-art algorithms with similar functionality and has outperformed them in terms of accuracy while maintaining excellent real-time performance. The above methods have been employed for development of a data recovery system that can be employed for implementation of advanced assistive technology tools. The high accuracy, reliability and real-time performance achieved for attention monitoring, gaze gesture control and recovery of demographic data, can enable the advanced human-robot interaction that is needed for developing systems that can provide assistance with everyday actions, thereby improving the quality of life for the elderly and/or disabled. (C) 2016 Elsevier Inc. All rights reserved.
引用
收藏
页码:32 / 50
页数:19
相关论文
共 50 条
  • [1] Eye center localization and gaze gesture recognition for human-computer interaction
    Zhang, Wenhao
    Smith, Melvyn L.
    Smith, Lyndon N.
    Farooq, Abdul
    JOURNAL OF THE OPTICAL SOCIETY OF AMERICA A-OPTICS IMAGE SCIENCE AND VISION, 2016, 33 (03) : 314 - 325
  • [2] Recognition of hand gesture to human-computer interaction
    Lee, LK
    Kim, S
    Choi, YK
    Lee, MH
    IECON 2000: 26TH ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY, VOLS 1-4: 21ST CENTURY TECHNOLOGIES AND INDUSTRIAL OPPORTUNITIES, 2000, : 2117 - 2122
  • [3] Dynamic gesture recognition and human-computer interaction
    Zhang, Jiali
    Liu, Guixi
    PROCEEDINGS OF THE 2015 INTERNATIONAL INDUSTRIAL INFORMATICS AND COMPUTER ENGINEERING CONFERENCE, 2015, : 1836 - 1839
  • [4] A hand gesture recognition technique for human-computer interaction
    Kiliboz, Nurettin Cagri
    Gudukbay, Ugur
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2015, 28 : 97 - 104
  • [5] Human-computer interaction system based on gesture recognition
    Li, Wei
    Zhang, Honglei
    Zhang, Zhilong
    Li, Chuwei
    SECOND INTERNATIONAL CONFERENCE ON OPTICS AND IMAGE PROCESSING (ICOIP 2022), 2022, 12328
  • [6] Face and hand gesture recognition for human-computer interaction
    Hongo, H
    Ohya, M
    Yasumoto, M
    Yamamoto, K
    15TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOL 2, PROCEEDINGS: PATTERN RECOGNITION AND NEURAL NETWORKS, 2000, : 921 - 924
  • [7] Design of hand gesture recognition system for human-computer interaction
    Tsai, Tsung-Han
    Huang, Chih-Chi
    Zhang, Kung-Long
    MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (9-10) : 5989 - 6007
  • [8] A visual system for hand gesture recognition in human-computer interaction
    Okkonen, Matti-Antero
    Kellokumpu, Vili
    Pietikainen, Matti
    Heikkilae, Janne
    IMAGE ANALYSIS, PROCEEDINGS, 2007, 4522 : 709 - +
  • [9] Gesture recognition for human-computer interaction using neural networks
    Zhang, YX
    Yuan, JG
    8TH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING, VOLS 1-3, PROCEEDING, 2001, : 735 - 740
  • [10] THE METHOD FOR HUMAN-COMPUTER INTERACTION BASED ON HAND GESTURE RECOGNITION
    Raudonis, Vidas
    Jonaitis, Domas
    PROCEEDINGS OF THE 8TH INTERNATIONAL CONFERENCE ON ELECTRICAL AND CONTROL TECHNOLOGIES, 2013, : 45 - 49