Gender and gaze gesture recognition for human-computer interaction

被引:35
|
作者
Zhang, Wenhao [1 ]
Smith, Melvyn L. [1 ]
Smith, Lyndon N. [1 ]
Farooq, Abdul [1 ]
机构
[1] Univ W England, Bristol Robot Lab, Ctr Machine Vis, T Block,Frenchay Campus,Coldharbour Lane, Bristol BS16 1QY, Avon, England
基金
英国工程与自然科学研究理事会;
关键词
Assistive HCI; Gender recognition; Eye centre localisation; Gaze analysis; Directed advertising; CLASSIFICATION; IMAGES; SHAPE;
D O I
10.1016/j.cviu.2016.03.014
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The identification of visual cues in facial images has been widely explored in the broad area of computer vision. However theoretical analyses are often not transformed into widespread assistive Human Computer Interaction (HCI) systems, due to factors such as inconsistent robustness, low efficiency, large computational expense or strong dependence on complex hardware. We present a novel gender recognition algorithm, a modular eye centre localisation approach and a gaze gesture recognition method, aiming to escalate the intelligence, adaptability and interactivity of HCI systems by combining demographic data (gender) and behavioural data (gaze) to enable development of a range of real-world assistive-technology applications. The gender recognition algorithm utilises Fisher Vectors as facial features which are encoded from low-level local features in facial images. We experimented with four types of low-level features: greyscale values, Local Binary Patterns (LBP), LBP histograms and Scale Invariant Feature Transform (SIFT). The corresponding Fisher Vectors were classified using a linear Support Vector Machine. The algorithm has been tested on the FERET database, the LFW database and the FRGCv2 database, yielding 97.7%, 92.5% and 96.7% accuracy respectively. The eye centre localisation algorithm has a modular approach, following a coarse-to-fine, global-to regional scheme and utilising isophote and gradient features. A Selective Oriented Gradient filter has been specifically designed to detect and remove strong gradients from eyebrows, eye corners and self shadows (which sabotage most eye centre localisation methods). The trajectories of the eye centres are then defined as gaze gestures for active HCI. The eye centre localisation algorithm has been compared with 10 other state-of-the-art algorithms with similar functionality and has outperformed them in terms of accuracy while maintaining excellent real-time performance. The above methods have been employed for development of a data recovery system that can be employed for implementation of advanced assistive technology tools. The high accuracy, reliability and real-time performance achieved for attention monitoring, gaze gesture control and recovery of demographic data, can enable the advanced human-robot interaction that is needed for developing systems that can provide assistance with everyday actions, thereby improving the quality of life for the elderly and/or disabled. (C) 2016 Elsevier Inc. All rights reserved.
引用
收藏
页码:32 / 50
页数:19
相关论文
共 50 条
  • [41] Dynamic hand gesture recognition using vision-based approach for human-computer interaction
    Singha, Joyeeta
    Roy, Amarjit
    Laskar, Rabul Hussain
    NEURAL COMPUTING & APPLICATIONS, 2018, 29 (04): : 1129 - 1141
  • [42] Research on Vision-Based Multi-user Gesture Recognition Human-Computer Interaction
    Zhang, Guofeng
    Zhang, Dongming
    7TH INTERNATIONAL CONFERENCE ON SYSTEM SIMULATION AND SCIENTIFIC COMPUTING ASIA SIMULATION CONFERENCE 2008, VOLS 1-3, 2008, : 1455 - 1458
  • [43] Hand Gesture Recognition based on Motion History Images for a Simple Human-Computer Interaction System
    Timotius, Ivanna K.
    Setyawan, Iwan
    INTERNATIONAL CONFERENCE ON GRAPHIC AND IMAGE PROCESSING (ICGIP 2012), 2013, 8768
  • [44] Gesture Recognition for Enhancing Human Computer Interaction
    Chakravarthi, Sangapu Sreenivasa
    Rao, B. Narendra Kumar
    Challa, Nagendra Panini
    Ranjana, R.
    Rai, Ankush
    JOURNAL OF SCIENTIFIC & INDUSTRIAL RESEARCH, 2023, 82 (04): : 438 - 443
  • [45] Hand Gesture Recognition for Human Computer Interaction
    Haria, Aashni
    Subramanian, Archanasri
    Asokkumar, Nivedhitha
    Poddar, Shristi
    Nayak, Jyothi S.
    7TH INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTING & COMMUNICATIONS (ICACC-2017), 2017, 115 : 367 - 374
  • [46] ADAPTIVE GESTURE RECOGNITION IN HUMAN COMPUTER INTERACTION
    Caridakis, George
    Karpouzis, Kostas
    Drosopottlos, Nasos
    Kollias, Stefanos
    2009 10TH INTERNATIONAL WORKSHOP ON IMAGE ANALYSIS FOR MULTIMEDIA INTERACTIVE SERVICES, 2009, : 270 - 274
  • [47] Hand Gesture Control for Human-Computer Interaction with Deep Learning
    Chua, S. N. David
    Chin, K. Y. Richard
    Lim, S. F.
    Jain, Pushpdant
    JOURNAL OF ELECTRICAL ENGINEERING & TECHNOLOGY, 2022, 17 (03) : 1961 - 1970
  • [48] Perception, language, and gesture: towards a natural human-computer interaction
    de Angeli, A.
    Gerbino, W.
    Romary, L.
    Wolff, F.
    PERCEPTION, 1999, 28 : 146 - 147
  • [49] Hand Shape Recognition for Human-Computer Interaction
    Marnik, Joanna
    MAN-MACHINE INTERACTIONS, 2009, 59 : 95 - 102
  • [50] Implicit Human-Computer Interaction by Posture Recognition
    Maier, Enrico
    DIGITAL HUMAN MODELING, 2011, 6777 : 143 - 150