Emotional facial expression classification for multimodal user interfaces

被引:0
|
作者
Cerezo, Eva [1 ]
Hupont, Isabelle [1 ]
机构
[1] Univ Zaragoza, Dept Informat & Ingn Sistemas, Zaragoza 50018, Spain
关键词
facial expression; multimodal interface;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present a simple and computationally feasible method to perform automatic emotional classification of facial expressions. We propose the use of 10 characteristic points (that are part of the MPEG4 feature points) to extract relevant emotional information (basically five distances, presence of wrinkles and mouth shape). The method defines and detects the six basic emotions (plus the neutral one) in terms of this information and has been fine-tuned with a data-base of 399 images. For the moment, the method is applied to static images. Application to sequences is being now developed. The extraction of such information about the user is of great interest for the development of new multimodal user interfaces.
引用
收藏
页码:405 / 413
页数:9
相关论文
共 50 条
  • [21] Multimodal user interfaces in the Open Agent Architecture
    Moran, DB
    Cheyer, AJ
    Julia, LE
    Martin, DL
    Park, S
    KNOWLEDGE-BASED SYSTEMS, 1998, 10 (05) : 295 - 303
  • [22] Transformer-Based Multimodal Emotional Perception for Dynamic Facial Expression Recognition in the Wild
    Zhang, Xiaoqin
    Li, Min
    Lin, Sheng
    Xu, Hang
    Xiao, Guobao
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (05) : 3192 - 3203
  • [23] Nonverbally smart User Interfaces: Postural and facial expression data in human computer interaction
    Bahr, G. Susanne
    Balaban, Carey
    Milanova, Mariofanna
    Choe, Howard
    UNIVERSAL ACCESS IN HUMAN-COMPUTER INTERACTION: AMBIENT INTERACTION, PT 2, PROCEEDINGS, 2007, 4555 : 740 - +
  • [24] Multimodal User Interaction in Smart Environments: Delivering Distributed User Interfaces
    Blumendorf, Marco
    Feuerstack, Sebastian
    Albayrak, Sahin
    CONSTRUCTING AMBIENT INTELLIGENCE, 2008, 11 : 113 - 120
  • [25] Using automatic facial expression classification for contents indexing based on the emotional component
    Kowalik, Uwe
    Aoki, Terumasa
    Yasuda, Hiroshi
    EMBEDDED AND UBIQUITOUS COMPUTING, PROCEEDINGS, 2006, 4096 : 519 - 528
  • [26] THE ASSESSMENT OF FACIAL EMOTIONAL EXPRESSION
    BOROD, JC
    KOFF, E
    JOURNAL OF CLINICAL AND EXPERIMENTAL NEUROPSYCHOLOGY, 1987, 9 (01) : 58 - 58
  • [27] Development of Multimodal User Interfaces to Internet for Common People
    Samanta, Debasis
    Ghosh, Soumalya
    Dey, Somnath
    Sarcar, Sayan
    Sharma, Manoj Kumar
    Saha, Pradipta Kumar
    Maiti, Santa
    4TH INTERNATIONAL CONFERENCE ON INTELLIGENT HUMAN COMPUTER INTERACTION (IHCI 2012), 2012,
  • [28] A language perspective on the development of plastic multimodal user interfaces
    Jean-Sébastien Sottet
    Gaëlle Calvary
    Joëlle Coutaz
    Jean-Marie Favre
    Jean Vanderdonckt
    Adrian Stanciulescu
    Sophie Lepreux
    Journal on Multimodal User Interfaces, 2007, 1 : 1 - 12
  • [29] Cross-disciplinary approaches to multimodal user interfaces
    Gualtiero Volpe
    Antonio Camurri
    Thierry Dutoit
    Maurizio Mancini
    Journal on Multimodal User Interfaces, 2010, 4 : 1 - 2
  • [30] User-centered Modeling and evaluation of multimodal interfaces
    Oviatt, S
    PROCEEDINGS OF THE IEEE, 2003, 91 (09) : 1457 - 1468