A system for feature classification of emotions based on Speech Analysis; Applications to Human-Robot Interaction

被引:0
|
作者
Rabiei, Mohammad [1 ]
Gasparetto, Alessandro [1 ]
机构
[1] Univ Udine, Dept Elect Engn Mech Engn & Management, Via Sci 206, I-33100 Udine, Italy
关键词
formant; pitch; speech analysis; speech rate; SPECTRAL FEATURES; RECOGNITION;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
A system for recognition of emotions based on speech analysis can have interesting applications in human robot interaction. Robot should make a proper mutual communication between sound recognition and perception for creating a desired emotional interaction with humans. Advanced research in this field will be based on sound analysis and recognition of emotions in spontaneous dialog. In this paper, we report the results obtained from an exploratory study on a methodology to automatically recognize and classify basic emotional states. The study attempted to investigate the appropriateness of using acoustic and phonetic properties of emotive speech with the minimal use of signal processing algorithms. The efficiency of the methodology was evaluated by experimental tests on adult European speakers. The speakers had to repeat six simple sentences in English language in order to emphasize features of the pitch (peak, value and range), the intensity of the speech, the formants and the speech rate. The proposed methodology using the freeware program (PRAAT) and consists of generating and analyzing a graph of pitch, formant and intensity of speech signals for classify basic emotion. Eventually, the proposed model provided successful recognition of the basic emotion in most of the cases.
引用
收藏
页码:795 / 800
页数:6
相关论文
共 50 条
  • [12] Space, Speech, and Gesture in Human-Robot Interaction
    Mead, Ross
    ICMI '12: PROCEEDINGS OF THE ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2012, : 333 - 336
  • [13] Modular Control for Human Motion Analysis and Classification in Human-Robot Interaction
    Alberto Rivera-Bautista, Juan
    Cristina Ramirez-Hernandez, Ana
    Garcia-Vega, Virginia A.
    Marin-Hernandez, Antonio
    PROCEEDINGS OF THE 5TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI 2010), 2010, : 169 - 170
  • [14] Integration of Gestures and Speech in Human-Robot Interaction
    Meena, Raveesh
    Jokinen, Kristiina
    Wilcock, Graham
    3RD IEEE INTERNATIONAL CONFERENCE ON COGNITIVE INFOCOMMUNICATIONS (COGINFOCOM 2012), 2012, : 673 - 678
  • [15] A Gaze-Speech System in Mixed Reality for Human-Robot Interaction
    Prada, John David Prieto
    Lee, Myung Ho
    Song, Cheol
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2023), 2023, : 7547 - 7553
  • [16] Automated Proxemic Feature Extraction and Behavior Recognition: Applications in Human-Robot Interaction
    Mead, Ross
    Atrash, Amin
    Mataric, Maja J.
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2013, 5 (03) : 367 - 378
  • [17] Automated Proxemic Feature Extraction and Behavior Recognition: Applications in Human-Robot Interaction
    Ross Mead
    Amin Atrash
    Maja J. Matarić
    International Journal of Social Robotics, 2013, 5 : 367 - 378
  • [18] A Selection Method of Speech Vocabulary for Human-Robot Speech Interaction
    Liu, Hong
    Li, Xiaofei
    IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2010), 2010,
  • [19] A flexible system for gesture based human-robot interaction
    Tellaeche, Alberto
    Kildal, Johan
    Maurtua, Inaki
    51ST CIRP CONFERENCE ON MANUFACTURING SYSTEMS, 2018, 72 : 57 - 62
  • [20] A Novel Human-Robot Interaction System Based on HMM
    Li, Zongwei
    Sun, Lei
    2011 9TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION (WCICA 2011), 2011, : 457 - 461