A system for feature classification of emotions based on Speech Analysis; Applications to Human-Robot Interaction

被引:0
|
作者
Rabiei, Mohammad [1 ]
Gasparetto, Alessandro [1 ]
机构
[1] Univ Udine, Dept Elect Engn Mech Engn & Management, Via Sci 206, I-33100 Udine, Italy
关键词
formant; pitch; speech analysis; speech rate; SPECTRAL FEATURES; RECOGNITION;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
A system for recognition of emotions based on speech analysis can have interesting applications in human robot interaction. Robot should make a proper mutual communication between sound recognition and perception for creating a desired emotional interaction with humans. Advanced research in this field will be based on sound analysis and recognition of emotions in spontaneous dialog. In this paper, we report the results obtained from an exploratory study on a methodology to automatically recognize and classify basic emotional states. The study attempted to investigate the appropriateness of using acoustic and phonetic properties of emotive speech with the minimal use of signal processing algorithms. The efficiency of the methodology was evaluated by experimental tests on adult European speakers. The speakers had to repeat six simple sentences in English language in order to emphasize features of the pitch (peak, value and range), the intensity of the speech, the formants and the speech rate. The proposed methodology using the freeware program (PRAAT) and consists of generating and analyzing a graph of pitch, formant and intensity of speech signals for classify basic emotion. Eventually, the proposed model provided successful recognition of the basic emotion in most of the cases.
引用
收藏
页码:795 / 800
页数:6
相关论文
共 50 条
  • [21] Classification of human-robot team interaction paradigms
    Music, Selma
    Hirche, Sandra
    IFAC PAPERSONLINE, 2016, 49 (32): : 42 - 47
  • [22] Enhancing Human Emotion Classification in Human-Robot Interaction
    Elsayed, HossamEldin
    Tawfik, Noha Seddik
    Shalash, Omar
    Ismail, Ossama
    2024 INTERNATIONAL CONFERENCE ON MACHINE INTELLIGENCE AND SMART INNOVATION, ICMISI 2024, 2024, : 19 - 24
  • [23] An FPGA Based Architecture for Concurrent System Design Applied to Human-robot Interaction Applications
    Zhang, Lin
    Slaets, Peter
    Bruyninckx, Herman
    MOVING INTEGRATED PRODUCT DEVELOPMENT TO SERVICE CLOUDS IN THE GLOBAL ECONOMY, 2014, 1 : 555 - 563
  • [24] Intention Based Comparative Analysis of Human-Robot Interaction
    Awais, Muhammad
    Saeed, Muhammad Yahya
    Malik, Muhammad Sheraz Arshad
    Younas, Muhammad
    Rao Iqbal Asif, Sohail
    IEEE ACCESS, 2020, 8 : 205821 - 205835
  • [25] Learning fusion feature representation for garbage image classification model in human-robot interaction
    Li, Xi
    Li, Tian
    Li, Shaoyi
    Tian, Bin
    Ju, Jianping
    Liu, Tingting
    Liu, Hai
    INFRARED PHYSICS & TECHNOLOGY, 2023, 128
  • [26] Motivational system for human-robot interaction
    Huang, X
    Weng, JY
    COMPUTER VISION IN HUMAN-COMPUTER INTERACTION, PROCEEDINGS, 2004, 3058 : 17 - 27
  • [27] Object recognition through human-robot interaction by speech
    Kurnia, R
    Hossain, A
    Nakamura, A
    Kuno, Y
    RO-MAN 2004: 13TH IEEE INTERNATIONAL WORKSHOP ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, PROCEEDINGS, 2004, : 619 - 624
  • [28] CNN Based Motor Imagery EEG Classification and Human-robot Interaction
    Cheng S.-W.
    Zhou T.-C.
    Tang Z.-C.
    Fan J.
    Sun L.-Y.
    Zhu A.-J.
    Ruan Jian Xue Bao/Journal of Software, 2019, 30 (10): : 3005 - 3016
  • [29] Paralinguistic Cues in Speech to Adapt Robot Behavior in Human-Robot Interaction
    Ashok, Ashita
    Pawlak, Jakub
    Paplu, Sarwar
    Zafar, Zuhair
    Berns, Karsten
    2022 9TH IEEE RAS/EMBS INTERNATIONAL CONFERENCE ON BIOMEDICAL ROBOTICS AND BIOMECHATRONICS (BIOROB 2022), 2022,
  • [30] Are Discrete Emotions Useful in Human-Robot Interaction? Feedback from Motion Capture Analysis
    Lewis, Matthew
    Canamero, Lola
    2013 HUMAINE ASSOCIATION CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2013, : 97 - 102