Multi-label maximum entropy model for social emotion classification over short text

被引:36
|
作者
Li, Jun [1 ]
Rao, Yanghui [1 ]
Jin, Fengmei [1 ]
Chen, Huijun [1 ]
Xiang, Xiyun [1 ]
机构
[1] Sun Yat Sen Univ, Sch Data & Comp Sci, Guangzhou, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-label maximum entropy model; Social emotion classification; Short text analysis; Co-training algorithm; SENTIMENT ANALYSIS; DICTIONARY;
D O I
10.1016/j.neucom.2016.03.088
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Social media provides an opportunity for many individuals to express their emotions online. Automatically classifying user emotions can help us understand the preferences of the general public, which has a number of useful applications, including sentiment retrieval and opinion summarization. Short text is prevalent on the Web, especially in tweets, questions, and news headlines. Most of the existing social emotion classification models focus on the detection of user emotions conveyed by long documents. In this paper, we introduce a multi-label maximum entropy (MME) model for user emotion classification over short text. MME generates rich features by modeling multiple emotion labels and valence scored by numerous users jointly. To improve the robustness of the method on varied-scale corpora, we further develop a co-training algorithm for MME and use the L-BFGS algorithm for the generalized MME model. Experiments on real-world short text collections validate the effectiveness of these methods on social emotion classification over sparse features. We also demonstrate the application of generated lexicons in identifying entities and behaviors that convey different social emotions. (C) 2016 Elsevier B.V. All rights reserved.
引用
收藏
页码:247 / 256
页数:10
相关论文
共 50 条
  • [1] A Corpus-based Multi-label Emotion Classification using Maximum Entropy
    Wu, Ye
    Ren, Fuji
    NATURAL LANGUAGE PROCESSING AND COGNITIVE SCIENCE, PROCEEDINGS, 2009, : 103 - 110
  • [2] Social emotion classification of short text via topic-level maximum entropy model
    Rao, Yanghui
    Xie, Haoran
    Li, Jun
    Jin, Fengmei
    Wang, Fu Lee
    Li, Qing
    INFORMATION & MANAGEMENT, 2016, 53 (08) : 978 - 986
  • [3] Text prediction method based on multi-label attributes and improved maximum entropy model
    Yin, Yi
    Feng, Dan
    Li, Yue
    Yin, Shuifang
    Shi, Zhan
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2018, 34 (02) : 1097 - 1109
  • [4] Hierarchical Multi-Label Classification of Social Text Streams
    Ren, Zhaochun
    Peetz, Maria-Hendrike
    Liang, Shangsong
    van Dolen, Willemijn
    de Rijke, Maarten
    SIGIR'14: PROCEEDINGS OF THE 37TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2014, : 213 - 222
  • [5] A Multi-Label Text Classification Model with Enhanced Label Information
    Wang, Min
    Gao, Yan
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 329 - 334
  • [6] A Label Information Aware Model for Multi-label Text Classification
    Tian, Xiaoyu
    Qin, Yongbin
    Huang, Ruizhang
    Chen, Yanping
    NEURAL PROCESSING LETTERS, 2024, 56 (05)
  • [7] Emotion classification for short texts: an improved multi-label method
    Liu, Xuan
    Shi, Tianyi
    Zhou, Guohui
    Liu, Mingzhe
    Yin, Zhengtong
    Yin, Lirong
    Zheng, Wenfeng
    HUMANITIES & SOCIAL SCIENCES COMMUNICATIONS, 2023, 10 (01):
  • [8] Emotion classification for short texts: an improved multi-label method
    Xuan Liu
    Tianyi Shi
    Guohui Zhou
    Mingzhe Liu
    Zhengtong Yin
    Lirong Yin
    Wenfeng Zheng
    Humanities and Social Sciences Communications, 10
  • [9] Label prompt for multi-label text classification
    Song, Rui
    Liu, Zelong
    Chen, Xingbing
    An, Haining
    Zhang, Zhiqi
    Wang, Xiaoguang
    Xu, Hao
    APPLIED INTELLIGENCE, 2023, 53 (08) : 8761 - 8775
  • [10] Label prompt for multi-label text classification
    Rui Song
    Zelong Liu
    Xingbing Chen
    Haining An
    Zhiqi Zhang
    Xiaoguang Wang
    Hao Xu
    Applied Intelligence, 2023, 53 : 8761 - 8775