A Prior Knowledge Based Neural Attention Model for Opioid Topic Identification

被引:0
|
作者
Yao, Riheng [1 ,2 ,3 ]
Li, Qiudan [1 ,3 ]
Lo-Ciganic, Wei-Hsuan [4 ]
Zeng, Daniel Dajun [1 ,2 ,3 ]
机构
[1] Chinese Acad Sci, State Key Lab Management & Control Complex Syst, Inst Automat, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Beijing, Peoples R China
[3] Shenzhen Artificial Intelligence & Data Sci Inst, Shenzhen, Peoples R China
[4] Univ Florida, Dept Pharmaceut Outcomes & Policy, Gainesville, FL USA
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
prior knowledge; attention; opioid; topic;
D O I
10.1109/isi.2019.8823280
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The opioid epidemic has become a serious public health crisis in the United States. Social media sources such as Reddit containing user-generated content may he a valuable safety surveillance platform to evaluate discussions discerning opioid use. This paper proposes a prior knowledge based neural attention model for opioid topics identification, which considers prior knowledge with attention mechanism. Experimental results on a real-world dataset show that our model can extract coherent topics, the identified less discussed but important topics provide more comprehensive information for opioid safety surveillance.
引用
收藏
页码:215 / 217
页数:3
相关论文
共 50 条
  • [1] Neural Topic Model with Attention for Supervised Learning
    Wang, Xinyi
    Yang, Yi
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108
  • [2] A Semi-supervised Hidden Markov Topic Model Based on Prior Knowledge
    Seifollahi, Sattar
    Piccardi, Massimo
    Borzeshi, Ehsan Zare
    DATA MINING, AUSDM 2017, 2018, 845 : 265 - 276
  • [3] Graph neural topic model with commonsense knowledge
    Zhu, Bingshan
    Cai, Yi
    Ren, Haopeng
    INFORMATION PROCESSING & MANAGEMENT, 2023, 60 (02)
  • [4] Neural Attention-Aware Hierarchical Topic Model
    Jin, Yuan
    Zhao, He
    Liu, Ming
    Du, Lan
    Buntine, Wray
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 1042 - 1052
  • [5] A multimodal deep neural network for prediction of the driver's focus of attention based on anthropomorphic attention mechanism and prior knowledge
    Fu, Rui
    Huang, Tao
    Li, Mingyue
    Sun, Qinyu
    Chen, Yunxing
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 214
  • [6] Generating Paraphrase with Topic as Prior Knowledge
    Liu, Yuanxin
    Lin, Zheng
    Liu, Fenglin
    Dai, Qinyun
    Wang, Weiping
    PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 2381 - 2384
  • [7] Incorporating prior knowledge in fuzzy model identification
    Abonyi, J
    Babuska, R
    Verbruggen, HB
    Szeifert, F
    INTERNATIONAL JOURNAL OF SYSTEMS SCIENCE, 2000, 31 (05) : 657 - 667
  • [8] A Study on Performance Enhancement by Integrating Neural Topic Attention with Transformer-Based Language Model
    Um, Taehum
    Kim, Namhyoung
    APPLIED SCIENCES-BASEL, 2024, 14 (17):
  • [9] Dynamic Structured Neural Topic Model with Self-Attention Mechanism
    Miyamoto, Nozomu
    Isonuma, Masaru
    Takase, Sho
    Mori, Junichiro
    Sakata, Ichiro
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5916 - 5930
  • [10] Knowledge-Grounded Attention-Based Neural Machine Translation Model
    Israr, Huma
    Khan, Safdar Abbas
    Tahir, Muhammad Ali
    Shahzad, Muhammad Khuram
    Ahmad, Muneer
    Zain, Jasni Mohamad
    APPLIED COMPUTATIONAL INTELLIGENCE AND SOFT COMPUTING, 2025, 2025 (01)