Long Short-Term Attention

被引:3
|
作者
Zhong, Guoqiang [1 ]
Lin, Xin [1 ]
Chen, Kang [1 ]
Li, Qingyang [1 ]
Huang, Kaizhu [2 ]
机构
[1] Ocean Univ China, Dept Comp Sci & Technol, Qingdao 266100, Peoples R China
[2] Xian Jiaotong Liverpool Univ, Dept Elect & Elect Engn, SIP, Suzhou 215123, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Machine learning; Sequence learning; Attention mechanism; Long short-term memory; Long short-term attention; BIDIRECTIONAL LSTM; SALIENCY DETECTION; BOTTOM-UP; FRAMEWORK;
D O I
10.1007/978-3-030-39431-8_5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Attention is an important cognition process of humans, which helps humans concentrate on critical information during their perception and learning. However, although many machine learning models can remember information of data, they have no the attention mechanism. For example, the long short-term memory (LSTM) network is able to remember sequential information, but it cannot pay special attention to part of the sequences. In this paper, we present a novel model called long short-term attention (LSTA), which seamlessly integrates the attention mechanism into the inner cell of LSTM. More than processing long short term dependencies, LSTA can focus on important information of the sequences with the attention mechanism. Extensive experiments demonstrate that LSTA outperforms LSTM and related models on the sequence learning tasks.
引用
收藏
页码:45 / 54
页数:10
相关论文
共 50 条
  • [31] Intrusion Detection Based on Bidirectional Long Short-Term Memory with Attention Mechanism
    Yang, Yongjie
    Tu, Shanshan
    Ali, Raja Hashim
    Alasmary, Hisham
    Waqas, Muhammad
    Amjad, Muhammad Nouman
    CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 74 (01): : 801 - 815
  • [32] The late ND attention waveform is sensitive to short-term memory, but not long-term memory
    Singhal, A
    Fowler, B
    INTERNATIONAL JOURNAL OF PSYCHOPHYSIOLOGY, 2002, 45 (1-2) : 130 - 130
  • [33] Sequence-Aware Recommendation with Long-Term and Short-Term Attention Memory Networks
    Chen, Daochang
    Zhang, Rui
    Qi, Jianzhong
    Yuan, Bo
    2019 20TH INTERNATIONAL CONFERENCE ON MOBILE DATA MANAGEMENT (MDM 2019), 2019, : 437 - 442
  • [34] Long short-term memory
    Hochreiter, S
    Schmidhuber, J
    NEURAL COMPUTATION, 1997, 9 (08) : 1735 - 1780
  • [35] Forecasting Short-Term Passenger Flow of Subway Stations Based on the Temporal Pattern Attention Mechanism and the Long Short-Term Memory Network
    Wei, Lingxiang
    Guo, Dongjun
    Chen, Zhilong
    Yang, Jincheng
    Feng, Tianliu
    ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION, 2023, 12 (01)
  • [36] Bi-directional long short-term memory method based on attention mechanism and rolling update for short-term load forecasting
    Wang, Shouxiang
    Wang, Xuan
    Wang, Shaomin
    Wang, Dan
    INTERNATIONAL JOURNAL OF ELECTRICAL POWER & ENERGY SYSTEMS, 2019, 109 : 470 - 479
  • [37] Attention problems, phonological short-term memory, and visuospatial short-term memory: Differential effects on near- and long-term scholastic achievement
    Sarver, Dustin E.
    Rapport, Mark D.
    Kofler, Michael J.
    Scanlan, Sean W.
    Raiker, Joseph S.
    Altro, Thomas A.
    Bolden, Jennifer
    LEARNING AND INDIVIDUAL DIFFERENCES, 2012, 22 (01) : 8 - 19
  • [38] Short-term visual attention in adults with attention deficit disorder
    Hollingsworth, DE
    Knowlton, BJ
    JOURNAL OF COGNITIVE NEUROSCIENCE, 1998, 10 : 66 - 66
  • [39] A short-term prediction model of global ionospheric VTEC based on the combination of long short-term memory and convolutional long short-term memory
    Peng Chen
    Rong Wang
    Yibin Yao
    Hao Chen
    Zhihao Wang
    Zhiyuan An
    Journal of Geodesy, 2023, 97
  • [40] A short-term prediction model of global ionospheric VTEC based on the combination of long short-term memory and convolutional long short-term memory
    Chen, Peng
    Wang, Rong
    Yao, Yibin
    Chen, Hao
    Wang, Zhihao
    An, Zhiyuan
    JOURNAL OF GEODESY, 2023, 97 (05)