Attention-based CNN-LSTM for high-frequency multiple cryptocurrency trend prediction

被引:13
|
作者
Peng, Peng [1 ,4 ]
Chen, Yuehong [2 ]
Lin, Weiwei [3 ,4 ]
Wang, James Z. [5 ]
机构
[1] South China Univ Technol, Sch Future Technol, Guangzhou 510000, Peoples R China
[2] Guangdong Polytech Normal Univ, Sch Math & Syst Sci, Guangzhou 510665, Peoples R China
[3] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510000, Peoples R China
[4] Peng Cheng Lab, Shenzhen 518000, Peoples R China
[5] Clemson Univ, Sch Comp, Clemson, SC USA
基金
美国国家科学基金会; 中国国家自然科学基金; 美国国家卫生研究院;
关键词
Time-series forecasting; Cryptocurrency; Labeling method; Attention mechanism; TIME-SERIES; MODEL;
D O I
10.1016/j.eswa.2023.121520
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the price of Bitcoin, Ethereum, and many other cryptocurrencies climbing, the cryptocurrency market has become the most popular investment area in recent years. Unlike other relatively more stable financial derivatives, the cryptocurrency market has high volatility which requires a high-frequency prediction model for quantitative trading. However, the excessive number of trading becomes a critical issue due to the instability of the prediction results and high error rate. To relieve such a problem, based on the observation of high frequency data, we use local minimum series to replace the original series and propose a more stable triple trend labeling method that reduces the number of trades by potentially influencing the training of the model. In addition, a new attention-based CNN-LSTM model for multiple cryptocurrencies (ACLMC) is proposed to optimize model effects by exploiting correlations across frequencies and currencies, and to smooth out the investment risk associated with prediction errors by supporting simultaneous multi-currency predictions. Experiments show that our labeling method with ACLMC can achieve much better financial metrics and fewer number of transactions than traditional baselines.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] An Attention-Based CNN-LSTM Method for Effluent Wastewater Quality Prediction
    Li, Yue
    Kong, Bin
    Yu, Weiwei
    Zhu, Xingliang
    APPLIED SCIENCES-BASEL, 2023, 13 (12):
  • [2] An Attention-Based CNN-LSTM Model with Limb Synergy for Joint Angles Prediction
    Zhu, Chang
    Liu, Quan
    Meng, Wei
    Ai, Qingsong
    Xie, Sheng Q.
    2021 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS (AIM), 2021, : 747 - 752
  • [3] Intrusion Detection Using Attention-Based CNN-LSTM Model
    Al-Omar, Ban
    Trabelsi, Zouheir
    ARTIFICIAL INTELLIGENCE APPLICATIONS AND INNOVATIONS, AIAI 2023, PT I, 2023, 675 : 515 - 526
  • [4] Urban PM2.5 Concentration Prediction via Attention-Based CNN-LSTM
    Li, Songzhou
    Xie, Gang
    Ren, Jinchang
    Guo, Lei
    Yang, Yunyun
    Xu, Xinying
    APPLIED SCIENCES-BASEL, 2020, 10 (06):
  • [5] MALICIOUS URL RECOGNITION AND DETECTION USING ATTENTION-BASED CNN-LSTM
    Peng, Yongfang
    Tian, Shengwei
    Yu, Long
    Lv, Yalong
    Wang, Ruijin
    KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2019, 13 (11) : 5580 - 5593
  • [6] KianNet: A Violence Detection Model Using an Attention-Based CNN-LSTM Structure
    Vosta, Soheil
    Yow, Kin-Choong
    IEEE ACCESS, 2024, 12 : 2198 - 2209
  • [7] An attention-based CNN-LSTM model for subjectivity detection in opinion-mining
    Sagnika, Santwana
    Mishra, Bhabani Shankar Prasad
    Meher, Saroj K.
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (24): : 17425 - 17438
  • [8] DNACoder: a CNN-LSTM attention-based network for genomic sequence data compression
    K. S. Sheena
    Madhu S. Nair
    Neural Computing and Applications, 2024, 36 (29) : 18363 - 18376
  • [9] Machine Fault Detection Using a Hybrid CNN-LSTM Attention-Based Model
    Borre, Andressa
    Seman, Laio Oriel
    Camponogara, Eduardo
    Stefenon, Stefano Frizzo
    Mariani, Viviana Cocco
    Coelho, Leandro dos Santos
    SENSORS, 2023, 23 (09)
  • [10] An attention-based CNN-LSTM model for subjectivity detection in opinion-mining
    Santwana Sagnika
    Bhabani Shankar Prasad Mishra
    Saroj K. Meher
    Neural Computing and Applications, 2021, 33 : 17425 - 17438