Improving stock trend prediction with pretrain multi-granularity denoising contrastive learning

被引:0
|
作者
Wang, Mingjie [2 ]
Wang, Siyuan [3 ]
Guo, Jianxiong [1 ,2 ]
Jia, Weijia [1 ,2 ]
机构
[1] Beijing Normal Univ, Adv Inst Nat Sci, Zhuhai 519087, Peoples R China
[2] BNU, HKBU United Int Coll, Dept Comp Sci, Guangdong Key Lab AI & Multimodal Data Proc, Zhuhai 519087, Peoples R China
[3] BNU, Fac Sci & Technol, Dept Math Sci, HKBU United Int Coll, Zhuhai 519087, Peoples R China
基金
中国国家自然科学基金;
关键词
Contrastive learning; Multi-granularity data; Memory; Denoising; Stock trend prediction; Pre-training; TIME-SERIES; NOISY DATA; CLASSIFICATION; INDEX; MODEL;
D O I
10.1007/s10115-023-02006-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Stock trend prediction (STP) aims to predict price fluctuation, which is critical in financial trading. The existing STP approaches only use market data with the same granularity (e.g., as daily market data). However, in the actual financial investment, there are a large number of more detailed investment signals contained in finer-grained data (e.g., high-frequency data). This motivates us to research how to leverage multi-granularity market data to capture more useful information and improve the accuracy in the task of STP. However, the effective utilization of multi-granularity data presents a major challenge. Firstly, the iteration of multi-granularity data with time will lead to more complex noise, which is difficult to extract signals. Secondly, the difference in granularity may lead to opposite target trends in the same time interval. Thirdly, the target trends of stocks with similar features can be quite different, and different sizes of granularity will aggravate this gap. In order to address these challenges, we present a self-supervised framework of multi-granularity denoising contrastive learning (MDC). Specifically, we construct a dynamic dictionary of memory, which can obtain clear and unified representations by filtering noise and aligning multi-granularity data. Moreover, we design two contrast learning modules during the fine-tuning stage to solve the differences in trends by constructing additional self-supervised signals. Besides, in the pre-training stage, we design the granularity domain adaptation module (GDA) to address the issues of temporal inconsistency and data imbalance associated with different granularity in financial data, alongside the memory self-distillation module (MSD) to tackle the challenge posed by a low signal-to-noise ratio. The GDA alleviates these complications by replacing a portion of the coarse-grained data with the preceding time step's fine-grained data, while the MSD seeks to filter out intrinsic noise by aligning the fine-grained representations with the coarse-grained representations' distribution using a self-distillation mechanism. Extensive experiments on the CSI 300 and CSI 100 datasets show that our framework stands out from the existing top-level systems and has excellent profitability in real investing scenarios.
引用
收藏
页码:2439 / 2466
页数:28
相关论文
共 50 条
  • [1] Improving stock trend prediction with pretrain multi-granularity denoising contrastive learning
    Mingjie Wang
    Siyuan Wang
    Jianxiong Guo
    Weijia Jia
    Knowledge and Information Systems, 2024, 66 : 2439 - 2466
  • [2] Improving Stock Trend Prediction with Multi-granularity Denoising Contrastive Learning
    Wang, Mingjie
    Chen, Feng
    Guo, Jianxiong
    Jia, Weijia
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [3] Stock Trend Prediction with Multi-Granularity Data: A Contrastive Learning Approach with Adaptive Fusion
    Hou, Min
    Xu, Chang
    Liu, Yang
    Liu, Weiqing
    Bian, Jiang
    Wu, Le
    Li, Zhi
    Chen, Enhong
    Liu, Tie-Yan
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 700 - 709
  • [4] Multi-granularity Contrastive Learning For Tourism Recommendation
    Wang, Xiaojun
    Journal of Applied Science and Engineering, 2025, 28 (04): : 901 - 909
  • [5] Multi-Granularity Spatio-Temporal Correlation Networks for Stock Trend Prediction
    Chen, Jiahao
    Xie, Liang
    Lin, Wenjing
    Wu, Yuchen
    Xu, Haijiao
    IEEE ACCESS, 2024, 12 : 67219 - 67232
  • [6] A novel complex network prediction method based on multi-granularity contrastive learning
    Sui, Shanshan
    Han, Qilong
    Lu, Dan
    Wu, Shiqing
    Xu, Guandong
    CCF TRANSACTIONS ON PERVASIVE COMPUTING AND INTERACTION, 2024,
  • [7] Multi-Granularity Contrastive Learning for Graph with Hierarchical Pooling
    Liu, Peishuo
    Zhou, Cangqi
    Liu, Xiao
    Zhang, Jing
    Li, Qianmu
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT IV, 2023, 14257 : 499 - 511
  • [8] MINING: Multi-Granularity Network Alignment Based on Contrastive Learning
    Zhang, Zhongbao
    Gao, Shuai
    Su, Sen
    Sun, Li
    Chen, Ruiyang
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (12) : 12785 - 12798
  • [9] Multi-granularity contrastive learning model for next POI recommendation
    Zhu, Yunfeng
    Yao, Shuchun
    Sun, Xun
    FRONTIERS IN NEUROROBOTICS, 2024, 18
  • [10] MCL: Multi-Granularity Contrastive Learning Framework for Chinese NER
    Zhao, Shan
    Wang, ChengYu
    Hu, Minghao
    Yan, Tianwei
    Wang, Meng
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 11, 2023, : 14011 - 14019