Learning fused lasso parameters in portfolio selection via neural networks

被引:0
|
作者
Corsaro S. [1 ]
De Simone V. [2 ]
Marino Z. [1 ]
Scognamiglio S. [1 ]
机构
[1] Department of Management and Quantitative Studies, Parthenope University of Naples, Via Generale Parisi 13, Napoli
[2] Department of Mathematics and Physics, University of Campania “Luigi Vanvitelli”, Viale Lincoln 5, Caserta
关键词
Fused lasso; Long short-term memory; Neural network; Portfolio selection; Regularization parameters;
D O I
10.1007/s11135-024-01858-1
中图分类号
学科分类号
摘要
In recent years, fused lasso models are becoming popular in several fields, such as computer vision, classification and finance. In portfolio selection, they can be used to penalize active positions and portfolio turnover. Despite efficient algorithms and software for solving non-smooth optimization problems have been developed, the amount of regularization to apply is a critical issue, especially if we have to achieve a financial aim. We propose a data-driven approach for learning the regularization parameters in a fused lasso formulation of the multi-period portfolio selection problem, able to realize a given financial target. We design a neural network architecture based on recurrent networks for learning the functional dependence between the regularization parameters and the input data. In particular, the Long Short-Term Memory networks are considered for their ability to process sequential data, such as the time series of the asset returns. Numerical experiments performed on market data show the effectiveness of our approach. © The Author(s) 2024.
引用
收藏
页码:4281 / 4299
页数:18
相关论文
共 50 条
  • [1] Fused Lasso approach in portfolio selection
    Stefania Corsaro
    Valentina De Simone
    Zelda Marino
    Annals of Operations Research, 2021, 299 : 47 - 59
  • [2] Fused Lasso approach in portfolio selection
    Corsaro, Stefania
    De Simone, Valentina
    Marino, Zelda
    ANNALS OF OPERATIONS RESEARCH, 2021, 299 (1-2) : 47 - 59
  • [3] Portfolio selection using neural networks
    Fernandez, Alberto
    Gomez, Sergio
    COMPUTERS & OPERATIONS RESEARCH, 2007, 34 (04) : 1177 - 1191
  • [4] Feature Selection for Neural Networks Using Group Lasso Regularization
    Zhang, Huaqing
    Wang, Jian
    Sun, Zhanquan
    Zurada, Jacek M.
    Pal, Nikhil R.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2020, 32 (04) : 659 - 673
  • [5] Learning Multiple Granger Graphical Models via Group Fused Lasso
    Songsiri, Jitkomut
    2015 10TH ASIAN CONTROL CONFERENCE (ASCC), 2015,
  • [6] Neural lasso: a unifying approach of lasso and neural networks
    Curbelo, Ernesto
    Delgado-Gomez, David
    Carreras, Danae
    INTERNATIONAL JOURNAL OF DATA SCIENCE AND ANALYTICS, 2024,
  • [7] Sparsity and smoothness via the fused lasso
    Tibshirani, R
    Saunders, M
    Rosset, S
    Zhu, J
    Knight, K
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2005, 67 : 91 - 108
  • [8] Feature Selection for Fuzzy Neural Networks using Group Lasso Regularization
    Gao, Tao
    Bai, Xiao
    Zhang, Liang
    Wang, Jian
    2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
  • [9] Learning regularization parameters of inverse problems via deep neural networks
    Afkham, Babak Maboudi
    Chung, Julianne
    Chung, Matthias
    INVERSE PROBLEMS, 2021, 37 (10)
  • [10] Fused graphical lasso for brain networks with symmetries
    Ranciati, Saverio
    Roverato, Alberto
    Luati, Alessandra
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES C-APPLIED STATISTICS, 2021, 70 (05) : 1299 - 1322