NONCONVEX L1/2 REGULARIZATION FOR SPARSE PORTFOLIO SELECTION

被引:0
|
作者
Xu, Fengmin [1 ]
Wang, Guan [1 ]
Gao, Yuelin [2 ]
机构
[1] Xi An Jiao Tong Univ, Sch Math & Stat, Xian 710049, Peoples R China
[2] Beifang Univ Nationalities, Inst Informat & Syst Sci, Yinchuan 750021, Peoples R China
来源
PACIFIC JOURNAL OF OPTIMIZATION | 2014年 / 10卷 / 01期
关键词
L-1/2; regularization; sparse portfolio selection; half thresholding algorithm;
D O I
暂无
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Two sparse optimal portfolio selection models with and without short-selling constraints are proposed by introducing L-1/2 regularization on portfolio weights in the traditional M-V portfolio selection model. A fast and efficient penalty half thresholding algorithm is presented for the solution of proposed sparse portfolio selection models. The algorithm is an extension of the half thresholding algorithm for solving L-1/2 regularization problems. A strategy to adjust the value of the regularization parameter in proposed models is derived when the sparsity of optimal portfolios is specified. When the strategy is incorporated into the modified algorithm, the efficiency of the algorithm is improved. Empirical analyzes on the proposed portfolio selection models and the proposed algorithm are executed to test the out-of-sample performance of the optimal portfolios generated from the proposed models. The out-of-sample performance is measured using Sharpe ratio. Empirical results show that the out-of-sample performance of the optimal portfolios generated from the proposed models is better than those of the optimal portfolios generated from M-V portfolio selection models with L-1/2 regularization on portfolio weights.
引用
收藏
页码:163 / 176
页数:14
相关论文
共 50 条
  • [41] Sparse hyperspectral unmixing combined L1/2 norm and reweighted total variation regularization
    Li, Yan
    NINTH INTERNATIONAL CONFERENCE ON DIGITAL IMAGE PROCESSING (ICDIP 2017), 2017, 10420
  • [42] l1-Regularization for multi-period portfolio selection
    Corsaro, Stefania
    De Simone, Valentina
    Marino, Zelda
    Perla, Francesca
    ANNALS OF OPERATIONS RESEARCH, 2020, 294 (1-2) : 75 - 86
  • [43] Sparse smooth group L0°L1/2 regularization method for convolutional neural networks
    Quasdane, Mohamed
    Ramchoun, Hassan
    Masrour, Tawfik
    KNOWLEDGE-BASED SYSTEMS, 2024, 284
  • [44] A mixed l1 regularization approach for sparse simultaneous approximation of parameterized PDEs
    Dexter, Nick
    Hoang Tran
    Webster, Clayton
    ESAIM-MATHEMATICAL MODELLING AND NUMERICAL ANALYSIS-MODELISATION MATHEMATIQUE ET ANALYSE NUMERIQUE, 2019, 53 (06): : 2025 - 2045
  • [45] Structural damage identification based on substructure sensitivity and l1 sparse regularization
    Zhou, Shumei
    Bao, Yuequan
    Li, Hui
    SENSORS AND SMART STRUCTURES TECHNOLOGIES FOR CIVIL, MECHANICAL, AND AEROSPACE SYSTEMS 2013, 2013, 8692
  • [46] Nonconvex regularization for sparse neural networks
    Pieper K.
    Petrosyan A.
    Applied and Computational Harmonic Analysis, 2022, 61 : 25 - 56
  • [47] Stochastic PCA with l2 and l1 Regularization
    Mianjy, Poorya
    Arora, Raman
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [48] ELM with L1/L2 regularization constraints
    Feng B.
    Qin K.
    Jiang Z.
    Hanjie Xuebao/Transactions of the China Welding Institution, 2018, 39 (09): : 31 - 35
  • [49] Variable selection for functional regression models via the L1 regularization
    Matsui, Hidetoshi
    Konishi, Sadanori
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2011, 55 (12) : 3304 - 3310
  • [50] l1/2-Regularization Based Sparse Channel hstimation for MmWave Massive MIMO Systems
    Zhang, Zhenyue
    Gui, Guan
    Liang, Yan
    2018 IEEE 23RD INTERNATIONAL CONFERENCE ON DIGITAL SIGNAL PROCESSING (DSP), 2018,