NONCONVEX L1/2 REGULARIZATION FOR SPARSE PORTFOLIO SELECTION

被引:0
|
作者
Xu, Fengmin [1 ]
Wang, Guan [1 ]
Gao, Yuelin [2 ]
机构
[1] Xi An Jiao Tong Univ, Sch Math & Stat, Xian 710049, Peoples R China
[2] Beifang Univ Nationalities, Inst Informat & Syst Sci, Yinchuan 750021, Peoples R China
来源
PACIFIC JOURNAL OF OPTIMIZATION | 2014年 / 10卷 / 01期
关键词
L-1/2; regularization; sparse portfolio selection; half thresholding algorithm;
D O I
暂无
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Two sparse optimal portfolio selection models with and without short-selling constraints are proposed by introducing L-1/2 regularization on portfolio weights in the traditional M-V portfolio selection model. A fast and efficient penalty half thresholding algorithm is presented for the solution of proposed sparse portfolio selection models. The algorithm is an extension of the half thresholding algorithm for solving L-1/2 regularization problems. A strategy to adjust the value of the regularization parameter in proposed models is derived when the sparsity of optimal portfolios is specified. When the strategy is incorporated into the modified algorithm, the efficiency of the algorithm is improved. Empirical analyzes on the proposed portfolio selection models and the proposed algorithm are executed to test the out-of-sample performance of the optimal portfolios generated from the proposed models. The out-of-sample performance is measured using Sharpe ratio. Empirical results show that the out-of-sample performance of the optimal portfolios generated from the proposed models is better than those of the optimal portfolios generated from M-V portfolio selection models with L-1/2 regularization on portfolio weights.
引用
收藏
页码:163 / 176
页数:14
相关论文
共 50 条
  • [21] Sparse Channel Estimation Based on L1/2 Regularization in OFDM Systems
    Duan, WenLei
    Li, Feng
    Liu, Zhe
    2014 INTERNATIONAL CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGY CONVERGENCE (ICTC), 2014, : 442 - 445
  • [22] Improved sparse reconstruction for fluorescence molecular tomography with L1/2 regularization
    Guo, Hongbo
    Yu, Jingjing
    He, Xiaowei
    Hou, Yuqing
    Dong, Fang
    Zhang, Shuling
    BIOMEDICAL OPTICS EXPRESS, 2015, 6 (05): : 1648 - 1664
  • [23] Feature Selection Using Smooth Gradient L1/2 Regularization
    Gao, Hongmin
    Yang, Yichen
    Zhang, Bingyin
    Li, Long
    Zhang, Huaqing
    Wu, Shujun
    NEURAL INFORMATION PROCESSING (ICONIP 2017), PT IV, 2017, 10637 : 160 - 170
  • [24] The L1/2 regularization method for variable selection in the Cox model
    Liu, Cheng
    Liang, Yong
    Luan, Xin-Ze
    Leung, Kwong-Sak
    Chan, Tak-Ming
    Xu, Zong-Ben
    Zhang, Hai
    APPLIED SOFT COMPUTING, 2014, 14 : 498 - 503
  • [25] Iteratively Reweighted l1 Approaches to Sparse Composite Regularization
    Ahmad, Rizwan
    Schniter, Philip
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2015, 1 (04) : 220 - 235
  • [26] Sparse Auto-encoder with Smoothed l1 Regularization
    Zhang, Li
    Lu, Yaping
    Zhang, Zhao
    Wang, Bangjun
    Li, Fanzhang
    NEURAL INFORMATION PROCESSING, ICONIP 2016, PT III, 2016, 9949 : 555 - 563
  • [27] A Simple Neural Network for Sparse Optimization With l1 Regularization
    Ma, Litao
    Bian, Wei
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2021, 8 (04): : 3430 - 3442
  • [28] Make l1 regularization effective in training sparse CNN
    He, Juncai
    Jia, Xiaodong
    Xu, Jinchao
    Zhang, Lian
    Zhao, Liang
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2020, 77 (01) : 163 - 182
  • [29] Euclid in a Taxicab: Sparse Blind Deconvolution with Smoothed l1/l2 Regularization
    Repetti, Audrey
    Mai Quyen Pham
    Duval, Laurent
    Chouzenoux, Emilie
    Pesquet, Jean-Christophe
    IEEE SIGNAL PROCESSING LETTERS, 2015, 22 (05) : 539 - 543
  • [30] Feature Selection with L1 Regularization in Formal Neurons
    Bobrowski, Leon
    ENGINEERING APPLICATIONS OF NEURAL NETWORKS, EANN 2024, 2024, 2141 : 343 - 353