NONCONVEX L1/2 REGULARIZATION FOR SPARSE PORTFOLIO SELECTION

被引:0
|
作者
Xu, Fengmin [1 ]
Wang, Guan [1 ]
Gao, Yuelin [2 ]
机构
[1] Xi An Jiao Tong Univ, Sch Math & Stat, Xian 710049, Peoples R China
[2] Beifang Univ Nationalities, Inst Informat & Syst Sci, Yinchuan 750021, Peoples R China
来源
PACIFIC JOURNAL OF OPTIMIZATION | 2014年 / 10卷 / 01期
关键词
L-1/2; regularization; sparse portfolio selection; half thresholding algorithm;
D O I
暂无
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Two sparse optimal portfolio selection models with and without short-selling constraints are proposed by introducing L-1/2 regularization on portfolio weights in the traditional M-V portfolio selection model. A fast and efficient penalty half thresholding algorithm is presented for the solution of proposed sparse portfolio selection models. The algorithm is an extension of the half thresholding algorithm for solving L-1/2 regularization problems. A strategy to adjust the value of the regularization parameter in proposed models is derived when the sparsity of optimal portfolios is specified. When the strategy is incorporated into the modified algorithm, the efficiency of the algorithm is improved. Empirical analyzes on the proposed portfolio selection models and the proposed algorithm are executed to test the out-of-sample performance of the optimal portfolios generated from the proposed models. The out-of-sample performance is measured using Sharpe ratio. Empirical results show that the out-of-sample performance of the optimal portfolios generated from the proposed models is better than those of the optimal portfolios generated from M-V portfolio selection models with L-1/2 regularization on portfolio weights.
引用
收藏
页码:163 / 176
页数:14
相关论文
共 50 条
  • [31] The Group-Lasso: l1,∞ Regularization versus l1,2 Regularization
    Vogt, Julia E.
    Roth, Volker
    PATTERN RECOGNITION, 2010, 6376 : 252 - 261
  • [32] l1-Regularization in Portfolio Selection with Machine Learning
    Corsaro, Stefania
    De Simone, Valentina
    Marino, Zelda
    Scognamiglio, Salvatore
    MATHEMATICS, 2022, 10 (04)
  • [33] Sparse Gabor Time-Frequency Representation Based on l1/2-l2 Regularization
    Li, Rui
    Zhou, Jian
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2019, 38 (10) : 4700 - 4722
  • [34] Application of L1 - L2 Regularization in Sparse-View Photoacoustic Imaging Reconstruction
    Wang, Mengyu
    Dai, Shuo
    Wang, Xin
    Liu, Xueyan
    IEEE PHOTONICS JOURNAL, 2024, 16 (03): : 1 - 8
  • [35] A recursive least squares algorithm with l1 regularization for sparse representation
    Liu, Di
    Baldi, Simone
    Liu, Quan
    Yu, Wenwu
    SCIENCE CHINA-INFORMATION SCIENCES, 2023, 66 (02)
  • [36] Transformed l1 regularization for learning sparse deep neural networks
    Ma, Rongrong
    Miao, Jianyu
    Niu, Lingfeng
    Zhang, Peng
    NEURAL NETWORKS, 2019, 119 : 286 - 298
  • [37] Online Feature Selection Algorithm with Bayesian l1 Regularization
    Cai, Yunpeng
    Sun, Yijun
    Li, Jian
    Goodison, Steve
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PROCEEDINGS, 2009, 5476 : 401 - +
  • [38] Gene Selection based on Fuzzy measure with L1 regularization
    Wang, Jinfeng
    Chen, Jiajie
    Wang, Hui
    2018 21ST IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND ENGINEERING (CSE 2018), 2018, : 157 - 163
  • [39] An Adaptive L1/2 Sparse Regularization Algorithm for Super-resolution Image Reconstruction
    Xiong, Jiongtao
    Liu, Yijun
    Ye, Xiangrong
    MATERIALS SCIENCE, ENERGY TECHNOLOGY, AND POWER ENGINEERING I, 2017, 1839
  • [40] A nonconvex l1(l1 - l2) model for image restoration with impulse noise
    Liu, Jingjing
    Ni, Anqi
    Ni, Guoxi
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2020, 378