Marginal Screening for Partial Least Squares Regression

被引:8
|
作者
Zhao, Naifei [1 ]
Xu, Qingsong [1 ]
Wang, Hong [1 ]
机构
[1] Cent S Univ, Sch Math & Stat, Changsha 410083, Hunan, Peoples R China
来源
IEEE ACCESS | 2017年 / 5卷
基金
中国国家自然科学基金;
关键词
Marginal screening; partial least squares; variable selection; VARIABLE SELECTION METHODS; NONCONCAVE PENALIZED LIKELIHOOD; NEAR-INFRARED SPECTRA; UVE-PLS METHOD; MULTIVARIATE CALIBRATION; DIMENSION REDUCTION; LINEAR-MODELS; ELIMINATION; PREDICTION; LASSO;
D O I
10.1109/ACCESS.2017.2728532
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Partial least squares (PLS) regression is a versatile modeling approach for high-dimensional data analysis. Recently, PLS-based variable selection has attracted great attention due to high-throughput data reduction and modeling interpretability. In this paper, a class of variable selection methods for PLS, which employs marginal screening approaches to select relevant variables, is proposed. The proposed methods select variables in two steps: first, a solution path of all predictors is generated by sorting the sequence of marginal correlations between each predictor and response, and second, variable selection is carried out by screening the solution path with PLS. We provide three marginal screening methods for PLS in this paper, namely, sure independence screening (SIS), profiled independence screening ( PIS), and high-dimensional ordinary least-squares projection (HOLP). The promising performance of our methods is illustrated via three near-infrared (NIR) spectral data sets. Compared with SIS and PIS, HOLP for PLS is more suitable for selecting important wavelengths and enhances the prediction accuracy in the NIR spectral data.
引用
收藏
页码:14047 / 14055
页数:9
相关论文
共 50 条
  • [41] Deep partial least squares for instrumental variable regression
    Nareklishvili, Maria
    Polson, Nicholas
    Sokolov, Vadim
    APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, 2023, 39 (06) : 734 - 754
  • [42] A Novel Extension of Kernel Partial Least Squares Regression
    贾金明
    仲伟俊
    JournalofDonghuaUniversity(EnglishEdition), 2009, 26 (04) : 438 - 442
  • [43] A novel extension of kernel partial least squares regression
    Jia, Jin-Ming
    Zhong, Wei-Jun
    Journal of Donghua University (English Edition), 2009, 26 (04) : 438 - 442
  • [44] A REFORMULATION OF THE PARTIAL LEAST-SQUARES REGRESSION ALGORITHM
    YOUNG, PJ
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 1994, 15 (01): : 225 - 230
  • [45] Tide modeling using partial least squares regression
    Okwuashi, Onuwa
    Ndehedehe, Christopher
    Attai, Hosanna
    OCEAN DYNAMICS, 2020, 70 (08) : 1089 - 1101
  • [46] Voice Conversion Using Partial Least Squares Regression
    Helander, Elina
    Virtanen, Tuomas
    Nurminen, Jani
    Gabbouj, Moncef
    IEEE TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2010, 18 (05): : 912 - 921
  • [47] An overview on the shrinkage properties of partial least squares regression
    Nicole Krämer
    Computational Statistics, 2007, 22 : 249 - 273
  • [48] PARTIAL LEAST-SQUARES AND CLASSIFICATION AND REGRESSION TREES
    YEH, CH
    SPIEGELMAN, CH
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 1994, 22 (01) : 17 - 23
  • [49] Significance regression: A statistical approach to partial least squares
    Control and Dynamical Systems 210-41, California Institute of Technology, Pasadena, CA 91125, United States
    不详
    不详
    Journal of Chemometrics, 11 (04): : 283 - 309
  • [50] Study of partial least squares and ridge regression methods
    Firinguetti, Luis
    Kibria, Golam
    Araya, Rodrigo
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2017, 46 (08) : 6631 - 6644