Supervised dimension reduction for ordinal predictors

被引:5
|
作者
Forzani, Liliana [1 ]
Garcia Arancibia, Rodrigo [2 ,3 ]
Llop, Pamela [1 ]
Tomassi, Diego [1 ]
机构
[1] Univ Nacl Litoral, Fac Ingn Quim, Researchers CONICET, Buenos Aires, DF, Argentina
[2] Inst Econ Aplicada Litoral FCE UNL, Buenos Aires, DF, Argentina
[3] Consejo Nacl Invest Cient & Tecn, Buenos Aires, DF, Argentina
关键词
Expectation-maximization (EM); Latent variables reduction subspace; SES index construction; Supervised classification; Variable selection; SLICED INVERSE REGRESSION; SOCIOECONOMIC-STATUS; CENTRAL SUBSPACE; VISUALIZATION; COMPONENTS;
D O I
10.1016/j.csda.2018.03.018
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
In applications involving ordinal predictors, common approaches to reduce dimensionality are either extensions of unsupervised techniques such as principal component analysis, or variable selection procedures that rely on modeling the regression function. A supervised dimension reduction method tailored to ordered categorical predictors is introduced which uses a model-based dimension reduction approach, inspired by extending sufficient dimension reductions to the context of latent Gaussian variables. The reduction is chosen without modeling the response as a function of the predictors and does not impose any distributional assumption on the response or on the response given the predictors. A likelihood-based estimator of the reduction is derived and an iterative expectation-maximization type algorithm is proposed to alleviate the computational load and thus make the method more practical. A regularized estimator, which simultaneously achieves variable selection and dimension reduction, is also presented. Performance of the proposed method is evaluated through simulations and a real data example for socioeconomic index construction, comparing favorably to widespread use techniques. (C) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:136 / 155
页数:20
相关论文
共 50 条
  • [1] Dimension reduction for supervised ordering
    Kamishima, Toshihiro
    Akaho, Shotaro
    ICDM 2006: SIXTH INTERNATIONAL CONFERENCE ON DATA MINING, PROCEEDINGS, 2006, : 330 - +
  • [3] ONLINE LEARNING FOR SUPERVISED DIMENSION REDUCTION
    Zhang, Ning
    Wu, Qiang
    MATHEMATICAL FOUNDATIONS OF COMPUTING, 2019, 2 (02): : 95 - 106
  • [4] An effective framework for supervised dimension reduction
    Khoat Than
    Tu Bao Ho
    Duy Khuong Nguyen
    NEUROCOMPUTING, 2014, 139 : 397 - 407
  • [5] On forward sufficient dimension reduction for categorical and ordinal responses
    Quach, Harris
    Li, Bing
    ELECTRONIC JOURNAL OF STATISTICS, 2023, 17 (01): : 980 - 1006
  • [6] On the Dimension Reduction of Radio Maps with a Supervised Approach
    Jia, Bing
    Huang, Baoqi
    Gao, Hepeng
    Li, Wuyungerile
    2017 IEEE 42ND CONFERENCE ON LOCAL COMPUTER NETWORKS (LCN), 2017, : 199 - 202
  • [7] Reducing Class Overlapping in Supervised Dimension Reduction
    Nguyen Trong Tung
    Vu Hoang Dieu
    Khoat Than
    Ngo Van Linh
    PROCEEDINGS OF THE NINTH INTERNATIONAL SYMPOSIUM ON INFORMATION AND COMMUNICATION TECHNOLOGY (SOICT 2018), 2018, : 8 - 15
  • [8] Signed Laplacian Embedding for Supervised Dimension Reduction
    Gong, Chen
    Tao, Dacheng
    Yang, Jie
    Fu, Keren
    PROCEEDINGS OF THE TWENTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2014, : 1847 - 1853
  • [9] Supervised dimension reduction for functional time series
    Wang, Guochang
    Wen, Zengyao
    Jia, Shanming
    Liang, Shanshan
    STATISTICAL PAPERS, 2024, 65 (07) : 4057 - 4077
  • [10] Deep Dimension Reduction for Supervised Representation Learning
    Huang, Jian
    Jiao, Yuling
    Liao, Xu
    Liu, Jin
    Yu, Zhou
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (05) : 3583 - 3598