ONLINE LEARNING FOR SUPERVISED DIMENSION REDUCTION

被引:4
|
作者
Zhang, Ning [1 ]
Wu, Qiang [2 ]
机构
[1] Middle Tennessee State Univ, Computat Sci PhD Program, 1301 E Main St, Murfreesboro, TN 37132 USA
[2] Middle Tennessee State Univ, Dept Math Sci, 1301 E Main St, Murfreesboro, TN 37132 USA
来源
关键词
Dimension reduction; supervised learning; sliced inverse regression; online learning; overlapping; SLICED INVERSE REGRESSION; LINEAR DISCRIMINANT-ANALYSIS; CLASSIFICATION; STRENGTH; SLUMP;
D O I
10.3934/mfc.2019008
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Online learning has attracted great attention due to the increasing demand for systems that have the ability of learning and evolving. When the data to be processed is also high dimensional and dimension reduction is necessary for visualization or prediction enhancement, online dimension reduction will play an essential role. The purpose of this paper is to propose a new online learning approach for supervised dimension reduction. Our algorithm is motivated by adapting the sliced inverse regression (SIR), a pioneer and effective algorithm for supervised dimension reduction, and making it implementable in an incremental manner. The new algorithm, called incremental sliced inverse regression (ISIR), is able to update the subspace of significant factors with intrinsic lower dimensionality fast and efficiently when new observations come in. We also refine the algorithm by using an overlapping technique and develop an incremental overlapping sliced inverse regression (IOSIR) algorithm. We verify the effectiveness and efficiency of both algorithms by simulations and real data applications.
引用
收藏
页码:95 / 106
页数:12
相关论文
共 50 条
  • [31] Supervised dimension reduction by local neighborhood optimization for image processing
    Zhao L.
    Wang H.
    Wang J.
    Recent Patents on Engineering, 2019, 13 (04) : 334 - 337
  • [32] Bayesian inverse regression for supervised dimension reduction with small datasets
    Cai, Xin
    Lin, Guang
    Li, Jinglai
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2021, 91 (14) : 2817 - 2832
  • [33] Maximizing adjusted covariance: new supervised dimension reduction for classification
    Park, Hyejoon
    Kim, Hyunjoong
    Lee, Yung-Seop
    COMPUTATIONAL STATISTICS, 2025, 40 (01) : 573 - 599
  • [34] Supervised ML Algorithms in the High Dimensional Applications for Dimension Reduction
    Tabassum, Hina
    Iqbal, Muhammad Mutahir
    Shehzad, Muhammad Ahmed
    Asghar, Nabeel
    Yusuf, Mohammed
    Kilai, Mutua
    Aldallal, Ramay
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2022, 2022
  • [35] A review of online learning in supervised neural networks
    Jain, Lakhmi C.
    Seera, Manjeevan
    Lim, Chee Peng
    Balasubramaniam, P.
    NEURAL COMPUTING & APPLICATIONS, 2014, 25 (3-4): : 491 - 509
  • [36] A review of online learning in supervised neural networks
    Lakhmi C. Jain
    Manjeevan Seera
    Chee Peng Lim
    P. Balasubramaniam
    Neural Computing and Applications, 2014, 25 : 491 - 509
  • [37] Online Semi-supervised Pairwise Learning
    Khalid, Majdi
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [38] An Incremental Neural Network for Online Supervised Learning and Topology Learning
    Kamiya, Youki
    Furao, Shen
    Hasegawa, Osamu
    JOURNAL OF ADVANCED COMPUTATIONAL INTELLIGENCE AND INTELLIGENT INFORMATICS, 2007, 11 (01) : 87 - 95
  • [39] Online semi-supervised learning with learning vector quantization
    Shen, Yuan-Yuan
    Zhang, Yan-Ming
    Zhang, Xu-Yao
    Liu, Cheng-Lin
    NEUROCOMPUTING, 2020, 399 : 467 - 478
  • [40] Recursive Dimension Reduction for semisupervised learning
    Ye, Qiaolin
    Yin, Tongming
    Gao, Shangbing
    Jing, Jiajia
    Zhang, Yu
    Sun, Cuiping
    NEUROCOMPUTING, 2016, 171 : 1629 - 1636