High-Dimensional Analysis for Generalized Nonlinear Regression: From Asymptotics to Algorithm

被引:0
|
作者
Li, Jian [1 ]
Liu, Yong [2 ]
Wang, Weiping [1 ]
机构
[1] Chinese Acad Sci, Inst Informat Engn, Beijing, Peoples R China
[2] Renmin Univ China, Gaoling Sch Artificial Intelligence, Beijing, Peoples R China
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
RANDOMIZED SKETCHES; RATES;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Overparameterization often leads to benign overfitting, where deep neural networks can be trained to overfit the training data but still generalize well on unseen data. However, it lacks a generalized asymptotic framework for nonlinear regressions and connections to conventional complexity notions. In this paper, we propose a generalized high-dimensional analysis for nonlinear regression models, including various nonlinear feature mapping methods and subsampling. Specifically, we first provide an implicit regularization parameter and asymptotic equivalents related to a classical complexity notion, i.e., effective dimension. We then present a high-dimensional analysis for nonlinear ridge regression and extend it to ridgeless regression in the under-parameterized and over-parameterized regimes, respectively. We find that the limiting risks decrease with the effective dimension. Motivated by these theoretical findings, we propose an algorithm, namely RFRed, to improve generalization ability. Finally, we validate our theoretical findings and the proposed algorithm through several experiments.
引用
收藏
页码:13500 / 13508
页数:9
相关论文
共 50 条
  • [1] HIGH-DIMENSIONAL ASYMPTOTICS OF PREDICTION: RIDGE REGRESSION AND CLASSIFICATION
    Dobriban, Edgar
    Wager, Stefan
    ANNALS OF STATISTICS, 2018, 46 (01): : 247 - 279
  • [2] Nonlinear confounding in high-dimensional regression
    Li, KC
    ANNALS OF STATISTICS, 1997, 25 (02): : 577 - 612
  • [3] GENERALIZED REGRESSION ESTIMATORS WITH HIGH-DIMENSIONAL COVARIATES
    Ta, Tram
    Shao, Jun
    Li, Quefeng
    Wang, Lei
    STATISTICA SINICA, 2020, 30 (03) : 1135 - 1154
  • [4] Active learning with generalized sliced inverse regression for high-dimensional reliability analysis
    Yin, Jianhua
    Du, Xiaoping
    STRUCTURAL SAFETY, 2022, 94
  • [5] A stepwise regression algorithm for high-dimensional variable selection
    Hwang, Jing-Shiang
    Hu, Tsuey-Hwa
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2015, 85 (09) : 1793 - 1806
  • [6] High-dimensional objective optimizer: An evolutionary algorithm and its nonlinear analysis
    Huang, Jun
    Huang, Xiaohong
    Ma, Yan
    Liu, Yanbing
    EXPERT SYSTEMS WITH APPLICATIONS, 2011, 38 (07) : 8921 - 8928
  • [7] Asymptotics of AIC, BIC and Cp model selection rules in high-dimensional regression
    Bai, Zhidong
    Choi, Kwok Pui
    Fujikoshi, Yasunori
    Hu, Jiang
    BERNOULLI, 2022, 28 (04) : 2375 - 2403
  • [8] High-dimensional Asymptotics of Denoising Autoencoders
    Cui, Hugo
    Zdeborova, Lenka
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [9] High-dimensional asymptotics of denoising autoencoders
    Cui, Hugo
    Zdeborova, Lenka
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2024, 2024 (10):
  • [10] Subgroup analysis for high-dimensional functional regression
    Zhang, Xiaochen
    Zhang, Qingzhao
    Ma, Shuangge
    Fang, Kuangnan
    JOURNAL OF MULTIVARIATE ANALYSIS, 2022, 192