Generalization Guarantees for Sparse Kernel Approximation with Entropic Optimal Features

被引:0
|
作者
Ding, Liang [1 ]
Tuo, Rui [1 ]
Shahrampour, Shahin [1 ]
机构
[1] Texas A&M Univ, Wm Michael Barnes Dept Ind & Syst Engn 64, College Stn, TX 77843 USA
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Despite their success, kernel methods suffer from a massive computational cost in practice. In this paper, in lieu of commonly used kernel expansion with respect to N inputs, we develop a novel optimal design maximizing the entropy among kernel features. This procedure results in a kernel expansion with respect to entropic optimal features (EOF), improving the data representation dramatically due to features dissimilarity. Under mild technical assumptions, our generalization bound shows that with only O(N-1/4) features (disregarding logarithmic factors), we can achieve the optimal statistical accuracy (i.e., O(1/root N)). The salient feature of our design is its sparsity that significantly reduces the time and space costs. Our numerical experiments on benchmark datasets verify the superiority of EOF over the state-of-the-art in kernel approximation.
引用
收藏
页数:11
相关论文
共 50 条
  • [31] Periodic Kernel Approximation by Index Set Fourier Series Features
    Tompkins, Anthony
    Ramos, Fabio
    35TH UNCERTAINTY IN ARTIFICIAL INTELLIGENCE CONFERENCE (UAI 2019), 2020, 115 : 486 - 496
  • [32] Structured orthogonal random features based on DCT for kernel approximation
    Zhang, Junna
    Zhou, Shuisheng
    NEUROCOMPUTING, 2024, 610
  • [33] OPTIMAL AND ROBUST KERNEL ALGORITHMS FOR PASSIVE STOCHASTIC-APPROXIMATION
    NAZIN, AV
    POLYAK, BT
    TSYBAKOV, AB
    IEEE TRANSACTIONS ON INFORMATION THEORY, 1992, 38 (05) : 1577 - 1583
  • [34] Improved Convergence Rates for Sparse Approximation Methods in Kernel-Based Learning
    Vakili, Sattar
    Scarlett, Jonathan
    Shiu, Da-shan
    Bernacchia, Alberto
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [35] Sparse approximation of multilinear problems with applications to kernel-based methods in UQ
    Fabio Nobile
    Raúl Tempone
    Sören Wolfers
    Numerische Mathematik, 2018, 139 : 247 - 280
  • [36] Sparse approximation of multilinear problems with applications to kernel-based methods in UQ
    Nobile, Fabio
    Tempone, Raul
    Wolfers, Soren
    NUMERISCHE MATHEMATIK, 2018, 139 (01) : 247 - 280
  • [37] Optimal Convergence for Agnostic Kernel Learning With Random Features
    Li, Jian
    Liu, Yong
    Wang, Weiping
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 1779 - 1789
  • [38] Optimal Convergence for Agnostic Kernel Learning With Random Features
    Li, Jian
    Liu, Yong
    Wang, Weiping
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 1779 - 1789
  • [39] Sparse Feature Selection in Kernel Discriminant Analysis via Optimal Scoring
    Lapanowski, Alexander F.
    Gaynanova, Irina
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [40] High-dimensional approximation with kernel-based multilevel methods on sparse grids
    Kempf, Ruediger
    Wendland, Holger
    NUMERISCHE MATHEMATIK, 2023, 154 (3-4) : 485 - 519