Kernel subspace pursuit for sparse regression

被引:0
|
作者
Kabbara, Jad [1 ]
Psaromiligkos, Ioannis N. [1 ]
机构
[1] McGill Univ, Dept Elect & Comp Engn, Montreal, PQ H3A 0E9, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Kernel methods; Sparse function approximation; Regression; Subspace pursuit;
D O I
10.1016/j.patrec.2015.09.018
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, results from sparse approximation theory have been considered as a means to improve the generalization performance of kernel-based machine learning algorithms. In this paper, we present Kernel Subspace Pursuit (KSP), a new method for sparse non-linear regression. KSP is a low-complexity method that iteratively approximates target functions in the least-squares sense as a linear combination of a limited number of elements selected from a kernel-based dictionary. Unlike other kernel methods, by virtue of KSP's algorithmic design, the number of KSP iterations needed to reach the final solution does not depend on the number of basis functions used nor that of elements in the dictionary. We experimentally show that, in many scenarios involving learning synthetic and real data, KSP is less complex computationally and outperforms other kernel methods that solve the same problem, namely, Kernel Matching Pursuit and Kernel Basis Pursuit. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:56 / 61
页数:6
相关论文
共 50 条
  • [41] Locality Preserving Robust Regression for Jointly Sparse Subspace Learning
    Liu, Ning
    Lai, Zhihui
    Li, Xuechen
    Chen, Yudong
    Mo, Dongmei
    Kong, Heng
    Shen, Linlin
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2021, 31 (06) : 2274 - 2287
  • [42] The evidence framework applied to sparse kernel logistic regression
    Cawley, GC
    Talbot, NLC
    NEUROCOMPUTING, 2005, 64 (64) : 119 - 135
  • [43] Sparse kernel ridge regression using backward deletion
    Wang, Ling
    Bo, Liefeng
    Jiao, Licheng
    PRICAI 2006: TRENDS IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2006, 4099 : 365 - 374
  • [44] SLiKER: Sparse loss induced kernel ensemble regression
    Shen, Xiang-Jun
    Ni, ChengGong
    Wang, Liangjun
    Zha, Zheng-Jun
    PATTERN RECOGNITION, 2021, 109
  • [45] Sparse nonparametric regression with regularized tensor product kernel
    Yu, Hang
    Wang, Yuanjia
    Zeng, Donglin
    STAT, 2020, 9 (01):
  • [46] Approximation bounds for some sparse kernel regression algorithms
    Zhang, T
    NEURAL COMPUTATION, 2002, 14 (12) : 3013 - 3042
  • [47] Two-stage Orthogonal Subspace Matching Pursuit for Joint Sparse Recovery
    Kim, Kyung Su
    Chung, Sae-Young
    2016 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, 2016, : 2374 - 2378
  • [48] ON THE DETECTION PROBABILITY OF SPARSE SIGNALS WITH SENSOR NETWORKS BASED ON DISTRIBUTED SUBSPACE PURSUIT
    Zhao, Wenqiang
    Li, Gang
    2015 IEEE CHINA SUMMIT & INTERNATIONAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING, 2015, : 324 - 328
  • [49] Sparse Subspace Clustering with One-way Selective Orthogonal Matching Pursuit
    Song, Jinren
    Zhu, Yuesheng
    Mo, Zhaoguo
    Zhong, Li
    TWELFTH INTERNATIONAL CONFERENCE ON DIGITAL IMAGE PROCESSING (ICDIP 2020), 2020, 11519
  • [50] Nonparametric Basis Pursuit via Sparse Kernel-Based Learning
    Bazerque, Juan Andres
    Giannakis, Georgios B.
    IEEE SIGNAL PROCESSING MAGAZINE, 2013, 30 (04) : 112 - 125