LARGE-SCALE RANDOM FEATURES FOR KERNEL REGRESSION

被引:0
|
作者
Laparra, Valero [1 ]
Gonzalez, Diego Marcos [2 ]
Tuia, Devis [2 ]
Camps-Valls, Gustau [1 ]
机构
[1] Univ Valencia, IPL, E-46003 Valencia, Spain
[2] Univ Zurich, CH-8006 Zurich, Switzerland
关键词
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Kernel methods constitute a family of powerful machine learning algorithms, which have found wide use in remote sensing and geosciences. However, kernel methods are still not widely adopted because of the high computational cost when dealing with large scale problems, such as the inversion of radiative transfer models. This paper introduces the method of random kitchen sinks (RKS) for fast statistical retrieval of bio-geo-physical parameters. The RKS method allows to approximate a kernel matrix with a set of random bases sampled from the Fourier domain. We extend their use to other bases, such as wavelets, stumps, and Walsh expansions. We show that kernel regression is now possible for datasets with millions of examples and high dimensionality. Examples on atmospheric parameter retrieval from infrared sounders and biophysical parameter retrieval by inverting PROSAIL radiative transfer models with simulated Sentinel-2 data show the effectiveness of the technique.
引用
收藏
页码:17 / 20
页数:4
相关论文
共 50 条
  • [1] Large-Scale Nonlinear Variable Selection via Kernel Random Features
    Gregorova, Magda
    Ramapuram, Jason
    Kalousis, Alexandros
    Marchand-Maillet, Stephane
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2018, PT II, 2019, 11052 : 177 - 192
  • [2] Online Adaptive Kernel Learning with Random Features for Large-scale Nonlinear Classification
    Chen, Yingying
    Yang, Xiaowei
    PATTERN RECOGNITION, 2022, 131
  • [3] Data-dependent compression of random features for large-scale kernel approximation
    Agrawal, Raj
    Campbell, Trevor
    Huggins, Jonathan
    Broderick, Tamara
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [4] KERNEL COMPUTATIONS FROM LARGE-SCALE RANDOM FEATURES OBTAINED BY OPTICAL PROCESSING UNITS
    Ohana, Ruben
    Wacker, Jonas
    Dong, Jonathan
    Marmin, Sebastien
    Krzakala, Florent
    Filippone, Maurizio
    Daudet, Laurent
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 9294 - 9298
  • [5] Kernel Logistic Regression Algorithm for Large-Scale Data Classification
    Elbashir, Murtada
    Wang, Jianxin
    INTERNATIONAL ARAB JOURNAL OF INFORMATION TECHNOLOGY, 2015, 12 (05) : 465 - 472
  • [6] Large-Scale Expectile Regression With Covariates Missing at Random
    Pan, Yingli
    Liu, Zhan
    Cai, Wen
    IEEE ACCESS, 2020, 8 : 36502 - 36513
  • [7] Large-scale Online Kernel Learning with Random Feature Reparameterization
    Tu Dinh Nguyen
    Le, Trung
    Bui, Hung
    Phung, Dinh
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2543 - 2549
  • [8] Large-Scale Gaussian Process Regression Based on Random Fourier Features and Local Approximation with Tsallis Entropy
    Zhang, Hongli
    Liu, Jinglei
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2023, E106D (10) : 1747 - 1751
  • [9] Random forest versus logistic regression: a large-scale benchmark experiment
    Couronne, Raphael
    Probst, Philipp
    Boulesteix, Anne-Laure
    BMC BIOINFORMATICS, 2018, 19
  • [10] Random forest versus logistic regression: a large-scale benchmark experiment
    Raphael Couronné
    Philipp Probst
    Anne-Laure Boulesteix
    BMC Bioinformatics, 19