From Kernel Methods to Neural Networks: A Unifying Variational Formulation

被引:1
|
作者
Unser, Michael [1 ]
机构
[1] Ecole Polytech Fed Lausanne EPFL, Biomed Imaging Grp, Stn 17, CH-1015 Lausanne, Switzerland
关键词
Machine learning; Convex optimization; Regularization; Representer theorem; Kernel methods; Neural networks; Banach space; APPROXIMATION; SPLINES; INTERPOLATION; REGRESSION; TRANSFORM;
D O I
10.1007/s10208-023-09624-9
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The minimization of a data-fidelity term and an additive regularization functional gives rise to a powerful framework for supervised learning. In this paper, we present a unifying regularization functional that depends on an operator L\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\textrm{L}$$\end{document} and on a generic Radon-domain norm. We establish the existence of a minimizer and give the parametric form of the solution(s) under very mild assumptions. When the norm is Hilbertian, the proposed formulation yields a solution that involves radial-basis functions and is compatible with the classical methods of machine learning. By contrast, for the total-variation norm, the solution takes the form of a two-layer neural network with an activation function that is determined by the regularization operator. In particular, we retrieve the popular ReLU networks by letting the operator be the Laplacian. We also characterize the solution for the intermediate regularization norms ||center dot||=||center dot||Lp\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\Vert \cdot \Vert =\Vert \cdot \Vert _{L_p}$$\end{document} with p is an element of(1,2]\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$p\in (1,2]$$\end{document}. Our framework offers guarantees of universal approximation for a broad family of regularization operators or, equivalently, for a wide variety of shallow neural networks, including the cases (such as ReLU) where the activation function is increasing polynomially. It also explains the favorable role of bias and skip connections in neural architectures.
引用
收藏
页码:1779 / 1818
页数:40
相关论文
共 50 条
  • [41] GRAPH CONVOLUTIONAL NETWORKS FROM THE PERSPECTIVE OF SHEAVES AND THE NEURAL TANGENT KERNEL
    Gebhart, Thomas
    TOPOLOGICAL, ALGEBRAIC AND GEOMETRIC LEARNING WORKSHOPS 2022, VOL 196, 2022, 196
  • [42] Trainable Calibration Measures For Neural Networks From Kernel Mean Embeddings
    Kumar, Aviral
    Sarawagi, Sunita
    Jain, Ujjwal
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [43] A New Formulation for Feedforward Neural Networks
    Razavi, Saman
    Tolson, Bryan A.
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011, 22 (10): : 1588 - 1598
  • [44] High-order neural networks and kernel methods for peptide-MHC binding prediction
    Kuksa, Pavel P.
    Min, Martin Renqiang
    Dugar, Rishabh
    Gerstein, Mark
    BIOINFORMATICS, 2015, 31 (22) : 3600 - 3607
  • [45] Wide coverage natural language processing using kernel methods and neural networks for structured data
    Menchetti, S
    Costa, F
    Frasconi, P
    Pontil, M
    PATTERN RECOGNITION LETTERS, 2005, 26 (12) : 1896 - 1906
  • [46] On the Neural Tangent Kernel Analysis of Randomly Pruned Neural Networks
    Yang, Hongru
    Wang, Zhangyang
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206, 2023, 206
  • [47] Neural Tangent Kernel Analysis of Deep Narrow Neural Networks
    Lee, Jongmin
    Choi, Joo Young
    Ryu, Ernest K.
    No, Albert
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [48] The Kernel Dynamics of Convolutional Neural Networks in Manifolds
    WU Wei
    JING Xiaoyuan
    DU Wencai
    Chinese Journal of Electronics, 2020, 29 (06) : 1185 - 1192
  • [49] A Novel Adaptive Kernel for the RBF Neural Networks
    Shujaat Khan
    Imran Naseem
    Roberto Togneri
    Mohammed Bennamoun
    Circuits, Systems, and Signal Processing, 2017, 36 : 1639 - 1653
  • [50] On the regularization of convolutional kernel tensors in neural networks
    Guo, Pei-Chang
    Ye, Qiang
    LINEAR & MULTILINEAR ALGEBRA, 2022, 70 (12): : 2318 - 2330