From Kernel Methods to Neural Networks: A Unifying Variational Formulation

被引:1
|
作者
Unser, Michael [1 ]
机构
[1] Ecole Polytech Fed Lausanne EPFL, Biomed Imaging Grp, Stn 17, CH-1015 Lausanne, Switzerland
关键词
Machine learning; Convex optimization; Regularization; Representer theorem; Kernel methods; Neural networks; Banach space; APPROXIMATION; SPLINES; INTERPOLATION; REGRESSION; TRANSFORM;
D O I
10.1007/s10208-023-09624-9
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The minimization of a data-fidelity term and an additive regularization functional gives rise to a powerful framework for supervised learning. In this paper, we present a unifying regularization functional that depends on an operator L\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\textrm{L}$$\end{document} and on a generic Radon-domain norm. We establish the existence of a minimizer and give the parametric form of the solution(s) under very mild assumptions. When the norm is Hilbertian, the proposed formulation yields a solution that involves radial-basis functions and is compatible with the classical methods of machine learning. By contrast, for the total-variation norm, the solution takes the form of a two-layer neural network with an activation function that is determined by the regularization operator. In particular, we retrieve the popular ReLU networks by letting the operator be the Laplacian. We also characterize the solution for the intermediate regularization norms ||center dot||=||center dot||Lp\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\Vert \cdot \Vert =\Vert \cdot \Vert _{L_p}$$\end{document} with p is an element of(1,2]\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$p\in (1,2]$$\end{document}. Our framework offers guarantees of universal approximation for a broad family of regularization operators or, equivalently, for a wide variety of shallow neural networks, including the cases (such as ReLU) where the activation function is increasing polynomially. It also explains the favorable role of bias and skip connections in neural architectures.
引用
收藏
页码:1779 / 1818
页数:40
相关论文
共 50 条
  • [21] A variational formulation of kinematic waves: Solution methods
    Daganzo, CF
    TRANSPORTATION RESEARCH PART B-METHODOLOGICAL, 2005, 39 (10) : 934 - 950
  • [22] G-networks: a unifying model for neural and queueing networks
    Gelenbe, Erol
    ANNALS OF OPERATIONS RESEARCH, 1994, 48 (05) : 433 - 461
  • [23] UNIFIED FORMULATION OF VARIATIONAL METHODS IN SCATTERING THEORY
    SINGH, SR
    STAUFFER, AD
    NUOVO CIMENTO DELLA SOCIETA ITALIANA DI FISICA B-GENERAL PHYSICS RELATIVITY ASTRONOMY AND MATHEMATICAL PHYSICS AND METHODS, 1974, B 22 (01): : 139 - 152
  • [24] Variational Networks: Connecting Variational Methods and Deep Learning
    Kobler, Erich
    Klatzer, Teresa
    Hammernik, Kerstin
    Pock, Thomas
    PATTERN RECOGNITION (GCPR 2017), 2017, 10496 : 281 - 293
  • [25] Unifying variational methods for simulating quantum many-body systems
    Dawson, C. M.
    Eisert, J.
    Osborne, T. J.
    PHYSICAL REVIEW LETTERS, 2008, 100 (13)
  • [26] Kernel Pooling for Convolutional Neural Networks
    Cui, Yin
    Zhou, Feng
    Wang, Jiang
    Liu, Xiao
    Lin, Yuanqing
    Belongie, Serge
    30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 3049 - 3058
  • [27] Kernel Graph Convolutional Neural Networks
    Nikolentzos, Giannis
    Meladianos, Polykarpos
    Tixier, Antoine Jean-Pierre
    Skianis, Konstantinos
    Vazirgiannis, Michalis
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2018, PT I, 2018, 11139 : 22 - 32
  • [28] On the Fine-Grained Complexity of Empirical Risk Minimization: Kernel Methods and Neural Networks
    Backurs, Arturs
    Indyk, Piotr
    Schmidt, Ludwig
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [29] Neural networks for variational problems in engineering
    Lopez, R.
    Balsa-Canto, E.
    Onate, E.
    INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, 2008, 75 (11) : 1341 - 1360
  • [30] Variational convolutional neural networks classifiers
    Huang, Fangyu
    Tan, Xiaoqing
    Huang, Rui
    Xu, Qingshan
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2022, 605