From Kernel Methods to Neural Networks: A Unifying Variational Formulation

被引:1
|
作者
Unser, Michael [1 ]
机构
[1] Ecole Polytech Fed Lausanne EPFL, Biomed Imaging Grp, Stn 17, CH-1015 Lausanne, Switzerland
关键词
Machine learning; Convex optimization; Regularization; Representer theorem; Kernel methods; Neural networks; Banach space; APPROXIMATION; SPLINES; INTERPOLATION; REGRESSION; TRANSFORM;
D O I
10.1007/s10208-023-09624-9
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The minimization of a data-fidelity term and an additive regularization functional gives rise to a powerful framework for supervised learning. In this paper, we present a unifying regularization functional that depends on an operator L\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\textrm{L}$$\end{document} and on a generic Radon-domain norm. We establish the existence of a minimizer and give the parametric form of the solution(s) under very mild assumptions. When the norm is Hilbertian, the proposed formulation yields a solution that involves radial-basis functions and is compatible with the classical methods of machine learning. By contrast, for the total-variation norm, the solution takes the form of a two-layer neural network with an activation function that is determined by the regularization operator. In particular, we retrieve the popular ReLU networks by letting the operator be the Laplacian. We also characterize the solution for the intermediate regularization norms ||center dot||=||center dot||Lp\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\Vert \cdot \Vert =\Vert \cdot \Vert _{L_p}$$\end{document} with p is an element of(1,2]\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$p\in (1,2]$$\end{document}. Our framework offers guarantees of universal approximation for a broad family of regularization operators or, equivalently, for a wide variety of shallow neural networks, including the cases (such as ReLU) where the activation function is increasing polynomially. It also explains the favorable role of bias and skip connections in neural architectures.
引用
收藏
页码:1779 / 1818
页数:40
相关论文
共 50 条
  • [31] Variational Graph Recurrent Neural Networks
    Hajiramezanali, Ehsan
    Hasanzadeh, Arman
    Duffield, Nick
    Narayanan, Krishna
    Zhou, Mingyuan
    Qian, Xiaoning
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [32] Neural Tangent Kernel: Convergence and Generalization in Neural Networks
    Jacot, Arthur
    Gabriel, Franck
    Hongler, Clement
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [33] A Variational Algorithm for Quantum Neural Networks
    Macaluso, Antonio
    Clissa, Luca
    Lodi, Stefano
    Sartori, Claudio
    COMPUTATIONAL SCIENCE - ICCS 2020, PT VI, 2020, 12142 : 591 - 604
  • [34] Unifying Graph Neural Networks with a Generalized Optimization Framework
    Shi, Chuan
    Zhu, Meiqi
    Yu, Yue
    Wang, Xiao
    Du, Junping
    ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2024, 42 (06)
  • [35] DropMessage: Unifying Random Dropping for Graph Neural Networks
    Fang, Taoran
    Xiao, Zhiqing
    Wang, Chunping
    Xu, Jiarong
    Yang, Xuan
    Yang, Yang
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 4, 2023, : 4267 - 4275
  • [36] Unifying Syntactic and Semantic Abstractions for Deep Neural Networks
    Siddiqui, Sanaa
    Mukhopadhyay, Diganta
    Afzal, Mohammad
    Karmarkar, Hrishikesh
    Madhukar, Kumar
    FORMAL METHODS FOR INDUSTRIAL CRITICAL SYSTEMS, FMICS 2024, 2024, 14952 : 201 - 219
  • [37] Interpreting and Unifying Graph Neural Networks with An Optimization Framework
    Zhu, Meiqi
    Wang, Xiao
    Shi, Chuan
    Ji, Houye
    Cui, Peng
    PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 1215 - 1226
  • [38] Spectra of the Conjugate Kernel and Neural Tangent Kernel for Linear-Width Neural Networks
    Fan, Zhou
    Wang, Zhichao
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [39] Variational formulation and monolithic solution of computational homogenization methods
    Hesch, Christian
    Schmidt, Felix
    Schuss, Stefan
    INTERNATIONAL JOURNAL FOR NUMERICAL METHODS IN ENGINEERING, 2024, 125 (20)
  • [40] A comparison of boundary methods based on inverse variational formulation
    Branski, A.
    Borkowski, M.
    Borkowska, D.
    ENGINEERING ANALYSIS WITH BOUNDARY ELEMENTS, 2012, 36 (04) : 505 - 510