Tight bounds on l1 approximation and learning of self-bounding functions

被引:1
|
作者
Feldman, Vitaly [1 ]
Kothari, Pravesh [2 ]
Vondrak, Jan [3 ]
机构
[1] Google Brain, Mountain View, CA USA
[2] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
[3] Stanford Univ, Stanford, CA 94305 USA
关键词
PAC learning; Submodular function; XOS function; Fourier analysis; Noise stability; Polynomial approximationyy; INEQUALITY;
D O I
10.1016/j.tcs.2019.11.013
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We study the complexity of learning and approximation of self-bounding functions over the uniform distribution on the Boolean hypercube {0, 1}(n). Informally, a function f : {0, 1}(n) -> R is self-bounding if for every x is an element of{0, 1}(n), f(x) upper bounds the sum of all the n marginal decreases in the value of the function at x. Self-bounding functions include such well-known classes of functions as submodular and fractionally-subadditive (XOS) functions. They were introduced by Boucheron et al. (2010) in the context of concentration of measure inequalities. Our main result is a nearly tight l(1)-approximation of self-bounding functions by low-degree juntas. Specifically, all self-bounding functions can be epsilon-approximated in l(1) by a polynomial of degree (O) over tilde (1/epsilon) over 2((O) over tilde (1/epsilon)) variables. We show that both the degree and junta-size are optimal up to logarithmic terms. Previous techniques considered stronger l(2) approximation and proved nearly tight bounds of Theta(1/epsilon(2)) on the degree and 2(Theta(1/epsilon 2)) on the number of variables. Our bounds rely on the analysis of noise stability of self-bounding functions together with a stronger connection between noise stability and l(1) approximation by low-degree polynomials. This technique can also be used to get tighter bounds on l(1) approximation by low-degree polynomials and a faster learning algorithm for halfspaces. These results lead to improved and in several cases almost tight bounds for PAC and agnostic learning of self-bounding functions relative to the uniform distribution. In particular, assuming hardness of learning juntas, we show that PAC and agnostic learning of self-bounding functions have complexity of n((Theta) over tilde (1/epsilon)). (C) 2019 Published by Elsevier B.V.
引用
收藏
页码:86 / 98
页数:13
相关论文
共 50 条
  • [1] On concentration of self-bounding functions
    Boucheron, Stephane
    Lugosi, Gabor
    Massart, Pacal
    ELECTRONIC JOURNAL OF PROBABILITY, 2009, 14 : 1884 - 1899
  • [2] Concentration for self-bounding functions and an inequality of Talagrand
    McDiarmid, Colin
    Reed, Bruce
    RANDOM STRUCTURES & ALGORITHMS, 2006, 29 (04) : 549 - 557
  • [3] L1 bounds in normal approximation
    Goldstein, Larry
    ANNALS OF PROBABILITY, 2007, 35 (05): : 1888 - 1930
  • [4] Tight Bounds for l1 Oblivious Subspace Embeddings
    Wang, Ruosong
    Woodruff, David P.
    ACM TRANSACTIONS ON ALGORITHMS, 2022, 18 (01)
  • [5] L1 APPROXIMATION OF DISCONTINUOUS FUNCTIONS
    CARROLL, MP
    NOTICES OF THE AMERICAN MATHEMATICAL SOCIETY, 1974, 21 (01): : A159 - A159
  • [6] Self-bounding Majority Vote Learning Algorithms by the Direct Minimization of a Tight PAC-Bayesian C-Bound
    Viallard, Paul
    Germain, Pascal
    Habrard, Amaury
    Morvant, Emilie
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021: RESEARCH TRACK, PT II, 2021, 12976 : 167 - 183
  • [7] NONLINEAR L1 APPROXIMATION OF SMOOTH FUNCTIONS
    WOLFE, JM
    JOURNAL OF APPROXIMATION THEORY, 1976, 17 (02) : 166 - 176
  • [8] DISCRETE L1 APPROXIMATION BY RATIONAL FUNCTIONS
    WATSON, GA
    IMA JOURNAL OF NUMERICAL ANALYSIS, 1984, 4 (03) : 275 - 288
  • [9] BEST HARMONIC L1 APPROXIMATION TO SUBHARMONIC FUNCTIONS
    GOLDSTEIN, M
    HAUSSMANN, W
    JETTER, K
    JOURNAL OF THE LONDON MATHEMATICAL SOCIETY-SECOND SERIES, 1984, 30 (OCT): : 257 - 264
  • [10] L1 APPROXIMATION OF DISCONTINUOUS FUNCTIONS WITH INTERPOLATION CONSTRAINTS
    CARROLL, MP
    JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 1974, 46 (01) : 132 - 142