Near-Optimal Statistical Query Lower Bounds for Agnostically Learning Intersections of Halfspaces with Gaussian Marginals

被引:0
|
作者
Hsu, Daniel [1 ]
Sanford, Clayton [1 ]
Servedio, Rocco A. [1 ]
Vlatakis-Gkaragkounis, Emmanouil-Vasileios [1 ]
机构
[1] Columbia Univ, New York, NY 10027 USA
来源
基金
美国国家科学基金会;
关键词
Statistical Query learning; agnostic learning; intersections of halfspaces; POLYNOMIAL-TIME ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the well-studied problem of learning intersections of halfspaces under the Gaussian distribution in the challenging agnostic learning model. Recent work of Diakonikolas et al. (2021b) shows that any Statistical Query (SQ) algorithm for agnostically learning the class of intersections of k halfspaces over R-n to constant excess error either must make queries of tolerance at most n(-(Omega) over tilde(root log k)) or must make 2(n Omega(1)) queries. We strengthen this result by improving the tolerance requirement to n-((Omega) over tilde (log k)). This lower bound is essentially best possible since an SQ algorithm of Klivans et al. (2008) agnostically learns this class to any constant excess error using n(O(log k)) queries of tolerance n(-O(log k)). We prove two variants of our lower bound, each of which combines ingredients from Diakonikolas et al. (2021b) with (an extension of) a different earlier approach for agnostic SQ lower bounds for the Boolean setting due to Dachman-Soled et al. (2014). Our approach also yields lower bounds for agnostically SQ learning the class of "convex subspace juntas" (studied by Vempala, 2010a) and the class of sets with bounded Gaussian surface area; all of these lower bounds are nearly optimal since they essentially match known upper bounds from Klivans et al. (2008).
引用
收藏
页码:283 / 312
页数:30
相关论文
共 34 条
  • [1] Near-Optimal SQ Lower Bounds for Agnostically Learning Halfspaces and ReLUs under Gaussian Marginals
    Diakonikolas, Ilias
    Kane, Daniel M.
    Zarifis, Nikos
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [2] Near-Optimal Cryptographic Hardness of Agnostically Learning Halfspaces and ReLU Regression under Gaussian Marginals
    Diakonikolas, Ilias
    Kane, Daniel M.
    Ren, Lisheng
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [3] Near-Optimal Bounds for Learning Gaussian Halfspaces with Random Classification Noise
    Diakonikolas, Ilias
    Diakonikolas, Jelena
    Kane, Daniel M.
    Wang, Puqian
    Zarifis, Nikos
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [4] Near-Optimal Statistical Query Hardness of Learning Halfspaces with Massart Noise
    Diakonikolas, Ilias
    Kane, Daniel M.
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178
  • [5] Unconditional lower bounds for learning intersections of halfspaces
    Klivans, Adam R.
    Sherstov, Alexander A.
    MACHINE LEARNING, 2007, 69 (2-3) : 97 - 114
  • [6] Unconditional lower bounds for learning intersections of halfspaces
    Adam R. Klivans
    Alexander A. Sherstov
    Machine Learning, 2007, 69 : 97 - 114
  • [7] Improved lower bounds for learning intersections of halfspaces
    Klivans, Adam R.
    Sherstov, Alexander A.
    LEARNING THEORY, PROCEEDINGS, 2006, 4005 : 335 - 349
  • [8] Near-Optimal Active Learning of Halfspaces via Query Synthesis in the Noisy Setting
    Chen, Lin
    Hassani, Hamed
    Karbasi, Amin
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1798 - 1804
  • [9] Optimal SQ Lower Bounds for Learning Halfspaces with Massart Noise
    Nasser, Rajai
    Tiegel, Stefan
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178
  • [10] New lower bounds for statistical query learning
    Yang, K
    JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 2005, 70 (04) : 485 - 509