The FAST Algorithm for Submodular Maximization

被引:0
|
作者
Breuer, Adam [1 ]
Balkanski, Eric [1 ]
Singer, Yaron [1 ]
机构
[1] Harvard Univ, Cambridge, MA 02138 USA
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper we describe a new parallel algorithm called Fast Adaptive Sequencing Technique (FAST) for maximizing a monotone submodular function under a cardinality constraint k. This algorithm achieves the optimal 1 - 1/e approximation guarantee and is orders of magnitude faster than the state-of-the-art on a variety of experiments over real-world data sets. Following recent work by Balkanski & Singer (2018a), there has been a great deal of research on algorithms whose theoretical parallel runtime is exponentially faster than algorithms used for submodular maximization over the past 40 years. However, while these new algorithms are fast in terms of asymptotic worst-case guarantees, it is computationally infeasible to use them in practice even on small data sets because the number of rounds and queries they require depend on large constants and high-degree polynomials in terms of precision and confidence. The design principles behind the FAST algorithm we present here are a significant departure from those of recent theoretically fast algorithms. Rather than optimize for asymptotic theoretical guarantees, the design of FAST introduces several new techniques that achieve remarkable practical and theoretical parallel runtimes. The approximation guarantee obtained by FAST is arbitrarily close to 1 - 1 /e, and its asymptotic parallel runtime (adaptivity) is O (log (n) log(2) (log k)) using O(n log log (k)) total queries. We show that FAST is orders of magnitude faster than any algorithm for submodular maximization we are aware of, including hyper-optimized parallel versions of state-of-the-art serial algorithms, by running experiments on large data sets.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Robust Adaptive Submodular Maximization
    Tang, Shaojie
    INFORMS JOURNAL ON COMPUTING, 2022, 34 (06) : 3277 - 3291
  • [42] Fast Streaming Algorithms for k-Submodular Maximization under a Knapsack Constraint
    Pham, Canh V.
    Ha, Dung K. T.
    Hoang, Huan X.
    Tran, Tan D.
    2022 IEEE 9TH INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (DSAA), 2022, : 260 - 269
  • [43] Fast Adaptive Non-Monotone Submodular Maximization Subject to a Knapsack Constraint
    Amanatidis, Georgios
    Fusco, Federico
    Lazos, Philip
    Leonardi, Stefano
    Reiffenhauser, Rebecca
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2022, 74 : 661 - 690
  • [44] Fast deterministic algorithms for non-submodular maximization with strong performance guarantees
    Lu, Cheng
    Yang, Wenguo
    JOURNAL OF GLOBAL OPTIMIZATION, 2024, 89 (03) : 777 - 801
  • [45] Fast Adaptive Non-Monotone Submodular Maximization Subject to a Knapsack Constraint
    Amanatidis G.
    Fusco F.
    Lazos P.
    Leonardi S.
    Reiffenhäuser R.
    Journal of Artificial Intelligence Research, 2022, 74 : 661 - 690
  • [46] Submodular Maximization by Simulated Annealing
    Gharan, Shayan Oveis
    Vondrak, Jan
    PROCEEDINGS OF THE TWENTY-SECOND ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, 2011, : 1098 - 1116
  • [47] Gradient Methods for Submodular Maximization
    Hassani, Hamed
    Soltanolkotabi, Mahdi
    Karbasi, Amin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [48] Practical Budgeted Submodular Maximization
    Moran Feldman
    Zeev Nutov
    Elad Shoham
    Algorithmica, 2023, 85 : 1332 - 1371
  • [49] The Power of Subsampling in Submodular Maximization
    Harshaw, Christopher
    Kazemi, Ehsan
    Feldman, Moran
    Karbasi, Amin
    MATHEMATICS OF OPERATIONS RESEARCH, 2022, 47 (02) : 1365 - 1393
  • [50] Improved Inapproximability for Submodular Maximization
    Austrin, Per
    APPROXIMATION, RANDOMIZATION, AND COMBINATORIAL OPTIMIZATION: ALGORITHMS AND TECHNIQUES, 2010, 6302 : 12 - 24