共 50 条
- [41] Scalable inductive learning on partitioned data FOUNDATIONS OF INTELLIGENT SYSTEMS, PROCEEDINGS, 2005, 3488 : 391 - 403
- [42] LAG: Lazily Aggregated Gradient for Communication-Efficient Distributed Learning ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
- [43] Communication-Efficient Distributed Deep Metric Learning with Hybrid Synchronization CIKM'18: PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2018, : 1463 - 1472
- [44] UbiNN: A Communication Efficient Framework for Distributed Machine Learning in Edge Computing IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2023, 10 (06): : 3368 - 3383
- [45] Poseidon: An Efficient Communication Architecture for Distributed Deep Learning on GPU Clusters 2017 USENIX ANNUAL TECHNICAL CONFERENCE (USENIX ATC '17), 2017, : 181 - 193
- [46] Communication-Efficient Coded Distributed Multi-Task Learning 2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
- [48] Communication-Efficient Gradient Coding for Straggler Mitigation in Distributed Learning 2020 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2020, : 2634 - 2639
- [49] CE-SGD: Communication-Efficient Distributed Machine Learning 2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,