共 5 条
- [1] Cost-Efficient Serverless Inference Serving with Joint Batching and Multi-Processing PROCEEDINGS OF THE 14TH ACM SIGOPS ASIA-PACIFIC WORKSHOP ON SYSTEMS, APSYS 2023, 2023, : 43 - 49
- [2] Accelerating DNN Inference with Heterogeneous Multi-DPU Engines 2023 60TH ACM/IEEE DESIGN AUTOMATION CONFERENCE, DAC, 2023,
- [3] Multi-exit DNN inference acceleration for intelligent terminal with heterogeneous processors SUSTAINABLE COMPUTING-INFORMATICS & SYSTEMS, 2023, 40
- [4] EdgeSP: Scalable Multi-device Parallel DNN Inference on Heterogeneous Edge Clusters ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2021, PT II, 2022, 13156 : 317 - 333