共 50 条
- [2] A Survey on Collaborative DNN Inference for Edge Intelligence Machine Intelligence Research, 2023, 20 : 370 - 395
- [3] Minimizing Latency for Multi-DNN Inference on Resource-Limited CPU-Only Edge Devices IEEE INFOCOM 2024-IEEE CONFERENCE ON COMPUTER COMMUNICATIONS, 2024, : 2239 - 2248
- [5] Accelerating DNN Inference by Edge-Cloud Collaboration 2021 IEEE INTERNATIONAL PERFORMANCE, COMPUTING, AND COMMUNICATIONS CONFERENCE (IPCCC), 2021,
- [6] Operating Latency Sensitive Applications on Public Serverless Edge Cloud Platforms IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (10): : 7954 - 7972
- [8] FusedInf: Efficient Swapping of DNN Models for On-Demand Serverless Inference Services on the Edge 2024 IEEE/ACM SYMPOSIUM ON EDGE COMPUTING, SEC 2024, 2024, : 98 - 109