共 50 条
- [21] Performance Trade-offs in Weight Quantization for Memory-Efficient Inference 2019 IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE CIRCUITS AND SYSTEMS (AICAS 2019), 2019, : 246 - 250
- [22] Evolutionary Bin Packing for Memory-Efficient Dataflow Inference Acceleration on FPGA GECCO'20: PROCEEDINGS OF THE 2020 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2020, : 1125 - 1133
- [24] A Memory-Efficient Edge Inference Accelerator with XOR-based Model Compression 2023 60TH ACM/IEEE DESIGN AUTOMATION CONFERENCE, DAC, 2023,
- [25] PENETRALIUM: Privacy-preserving and memory-efficient neural network inference at the edge FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2024, 156 : 30 - 41
- [26] Buffer Sizes Reduction for Memory-efficient CNN Inference on Mobile and Embedded Devices 2020 23RD EUROMICRO CONFERENCE ON DIGITAL SYSTEM DESIGN (DSD 2020), 2020, : 133 - 140
- [27] dCSR: A Memory-Efficient Sparse Matrix Representation for Parallel Neural Network Inference 2021 IEEE/ACM INTERNATIONAL CONFERENCE ON COMPUTER AIDED DESIGN (ICCAD), 2021,
- [29] Gerbil: A Fast and Memory-Efficient k-mer Counter with GPU-Support ALGORITHMS IN BIOINFORMATICS, 2016, 9838 : 150 - 161