共 50 条
- [2] Domain-specific knowledge distillation yields smaller and better models for conversational commerce PROCEEDINGS OF THE 5TH WORKSHOP ON E-COMMERCE AND NLP (ECNLP 5), 2022, : 151 - 160
- [3] GIN: Better going safe with personalized routes 2020 IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS (ISCC), 2020, : 327 - 332
- [6] BETTER C-3 DISTILLATION PRESSURE - DISTILLATION HYDROCARBON PROCESSING, 1979, 58 (02): : 95 - 98