共 50 条
- [1] GAPFORMER: Fast Autoregressive Transformers meet RNNs for Personalized Adaptive Cruise Control 2022 IEEE 25TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS (ITSC), 2022, : 2528 - 2535
- [2] Finetuning Pretrained Transformers into RNNs 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 10630 - 10643
- [3] Fast Vision Transformers with HiLo Attention ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
- [4] Linear Transformers Are Secretly Fast Weight Programmers INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
- [6] Pose Transformers (POTR): Human Motion Prediction with Non-Autoregressive Transformers 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW 2021), 2021, : 2276 - 2284
- [7] On the Learning of Non-Autoregressive Transformers INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
- [8] LeaPformer: Enabling Linear Transformers for Autoregressive and Simultaneous Tasks via Learned Proportions INTERNATIONAL CONFERENCE ON MACHINE LEARNING, 2024, 235 : 452 - 470
- [9] Training-free Neural Architecture Search for RNNs and Transformers PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 2522 - 2540