Spatial Mixture-of-Experts

被引:0
|
作者
Dryden, Nikoli [1 ]
Hoefler, Torsten [1 ]
机构
[1] Swiss Fed Inst Technol, Zurich, Switzerland
基金
欧盟地平线“2020”;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many data have an underlying dependence on spatial location; it may be weather on the Earth, a simulation on a mesh, or a registered image. Yet this feature is rarely taken advantage of, and violates common assumptions made by many neural network layers, such as translation equivariance. Further, many works that do incorporate locality fail to capture fine-grained structure. To address this, we introduce the Spatial Mixture-of-Experts (SMOE) layer, a sparsely-gated layer that learns spatial structure in the input domain and routes experts at a fine-grained level to utilize it. We also develop new techniques to train SMOEs, including a self-supervised routing loss and damping expert errors. Finally, we show strong results for SMOEs on numerous tasks, and set new state-of-the-art results for medium-range weather prediction and post-processing ensemble weather forecasts.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] Efficient Routing in Sparse Mixture-of-Experts
    Shamsolmoali, Pourya (pshams55@gmail.com), 1600, Institute of Electrical and Electronics Engineers Inc.
  • [2] Mixture-of-Experts with Expert Choice Routing
    Zhou, Yanqi
    Lei, Tao
    Liu, Hanxiao
    Du, Nan
    Huang, Yanping
    Zhao, Vincent Y.
    Dai, Andrew
    Chen, Zhifeng
    Le, Quoc
    Laudon, James
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [3] Asymptotic properties of mixture-of-experts models
    Olteanu, M.
    Rynkiewicz, J.
    NEUROCOMPUTING, 2011, 74 (09) : 1444 - 1449
  • [4] Research on the Structure and Realization of Mixture-of-Experts
    Yan, Qidong
    Li, Yingjie
    Ma, Ning
    Wan, Fucheng
    2024 5TH INTERNATIONAL CONFERENCE ON COMPUTER ENGINEERING AND APPLICATION, ICCEA 2024, 2024, : 354 - 359
  • [5] MoDE: A Mixture-of-Experts Model with Mutual Distillation among the Experts
    Xie, Zhitian
    Zhang, Yinger
    Zhuang, Chenyi
    Shi, Qitao
    Liu, Zhining
    Gu, Jinjie
    Zhang, Guannan
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 14, 2024, : 16067 - 16075
  • [6] A Universal Approximation Theorem for Mixture-of-Experts Models
    Nguyen, Hien D.
    Lloyd-Jones, Luke R.
    McLachlan, Geoffrey J.
    NEURAL COMPUTATION, 2016, 28 (12) : 2585 - 2593
  • [7] A mixture-of-experts framework for adaptive Kalman filtering
    Chaer, WS
    Bishop, RH
    Ghosh, J
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1997, 27 (03): : 452 - 464
  • [8] Semi-supervised mixture-of-experts classification
    Karakoulas, G
    Salakhutdinov, R
    FOURTH IEEE INTERNATIONAL CONFERENCE ON DATA MINING, PROCEEDINGS, 2004, : 138 - 145
  • [9] A Multilevel Mixture-of-Experts Framework for Pedestrian Classification
    Enzweiler, Markus
    Gavrila, Dariu M.
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2011, 20 (10) : 2967 - 2979
  • [10] On the Benefits of Learning to Route in Mixture-of-Experts Models
    Dikkala, Nishanth
    Ghosh, Nikhil
    Meka, Raghu
    Panigrahy, Rina
    Vyas, Nikhil
    Wang, Xin
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 9376 - 9396