Trapezoidal Step Scheduler for Model-Agnostic Meta-Learning in Medical Imaging

被引:0
|
作者
Voon, Wingates [1 ]
Hum, Yan Chai [1 ]
Tee, Yee Kai [1 ]
Yap, Wun-She [2 ]
Lai, Khin Wee [3 ]
Nisar, Humaira [4 ]
Mokayed, Hamam [5 ]
机构
[1] Univ Tunku Abdul Rahman, Lee Kong Chian Fac Engn & Sci, Dept Mechatron & Biomed Engn, Sungai Long, Malaysia
[2] Univ Tunku Abdul Rahman, Lee Kong Chian Fac Engn & Sci, Dept Elect & Elect Engn, Sungai Long, Malaysia
[3] Univ Malaya, Dept Biomed Engn, Kuala Lumpur, Malaysia
[4] Univ Tunku Abdul Rahman, Fac Engn & Green Technol, Dept Elect Engn, Kampar, Malaysia
[5] Lulea Univ Technol, Dept Comp Sci Elect & Space Engn, Lulea, Sweden
关键词
Few-shot learning; Medical image classification; Trapezoidal step scheduler; Model-agnostic meta-learning;
D O I
10.1016/j.patcog.2024.111316
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Model-Agnostic Meta-learning (MAML) is a widely adopted few-shot learning (FSL) method designed to mitigate the dependency on large, labeled datasets of deep learning-based methods in medical imaging analysis. However, MAML's reliance on a fixed number of gradient descent (GD) steps for task adaptation results in computational inefficiency and task-level overfitting. To address this issue, we introduce Tra-MAML, which optimizes the balance between model adaptation capacity and computational efficiency through a trapezoidal step scheduler (TRA). The TRA scheduler dynamically adjusts the number of GD steps in the inner optimization loop: initially increasing the steps uniformly to reduce variance, maintaining the maximum number of steps to enhance adaptation capacity, and finally decreasing the steps uniformly to mitigate overfitting. Our evaluation of TraMAML against selected FSL methods across four medical imaging datasets demonstrates its superior performance. Notably, Tra-MAML outperforms MAML by 13.36% on the BreaKHis40X dataset in the 3-way 10-shot scenario.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Bayesian Model-Agnostic Meta-Learning
    Yoon, Jaesik
    Kim, Taesup
    Dia, Ousmane
    Kim, Sungwoong
    Bengio, Yoshua
    Ahn, Sungjin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [2] Probabilistic Model-Agnostic Meta-Learning
    Finn, Chelsea
    Xu, Kelvin
    Levine, Sergey
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [3] Theoretical Convergence of Multi-Step Model-Agnostic Meta-Learning
    Ji, Kaiyi
    Yang, Junjie
    Liang, Yingbin
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [4] Theoretical Convergence of Multi-Step Model-Agnostic Meta-Learning
    Ji, Kaiyi
    Yang, Junjie
    Liang, Yingbin
    Journal of Machine Learning Research, 2022, 23
  • [5] Knowledge Distillation for Model-Agnostic Meta-Learning
    Zhang, Min
    Wang, Donglin
    Gai, Sibo
    ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1355 - 1362
  • [6] Meta weight learning via model-agnostic meta-learning
    Xu, Zhixiong
    Chen, Xiliang
    Tang, Wei
    Lai, Jun
    Cao, Lei
    NEUROCOMPUTING, 2021, 432 : 124 - 132
  • [7] PEER-TO-PEER MODEL-AGNOSTIC META-LEARNING
    Qureshi, Muhammad I.
    Khan, Usman A.
    2024 IEEE 13RD SENSOR ARRAY AND MULTICHANNEL SIGNAL PROCESSING WORKSHOP, SAM 2024, 2024,
  • [8] Task-Robust Model-Agnostic Meta-Learning
    Collins, Liam
    Mokhtari, Aryan
    Shakkottai, Sanjay
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [9] Combining Model-Agnostic Meta-Learning and Transfer Learning for Regression
    Satrya, Wahyu Fadli
    Yun, Ji-Hoon
    SENSORS, 2023, 23 (02)
  • [10] Learning Symbolic Model-Agnostic Loss Functions via Meta-Learning
    Raymond, Christian
    Chen, Qi
    Xue, Bing
    Zhang, Mengjie
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (11) : 13699 - 13714