A Transformer-Embedded Multi-Task Model for Dose Distribution Prediction

被引:22
|
作者
Wen, Lu [1 ]
Xiao, Jianghong [2 ]
Tan, Shuai [1 ]
Wu, Xi [3 ]
Zhou, Jiliu [1 ]
Peng, Xingchen [4 ]
Wang, Yan [1 ]
机构
[1] Sichuan Univ, Sch Comp Sci, Chengdu, Peoples R China
[2] Sichuan Univ, Canc Ctr, Dept Radiat Oncol, West China Hosp, Chengdu, Peoples R China
[3] Chengdu Univ Informat Technol, Sch Comp Sci, Chengdu, Peoples R China
[4] Sichuan Univ, Canc Ctr, Dept Biotherapy, West China Hosp, Chengdu, Peoples R China
基金
中国国家自然科学基金;
关键词
Dose prediction; multi-task learning; isodose lines; gradient information; consistency constraint; transformer; INTENSITY-MODULATED RADIOTHERAPY; PLAN QUALITY; NECK-CANCER; NETWORK; HEAD;
D O I
10.1142/S0129065723500430
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Radiation therapy is a fundamental cancer treatment in the clinic. However, to satisfy the clinical requirements, radiologists have to iteratively adjust the radiotherapy plan based on experience, causing it extremely subjective and time-consuming to obtain a clinically acceptable plan. To this end, we introduce a transformer-embedded multi-task dose prediction (TransMTDP) network to automatically predict the dose distribution in radiotherapy. Specifically, to achieve more stable and accurate dose predictions, three highly correlated tasks are included in our TransMTDP network, i.e. a main dose prediction task to provide each pixel with a fine-grained dose value, an auxiliary isodose lines prediction task to produce coarse-grained dose ranges, and an auxiliary gradient prediction task to learn subtle gradient information such as radiation patterns and edges in the dose maps. The three correlated tasks are integrated through a shared encoder, following the multi-task learning strategy. To strengthen the connection of the output layers for different tasks, we further use two additional constraints, i.e. isodose consistency loss and gradient consistency loss, to reinforce the match between the dose distribution features generated by the auxiliary tasks and the main task. Additionally, considering many organs in the human body are symmetrical and the dose maps present abundant global features, we embed the transformer into our framework to capture the long-range dependencies of the dose maps. Evaluated on an in-house rectum cancer dataset and a public head and neck cancer dataset, our method gains superior performance compared with the state-of-the-art ones. Code is available at https://github.com/luuuwen/TransMTDP.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] TFUT: Task fusion upward transformer model for multi-task learning on dense prediction
    Xin, Zewei
    Sirejiding, Shalayiding
    Lu, Yuxiang
    Ding, Yue
    Wang, Chunlin
    Alsarhan, Tamam
    Lu, Hongtao
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2024, 244
  • [2] Prompt Guided Transformer for Multi-Task Dense Prediction
    Lu, Yuxiang
    Sirejiding, Shalayiding
    Ding, Yue
    Wang, Chunlin
    Lu, Hongtao
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 6375 - 6385
  • [3] Multi-Task Learning With Multi-Query Transformer for Dense Prediction
    Xu, Yangyang
    Li, Xiangtai
    Yuan, Haobo
    Yang, Yibo
    Zhang, Lefei
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (02) : 1228 - 1240
  • [4] Multimodal radiotherapy dose prediction using a multi-task deep learning model
    Maniscalco, Austen
    Mathew, Ezek
    Parsons, David
    Visak, Justin
    Arbab, Mona
    Alluri, Prasanna
    Li, Xingzhe
    Wandrey, Narine
    Lin, Mu-Han
    Rahimi, Asal
    Jiang, Steve
    Nguyen, Dan
    MEDICAL PHYSICS, 2024, 51 (06) : 3932 - 3949
  • [5] Multi-Task CNN Model for Attribute Prediction
    Abdulnabi, Abrar H.
    Wang, Gang
    Lu, Jiwen
    Jia, Kui
    IEEE TRANSACTIONS ON MULTIMEDIA, 2015, 17 (11) : 1949 - 1959
  • [6] Multi-Task Transformer Visualization to build Trust for Clinical Outcome Prediction
    Antweiler, Dario
    Gallusser, Florian
    Fuchs, Georg
    2023 WORKSHOP ON VISUAL ANALYTICS IN HEALTHCARE, VAHC, 2023, : 21 - 26
  • [7] DeMT: Deformable Mixer Transformer for Multi-Task Learning of Dense Prediction
    Xu, Yangyang
    Yang, Yibo
    Zhang, Lefei
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 3, 2023, : 3072 - 3080
  • [8] A Multi-task Transformer Architecture for Drone State Identification and Trajectory Prediction
    Souli, Nicolas
    Palamas, Andreas
    Panayiotou, Tania
    Kolios, Panayiotis
    Ellinas, Georgios
    2024 20TH INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING IN SMART SYSTEMS AND THE INTERNET OF THINGS, DCOSS-IOT 2024, 2024, : 285 - 291
  • [9] Multi-Task Transformer with LSTM Model for Question Set Generation
    1600, Institute of Electrical and Electronics Engineers Inc.
  • [10] Multi-task learning for automated contouring and dose prediction in radiotherapy
    Kim, Sangwook
    Khalifa, Aly
    Purdie, Thomas G.
    Mcintosh, Chris
    PHYSICS IN MEDICINE AND BIOLOGY, 2025, 70 (05):