Multi-task Learning Based Skin Segmentation

被引:0
|
作者
Tan, Taizhe [1 ,2 ]
Shan, Zhenghao [1 ]
机构
[1] Guangdong Univ Technol, Sch Comp Sci & Technol, Guangzhou 510006, Peoples R China
[2] Heyuan Bay Area Digital Econ Technol Innovat Ctr, Heyuan 517001, Peoples R China
关键词
Skin segmentation; query-based; multi-task learning; encoder-decoder; deep learning;
D O I
10.1007/978-3-031-40289-0_29
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Skin segmentation is a critical task in computer vision that has diverse applications in several fields such as biometrics, medical imaging, and video surveillance. Despite its importance, the acquisition of high-quality data remains a significant challenge in skin segmentation research. In this paper, we propose a novel skin segmentation algorithm for single-person images by utilizing a dual-task neural network built on the multi-task learning framework. Specifically, the algorithm employs an encoder-decoder architecture consisting of a shared backbone, two dynamic encoders, and a decoder. The dynamic encoders use dynamic convolution to extract more spatial location information, while the decoder utilizes a query-based dual-task approach that allows each task to utilize the information generated by the other one efficiently. The experimental results indicate that the proposed skin segmentation algorithm outperforms or matches the current state-of-the-art techniques on the benchmark test set.
引用
收藏
页码:360 / 369
页数:10
相关论文
共 50 条
  • [21] A Deep Multi-Task Learning Framework for Brain Tumor Segmentation
    Huang, He
    Yang, Guang
    Zhang, Wenbo
    Xu, Xiaomei
    Yang, Weiji
    Jiang, Weiwei
    Lai, Xiaobo
    FRONTIERS IN ONCOLOGY, 2021, 11
  • [22] MULTI-TASK DEEP LEARNING FOR SATELLITE IMAGE PANSHARPENING AND SEGMENTATION
    Khalel, Andrew
    Tasar, Onur
    Charpiat, Guillaume
    Tarabalka, Yuliya
    2019 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2019), 2019, : 4869 - 4872
  • [23] Multi-Task Learning for Diabetic Retinopathy Grading and Lesion Segmentation
    Foo, Alex
    Hsu, Wynne
    Lee, Mong Li
    Lim, Gilbert
    Wong, Tien Yin
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 13267 - 13272
  • [24] Multi-task learning for the segmentation of organs at risk with label dependence
    He, Tao
    Hu, Junjie
    Song, Ying
    Guo, Jixiang
    Yi, Zhang
    MEDICAL IMAGE ANALYSIS, 2020, 61
  • [25] MULTI-TASK LEARNING IMPROVES THE BRAIN STOKE LESION SEGMENTATION
    Liu, Libo
    Huang, Chengjian
    Cai, Chunsheng
    Zhang, Xiaodong
    Hu, Qingmao
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 2385 - 2389
  • [26] Decoupling multi-task causality for improved skin lesion segmentation and classification
    Song, Lei
    Wang, Haoqian
    Wang, Z. Jane
    PATTERN RECOGNITION, 2023, 133
  • [27] A Multi-Task Based Deep Learning Framework With Landmark Detection for MRI Couinaud Segmentation
    Miao, Dong
    Zhao, Ying
    Ren, Xue
    Dou, Meng
    Yao, Yu
    Xu, Yiran
    Cui, Yingchao
    Liu, Ailian
    IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE, 2024, 12 : 697 - 710
  • [28] Segmentation of Remote Sensing Images Based on U-Net Multi-Task Learning
    Ni Ruiwen
    Mu Ye
    Li Ji
    Zhang Tong
    Luo Tianye
    Feng Ruilong
    Gong He
    Hu Tianli
    Sun Yu
    Guo Ying
    Li Shijun
    Tyasi, Thobela Louis
    CMC-COMPUTERS MATERIALS & CONTINUA, 2022, 73 (02): : 3263 - 3274
  • [29] Microcrack Segmentation of 3D CT Images Based on Multi-Task Learning
    Peng, Junjie
    Li, Wenbin
    Liao, Suyu
    Zhu, Yining
    IEEE ACCESS, 2024, 12 : 138192 - 138200
  • [30] AMTLUS: Attention-guided multi-task learning with uncertainty estimation in skin lesion segmentation and classification
    Kasukurthi A.
    Davuluri R.L.
    Multimedia Tools and Applications, 2024, 83 (37) : 84885 - 84909