Retinal artery/vein classification based on multi-scale category fusion

被引:2
|
作者
Duan, Kunyi [1 ]
Wang, Suyu [1 ]
Liu, Hongyu [1 ]
He, Jian [1 ]
机构
[1] Beijing Univ Technol, Sch Software Engn, Beijing, Peoples R China
关键词
fundus image; artery and vein classification; deep learning;
D O I
10.1109/ICTAI56018.2022.00158
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Retinal blood vessel morphological abnormalities are generally associated with cardiovascular, cerebrovascular, and systemic diseases, automatic artery/vein classification is particularly important for medical image analysis and clinical decision making. This work proposes a retinal artery/vein classification model based on multi-scale category fusion in order to improve the accuracy of arteriovenous classification. Aiming at the high similarity of arteriovenous features, a multi-scale feature extraction module is proposed to enhance feature extraction by aggregating multi-scale features of hierarchical residuals in Res2Net single residuals. Furthermore, a multi-layer semantic supervision structure is designed to supervise and fuse the arteriovenous features at different layers to obtain more semantic details to enhance the distinguishing ability of features. Finally, a category-weighted fusion module is introduced to concatenate the feature maps of the same category together to refine the overall segmentation results. The proposed method is verified on two public available fundus image datasets with different scales, namely, DRIVE and LES-AV. The experimental results show that the proposed method performs well in the task of arteriovenous classification and outperforms most of the existing methods.
引用
收藏
页码:1036 / 1041
页数:6
相关论文
共 50 条
  • [31] Dilated Multi-scale Fusion for Point Cloud Classification and Segmentation
    Fan Guo
    Qingquan Ren
    Jin Tang
    Zhiyong Li
    Multimedia Tools and Applications, 2022, 81 : 6069 - 6090
  • [32] Dilated Multi-scale Fusion for Point Cloud Classification and Segmentation
    Guo, Fan
    Ren, Qingquan
    Tang, Jin
    Li, Zhiyong
    MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (05) : 6069 - 6090
  • [33] FUSION MULTI-SCALE SUPERPIXEL FEATURES FOR CLASSIFICATION OF HYPERSPECTRAL IMAGES
    Li, Shanshan
    Zhang, Bing
    Jia, Xiuping
    Wu, Hua
    2016 8TH WORKSHOP ON HYPERSPECTRAL IMAGE AND SIGNAL PROCESSING: EVOLUTION IN REMOTE SENSING (WHISPERS), 2016,
  • [34] Fast multi-scale feature fusion for ECG heartbeat classification
    Danni Ai
    Jian Yang
    Zeyu Wang
    Jingfan Fan
    Changbin Ai
    Yongtian Wang
    EURASIP Journal on Advances in Signal Processing, 2015
  • [35] Multi-scale predictions fusion for robust hand detection and classification
    Lu Ding
    Yong Wang
    Robert Laganière
    Xinbin Luo
    Shan Fu
    Multimedia Tools and Applications, 2019, 78 : 35633 - 35650
  • [36] Unbalanced Graph Multi-Scale Fusion Node Classification Method
    Zhang J.
    He X.
    Qi Z.
    Ma C.
    Li J.
    Journal of Shanghai Jiaotong University (Science), 2024, 29 (03) : 557 - 565
  • [37] Object-oriented classification based on multi-scale segmentation and data fusion technique
    Liu, F. J.
    Wu, X. C.
    Mei, L. L.
    Guo, Y.
    Hu, Y. S.
    2008 PROCEEDINGS OF INFORMATION TECHNOLOGY AND ENVIRONMENTAL SYSTEM SCIENCES: ITESS 2008, VOL 4, 2008, : 1058 - 1062
  • [38] Ship fine-grained classification network based on multi-scale feature fusion
    Chen, Lisu
    Wang, Qian
    Zhu, Enyan
    Feng, Daolun
    Wu, Huafeng
    Liu, Tao
    OCEAN ENGINEERING, 2025, 318
  • [39] Classification model for citrus canopy spraying deposition based on multi-scale feature fusion
    Lu J.
    Lin J.
    Deng X.
    Lan Y.
    Qiu H.
    Yang R.
    Chen P.
    Nongye Gongcheng Xuebao/Transactions of the Chinese Society of Agricultural Engineering, 2020, 36 (23): : 70 - 76
  • [40] Study on Image Classification Algorithm Based on Multi-Scale Feature Fusion and Domain Adaptation
    Guo, Yu
    Cheng, Ziyi
    Zhang, Yuanlong
    Wang, Gaoxuan
    Zhang, Jundong
    APPLIED SCIENCES-BASEL, 2024, 14 (22):