Multimodal Data Fusion Using Non-Sparse Multi-Kernel Learning With Regularized Label Softening

被引:7
|
作者
Wang, Peihua [1 ]
Qiu, Chengyu [1 ]
Wang, Jiali [1 ]
Wang, Yulong [1 ]
Tang, Jiaxi [1 ]
Huang, Bin [1 ]
Su, Jian [2 ]
Zhang, Yuanpeng [1 ]
机构
[1] Nantong Univ, Dept Med Informat, Nantong 226001, Peoples R China
[2] Nanjing Univ Informat Sci & Technol, Sch Comp & Software, Nanjing 210044, Peoples R China
基金
中国国家自然科学基金;
关键词
Kernel; Matrix decomposition; Data integration; Softening; Fitting; Task analysis; Training; Label softening; manifold learning; multi-Kernel learning; remote sensing; semantic-based multimodal fusion; REGRESSION; CLASSIFICATION; FRAMEWORK;
D O I
10.1109/JSTARS.2021.3087738
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Due to the need of practical application, multiple sensors are often used for data acquisition, so as to realize the multimodal description of the same object. How to effectively fuse multimodal data has become a challenge problem in different scenarios including remote sensing. Nonsparse multi-Kernel learning has won many successful applications in multimodal data fusion due to the full utilization of multiple Kernels. Most existing models assume that the nonsparse combination of multiple Kernels is infinitely close to a strict binary label matrix during the training process. However, this assumption is very strict so that label fitting has very little freedom. To address this issue, in this article, we develop a novel nonsparse multi-Kernel model for multimodal data fusion. To be specific, we introduce a label softening strategy to soften the binary label matrix which provides more freedom for label fitting. Additionally, we introduce a regularized term based on manifold learning to anti over fitting problems caused by label softening. Experimental results on one synthetic dataset, several UCI multimodal datasets and one multimodal remoting sensor dataset demonstrate the promising performance of the proposed model.
引用
收藏
页码:6244 / 6252
页数:9
相关论文
共 50 条
  • [1] A non-sparse multi-kernel learning method based on primal problem
    School of Computer, State Key Laboratory of Software Engineering, Wuhan University, Wuhan
    Hubei
    430072, China
    不详
    Guangxi
    541004, China
    Huanan Ligong Daxue Xuebao, 5 (78-85):
  • [2] Multi-Modality Fusion & Inductive Knowledge Transfer Underlying Non-Sparse Multi-Kernel Learning and Distribution Adaption
    Zhang, Yuanpeng
    Xia, Kaijian
    Jiang, Yizhang
    Qian, Pengjiang
    Cai, Weiwei
    Qiu, Chengyu
    Lai, Khin Wee
    Wu, Dongrui
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2023, 20 (04) : 2387 - 2397
  • [3] Sparse and Non-Sparse Multiple Kernel Learning for Recognition
    Alioscha-Perez, Mitchel
    Sahli, Hichem
    Gonzalez, Isabel
    Taboada-Crispi, Alberto
    COMPUTACION Y SISTEMAS, 2012, 16 (02): : 167 - 174
  • [4] Multi-task Learning via Non-sparse Multiple Kernel Learning
    Samek, Wojciech
    Binder, Alexander
    Kawanabe, Motoaki
    COMPUTER ANALYSIS OF IMAGES AND PATTERNS: 14TH INTERNATIONAL CONFERENCE, CAIP 2011, PT I, 2011, 6854 : 335 - 342
  • [5] Learning rates of multi-kernel regularized regression
    Chen, Hong
    Li, Luoqing
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2010, 140 (09) : 2562 - 2568
  • [6] Multi-Kernel Learning for Heterogeneous Data
    Liao, Chunlan
    Peng, Shili
    IEEE ACCESS, 2025, 13 : 45340 - 45349
  • [7] Multi-kernel partial label learning using graph contrast disambiguation
    Li, Hongyan
    Wan, Zhonglin
    Vong, Chi Man
    APPLIED INTELLIGENCE, 2024, 54 (20) : 9760 - 9782
  • [8] Non-Sparse Multiple Kernel Learning for Fisher Discriminant Analysis
    Yan, Fei
    Kittler, Josef
    Mikolajczyk, Krystian
    Tahir, Atif
    2009 9TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING, 2009, : 1064 - 1069
  • [9] Non-sparse label specific features selection for multi-label classification
    Weng, Wei
    Chen, Yan-Nan
    Chen, Chin-Ling
    Wu, Shun-Xiang
    Liu, Jing-Hua
    NEUROCOMPUTING, 2020, 377 : 85 - 94
  • [10] Feature selection and multi-kernel learning for sparse representation on a manifold
    Wang, Jim Jing-Yan
    Bensmail, Halima
    Gao, Xin
    NEURAL NETWORKS, 2014, 51 : 9 - 16