Unsupervised Deep Cross-Modal Hashing by Knowledge Distillation for Large-scale Cross-modal Retrieval

被引:18
|
作者
Li, Mingyong [1 ,2 ]
Wang, Hongya [1 ,3 ]
机构
[1] Donghua Univ, Coll Comp Sci & Technol, Shanghai, Peoples R China
[2] Chongqing Normal Univ, Coll Comp & Informat Sci, Chongqing, Peoples R China
[3] Shanghai Key Lab Comp Software Evaluating & Testi, Shanghai, Peoples R China
关键词
cross-modal hashing; unsupervised learning; knowledge distillation; cross-modal retrieval;
D O I
10.1145/3460426.3463626
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Cross-modal hashing (CMH) maps heterogeneous multiple modality data into compact binary code to achieve fast and flexible retrieval across different modalities, especially in large-scale retrieval. As the data don't need a lot of manual annotation, unsupervised cross-modal hashing has a wider application prospect than supervised method. However, the existing unsupervised methods are difficult to achieve satisfactory performance due to the lack of credible supervisory information. To solve this problem, inspired by knowledge distillation, we propose a novel unsupervised Knowledge Distillation Cross-Modal Hashing method (KDCMH), which can use similarity information distilled from unsupervised method to guide supervised method. Specifically, firstly, the teacher model adopted an unsupervised distribution-based similarity hashing method, which can construct a modal fusion similarity matrix.Secondly, under the supervision of teacher model distillation information, student model can generate more discriminative hash codes. In two public datasets NUS-WIDE and MIRFLICKR-25K, extensive experiments have proved the significant improvement of KDCMH on several representative unsupervised cross-modal hashing methods.
引用
收藏
页码:183 / 191
页数:9
相关论文
共 50 条
  • [41] Deep noise mitigation and semantic reconstruction hashing for unsupervised cross-modal retrieval
    Cheng Zhang
    Yuan Wan
    Haopeng Qiang
    Neural Computing and Applications, 2024, 36 : 5383 - 5397
  • [42] Coupled CycleGAN: Unsupervised Hashing Network for Cross-Modal Retrieval
    Li, Chao
    Deng, Cheng
    Wang, Lei
    Xie, De
    Liu, Xianglong
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 176 - 183
  • [43] Revising similarity relationship hashing for unsupervised cross-modal retrieval
    Wu, You
    Li, Bo
    Li, Zhixin
    NEUROCOMPUTING, 2025, 614
  • [44] Deep noise mitigation and semantic reconstruction hashing for unsupervised cross-modal retrieval
    Zhang, Cheng
    Wan, Yuan
    Qiang, Haopeng
    NEURAL COMPUTING & APPLICATIONS, 2024, 36 (10): : 5383 - 5397
  • [45] Deep Semantic-Preserving Reconstruction Hashing for Unsupervised Cross-Modal Retrieval
    Cheng, Shuli
    Wang, Liejun
    Du, Anyu
    ENTROPY, 2020, 22 (11) : 1 - 22
  • [46] Pseudo-label driven deep hashing for unsupervised cross-modal retrieval
    XianHua Zeng
    Ke Xu
    YiCai Xie
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 3437 - 3456
  • [47] Learning From Expert: Vision-Language Knowledge Distillation for Unsupervised Cross-Modal Hashing Retrieval
    Sun, Lina
    Li, Yewen
    Dong, Yumin
    PROCEEDINGS OF THE 2023 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2023, 2023, : 499 - 507
  • [48] Deep Cross-Modal Proxy Hashing
    Tu, Rong-Cheng
    Mao, Xian-Ling
    Tu, Rong-Xin
    Bian, Binbin
    Cai, Chengfei
    Wang, Hongfa
    Wei, Wei
    Huang, Heyan
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (07) : 6798 - 6810
  • [49] Semantic deep cross-modal hashing
    Lin, Qiubin
    Cao, Wenming
    He, Zhihai
    He, Zhiquan
    NEUROCOMPUTING, 2020, 396 (396) : 113 - 122
  • [50] Deep Lifelong Cross-Modal Hashing
    Xu, Liming
    Li, Hanqi
    Zheng, Bochuan
    Li, Weisheng
    Lv, Jiancheng
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (12) : 13478 - 13493