Strongly Constrained Discrete Hashing

被引:44
|
作者
Chen, Yong [1 ]
Tian, Zhibao [2 ]
Zhang, Hui [2 ]
Wang, Jun [3 ]
Zhang, Dell [4 ,5 ]
机构
[1] Peking Univ, Sch EECS, Key Lab Machine Percept, Beijing 100871, Peoples R China
[2] Beihang Univ, Dept Comp Sci & Engn, Beijing 100191, Peoples R China
[3] UCL, Dept Comp Sci, London WC1E 6BT, England
[4] Birkbeck Univ London, Dept Comp Sci & Informat Syst, London WC1E 7HX, England
[5] Blue Prism AI Labs, London WC2B 6NH, England
基金
中国博士后科学基金;
关键词
Learning to hash; image retrieval; discrete optimization; ITERATIVE QUANTIZATION; PROCRUSTEAN APPROACH;
D O I
10.1109/TIP.2020.2963952
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning to hash is a fundamental technique widely used in large-scale image retrieval. Most existing methods for learning to hash address the involved discrete optimization problem by the continuous relaxation of the binary constraint, which usually leads to large quantization errors and consequently suboptimal binary codes. A few discrete hashing methods have emerged recently. However, they either completely ignore some useful constraints (specifically the balance and decorrelation of hash bits) or just turn those constraints into regularizers that would make the optimization easier but less accurate. In this paper, we propose a novel supervised hashing method named Strongly Constrained Discrete Hashing (SCDH) which overcomes such limitations. It can learn the binary codes for all examples in the training set, and meanwhile obtain a hash function for unseen samples with the above mentioned constraints preserved. Although the model of SCDH is fairly sophisticated, we are able to find closed-form solutions to all of its optimization subproblems and thus design an efficient algorithm that converges quickly. In addition, we extend SCDH to a kernelized version SCDH $_{K}$ . Our experiments on three large benchmark datasets have demonstrated that not only can SCDH and SCDH $_{K}$ achieve substantially higher MAP scores than state-of-the-art baselines, but they train much faster than those that are also supervised as well.
引用
收藏
页码:3596 / 3611
页数:16
相关论文
共 50 条
  • [1] Locality-constrained discrete graph hashing
    Ying, Wenjie
    Sang, Jitao
    Yu, Jian
    NEUROCOMPUTING, 2020, 398 : 566 - 573
  • [2] Merkle hash tree improved strongly constrained discrete hashing function-based authentication scheme for enabling security for smart home IoT applications
    Sudha, K. Swapna
    Jeyanthi, N.
    PEER-TO-PEER NETWORKING AND APPLICATIONS, 2023, 16 (05) : 2367 - 2379
  • [3] Merkle hash tree improved strongly constrained discrete hashing function-based authentication scheme for enabling security for smart home IoT applications
    K. Swapna Sudha
    N. Jeyanthi
    Peer-to-Peer Networking and Applications, 2023, 16 : 2367 - 2379
  • [4] Supervised Discrete Hashing
    Shen, Fumin
    Shen, Chunhua
    Liu, Wei
    Shen, Heng Tao
    2015 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2015, : 37 - 45
  • [5] Nonlinear Discrete Hashing
    Chen, Zhixiang
    Lu, Jiwen
    Feng, Jianjiang
    Zhou, Jie
    IEEE TRANSACTIONS ON MULTIMEDIA, 2017, 19 (01) : 123 - 135
  • [6] Discrete Graph Hashing
    Liu, Wei
    Mu, Cun
    Kumar, Sanjiv
    Chang, Shih-Fu
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [7] Strongly Universal String Hashing is Fast
    Lemire, Daniel
    Kaser, Owen
    COMPUTER JOURNAL, 2014, 57 (11): : 1624 - 1638
  • [8] Discrete Hashing With Multiple Supervision
    Luo, Xin
    Zhang, Peng-Fei
    Huang, Zi
    Nie, Liqiang
    Xu, Xin-Shun
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2019, 28 (06) : 2962 - 2975
  • [9] Asymmetric Discrete Graph Hashing
    Shi, Xiaoshuang
    Xing, Fuyong
    Xu, Kaidi
    Sapkota, Manish
    Yang, Lin
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2541 - 2547
  • [10] Supervised Discrete Hashing With Relaxation
    Gui, Jie
    Liu, Tongliang
    Sun, Zhenan
    Tao, Dacheng
    Tan, Tieniu
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (03) : 608 - 617