Learning Feature Hierarchies: A Layer-Wise Tag-Embedded Approach

被引:7
|
作者
Yuan, Zhaoquan [1 ]
Xu, Changsheng [1 ]
Sang, Jitao [1 ]
Yan, Shuicheng [2 ]
Hossain, M. Shamim [3 ]
机构
[1] Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
[2] Natl Univ Singapore, Dept Elect & Comp Engn, Singapore 119077, Singapore
[3] King Saud Univ, SWE Dept, Coll Comp & Informat Sci, Riyadh 11543, Saudi Arabia
基金
北京市自然科学基金; 新加坡国家研究基金会; 中国国家自然科学基金;
关键词
Auto-encoder; deep learning; hierarchical feature learning; social tags; CLASSIFICATION; DICTIONARY; MULTIPLE; MODELS;
D O I
10.1109/TMM.2015.2417777
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Feature representation learning is an important and fundamental task in multimedia and pattern recognition research. In this paper, we propose a novel framework to explore the hierarchical structure inside the images from the perspective of feature representation learning, which is applied to hierarchical image annotation. Different from the current trend in multimedia analysis of using pre-defined features or focusing on the end-task "flat" representation, we propose a novel layer-wise tag-embedded deep learning (LTDL) model to learn hierarchical features which correspond to hierarchical semantic structures in the tag hierarchy. Unlike most existing deep learning models, LTDL utilizes both the visual content of the image and the hierarchical information of associated social tags. In the training stage, the two kinds of information are fused in a bottom-up way. Supervised training and multi-modal fusion alternate in a layer-wise way to learn feature hierarchies. To validate the effectiveness of LTDL, we conduct extensive experiments for hierarchical image annotation on a large-scale public dataset. Experimental results show that the proposed LTDL can learn representative features with improved performances.
引用
收藏
页码:816 / 827
页数:12
相关论文
共 50 条
  • [21] FedScrap: Layer-Wise Personalized Federated Learning for Scrap Detection
    Zhang, Weidong
    Deng, Dongshang
    Wang, Lidong
    ELECTRONICS, 2024, 13 (03)
  • [22] Layer-wise partitioning and merging for efficient and scalable deep learning
    Akintoye, S. B.
    Han, L.
    Lloyd, H.
    Zhang, X.
    Dancey, D.
    Chen, H.
    Zhang, D.
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2023, 149 : 432 - 444
  • [23] An analytic layer-wise deep learning framework with applications to robotics
    Huu-Thiet Nguyen
    Chien Chern Cheah
    Kar-Ann Toh
    AUTOMATICA, 2022, 135
  • [24] Collaborative Layer-Wise Discriminative Learning in Deep Neural Networks
    Jin, Xiaojie
    Chen, Yunpeng
    Dong, Jian
    Feng, Jiashi
    Yan, Shuicheng
    COMPUTER VISION - ECCV 2016, PT VII, 2016, 9911 : 733 - 749
  • [25] Layer-Wise Adaptive Weighting for Faster Convergence in Federated Learning
    Lanjewar, Vedant S.
    Tran, Hai-Anh
    Tran, Truong X.
    2024 IEEE INTERNATIONAL CONFERENCE ON INFORMATION REUSE AND INTEGRATION FOR DATA SCIENCE, IRI 2024, 2024, : 126 - 131
  • [26] Layer-Wise Adaptive Model Aggregation for Scalable Federated Learning
    Lee, Sunwoo
    Zhang, Tuo
    Avestimehr, Salman
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 7, 2023, : 8491 - 8499
  • [27] Layer-Wise Personalized Federated Learning for Mobile Traffic Prediction
    Lee, Seungyeol
    Sung, Jihoon
    Shin, Myung-Ki
    IEEE ACCESS, 2024, 12 : 53126 - 53140
  • [28] Enriching Variety of Layer-wise Learning Information by Gradient Combination
    Wang, Chien-Yao
    Liao, Hong-Yuan Mark
    Chen, Ping-Yang
    Hsieh, Jun-Wei
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, : 2477 - 2484
  • [29] Layer-wise Adversarial Training Approach to Improve Adversarial Robustness
    Chen, Xiaoyi
    Zhang, Ni
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [30] LAYER-WISE APPROACH FOR THE BIFURCATION PROBLEM IN LAMINATED COMPOSITES WITH DELAMINATIONS
    LEE, JH
    GURDAL, Z
    GRIFFIN, OH
    AIAA JOURNAL, 1993, 31 (02) : 331 - 338