Heterogeneous Representation Learning with Structured Sparsity Regularization

被引:0
|
作者
Yang, Pei [1 ]
He, Jingrui [1 ]
机构
[1] Arizona State Univ, Tempe, AZ 85281 USA
基金
美国国家科学基金会;
关键词
SELECTION; REGRESSION;
D O I
10.1109/ICDM.2016.67
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Motivated by real applications, heterogeneous learning has emerged as an important research area, which aims to model the co-existence of multiple types of heterogeneity. In this paper, we propose a HEterogeneous REpresentation learning model with structured Sparsity regularization (HERES) to learn from multiple types of heterogeneity. HERES aims to leverage two kinds of information to build a robust learning system. One is the rich correlations among heterogeneous data such as task relatedness, view consistency, and label correlation. The other is the prior knowledge of the data in the form of, e.g., the soft-clustering of the tasks. HERES is a generic framework for heterogeneous learning, which integrates multi-task, multi-view, and multi-label learning into a principled framework based on representation learning. The objective of HERES is to minimize the reconstruction loss of using the factor matrices to recover the input matrix for heterogeneous data, regularized by the structured sparsity constraint. The resulting optimization problem is challenging due to the non-smoothness and non-separability of structured sparsity. We develop an iterative updating method to solve the problem. Furthermore, we prove that the reformulation of structured sparsity is separable, which leads to a family of efficient and scalable algorithms for solving structured sparsity penalized problems. The experimental results in comparison with state-of-the-art methods demonstrate the effectiveness of the proposed approach.
引用
收藏
页码:539 / 548
页数:10
相关论文
共 50 条
  • [31] LEARNING STRUCTURED SPARSITY FOR TIME-FREQUENCY RECONSTRUCTION
    Jiang, Lei
    Zhang, Haijian
    Yu, Lei
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 5398 - 5402
  • [32] Flexible representation and querying of heterogeneous structured documents
    Bordogna, G
    Pasi, G
    KYBERNETIKA, 2000, 36 (06) : 617 - 633
  • [33] Structured Sparsity
    van de Geer, Sara
    ESTIMATION AND TESTING UNDER SPARSITY: ECOLE D'ETE DE PROBABILITES DE SAINT-FLOUR XLV - 2015, 2016, 2159 : 75 - 101
  • [34] Implicit Regularization in Deep Tucker Factorization: Low-Rankness via Structured Sparsity
    Hariz, Kais
    Kadri, Hachem
    Ayache, Stephane
    Moakher, Maher
    Artieres, Thierry
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [35] REGULARIZATION PROPERTIES OF TIKHONOV REGULARIZATION WITH SPARSITY CONSTRAINTS
    Ramlau, Ronny
    ELECTRONIC TRANSACTIONS ON NUMERICAL ANALYSIS, 2008, 30 : 54 - 74
  • [36] Structured Model Probing: Empowering Efficient Transfer Learning by Structured Regularization
    Wu, Zhi-Fan
    Mao, Chaojie
    Wang, Xue
    Jiang, Jianwen
    Lv, Yiliang
    Jin, Rong
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 16838 - 16847
  • [37] Learning Hierarchical Image Representation with Sparsity, Saliency and Locality
    Yang, Jimei
    Yang, Ming-Hsuan
    PROCEEDINGS OF THE BRITISH MACHINE VISION CONFERENCE 2011, 2011,
  • [38] Heterogeneous Network Representation Learning
    Dong, Yuxiao
    Hu, Ziniu
    Wang, Kuansan
    Sun, Yizhou
    Tang, Jie
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 4861 - 4867
  • [39] Learning broad learning system with controllable sparsity through L0 regularization
    Chu, Fei
    Wang, Guanghui
    Wang, Jun
    Chen, C. L. Philip
    Wang, Xuesong
    APPLIED SOFT COMPUTING, 2023, 136
  • [40] THE GROUP k- SUPPORT NORM FOR LEARNING WITH STRUCTURED SPARSITY
    Rao, Nikhil
    Dudik, Miroslav
    Harchaoui, Zaid
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 2402 - 2406