SCED: A General Framework for Sparse Tensor Decomposition with Constraints and Elementwise Dynamic Learning

被引:3
|
作者
Zhou, Shuo [1 ]
Erfani, Sarah M. [1 ]
Bailey, James [1 ]
机构
[1] Univ Melbourne, Sch Comp & Informat Syst, Melbourne, Vic, Australia
来源
2017 17TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM) | 2017年
关键词
D O I
10.1109/ICDM.2017.77
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
CANDECOMP/PARAFAC Decomposition (CPD) is one of the most popular tensor decomposition methods that has been extensively studied and widely applied. In recent years, sparse tensors that contain a huge portion of zeros but a limited number of non-zeros have attracted increasing interest. Existing techniques are not directly applicable to sparse tensors, since they mainly target dense ones and usually have poor efficiency. Additionally, specific issues also arise for sparse tensors, depending on different data sources and applications: the role of zero entries can be different; incorporating constraints like non-negativity and sparseness might be necessary; the ability to learn on-the-fly is a must for dynamic scenarios that new data keeps arriving at high velocity. However, state-of-art algorithms only partially address the above issues. To fill this gap, we propose a general framework for finding the CPD of sparse tensors. Modeling the sparse tensor decomposition problem by a generalized weighted CPD formulation and solving it efficiently, our proposed method is also flexible to handle constraints and dynamic data streams. Through experiments on both synthetic and real-world datasets, for the static case, our method demonstrates significant improvements in terms of effectiveness, efficiency and scalability. Moreover, under the dynamic setting, our method speeds up current technology by hundreds to thousands times, without sacrificing decomposition quality.
引用
收藏
页码:675 / 684
页数:10
相关论文
共 32 条
  • [1] TENSOR DICTIONARY LEARNING WITH SPARSE TUCKER DECOMPOSITION
    Zubair, Syed
    Wang, Wenwu
    2013 18TH INTERNATIONAL CONFERENCE ON DIGITAL SIGNAL PROCESSING (DSP), 2013,
  • [2] A General Sparse Tensor Framework for Electronic Structure Theory
    Manzer, Samuel
    Epifanovsky, Evgeny
    Krylov, Anna I.
    Head-Gordon, Martin
    JOURNAL OF CHEMICAL THEORY AND COMPUTATION, 2017, 13 (03) : 1108 - 1116
  • [3] Bi-smooth constraints for accelerated dynamic MRI with low-rank plus sparse tensor decomposition
    He, Jingfei
    Liu, Xiaotong
    Luan, Nannan
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 82
  • [4] Sparse tensor framework for implementation of general local correlation methods
    Kats, Daniel
    Manby, Frederick R.
    JOURNAL OF CHEMICAL PHYSICS, 2013, 138 (14):
  • [5] A general framework for transfer sparse subspace learning
    Shizhun Yang
    Ming Lin
    Chenping Hou
    Changshui Zhang
    Yi Wu
    Neural Computing and Applications, 2012, 21 : 1801 - 1817
  • [6] A general framework for transfer sparse subspace learning
    Yang, Shizhun
    Lin, Ming
    Hou, Chenping
    Zhang, Changshui
    Wu, Yi
    NEURAL COMPUTING & APPLICATIONS, 2012, 21 (07): : 1801 - 1817
  • [7] 2D Sparse Dictionary Learning via Tensor Decomposition
    Hsieh, Sung-Hsien
    Lu, Chun-Shien
    Pei, Soo-Chang
    2014 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2014, : 492 - 496
  • [8] Partitioning Models for General Medium-Grain Parallel Sparse Tensor Decomposition
    Karsavuran, M. Ozan
    Acer, Seher
    Aykanat, Cevdet
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2021, 32 (01) : 147 - 159
  • [9] Nonparametric Factor Trajectory Learning for Dynamic Tensor Decomposition
    Wang, Zheng
    Zhe, Shandian
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [10] Sparse Decomposition Learning Based Dynamic MRI Reconstruction
    Zhu, Peifei
    Zhang, Qieshi
    Kamata, Sei-ichiro
    SEVENTH INTERNATIONAL CONFERENCE ON MACHINE VISION (ICMV 2014), 2015, 9445