Encoding Hierarchical Information in Neural Networks Helps in Subpopulation Shift

被引:3
|
作者
Mukherjee A. [1 ]
Garg I. [1 ]
Roy K. [1 ]
机构
[1] Purdue University, Electrical and Computer Engineering, West Lafayette, 47907-2050, IN
来源
关键词
Catastrophic mispredictions; hierarchical learning; representation learning; subpopulation shift;
D O I
10.1109/TAI.2023.3261861
中图分类号
学科分类号
摘要
Over the past decade, deep neural networks have proven to be adept in image classification tasks, often surpassing humans in terms of accuracy. However, standard neural networks often fail to understand the concept of hierarchical structures and dependencies among different classes for vision-related tasks. Humans on the other hand, seem to intuitively learn categories conceptually, progressively growing from understanding high-level concepts down to granular levels of categories. One of the issues arising from the inability of neural networks to encode such dependencies within its learned structure is that of subpopulation shift - where models are queried with novel unseen classes taken from a shifted population of the training set categories. Since the neural network treats each class as independent from all others, it struggles to categorize shifting populations that are dependent at higher levels of the hierarchy. In this work, we study the aforementioned problems through the lens of a novel conditional supervised training framework. We tackle subpopulation shift by a structured learning procedure that incorporates hierarchical information conditionally through labels. Furthermore, we introduce a notion of hierarchical distance to model the catastrophic effect of mispredictions. We show that learning in this structured hierarchical manner results in networks that are more robust against subpopulation shifts, with an improvement up to 3% in terms of accuracy and up to 11% in terms of hierarchical distance over standard models on subpopulation shift benchmarks. © 2023 IEEE.
引用
收藏
页码:827 / 838
页数:11
相关论文
共 50 条
  • [41] Hierarchical neural networks for text categorization
    Ruiz, ME
    Srinivasan, P
    SIGIR'99: PROCEEDINGS OF 22ND INTERNATIONAL CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 1999, : 281 - 282
  • [42] MULTILAYER NEURAL NETWORKS FOR HIERARCHICAL PATTERNS
    SOURLAS, N
    EUROPHYSICS LETTERS, 1988, 7 (08): : 749 - 753
  • [43] Hierarchical nucleation in deep neural networks
    Doimo, Diego
    Glielmo, Aldo
    Ansuini, Alessio
    Laio, Alessandro
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [44] A new constrained learning algorithm for function approximation by encoding a priori information into feedforward neural networks
    Fei Han
    De-Shuang Huang
    Neural Computing and Applications, 2008, 17 : 433 - 439
  • [45] A new constrained learning algorithm for function approximation by encoding a priori information into feedforward neural networks
    Han, Fei
    Huang, De-Shuang
    NEURAL COMPUTING & APPLICATIONS, 2008, 17 (5-6): : 433 - 439
  • [46] A Constrained Approximation Algorithm by Encoding Second-Order Derivative Information into Feedforward Neural Networks
    Ling, Qing-Hua
    Han, Fei
    EMERGING INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS: WITH ASPECTS OF ARTIFICIAL INTELLIGENCE, 2009, 5755 : 928 - +
  • [47] Topology information condensation in hierarchical networks
    Van Mieghem, P
    COMPUTER NETWORKS, 1999, 31 (20) : 2115 - 2137
  • [48] Hierarchical social networks and information flow
    López, L
    Mendes, JFF
    Sanjuán, MAF
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2002, 316 (1-4) : 695 - 708
  • [49] Information flow in generalized hierarchical networks
    Almendral, JA
    López, L
    Sanjuán, MAF
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2003, 324 (1-2) : 424 - 429
  • [50] Neural Information Encoding Based on a Bifurcation Machinery
    Ren, Wei
    Gu, Huaguang
    Yang, Minghao
    Liu, Zhiqiang
    Li, Li
    Xu, Yulin
    Liu, Hongjv
    ADVANCES IN COGNITIVE NEURODYNAMICS, PROCEEDINGS, 2008, : 821 - +