Encoding Hierarchical Information in Neural Networks Helps in Subpopulation Shift

被引:3
|
作者
Mukherjee A. [1 ]
Garg I. [1 ]
Roy K. [1 ]
机构
[1] Purdue University, Electrical and Computer Engineering, West Lafayette, 47907-2050, IN
来源
关键词
Catastrophic mispredictions; hierarchical learning; representation learning; subpopulation shift;
D O I
10.1109/TAI.2023.3261861
中图分类号
学科分类号
摘要
Over the past decade, deep neural networks have proven to be adept in image classification tasks, often surpassing humans in terms of accuracy. However, standard neural networks often fail to understand the concept of hierarchical structures and dependencies among different classes for vision-related tasks. Humans on the other hand, seem to intuitively learn categories conceptually, progressively growing from understanding high-level concepts down to granular levels of categories. One of the issues arising from the inability of neural networks to encode such dependencies within its learned structure is that of subpopulation shift - where models are queried with novel unseen classes taken from a shifted population of the training set categories. Since the neural network treats each class as independent from all others, it struggles to categorize shifting populations that are dependent at higher levels of the hierarchy. In this work, we study the aforementioned problems through the lens of a novel conditional supervised training framework. We tackle subpopulation shift by a structured learning procedure that incorporates hierarchical information conditionally through labels. Furthermore, we introduce a notion of hierarchical distance to model the catastrophic effect of mispredictions. We show that learning in this structured hierarchical manner results in networks that are more robust against subpopulation shifts, with an improvement up to 3% in terms of accuracy and up to 11% in terms of hierarchical distance over standard models on subpopulation shift benchmarks. © 2023 IEEE.
引用
收藏
页码:827 / 838
页数:11
相关论文
共 50 条
  • [31] Information capacity of a hierarchical neural network
    Dominguez, DRC
    PHYSICAL REVIEW E, 1998, 58 (04): : 4811 - 4815
  • [32] Information capacity of a hierarchical neural network
    Renato, Carreta Dominguez, David
    Physical Review E. Statistical Physics, Plasmas, Fluids, and Related Interdisciplinary Topics, 1998, 58 (04):
  • [33] Hierarchical Graph Neural Network: A Lightweight Image Matching Model with Enhanced Message Passing of Local and Global Information in Hierarchical Graph Neural Networks
    Gyamfi, Enoch Opanin
    Qin, Zhiguang
    Danso, Juliana Mantebea
    Adu-Gyamfi, Daniel
    INFORMATION, 2024, 15 (10)
  • [34] Application of Hierarchical Encoding Scheme in Distribution Networks Reconfiguration
    Wen, Juan
    Tan, Yang-hong
    Jiang, Lin
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2019, 44 (03) : 2295 - 2304
  • [35] Application of Hierarchical Encoding Scheme in Distribution Networks Reconfiguration
    Juan Wen
    Yang-hong Tan
    Lin Jiang
    Arabian Journal for Science and Engineering, 2019, 44 : 2295 - 2304
  • [36] Hierarchical neural networks with exponential storage
    Willcox, C.R.
    Neural Networks, 1988, 1 (1 SUPPL)
  • [37] High Learning Hierarchical Neural Networks
    Bobrowski, Leon
    COMPUTATIONAL COLLECTIVE INTELLIGENCE, PT II, ICCCI 2024, 2024, 14811 : 295 - 304
  • [38] Hierarchical evolution of heterogeneous neural networks
    Weingaertner, D
    Tatai, VK
    Gudwin, RR
    Von Zuben, FJ
    CEC'02: PROCEEDINGS OF THE 2002 CONGRESS ON EVOLUTIONARY COMPUTATION, VOLS 1 AND 2, 2002, : 1775 - 1780
  • [39] Hierarchical neural networks for pixel classification
    Schouten, TE
    Liu, ZK
    Feng, L
    Gu, JJ
    IMAGE AND SIGNAL PROCESSING FOR REMOTE SENSING VI, 2001, 4170 : 57 - 64
  • [40] Bayesian Hierarchical Convolutional Neural Networks
    Bensen, Alexis
    Kahana, Adam
    Woods, Zerotti
    ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING FOR MULTI-DOMAIN OPERATIONS APPLICATIONS V, 2023, 12538