Encoding Hierarchical Information in Neural Networks Helps in Subpopulation Shift

被引:3
|
作者
Mukherjee A. [1 ]
Garg I. [1 ]
Roy K. [1 ]
机构
[1] Purdue University, Electrical and Computer Engineering, West Lafayette, 47907-2050, IN
来源
关键词
Catastrophic mispredictions; hierarchical learning; representation learning; subpopulation shift;
D O I
10.1109/TAI.2023.3261861
中图分类号
学科分类号
摘要
Over the past decade, deep neural networks have proven to be adept in image classification tasks, often surpassing humans in terms of accuracy. However, standard neural networks often fail to understand the concept of hierarchical structures and dependencies among different classes for vision-related tasks. Humans on the other hand, seem to intuitively learn categories conceptually, progressively growing from understanding high-level concepts down to granular levels of categories. One of the issues arising from the inability of neural networks to encode such dependencies within its learned structure is that of subpopulation shift - where models are queried with novel unseen classes taken from a shifted population of the training set categories. Since the neural network treats each class as independent from all others, it struggles to categorize shifting populations that are dependent at higher levels of the hierarchy. In this work, we study the aforementioned problems through the lens of a novel conditional supervised training framework. We tackle subpopulation shift by a structured learning procedure that incorporates hierarchical information conditionally through labels. Furthermore, we introduce a notion of hierarchical distance to model the catastrophic effect of mispredictions. We show that learning in this structured hierarchical manner results in networks that are more robust against subpopulation shifts, with an improvement up to 3% in terms of accuracy and up to 11% in terms of hierarchical distance over standard models on subpopulation shift benchmarks. © 2023 IEEE.
引用
收藏
页码:827 / 838
页数:11
相关论文
共 50 条
  • [21] Information propagation in hierarchical networks
    Fu, Feng
    Liu, Lianghuan
    Wang, Long
    PROCEEDINGS OF THE 46TH IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-14, 2007, : 3927 - 3931
  • [22] Dynamical Information Encoding in Neural Adaptation
    Li, Luozheng
    Zhang, Wenhao
    Mi, Yuanyuan
    Wang, Dahui
    Lin, Xiaohan
    Wu, Si
    2016 38TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2016, : 3060 - 3063
  • [23] Encoding of information using neural fingerprints
    José Luis Carrillo-Medina
    Roberto Latorre
    BMC Neuroscience, 16 (Suppl 1)
  • [24] Ensemble probabilistic quantization encoding for information preservation of numerical variables in convolutional neural networks
    Nam, Ki Yup
    Park, Hyun-Woong
    Lee, Yeongseop
    Jung, Hun-Young
    Kang, Taeseen
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [25] The Fuzzy Neural Networks with Ternary Encoding
    Semenova, O.
    Semenov, A.
    Koval, K.
    Rudyk, A.
    Chuhov, V.
    2013 INTERNATIONAL SIBERIAN CONFERENCE ON CONTROL AND COMMUNICATIONS (SIBCON), 2013,
  • [26] ENCODING STRATEGIES IN MULTILAYER NEURAL NETWORKS
    ELIZALDE, E
    GOMEZ, S
    ROMEO, A
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1991, 24 (23): : 5617 - 5638
  • [27] Encoding Involutory Invariances in Neural Networks
    Bhattacharya, Anwesh
    Mattheakis, Marios
    Protopapas, Pavlos
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [28] Nonlinearity Encoding for Extrapolation of Neural Networks
    Na, Gyoung S.
    Park, Chanyoung
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 1284 - 1294
  • [29] Natural neural networks, a general model for evolving hierarchical energy and information transformation systems
    Winiwarter, P
    15TH INTERNATIONAL CONGRESS ON CYBERNETICS, PROCEEDINGS, 1999, : 787 - 792
  • [30] Hierarchical Information Fusion Graph Neural Networks for Chinese Implicit Rhetorical Questions Recognition
    Li, Xiang
    Qian, Zhong
    Li, Peifeng
    Zhu, Xiaoxu
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,