SemiH: DFT Hamiltonian neural network training with semi-supervised learning

被引:0
|
作者
Cho, Yucheol [1 ]
Choi, Guenseok [1 ]
Ham, Gyeongdo [1 ]
Shin, Mincheol [1 ]
Kim, Daeshik [1 ]
机构
[1] Korea Adv Inst Sci & Technol KAIST, Sch Elect Engn, Daejeon 34141, South Korea
来源
关键词
density functional theory; neural network Hamiltonian; message-passing neural network; semi-supervised learning; unlabeled data; pseudo Hamiltonian; graph neural network; DENSITY-FUNCTIONAL THEORY; ELECTRONIC-STRUCTURE;
D O I
10.1088/2632-2153/ad7227
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Over the past decades, density functional theory (DFT) calculations have been utilized in various fields such as materials science and semiconductor devices. However, due to the inherent nature of DFT calculations, which rigorously consider interactions between atoms, they require significant computational cost. To address this, extensive research has recently focused on training neural networks to replace DFT calculations. However, previous methods for training neural networks necessitated an extensive number of DFT simulations to acquire the ground truth (Hamiltonians). Conversely, when dealing with a limited amount of training data, deep learning models often display increased errors in predicting Hamiltonians and band structures for testing data. This phenomenon poses the potential risk of generating inaccurate physical interpretations, including the emergence of unphysical branches within band structures. To tackle this challenge, we propose a novel deep learning-based method for calculating DFT Hamiltonians, specifically tailored to produce accurate results with limited training data. Our framework not only employs supervised learning with the calculated Hamiltonian but also generates pseudo Hamiltonians (targets for unlabeled data) and trains the neural networks on unlabeled data. Particularly, our approach, which leverages unlabeled data, is noteworthy as it marks the first attempt in the field of neural network Hamiltonians. Our framework showcases the superior performance of our framework compared to the state-of-the-art approach across various datasets, such as MoS2, Bi2Te3, HfO2, and InGaAs. Moreover, our framework demonstrates enhanced generalization performance by effectively utilizing unlabeled data, achieving noteworthy results when evaluated on data more complex than the training set, such as configurations with more atoms and temperature ranges outside the training data.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] CGT: Consistency Guided Training in Semi-Supervised Learning
    Hasan, Nesreen
    Ghorban, Farzin
    Velten, Joerg
    Kummert, Anton
    PROCEEDINGS OF THE 17TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS (VISAPP), VOL 5, 2022, : 55 - 64
  • [42] Debiased Self-Training for Semi-Supervised Learning
    Chen, Baixu
    Jiang, Junguang
    Wang, Ximei
    Wan, Pengfei
    Wang, Jianmin
    Long, Mingsheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [43] Semi-Supervised Training in Deep Learning Acoustic Model
    Huang, Yan
    Wang, Yongqiang
    Gong, Yifan
    17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 3848 - 3852
  • [44] Cybercrime Detection Using Semi-Supervised Neural Network
    Karimi, Abbas
    Abbasabadei, Saber
    Torkestani, Javad Akbari
    Zarafshan, Faraneh
    COMPUTER SCIENCE JOURNAL OF MOLDOVA, 2021, 29 (02) : 155 - 183
  • [45] Virtual Adversarial Training: A Regularization Method for Supervised and Semi-Supervised Learning
    Miyato, Takeru
    Maeda, Shin-Ichi
    Koyama, Masanori
    Ishii, Shin
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2019, 41 (08) : 1979 - 1993
  • [46] Latent Space Virtual Adversarial Training for Supervised and Semi-Supervised Learning
    Osada, Genki
    Ahsan, Budrul
    Prasad Bora, Revoti
    Nishide, Takashi
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (03) : 667 - 678
  • [47] Semi-supervised Learning
    Adams, Niall
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES A-STATISTICS IN SOCIETY, 2009, 172 : 530 - 530
  • [48] On semi-supervised learning
    A. Cholaquidis
    R. Fraiman
    M. Sued
    TEST, 2020, 29 : 914 - 937
  • [49] On semi-supervised learning
    Cholaquidis, A.
    Fraiman, R.
    Sued, M.
    TEST, 2020, 29 (04) : 914 - 937
  • [50] Graph Stochastic Neural Networks for Semi-supervised Learning
    Wang, Haibo
    Zhou, Chuan
    Chen, Xin
    Wu, Jia
    Pan, Shirui
    Wang, Jilong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33