SemiH: DFT Hamiltonian neural network training with semi-supervised learning

被引:0
|
作者
Cho, Yucheol [1 ]
Choi, Guenseok [1 ]
Ham, Gyeongdo [1 ]
Shin, Mincheol [1 ]
Kim, Daeshik [1 ]
机构
[1] Korea Adv Inst Sci & Technol KAIST, Sch Elect Engn, Daejeon 34141, South Korea
来源
关键词
density functional theory; neural network Hamiltonian; message-passing neural network; semi-supervised learning; unlabeled data; pseudo Hamiltonian; graph neural network; DENSITY-FUNCTIONAL THEORY; ELECTRONIC-STRUCTURE;
D O I
10.1088/2632-2153/ad7227
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Over the past decades, density functional theory (DFT) calculations have been utilized in various fields such as materials science and semiconductor devices. However, due to the inherent nature of DFT calculations, which rigorously consider interactions between atoms, they require significant computational cost. To address this, extensive research has recently focused on training neural networks to replace DFT calculations. However, previous methods for training neural networks necessitated an extensive number of DFT simulations to acquire the ground truth (Hamiltonians). Conversely, when dealing with a limited amount of training data, deep learning models often display increased errors in predicting Hamiltonians and band structures for testing data. This phenomenon poses the potential risk of generating inaccurate physical interpretations, including the emergence of unphysical branches within band structures. To tackle this challenge, we propose a novel deep learning-based method for calculating DFT Hamiltonians, specifically tailored to produce accurate results with limited training data. Our framework not only employs supervised learning with the calculated Hamiltonian but also generates pseudo Hamiltonians (targets for unlabeled data) and trains the neural networks on unlabeled data. Particularly, our approach, which leverages unlabeled data, is noteworthy as it marks the first attempt in the field of neural network Hamiltonians. Our framework showcases the superior performance of our framework compared to the state-of-the-art approach across various datasets, such as MoS2, Bi2Te3, HfO2, and InGaAs. Moreover, our framework demonstrates enhanced generalization performance by effectively utilizing unlabeled data, achieving noteworthy results when evaluated on data more complex than the training set, such as configurations with more atoms and temperature ranges outside the training data.
引用
收藏
页数:16
相关论文
共 50 条
  • [21] Semi-Supervised Learning for Neural Keyphrase Generation
    Ye, Hai
    Wang, Lu
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4142 - 4153
  • [22] Semi-Supervised Learning for Neural Machine Translation
    Cheng, Yong
    Xu, Wei
    He, Zhongjun
    He, Wei
    Wu, Hua
    Sun, Maosong
    Liu, Yang
    PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, 2016, : 1965 - 1974
  • [23] Unsupervised Pre-Training with Spiking Neural Networks in Semi-Supervised Learning
    Dorogyy, Yaroslav
    Kolisnichenko, Vadym
    2018 IEEE FIRST INTERNATIONAL CONFERENCE ON SYSTEM ANALYSIS & INTELLIGENT COMPUTING (SAIC), 2018, : 177 - 180
  • [24] SEMI-SUPERVISED TRAINING STRATEGIES FOR DEEP NEURAL NETWORKS
    Gibson, Matthew
    Cook, Gary
    Zhan, Puming
    2017 IEEE AUTOMATIC SPEECH RECOGNITION AND UNDERSTANDING WORKSHOP (ASRU), 2017, : 77 - 83
  • [25] Semi-Supervised Learning via Convolutional Neural Network for Hyperspectral Image Classification
    Ling, Zhigang
    Li, Xiuxin
    Zou, Wen
    Guo, Siyu
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 1900 - 1905
  • [26] Semi-Supervised Learning Based on Hybrid Neural Network for the Signal Integrity Analysis
    Chen, Siyu
    Chen, Jienan
    Zhang, Tingrui
    Wei, Shuwu
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2020, 67 (10) : 1934 - 1938
  • [27] A noise-resistant graph neural network by semi-supervised contrastive learning
    Lu, Zhengyu
    Ma, Junbo
    Wu, Zongqian
    Zhou, Bo
    Zhu, Xiaofeng
    INFORMATION SCIENCES, 2024, 658
  • [28] Semi-supervised node classification via graph learning convolutional neural network
    Kangjie Li
    Wenjing Ye
    Applied Intelligence, 2022, 52 : 12724 - 12736
  • [29] A Semi-Supervised Active Learning Neural Network for Data Streams With Concept Drift
    Jiao, Botao
    Gomes, Heitor Murilo
    Xue, Bing
    Guo, Yinan
    Zhang, Mengjie
    IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2025, 20 (01) : 18 - 33
  • [30] Learning ladder neural networks for semi-supervised node classification in social network
    Li, Bentian
    Pi, Dechang
    Lin, Yunxia
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 165