Automatic Feature Selection for Atom-Centered Neural Network Potentials Using a Gradient Boosting Decision Algorithm

被引:0
|
作者
Li, Renzhe [1 ]
Wang, Jiaqi [1 ]
Singh, Akksay [1 ,2 ,3 ]
Li, Bai [1 ]
Song, Zichen [1 ,4 ]
Zhou, Chuan [1 ]
Li, Lei [1 ]
机构
[1] Southern Univ Sci & Technol, Dept Mat Sci & Engn, Shenzhen 518055, Peoples R China
[2] Univ Texas Austin, Dept Chem, Austin, TX 78712 USA
[3] Univ Texas Austin, Inst Computat Engn & Sci, Austin, TX 78712 USA
[4] City Univ Hong Kong, Dept Mat Sci & Engn, Kowloon, Hong Kong 999077, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
LIQUID-METAL; FORCE-FIELD; APPROXIMATION; DYNAMICS; PERFORMANCE; SIMULATION; MODEL;
D O I
10.1021/acs.jctc.4c01176
中图分类号
O64 [物理化学(理论化学)、化学物理学];
学科分类号
070304 ; 081704 ;
摘要
Atom-centered neural network (ANN) potentials have shown high accuracy and computational efficiency in modeling atomic systems. A crucial step in developing reliable ANN potentials is the proper selection of atom-centered symmetry functions (ACSFs), also known as atomic features, to describe atomic environments. Inappropriate selection of ACSFs can lead to poor-quality ANN potentials. Here, we propose a gradient boosting decision tree (GBDT)-based framework for the automatic selection of optimal ACSFs. This framework takes uniformly distributed sets of ACSFs as input and evaluates their relative importance. The ACSFs with high average importance scores are selected and used to train an ANN potential. We applied this method to the Ge system, resulting in an ANN potential with root-mean-square errors (RMSE) of 10.2 meV/atom for energy and 84.8 meV/& Aring; for force predictions, utilizing only 18 ACSFs to achieve a balance between accuracy and computational efficiency. The framework is validated using the grid searching method, demonstrating that ACSFs selected with our framework are in the optimal region. Furthermore, we also compared our method with commonly used feature selection algorithms. The results show that our algorithm outperforms the others in terms of effectiveness and accuracy. This study highlights the significance of the ACSF parameter effect on the ANN performance and presents a promising method for automatic ACSF selection, facilitating the development of machine learning potentials.
引用
收藏
页码:10564 / 10573
页数:10
相关论文
共 50 条
  • [41] A Fast Coding Unit Partitioning Decision Algorithm for Versatile Video Coding Based on Gradient Feedback Hierarchical Convolutional Neural Network and Light Gradient Boosting Machine Decision Tree
    Liu, Fangmei
    Wang, Jiyuan
    Zhang, Qiuwen
    ELECTRONICS, 2024, 13 (24):
  • [42] Algorithm learning based neural network integrating feature selection and classification
    Yoon, Hyunsoo
    Park, Cheong-Sool
    Kim, Jun Seok
    Baek, Jun-Geol
    EXPERT SYSTEMS WITH APPLICATIONS, 2013, 40 (01) : 231 - 241
  • [43] Automatic feature extraction using a novel noniterative neural network
    Hu, CLJ
    AUTOMATIC TARGET RECOGNITION IX, 1999, 3718 : 164 - 171
  • [44] Library of dispersion-corrected atom-centered potentials for generalized gradient approximation functionals: Elements H, C, N, O, He, Ne, Ar, and Kr
    Lin, I-Chun
    Coutinho-Neto, Mauricio D.
    Felsenheimer, Camille
    von Lilienfeld, O. Anatole
    Tavernelli, Ivano
    Rothlisberger, Ursula
    PHYSICAL REVIEW B, 2007, 75 (20)
  • [45] Automatic feature selection using enhanced dynamic Crow Search Algorithm
    Ranjan R.
    Chhabra J.K.
    International Journal of Information Technology, 2023, 15 (5) : 2777 - 2782
  • [46] Attentional Neural Network: Feature Selection Using Cognitive Feedback
    Wang, Qian
    Zhang, Jiaxing
    Song, Sen
    Zhang, Zheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [47] A new wrapper feature selection approach using neural network
    Kabir, Md Monirul
    Islam, Md Monirul
    Murase, Kazuyuki
    NEUROCOMPUTING, 2010, 73 (16-18) : 3273 - 3283
  • [48] Feature selection for high-dimensional neural network potentials with the adaptive group lasso
    Sandberg, Johannes
    Voigtmann, Thomas
    Devijver, Emilie
    Jakse, Noel
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2024, 5 (02):
  • [49] Neural Network Algorithm with Dropout Using Elite Selection
    Wang, Yong
    Wang, Kunzhao
    Wang, Gaige
    MATHEMATICS, 2022, 10 (11)
  • [50] Spam detection through feature selection using artificial neural network and sine-cosine algorithm
    Pashiri, Rozita Talaei
    Rostami, Yaser
    Mahrami, Mohsen
    MATHEMATICAL SCIENCES, 2020, 14 (03) : 193 - 199