Multimodal convolutional neural networks for predicting evolution of gyrokinetic simulations

被引:0
|
作者
Honda, Mitsuru [1 ]
Narita, Emi [2 ]
Maeyama, Shinya [3 ]
Watanabe, Tomo-Hiko [3 ]
机构
[1] Kyoto Univ, Grad Sch Engn, Kyoto 6158530, Japan
[2] Natl Inst Quantum Sci & Technol, Naka Fus Inst, Ibaraki, Japan
[3] Nagoya Univ, Dept Phys, Nagoya, Aichi, Japan
基金
日本学术振兴会;
关键词
convolutional neural network; deep learning; GKV gyrokinetic simulation; multimodal model; turbulent heat flux;
D O I
10.1002/ctpp.202200137
中图分类号
O35 [流体力学]; O53 [等离子体物理学];
学科分类号
070204 ; 080103 ; 080704 ;
摘要
Gyrokinetic simulations are required for the quantitative calculation of fluxes due to turbulence, which dominates over other transport mechanisms in tokamaks. However, nonlinear gyrokinetic simulations are computationally expensive. A multimodal convolutional neural network model that reads images and values generated by nonlinear gyrokinetic simulations and predicts electrostatic turbulent heat fluxes was developed to support efficient runs. The model was extended to account for squared electrostatic potential fluctuations, which are proportional to the fluxes in the quasilinear model, as well as images containing fluctuating electron and ion distribution functions and fluctuating electrostatic potentials in wavenumber space. This multimodal model can predict the time and electron and ion turbulent heat fluxes corresponding to the input data. The model trained on the Cyclone base case data successfully predicted times and fluxes not only for its test data, but also for the completely different and unknown JT-60U case, with high accuracy. The predictive performance of the model depended on the similarity of the linear stability of the case used to train the model to the case being predicted.
引用
收藏
页数:12
相关论文
共 50 条
  • [11] Convolutional Neural Networks for Multimodal Remote Sensing Data Classification
    Wu, Xin
    Hong, Danfeng
    Chanussot, Jocelyn
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [12] Multimodal MRI Volumetric Data Fusion With Convolutional Neural Networks
    Liu, Yu
    Shi, Yu
    Mu, Fuhao
    Cheng, Juan
    Li, Chang
    Chen, Xun
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [13] Early vs Late Fusion in Multimodal Convolutional Neural Networks
    Gadzicki, Konrad
    Khamsehashari, Razieh
    Zetzsche, Christoph
    PROCEEDINGS OF 2020 23RD INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION 2020), 2020, : 292 - 297
  • [14] Multimodal Convolutional Neural Networks for Sperm Motility and Concentration Predictions
    Goh, Voon Hueh
    Mansor, Muhammad Asraf
    As'ari, Muhammad Amir
    Ismail, Lukman Hakim
    MALAYSIAN JOURNAL OF FUNDAMENTAL AND APPLIED SCIENCES, 2024, 20 (02): : 347 - 359
  • [15] Predicting and Understanding Urban Perception with Convolutional Neural Networks
    Porzi, Lorenzo
    Bulo, Samuel Rota
    Lepri, Bruno
    Ricci, Elisa
    MM'15: PROCEEDINGS OF THE 2015 ACM MULTIMEDIA CONFERENCE, 2015, : 139 - 148
  • [16] Predicting Sleeping Quality Using Convolutional Neural Networks
    Sathish, Vidya Rohini Konanur
    Woo, Wai Lok
    Ho, Edmond S. L.
    ADVANCES IN CYBERSECURITY, CYBERCRIMES, AND SMART EMERGING TECHNOLOGIES, 2023, 4 : 175 - 184
  • [17] Predicting Magnetization Directions Using Convolutional Neural Networks
    Nurindrawati, Felicia
    Sun, Jiajia
    JOURNAL OF GEOPHYSICAL RESEARCH-SOLID EARTH, 2020, 125 (10)
  • [18] PREDICTING UNIVERSITY DROPOUT BY USING CONVOLUTIONAL NEURAL NETWORKS
    Mezzini, Mauro
    Bonavolonta, Gianmarco
    Agrusti, Francesco
    13TH INTERNATIONAL TECHNOLOGY, EDUCATION AND DEVELOPMENT CONFERENCE (INTED2019), 2019, : 9155 - 9163
  • [19] Convolutional neural networks for predicting creep and shrinkage of concrete
    Zhu, Jinsong
    Wang, Yanlei
    CONSTRUCTION AND BUILDING MATERIALS, 2021, 306
  • [20] Convolutional Neural Networks in Predicting Missing Text in Arabic
    Souri, Adnan
    Alachhab, Mohamed
    Eddine Elmohajir, Badr
    Zbakh, Abdelali
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2019, 10 (06) : 520 - 527