Event generation and statistical sampling for physics with deep generative models and a density information buffer

被引:0
|
作者
Sydney Otten
Sascha Caron
Wieske de Swart
Melissa van Beekveld
Luc Hendriks
Caspar van Leeuwen
Damian Podareanu
Roberto Ruiz de Austri
Rob Verheyen
机构
[1] Institute for Mathematics,
[2] Astro- and Particle Physics IMAPP Radboud Universiteit,undefined
[3] GRAPPA,undefined
[4] University of Amsterdam,undefined
[5] Nikhef,undefined
[6] SURFsara,undefined
[7] Instituto de Fisica Corpuscular,undefined
[8] IFIC-UV/CSIC University of Valencia,undefined
来源
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Simulating nature and in particular processes in particle physics require expensive computations and sometimes would take much longer than scientists can afford. Here, we explore ways to a solution for this problem by investigating recent advances in generative modeling and present a study for the generation of events from a physical process with deep generative models. The simulation of physical processes requires not only the production of physical events, but to also ensure that these events occur with the correct frequencies. We investigate the feasibility of learning the event generation and the frequency of occurrence with several generative machine learning models to produce events like Monte Carlo generators. We study three processes: a simple two-body decay, the processes e+e− → Z → l+l− and pp→tt¯\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$pp\to t\bar{t}$$\end{document} including the decay of the top quarks and a simulation of the detector response. By buffering density information of encoded Monte Carlo events given the encoder of a Variational Autoencoder we are able to construct a prior for the sampling of new events from the decoder that yields distributions that are in very good agreement with real Monte Carlo events and are generated several orders of magnitude faster. Applications of this work include generic density estimation and sampling, targeted event generation via a principal component analysis of encoded ground truth data, anomaly detection and more efficient importance sampling, e.g., for the phase space integration of matrix elements in quantum field theories.
引用
收藏
相关论文
共 50 条
  • [1] Event generation and statistical sampling for physics with deep generative models and a density information buffer
    Otten, Sydney
    Caron, Sascha
    de Swart, Wieske
    van Beekveld, Melissa
    Hendriks, Luc
    van Leeuwen, Caspar
    Podareanu, Damian
    de Austri, Roberto Ruiz
    Verheyen, Rob
    NATURE COMMUNICATIONS, 2021, 12 (01)
  • [2] Algorithms from statistical physics for generative models of images
    Coughlan, J
    Yuille, A
    IMAGE AND VISION COMPUTING, 2003, 21 (01) : 29 - 36
  • [3] Statistical Predictions in String Theory and Deep Generative Models
    Halverson, James
    Long, Cody
    FORTSCHRITTE DER PHYSIK-PROGRESS OF PHYSICS, 2020, 68 (05):
  • [4] On the design and evaluation of generative models in high energy density physics
    Shukla, Ankita
    Mubarka, Yamen
    Anirudh, Rushil
    Kur, Eugene
    Mariscal, Derek
    Djordjevic, Blagoje
    Kustowski, Bogdan
    Swanson, Kelly
    Spears, Brian
    Bremer, Peer-Timo
    Ma, Tammy
    Turaga, Pavan
    Thiagarajan, Jayaraman J.
    COMMUNICATIONS PHYSICS, 2025, 8 (01):
  • [5] A Systematic Survey on Deep Generative Models for Graph Generation
    Guo, Xiaojie
    Zhao, Liang
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (05) : 5370 - 5390
  • [6] Identifiability of deep generative models without auxiliary information
    Kivva, Bohdan
    Rajendran, Goutham
    Ravikumar, Pradeep
    Aragam, Bryon
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [7] Deep Generative Models for Relational Data with Side Information
    Hu, Changwei
    Rai, Piyush
    Carin, Lawrence
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [8] InvNet: Encoding Geometric and Statistical Invariances in Deep Generative Models
    Joshi, Ameya
    Cho, Minsu
    Shah, Viraj
    Pokuri, Balaji
    Sarkar, Soumik
    Ganapathysubramanian, Baskar
    Hegde, Chinmay
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 4377 - 4384
  • [9] Structured deep generative models for sampling on constraint manifolds in sequential manipulation
    Ortiz-Haro, Joaquim
    Ha, Jung-Su
    Driess, Danny
    Toussaint, Marc
    CONFERENCE ON ROBOT LEARNING, VOL 164, 2021, 164 : 213 - 223
  • [10] Comparison of Deep Generative Models for the Generation of Handwritten Character Images
    Kirbiyik, Omer
    Simsar, Enis
    Cemgil, A. Taylan
    2019 27TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2019,