Triggering dark showers with conditional dual auto-encoders

被引:0
|
作者
Anzalone, Luca [1 ,3 ]
Chhibra, Simranjit Singh [1 ,2 ,5 ]
Maier, Benedikt [2 ,4 ]
Chernyavskaya, Nadezda [2 ]
Pierini, Maurizio [2 ]
机构
[1] Univ Bologna, Dept Phys & Astron DIFA, Bologna, Italy
[2] European Org Nucl Res CERN, Geneva, Switzerland
[3] Ist Nazl Fis Nucl INFN, Bologna, Italy
[4] Karlsruhe Inst Technol KIT, Karlsruhe, Germany
[5] Queen Mary Univ London QMUL, London, England
来源
关键词
anomaly detection; auto-encoders; deep learning; dark showers; high-energy physics;
D O I
10.1088/2632-2153/ad652b
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present a family of conditional dual auto-encoders (CoDAEs) for generic and model-independent new physics searches at colliders. New physics signals, which arise from new types of particles and interactions, are considered in our study as anomalies causing deviations in data with respect to expected background events. In this work, we perform a normal-only anomaly detection, which employs only background samples, to search for manifestations of a dark version of strong force applying (variational) auto-encoders on raw detector images, which are large and highly sparse, without leveraging any physics-based pre-processing or strong assumption on the signals. The proposed CoDAE has a dual-encoder design, which is general and can learn an auxiliary yet compact latent space through spatial conditioning, showing a neat improvement over competitive physics-based baselines and related approaches, therefore also reducing the gap with fully supervised models. It is the first time an unsupervised model is shown to exhibit excellent discrimination against multiple dark shower models, illustrating the suitability of this method as an accurate, fast, model-independent algorithm to deploy, e.g. in the real-time event triggering systems of large hadron collider experiments such as ATLAS and CMS.
引用
收藏
页数:18
相关论文
共 50 条
  • [21] Association Rules Mining with Auto-encoders
    Berteloot, Theophile
    Khoury, Richard
    Durand, Audrey
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING - IDEAL 2024, PT I, 2025, 15346 : 51 - 62
  • [22] Radon-Sobolev Variational Auto-Encoders
    Turinici, Gabriel
    NEURAL NETWORKS, 2021, 141 : 294 - 305
  • [23] HSAE: A Hessian regularized sparse auto-encoders
    Liu, Weifeng
    Ma, Tengzhou
    Tao, Dapeng
    You, Jane
    NEUROCOMPUTING, 2016, 187 : 59 - 65
  • [24] Feature Selection using Multiple Auto-Encoders
    Guo, Xinyu
    Minai, Ali A.
    Lu, Long J.
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 4602 - 4609
  • [25] Sparse Wavelet Auto-Encoders for Image classification
    Hassairi, Salima
    Ejbali, Ridha
    Zaied, Mourad
    2016 INTERNATIONAL CONFERENCE ON DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS (DICTA), 2016, : 625 - 630
  • [26] A hybrid learning model based on auto-encoders
    Zhou, Ju
    Ju, Li
    Zhang, Xiaolong
    PROCEEDINGS OF THE 2017 12TH IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA), 2017, : 522 - 528
  • [27] HGATE: Heterogeneous Graph Attention Auto-Encoders
    Wang, Wei
    Suo, Xiaoyang
    Wei, Xiangyu
    Wang, Bin
    Wang, Hao
    Dai, Hong-Ning
    Zhang, Xiangliang
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (04) : 3938 - 3951
  • [28] Complete Stacked Denoising Auto-Encoders for Regression
    María-Elena Fernández-García
    José-Luis Sancho-Gómez
    Antonio Ros-Ros
    Aníbal R. Figueiras-Vidal
    Neural Processing Letters, 2021, 53 : 787 - 797
  • [29] Comparison of Auto-encoders with Different Sparsity Regularizers
    Zhang, Li
    Lu, Yaping
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [30] Pre-train and Plug-in: Flexible Conditional Text Generation with Variational Auto-Encoders
    Duan, Yu
    Xu, Canwen
    Pei, Jiaxin
    Han, Jialong
    Li, Chenliang
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 253 - 262