Unsupervised 3D Out-of-Distribution Detection with Latent Diffusion Models

被引:4
|
作者
Graham, Mark S. [1 ]
Pinaya, Walter Hugo Lopez [1 ]
Wright, Paul [1 ]
Tudosiu, Petru-Daniel [1 ]
Mah, Yee H. [1 ,2 ]
Teo, James T. [2 ,3 ]
Jager, H. Rolf [4 ]
Werring, David [5 ]
Nachev, Parashkev [4 ]
Ourselin, Sebastien [1 ]
Cardoso, M. Jorge [1 ]
机构
[1] Kings Coll London, Sch Biomed Engn & Imaging Sci, Dept Biomed Engn, London, England
[2] Kings Coll Hosp NHS Fdn Trust, Denmark Hill, London, Denmark
[3] Kings Coll London, Inst Psychiat Psychol & Neurosci, London, England
[4] UCL, Inst Neurol, London, England
[5] UCL, Ctr Stroke Res, Inst Neurol, Queen Sq, London, England
基金
英国工程与自然科学研究理事会; 英国惠康基金; “创新英国”项目;
关键词
Latent diffusion models; Out-of-distribution detection;
D O I
10.1007/978-3-031-43907-0_43
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Methods for out-of-distribution (OOD) detection that scale to 3D data are crucial components of any real-world clinical deep learning system. Classic denoising diffusion probabilistic models (DDPMs) have been recently proposed as a robust way to perform reconstruction-based OOD detection on 2D datasets, but do not trivially scale to 3D data. In this work, we propose to use Latent Diffusion Models (LDMs), which enable the scaling of DDPMs to high-resolution 3D medical data. We validate the proposed approach on near- and far-OOD datasets and compare it to a recently proposed, 3D-enabled approach using Latent Transformer Models (LTMs). Not only does the proposed LDM-based approach achieve statistically significant better performance, it also shows less sensitivity to the underlying latent representation, more favourable memory scaling, and produces better spatial anomaly maps. Code is available at https://github.com/marksgraham/ddpm-ood.
引用
收藏
页码:446 / 456
页数:11
相关论文
共 50 条
  • [31] Entropic Out-of-Distribution Detection
    Macedo, David
    Ren, Tsang Ing
    Zanchettin, Cleber
    Oliveira, Adriano L., I
    Ludermir, Teresa
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [32] Watermarking for Out-of-distribution Detection
    Wang, Qizhou
    Liu, Feng
    Zhang, Yonggang
    Zhang, Jing
    Gong, Chen
    Liu, Tongliang
    Han, Bo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [33] Is Out-of-Distribution Detection Learnable?
    Fang, Zhen
    Li, Yixuan
    Lu, Jie
    Dong, Jiahua
    Han, Bo
    Liu, Feng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [34] On the Learnability of Out-of-distribution Detection
    Fang, Zhen
    Li, Yixuan
    Liu, Feng
    Han, Bo
    Lu, Jie
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [35] DEEPLENS: Interactive Out-of-distribution Data Detection in NLP Models
    Song, Da
    Wang, Zhijie
    Huang, Yuheng
    Ma, Lei
    Zhang, Tianyi
    PROCEEDINGS OF THE 2023 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, CHI 2023, 2023,
  • [36] Understanding Failures in Out-of-Distribution Detection with Deep Generative Models
    Zhang, Lily H.
    Goldstein, Mark
    Ranganath, Rajesh
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [37] Interpretable Latent Space for Meteorological Out-of-Distribution Detection via Weak Supervision
    Das, Suman
    Yuhas, Michael
    Koh, Rachel
    Easwaran, Arvind
    ACM TRANSACTIONS ON CYBER-PHYSICAL SYSTEMS, 2024, 8 (02) : 1 - 26
  • [38] Out-of-Distribution Node Detection Based on Graph Heat Kernel Diffusion
    Li, Fangfang
    Wang, Yangshuai
    Du, Xinyu
    Li, Xiaohua
    Yu, Ge
    MATHEMATICS, 2024, 12 (18)
  • [39] Self-supervised Out-of-Distribution Detection with Dynamic Latent Scale GAN
    Cho, Jeongik
    Krzyzak, Adam
    STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, S+SSPR 2022, 2022, 13813 : 113 - 121
  • [40] Latent Out: an unsupervised deep anomaly detection approach exploiting latent space distribution
    Angiulli, Fabrizio
    Fassetti, Fabio
    Ferragina, Luca
    MACHINE LEARNING, 2023, 112 (11) : 4323 - 4349