Understanding Deep Learning with Statistical Relevance

被引:10
|
作者
Raez, Tim [1 ]
机构
[1] Univ Bern, Inst Philosophy, Bern, Switzerland
关键词
D O I
10.1017/psa.2021.12
中图分类号
N09 [自然科学史]; B [哲学、宗教];
学科分类号
01 ; 0101 ; 010108 ; 060207 ; 060305 ; 0712 ;
摘要
This paper argues that a notion of statistical explanation, based on Salmon's statistical relevance model, can help us better understand deep neural networks. It is proved that homogeneous partitions, the core notion of Salmon's model, are equivalent to minimal sufficient statistics, an important notion from statistical inference. This establishes a link to deep neural networks via the so-called Information Bottleneck method, an information-theoretic framework, according to which deep neural networks implicitly solve an optimization problem that generalizes minimal sufficient statistics. The resulting notion of statistical explanation is general, mathematical, and subcausal.
引用
收藏
页码:20 / 41
页数:22
相关论文
empty
未找到相关数据