Neural networks and rational McNaughton functions

被引:0
|
作者
Amato, P
Di Nola, A
Gerla, B
机构
[1] Univ Salerno, Dept Math & Informat, Soft Comp Lab, I-84081 Baronissi, SA, Italy
[2] STMicroelect, SST Corp R& D, I-80022 Arzano, Napoli, Italy
关键词
neural networks; many-valued logic; Lukasiewicz logic; rational weights; McNaughton functions;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we shall describe a correspondence between Rational McNaughton functions (as truth table of Rational Lukasiewicz formulas) and neural networks in which the activation function is the truncated identity and synaptic weights are rational numbers. On one hand to have a logical representation (in a given logic) of neural networks could widen the interpretability, amalgamability and reuse of these objects. On the other hand, neural networks could be used to learn formulas from data and as circuital counterparts of (functions represented by) formulas.
引用
收藏
页码:95 / 110
页数:16
相关论文
共 50 条
  • [41] Loss Functions for Image Restoration With Neural Networks
    Zhao, Hang
    Gallo, Orazio
    Frosio, Iuri
    Kautz, Jan
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2017, 3 (01) : 47 - 57
  • [42] Exactly Predictable Functions for Simple Neural Networks
    Yost J.
    Rizo L.
    Fang X.
    Su Q.
    Grobe R.
    SN Computer Science, 2022, 3 (1)
  • [43] Unsupervised learning spectral functions with neural networks
    Wang, Lingxiao
    Shi, Shuzhe
    Zhou, Kai
    28TH INTERNATIONAL NUCLEAR PHYSICS CONFERENCE, INPC 2022, 2023, 2586
  • [44] A Comparison of Activation Functions in Artificial Neural Networks
    Bircanoglu, Cenk
    Arica, Nafiz
    2018 26TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2018,
  • [45] Representation of intermolecular potential functions by neural networks
    Gassner, H
    Probst, M
    Lauenstein, A
    Hermansson, K
    JOURNAL OF PHYSICAL CHEMISTRY A, 1998, 102 (24): : 4596 - 4605
  • [46] Neural Redshift: Random Networks are not Random Functions
    Teney, Damien
    Nicolicioiu, Armand Mihai
    Hartmann, Valentin
    Abbasnejad, Ehsan
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2024, 2024, : 4786 - 4796
  • [47] HOLDER CONTINUOUS ACTIVATION FUNCTIONS IN NEURAL NETWORKS
    Tatar, Nasser-Eddine
    ADVANCES IN DIFFERENTIAL EQUATIONS AND CONTROL PROCESSES, 2015, 15 (02): : 93 - 106
  • [48] The neural networks for logical functions with optical devices
    Degeratu, V
    Degeratu, S
    Schiopu, P
    7TH WORLD MULTICONFERENCE ON SYSTEMICS, CYBERNETICS AND INFORMATICS, VOL XI, PROCEEDINGS: COMMUNICATION, NETWORK AND CONTROL SYSTEMS, TECHNOLOGIES AND APPLICATIONS: II, 2003, : 290 - 293
  • [49] Decomposing neural networks as mappings of correlation functions
    Fischer, Kirsten
    Rene, Alexandre
    Keup, Christian
    Layer, Moritz
    Dahmen, David
    Helias, Moritz
    PHYSICAL REVIEW RESEARCH, 2022, 4 (04):
  • [50] Learning Activation Functions for Sparse Neural Networks
    Loni, Mohammad
    Mohan, Aditya
    Asadi, Mehdi
    Lindauer, Marius
    INTERNATIONAL CONFERENCE ON AUTOMATED MACHINE LEARNING, VOL 224, 2023, 224