Structured pruning of neural networks for constraints learning

被引:1
|
作者
Cacciola, Matteo [1 ]
Frangioni, Antonio [2 ]
Lodi, Andrea [3 ,4 ]
机构
[1] Polytech Montreal, CERC, Montreal, PQ, Canada
[2] Univ Pisa, Pisa, Italy
[3] Cornell Tech, New York, NY 10044 USA
[4] Technion IIT, New York, NY 10011 USA
基金
加拿大自然科学与工程研究理事会;
关键词
Artificial neural networks; Mixed integer programming; Model compression; Pruning; ANALYTICS;
D O I
10.1016/j.orl.2024.107194
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In recent years, the integration of Machine Learning (ML) models with Operation Research (OR) tools has gained popularity in applications such as cancer treatment, algorithmic configuration, and chemical process optimization. This integration often uses Mixed Integer Programming (MIP) formulations to represent the chosen ML model, that is often an Artificial Neural Networks (ANNs) due to their widespread use. However, ANNs frequently contain a large number of parameters, resulting in MIP formulations impractical to solve. In this paper we showcase the effectiveness of a ANN pruning, when applied to models prior to their integration into MIPs. We discuss why pruning is more suitable in this context than other ML compression techniques, and we highlight the potential of appropriate pruning strategies via experiments on MIPs used to construct adversarial examples to ANNs. Our results demonstrate that pruning offers remarkable reductions in solution times without hindering the quality of the final decision, enabling the resolution of previously unsolvable instances.
引用
收藏
页数:7
相关论文
共 50 条
  • [41] Learning from pairwise constraints by Similarity Neural Networks
    Maggini, Marco
    Melacci, Stefano
    Sarti, Lorenzo
    NEURAL NETWORKS, 2012, 26 : 141 - 158
  • [42] Compressing Deep Reinforcement Learning Networks With a Dynamic Structured Pruning Method for Autonomous Driving
    Su, Wensheng
    Li, Zhenni
    Xu, Minrui
    Kang, Jiawen
    Niyato, Dusit
    Xie, Shengli
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2024, 73 (12) : 18017 - 18030
  • [43] Learning Understandable Neural Networks With Nonnegative Weight Constraints
    Chorowski, Jan
    Zurada, Jacek M.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (01) : 62 - 69
  • [44] Structured Pruning of RRAM Crossbars for Efficient In-Memory Computing Acceleration of Deep Neural Networks
    Meng, Jian
    Yang, Li
    Peng, Xiaochen
    Yu, Shimeng
    Fan, Deliang
    Seo, Jae-Sun
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2021, 68 (05) : 1576 - 1580
  • [45] Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
    Hoefler, Torsten
    Alistarh, Dan
    Ben-Nun, Tal
    Dryden, Nikoli
    Peste, Alexandra
    Journal of Machine Learning Research, 2021, 22
  • [46] Interpretable Task-inspired Adaptive Filter Pruning for Neural Networks Under Multiple Constraints
    Guo, Yang
    Gao, Wei
    Li, Ge
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2024, 132 (06) : 2060 - 2076
  • [47] Structured pruning via feature channels similarity and mutual learning for convolutional neural network compression
    Wei Yang
    Yancai Xiao
    Applied Intelligence, 2022, 52 : 14560 - 14570
  • [48] Learning Contact Dynamics using Physically Structured Neural Networks
    Hochlehnert, Andreas
    Terenin, Alexander
    Saemundsson, Steindor
    Deisenroth, Marc Peter
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [49] Structured learning via convolutional neural networks for vehicle detection
    Maqueda, Ana I.
    del Blanco, Carlos R.
    Jaureguizar, Fernando
    Garcia, Narciso
    REAL-TIME IMAGE AND VIDEO PROCESSING 2017, 2017, 10223
  • [50] LEARNING AND CONVERGENCE ANALYSIS OF NEURAL-TYPE STRUCTURED NETWORKS
    POLYCARPOU, MM
    IOANNOU, PA
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (01): : 39 - 50