Authors
Filippo Martinini, Andriy Enttsel, Alex Marchioni, Mauro Mangia, Riccardo Rovatti, Gianluca Setti
Publication date
2023/8/6
Conference
2023 IEEE 66th International Midwest Symposium on Circuits and Systems (MWSCAS)
Pages
1020-1024
Publisher
IEEE
Description
The current trend of over-parameterized Deep Neural Networks makes the deployment on resource constrained systems challenging. To deal with this, optimization techniques, such as network pruning, can be adopted. We propose a novel pruning technique based on trainable probability masks that, when binarized, select the elements of the network to prune. Our method features i) an automatic selections of the elements to prune by jointly training the binary masks with the model, ii) the capability of controlling the pruning level through hyper-parameters of a novel regularization term. We assess the effectiveness of our method by employing it in the structured pruning of the fully connected layers of shallow and deep neural networks where it outperforms the magnitude-based pruning approaches.
Total citations
Scholar articles
F Martinini, A Enttsel, A Marchioni, M Mangia, R Rovatti… - 2023 IEEE 66th International Midwest Symposium on …, 2023