Communication Dans Un Congrès Année : 2025

Importance Resides In Activations: Fast Input-Based Nonlinearity Pruning

Résumé

Deep learning has achieved tremendous successes across a broad range of applications, especially in computer vision with Convolutional Neural Networks (CNNs), which consist of successions of linear and nonlinear operations. In this study, our key contribution is a new procedure to linearize CNNs, in the most cost-effective way possible. We leverage information from the inputs to each nonlinear functions to identify which nonlinearities are less critical for the network's performance. Our method is versatile, adaptable to any common nonlinearity and CNN architecture. While it gives a small drop in accuracy across a wide range of CNNs with respect to state-of-the-art methods, it bypasses the usual significant computational effort to determine removable nonlinearities, whether layer-wise or channel-wise. Additionally, we provide a comprehensive analysis of network behavior during pruning, offering insights into internal damage, recovery, and effective retraining strategies.
Fichier principal
Vignette du fichier
ICONIP_BR.pdf (793.41 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
licence

Dates et versions

hal-04920230 , version 1 (30-01-2025)

Licence

Identifiants

  • HAL Id : hal-04920230 , version 1

Citer

Baptiste Rossigneux, Vincent Lorrain, Inna Kucher, Emmanuel Casseau. Importance Resides In Activations: Fast Input-Based Nonlinearity Pruning. International Conference on Neural Information Processing (ICONIP), Dec 2024, Auckland, New Zealand. ⟨hal-04920230⟩
0 Consultations
0 Téléchargements

Partager

More