Neural Network Pruning by Cooperative Coevolution

Abstract

Neural network pruning is a popular model compression method which can signifcantly reduce the computing cost with negligible loss of accuracy. Recently, flters are often pruned directly by designing proper criteria or using auxiliary modules to measure their importance, which, however, requires expertise and trial-and-error. Due to the advantage of automation, pruning by evolutionary algorithms (EAs) has attracted much attention, but the performance is limited for deep neural networks as the search space can be quite large. In this paper, we propose a new flter pruning algorithm CCEP by cooperative coevolution, which prunes the flters in each layer by EAs separately. That is, CCEP reduces the pruning space by a divide-and-conquer strategy. The experiments show that CCEP can achieve a competitive performance with the stateof-the-art pruning methods, e.g., prune ResNet56 for 63.42% FLOPs on CIFAR10 with −0.24% accuracy drop, and ResNet50 for 44.56% FLOPs on ImageNet with 0.07% accuracy drop.

Publication
In Thirty-First International Joint Conference on Artificial Intelligence
Jia-Liang Wu
Jia-Liang Wu
M.Sc. student

My research interests include machine learning, evolutionary learningg and model compression.