您好,欢迎来到知库网。
搜索
您的当前位置:首页Pruning CNN for resource efficie

Pruning CNN for resource efficie

来源:知库网

Approach

The proposed scheme for pruning consists of the following steps:

  • Fine-tune the network until convergence on the target task;
  • Alternate iterations of pruning and fine-tuning;
  • Stop pruning when the required trade-off between accuracy and pruning objective is reached.

C(.) is a cost function, the goal of pruning is as follow:

There're several criterion for pruning by evaluating importance of neurons:

  • ORACLE pruning
    The best approximation of a neuron’s importance is to estimate the cost value of the network, once a particular neuron is pruned. This can be implemented as setting the pruning gate to 0 for each neuron in turn and estimating C(D|W).
  • Minimum weight
  • Activation based criteria
    One of the reasons of ReLU’s popularity is that convolutional layers with this activation act as feature detectors. Therefore it is reasonable to assume that if the activation value (the output of the neuron) is small then this feature detector is not important for prediction of the output of the network.
  • Taylor Expansion Approximation
    Intuitively, this criterion prunes neurons that have an almost flat influence on the cost function. This approach requires accumulation of the product of the activation and the gradient wrt. the cost function which is precomputed for back-propagation during training.

Experiment

References:
PRUNING CONVOLUTIONAL NEURAL NETWORKS FOR RESOURCE EFFICIENT INFERENCE, Pavlo Molchanov, 2017, ICLR

Copyright © 2019- zicool.com 版权所有 湘ICP备2023022495号-2

违法及侵权请联系:TEL:199 1889 7713 E-MAIL:2724546146@qq.com

本站由北京市万商天勤律师事务所王兴未律师提供法律服务