Giter Site home page Giter Site logo

dynamic-channel-propagation's Introduction

Dynamic Channel Propagation

The code for the contributed paper "Learning to Prune in Training via Dynamic Channel Propagation" arXiv.org accepted by ICPR-2020. In this paper, we propose a novel network training mechanism called "dynamic channel propagation" to prune the deep neural networks during the training period. Here we show the source code of our scheme.

Preliminalies

Environment

Our code is based on the deep-learning framework Pytorch and strongly reference to its official examples.

  • python >= 3.5
  • cuda >= 10.0
  • torch >= 1.3.0, torchvision

Data set

For CIFAR-10, you may directly download it using pytorch API

from torchvision.datasets.cifar import CIFAR10 as dataset
# for training set
dataset(root='../data', train=True, download=True)
# for testing set
dataset(root='../data', train=False, download=True)

As for ILSVRC-2012(ImageNet), you have to download it from the URL, unzip it and move the validating images to subfolders by the shell.

Run the code

Firstly, enter the root directory of the project, and then generate a folder to store the results

cd [root directory of the project]
mkdir model

Type the following code to run on CIFAR-10

pyhton3 main.py -architecture [Vgg or ResNet] -decay [initial value of decay factor] -pr [global pruning rate of channels] -data_dir [path to the dataset]

In regard with pruning ResNet on ILSVRC-2012, type

pyhton3 main2.py -pr [global pruning rate of channels] -data_dir [path to the dataset]

Result

Data set Architecture Top-1 Top-5 FLOPs pruned
CIFAR-10 VGG-16 93.50% - 73.3%
CIFAR-10 ResNet-32 92.60% - 50.2%
ImageNet ResNet-50 74.25% 92.05% 41.1%

Please refer to our paper for more details.

dynamic-channel-propagation's People

Contributors

shibo-shen avatar shenshibo avatar

Stargazers

WPC avatar webber avatar  avatar  avatar 256-alex avatar

Watchers

 avatar paper2code - bot avatar

Forkers

zjunice xinwzhang

dynamic-channel-propagation's Issues

About training time changes

hello, thanks your work, when I use dynamic pruning, will the training time be longer than the time without dynamic pruning?

parameters and FLOPs

thank you for your nice work. the number of parameters and FLOPs in the pruned model are same as the baseline. How do you operate it? and i use the Thop package.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.