All Projects → gaborvecsei → Ridurre Network Filter Pruning Keras

gaborvecsei / Ridurre Network Filter Pruning Keras

Licence: mit
Keras model convolutional filter pruning package

Programming Languages

python
139335 projects - #7 most used programming language
python3
1442 projects

Projects that are alternatives of or similar to Ridurre Network Filter Pruning Keras

Yolov3 Network Slimming
yolov3 network slimming剪枝的一种实现
Stars: ✭ 320 (+814.29%)
Mutual labels:  pruning, network
Unrealnetworkprofiler
A modern WPF based Network Profiler for Unreal Engine.
Stars: ✭ 29 (-17.14%)
Mutual labels:  network
Metta
An information security preparedness tool to do adversarial simulation.
Stars: ✭ 867 (+2377.14%)
Mutual labels:  network
Androidutilcode
AndroidUtilCode 🔥 is a powerful & easy to use library for Android. This library encapsulates the functions that commonly used in Android development which have complete demo and unit test. By using it's encapsulated APIs, you can greatly improve the development efficiency. The program mainly consists of two modules which is utilcode, which is commonly used in development, and subutil which is rarely used in development, but the utils can be beneficial to simplify the main module. 🔥
Stars: ✭ 30,239 (+86297.14%)
Mutual labels:  network
Bash Toolkit
Este proyecto esá destinado a ayudar a los sysadmin
Stars: ✭ 13 (-62.86%)
Mutual labels:  network
Sensu Plugins Network Checks
This plugin provides native network instrumentation for monitoring and metrics collection, including: hardware, TCP response, RBLs, whois, port status, and more.
Stars: ✭ 28 (-20%)
Mutual labels:  network
Pytorch Forecasting
Time series forecasting with PyTorch
Stars: ✭ 849 (+2325.71%)
Mutual labels:  network
Eoip
EoIP/EoIPv6 for *nix.
Stars: ✭ 34 (-2.86%)
Mutual labels:  network
Networkmnt
Monitor network float of process
Stars: ✭ 31 (-11.43%)
Mutual labels:  network
Phpnetmap
Web application for ethernet network mapping. PHP Software for network device monitoring with SNMP v(1/2c/3) protocol.
Stars: ✭ 20 (-42.86%)
Mutual labels:  network
Subdue
The Subdue graph miner discovers highly-compressing patterns in an input graph.
Stars: ✭ 20 (-42.86%)
Mutual labels:  network
Libzmq
ZeroMQ core engine in C++, implements ZMTP/3.1
Stars: ✭ 7,418 (+21094.29%)
Mutual labels:  network
Xinblog
前端基础。Vue框架。数据结构与算法。计算机网络。夯实基础。
Stars: ✭ 29 (-17.14%)
Mutual labels:  network
Clientserverproject
一个C-S模版,该模版由三部分的程序组成,一个服务端运行的程序,一个客户端运行的程序,还有一个公共的组件,实现了基础的账户管理功能,版本控制,软件升级,公告管理,消息群发,共享文件上传下载,批量文件传送功能。具体的操作方法见演示就行。本项目的一个目标是:提供一个基础的中小型系统的C-S框架,客户端有三种模式,无缝集成访问,winform版本,wpf版本,asp.net mvc版本,方便企业进行中小型系统的二次开发和个人学习。同时网络组件方便的支持读写三菱和西门子PLC的数据,详细见Readme
Stars: ✭ 873 (+2394.29%)
Mutual labels:  network
Tina
a powerful android network library base on okhttp
Stars: ✭ 32 (-8.57%)
Mutual labels:  network
Powermodelsannex.jl
A PowerModels.jl Extension Package for Exploratory Work
Stars: ✭ 11 (-68.57%)
Mutual labels:  network
Hsdn
Analysis of the human symptoms–disease network
Stars: ✭ 15 (-57.14%)
Mutual labels:  network
Tk Listen
A library that allows to listen network sockets with proper resource limits and error handling
Stars: ✭ 27 (-22.86%)
Mutual labels:  network
Cytoscape.js
Graph theory (network) library for visualisation and analysis
Stars: ✭ 8,107 (+23062.86%)
Mutual labels:  network
Xiringuito
SSH-based "VPN for poors"
Stars: ✭ 969 (+2668.57%)
Mutual labels:  network

Ridurre - Filter Pruning in Deep Convolutional Networks

Pruning is the process when we try to shrink a network by removing the not so significant/redundant filters.

This package is a mini-framework which you can easily use on your existing models and also you can define your own pruning methods without any struggle.

pruning framework diagram

Install

pip install ridurre

  • Install packages inside requirements.txt
    • pip install -r requirements.txt

Example Results

These results were achieved with the example provided:

  • Cifar10 dataset
  • ResNetV1
  • KMeans filter pruning
    • Clustering factor: 0.9 (which can be considered as an aggressive pruning)
training with pruning pruning

Usage

Define you own pruning method

You can make your own pruning method by creating a new class which has the parent BasePruning. There is only 1 thing you should take care and that the implementation of the run_pruning_for_conv2d_layer function.

For an example just take a look at the RandomFilterPruning code.

Use an already existing method

Check out the example/model_pruning_example.py for a simple but extensive tutorial

Callbacks

You will need to define 2 callbacks for the pruning:

  • Model compile function
    • 1 argument:
      • model which is a keras.models.Model
    • This should define how to compile the model
    • Example:
      def compile_model(my_model):
          my_model.compile(optimizer=optimizers.Adam(lr=0.001),
                           loss=losses.categorical_crossentropy,
                           metrics=["accuracy"])
      
  • Finetune function
    • 3 arguments:
      • model which is a keras.models.Model
      • initial_epoch which is an int: This defines the initial epoch state for the model fitting. For example it is 12 if we trained the model for 12 epochs before this function was called
      • finetune_epochs which is an int: Defines how much should we train after a pruning.
    • This should define how to finetune out model
    • Example:
      def finetune_model(my_model, initial_epoch, finetune_epochs):
              my_model.fit(x_train,
                           y_train,
                           32,
                           epochs=finetune_epochs,
                           validation_data=(x_test, y_test),
                           callbacks=callbacks,
                           initial_epoch=initial_epoch,
                           verbose=1)
      

Pruning

You will need to select which pruning method you would like to use. In this example I will use the KMeans pruning

import ridurre

# Create the model
model = build_model(...)

# Define compile callback
def compile_my_model(model):
    model.compile(...)

# Compile with your callback (of course you can use different compilation for this train and the pruning)
compile_my_model(model)

# Train if you would like to start from a better position
model.fit(...)

# Define finetuning callback
def finetune_my_model(model, initial_epoch, finetune_epochs):
    model.fit(..., epochs=finetune_epochs, initial_epoch=initial_epoch)

# We can start pruning
pruning = ridurre.KMeansFilterPruning(0.9,
                                             compile_my_model,
                                             finetune_my_model,
                                             6,
                                             maximum_pruning_percent=0.4,
                                             maximum_prune_iterations=12)
model, _ = pruning.run_pruning(model)

At the end of the pruning step, you will have a trained and pruned model which you can use. I can recommend to train your model after the pruning for just a little longer as an extra step towards accuracy.

Future work

  • Look for problematic cases, where there is a merging (like add) and warn the user that the different inputs to that operations should be pruned in the same manner
    • A good example for a case like this is ResNet
  • Define "pruneable" set of layers
    • With regex or layer indices
    • This needs to find add, multiply, average, etc... operations (layers) which needs same filter number from the different inputs
  • Different pruning factors for channels with different number of filters
  • More pruning solutions
  • Do not depend on kerassurgeon as I only use the channel delete function

Papers

[1] Filter Level Pruning Based on Similar Feature Extraction for Convolutional Neural Networks

[2] Demystifying Neural Network Filter Pruning

About

Gábor Vecsei

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].