All Projects → VITA-Group → Deep K Means Pytorch

VITA-Group / Deep K Means Pytorch

[ICML 2018] "Deep k-Means: Re-Training and Parameter Sharing with Harder Cluster Assignments for Compressing Deep Convolutions"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Deep K Means Pytorch

Triplet Attention
Official PyTorch Implementation for "Rotate to Attend: Convolutional Triplet Attention Module." [WACV 2021]
Stars: ✭ 222 (+93.04%)
Mutual labels:  paper, convolutional-neural-networks
Shiftresnet Cifar
ResNet with Shift, Depthwise, or Convolutional Operations for CIFAR-100, CIFAR-10 on PyTorch
Stars: ✭ 112 (-2.61%)
Mutual labels:  convolutional-neural-networks, compression
Recursive Cnns
Implementation of my paper "Real-time Document Localization in Natural Images by Recursive Application of a CNN."
Stars: ✭ 80 (-30.43%)
Mutual labels:  paper, convolutional-neural-networks
Deblurgan
Image Deblurring using Generative Adversarial Networks
Stars: ✭ 2,033 (+1667.83%)
Mutual labels:  paper, convolutional-neural-networks
Core50
CORe50: a new Dataset and Benchmark for Continual Learning
Stars: ✭ 91 (-20.87%)
Mutual labels:  paper, convolutional-neural-networks
Densepoint
DensePoint: Learning Densely Contextual Representation for Efficient Point Cloud Processing (ICCV 2019)
Stars: ✭ 110 (-4.35%)
Mutual labels:  convolutional-neural-networks
Makeself
A self-extracting archiving tool for Unix systems, in 100% shell script.
Stars: ✭ 1,582 (+1275.65%)
Mutual labels:  compression
Alexnet Experiments Keras
Code examples for training AlexNet using Keras and Theano
Stars: ✭ 109 (-5.22%)
Mutual labels:  convolutional-neural-networks
Shot Type Classifier
Detecting cinema shot types using a ResNet-50
Stars: ✭ 109 (-5.22%)
Mutual labels:  convolutional-neural-networks
Brainforge
A Neural Networking library based on NumPy only
Stars: ✭ 114 (-0.87%)
Mutual labels:  convolutional-neural-networks
Gzip
💾 Golang gzip middleware for Gin and net/http | Golang gzip中间件,支持Gin和net/http,开箱即用同时可定制
Stars: ✭ 113 (-1.74%)
Mutual labels:  compression
Nlp Papers
Papers and Book to look at when starting NLP 📚
Stars: ✭ 111 (-3.48%)
Mutual labels:  paper
Pytorch Estimate Flops
Estimate/count FLOPS for a given neural network using pytorch
Stars: ✭ 110 (-4.35%)
Mutual labels:  convolutional-neural-networks
Sigver wiwd
Learned representation for Offline Handwritten Signature Verification. Models and code to extract features from signature images.
Stars: ✭ 112 (-2.61%)
Mutual labels:  convolutional-neural-networks
Deepgaze
Computer Vision library for human-computer interaction. It implements Head Pose and Gaze Direction Estimation Using Convolutional Neural Networks, Skin Detection through Backprojection, Motion Detection and Tracking, Saliency Map.
Stars: ✭ 1,552 (+1249.57%)
Mutual labels:  convolutional-neural-networks
Facedet
实现常用基于深度学习的人脸检测算法
Stars: ✭ 109 (-5.22%)
Mutual labels:  convolutional-neural-networks
Optimus
Image conversion and optimization desktop app.
Stars: ✭ 111 (-3.48%)
Mutual labels:  compression
Deep learning notes
a collection of my notes on deep learning
Stars: ✭ 112 (-2.61%)
Mutual labels:  convolutional-neural-networks
Singleviewreconstruction
Official Code: 3D Scene Reconstruction from a Single Viewport
Stars: ✭ 110 (-4.35%)
Mutual labels:  paper
Libbrotli
meta project to build libraries from the brotli source code
Stars: ✭ 110 (-4.35%)
Mutual labels:  compression

PyTorch Code for 'Deep k-Means: Re-Training and Parameter Sharing with Harder Cluster Assignments for Compressing Deep Convolutions'

Introduction

PyTorch Implementation of our ICML 2018 paper "Deep k-Means: Re-Training and Parameter Sharing with Harder Cluster Assignments for Compressing Deep Convolutions".

[Poster]
[PPT]

In our paper, we proposed a simple yet effective scheme for compressing convolutions though applying k-means clustering on the weights, compression is achieved through weight-sharing, by only recording K cluster centers and weight assignment indexes.

We then introduced a novel spectrally relaxed k-means regularization, which tends to make hard assignments of convolutional layer weights to K learned cluster centers during re-training.

We additionally propose an improved set of metrics to estimate energy consumption of CNN hardware implementations, whose estimation results are verified to be consistent with previously proposed energy estimation tool extrapolated from actual hardware measurements.

We finally evaluated Deep k-Means across several CNN models in terms of both compression ratio and energy consumption reduction, observing promising results without incurring accuracy loss.

PyTorch Model

  • [x] Wide ResNet
  • [ ] LeNet-Caffe-5

Dependencies

Python 3.5

Testing Deep k-Means

  • Wide ResNet
python WideResNet_Deploy.py

Filters Visualization

Sample Visualization of Wide ResNet (Conv2)

Pre-Trained Model (Before Comp.) Pre-Trained Model (After Comp.)
Deep k-Means Re-Trained Model (Before Comp.) Deep k-Means Re-Trained Model (After Comp.)

Citation

If you find this code useful, please cite the following paper:

@article{deepkmeans,
    title={Deep k-Means: Re-Training and Parameter Sharing with Harder Cluster Assignments for Compressing Deep Convolutions},
    author={Junru Wu, Yue Wang, Zhenyu Wu, Zhangyang Wang, Ashok Veeraraghavan, Yingyan Lin},
    journal={ICML},
    year={2018}
}

Acknowledgment

We would like to thanks the arthor of libKMCUDA, a CUDA based k-means library, without which we won't be able to do large-scale k-means efficiently.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].