All Projects → mmasana → Dalr

mmasana / Dalr

Licence: mit
Implementation of "Domain-adaptive deep network compression", ICCV 2017

Projects that are alternatives of or similar to Dalr

Pacmap
PaCMAP: Large-scale Dimension Reduction Technique Preserving Both Global and Local Structure
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Shadowmusic
A temporal music synthesizer
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Uc berkeley Applied Machine Learning
Materials for Applied Machine Learning Taught in Python
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Idb Idb Invest Coronavirus Impact Dashboard
Follow the impact of COVID-19 outbreak in Latin America in real time
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Medium Article
Repo for articles in my personal blog and Medium
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Tensorflow2.0 eager execution tutorials
Tutorials of TensorFlow eager execution
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Alfabattle2 1stproblem
Alfabattle 2.0 1st task Top-6 solution: 8-folds lgbm blend
Stars: ✭ 27 (-3.57%)
Mutual labels:  jupyter-notebook
Deep learning projects
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Sports Type Classifier
Classify the type of sports from images
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Yancheng Sales
天池-印象盐城-汽车销量预测大赛
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Data driven science python demos
IPython notebooks with demo code intended as a companion to the book "Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control" by J. Nathan Kutz and Steven L. Brunton
Stars: ✭ 27 (-3.57%)
Mutual labels:  jupyter-notebook
Chexpert
CheXpert competition models -- attention augmented convolutions on DenseNet, ResNet; EfficientNet
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Data Visualizations Medium
Understanding Data and Machine Learning Models with Visualizations
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Advanced Gradient Obfuscating
Take further steps in the arms race of adversarial examples with only preprocessing.
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Linguistic and stylistic complexity
Linguistic and stylistic complexity measures for (literary) texts
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Sid
Official implementation for ICCV19 "Shadow Removal via Shadow Image Decomposition"
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Resimnet
Implementation of ReSimNet for drug response similarity prediction
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Imageretrieval
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
Personal History Archive
An experiment in creating a dump of your personal browser history for analysis
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook
S gd2
Stress-based Graph Drawing by Stochastic Gradient Descent
Stars: ✭ 28 (+0%)
Mutual labels:  jupyter-notebook

Domain-adaptive deep network compression

ICCV 2017 open access is available and the poster can be found here. The arXiv pre-print is also available.

How to run the tensorflow code

The example is done on a vanilla non-domain-transfer simple experiment. We train a LeNet network from scratch on MNIST dataset and then compress the network using either the SVD baseline or our proposed DALR method. The example code is given in a jupyter notebook.

cd code/tensorflow
jupyter notebook Experiment_LeNet_MNIST.ipynb

How to run the matlab code

The example network can be downloaded from here and copied to a new folder "nets/".

mkdir nets
cd nets
wget http://mmasana.foracoffee.org/DALR_ICCV_2017/birds_vgg19_net.mat

Then, the example can be run from the "code/" folder by calling the "mainScript_compress_DALR.m" file on the MatLab terminal.

Citation

@InProceedings{Masana_2017_ICCV,
author = {Masana, Marc and van de Weijer, Joost and Herranz, Luis and Bagdanov, Andrew D. and Alvarez, Jose M.},
title = {Domain-Adaptive Deep Network Compression},
booktitle = {The IEEE International Conference on Computer Vision (ICCV)},
month = {Oct},
year = {2017}
}

Code by Marc Masana, PhD student at LAMP research group at Computer Vision Center, Barcelona

Abstract

Deep Neural Networks trained on large datasets can be easily transferred to new domains with far fewer labeled examples by a process called fine-tuning. This has the advantage that representations learned in the large source domain can be exploited on smaller target domains. However, networks designed to be optimal for the source task are often prohibitively large for the target task. In this work we address the compression of networks after domain transfer.

We focus on compression algorithms based on low-rank matrix decomposition. Existing methods base compression solely on learned network weights and ignore the statistics of network activations. We show that domain transfer leads to large shifts in network activations and that it is desirable to take this into account when compressing. We demonstrate that considering activation statistics when compressing weights leads to a rank-constrained regression problem with a closed-form solution. Because our method takes into account the target domain, it can more optimally remove the redundancy in the weights. Experiments show that our Domain Adaptive Low Rank (DALR) method significantly outperforms existing low-rank compression techniques. With our approach, the fc6 layer of VGG19 can be compressed more than 4x more than using truncated SVD alone -- with only a minor or no loss in accuracy. When applied to domain-transferred networks it allows for compression down to only 5-20% of the original number of parameters with only a minor drop in performance.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].