All Projects → marload → Aquvitae

marload / Aquvitae

Licence: mit
The Easiest Knowledge Distillation Library for Lightweight Deep Learning

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Aquvitae

Soft Filter Pruning
Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks
Stars: ✭ 291 (+309.86%)
Mutual labels:  model-compression
Lightctr
Lightweight and Scalable framework that combines mainstream algorithms of Click-Through-Rate prediction based computational DAG, philosophy of Parameter Server and Ring-AllReduce collective communication.
Stars: ✭ 644 (+807.04%)
Mutual labels:  model-compression
Model Optimization
A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
Stars: ✭ 992 (+1297.18%)
Mutual labels:  model-compression
Amc
[ECCV 2018] AMC: AutoML for Model Compression and Acceleration on Mobile Devices
Stars: ✭ 298 (+319.72%)
Mutual labels:  model-compression
Ghostnet.pytorch
[CVPR2020] GhostNet: More Features from Cheap Operations
Stars: ✭ 440 (+519.72%)
Mutual labels:  model-compression
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+873.24%)
Mutual labels:  model-compression
A- Guide -to Data Sciecne from mathematics
It is a blueprint to data science from the mathematics to algorithms. It is not completed.
Stars: ✭ 25 (-64.79%)
Mutual labels:  model-compression
Awesome Knowledge Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
Stars: ✭ 1,031 (+1352.11%)
Mutual labels:  model-compression
Knowledge Distillation Zoo
Pytorch implementation of various Knowledge Distillation (KD) methods.
Stars: ✭ 514 (+623.94%)
Mutual labels:  model-compression
Knowledge Distillation Pytorch
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
Stars: ✭ 986 (+1288.73%)
Mutual labels:  model-compression
Filter Pruning Geometric Median
Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration (CVPR 2019 Oral)
Stars: ✭ 338 (+376.06%)
Mutual labels:  model-compression
Knowledge Distillation Papers
knowledge distillation papers
Stars: ✭ 422 (+494.37%)
Mutual labels:  model-compression
Bipointnet
This project is the official implementation of our accepted ICLR 2021 paper BiPointNet: Binary Neural Network for Point Clouds.
Stars: ✭ 27 (-61.97%)
Mutual labels:  model-compression
Model Compression Papers
Papers for deep neural network compression and acceleration
Stars: ✭ 296 (+316.9%)
Mutual labels:  model-compression
Compress
Compressing Representations for Self-Supervised Learning
Stars: ✭ 43 (-39.44%)
Mutual labels:  model-compression
SViTE
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
Stars: ✭ 50 (-29.58%)
Mutual labels:  model-compression
Paddleslim
PaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+853.52%)
Mutual labels:  model-compression
Keras model compression
Model Compression Based on Geoffery Hinton's Logit Regression Method in Keras applied to MNIST 16x compression over 0.95 percent accuracy.An Implementation of "Distilling the Knowledge in a Neural Network - Geoffery Hinton et. al"
Stars: ✭ 59 (-16.9%)
Mutual labels:  model-compression
Awesome Pruning
A curated list of neural network pruning resources.
Stars: ✭ 1,017 (+1332.39%)
Mutual labels:  model-compression
Channel Pruning
Channel Pruning for Accelerating Very Deep Neural Networks (ICCV'17)
Stars: ✭ 979 (+1278.87%)
Mutual labels:  model-compression

TF Depend TORCH Depend License Badge

AquVitae: The Easiest Knowledge Distillation Library

AquVitae is a Python library that is the easiest to perform Knowledge Distillation through a very simple API. This library supports TensorFlow and PyTorch. Knowledge Distillation is the most representative model compression technology along with Weight Running and Quantization. This library has a popular and diverse Knowledge Distillation algorithm. If the Deep Learning model used in your project is too heavy, you can use AquVitae to make the speed very fast with little loss of performance.

Getting Started

TF Tutorial Colab

In AquVitae, you only need to call the function once for Knowledge Distillation.

from aquvitae import dist, ST

# Load the dataset
train_ds = ...
test_ds = ...

# Load the teacher and student model
teacher = ...
student = ...

optimizer = ...

# Knowledge Distillation
student = dist(
    teacher=teacher,
    student=student,
    algo=ST(alpha=0.6, T=2.5),
    optimizer=optimizer,
    train_ds=train_ds,
    test_ds=test_ds,
    iterations=3000
)

Installation

$ pip install aquvitae

Algorithms

List of Knowledge Distillation Algorithms implemented in AquVitae.

Algo HP Paper TF TORCH
ST alpha, T Distilling the Knowledge in a Neural Network ✔️ ✔️
DML - Deep Mutual Learning - -
FitNets - FitNets: Hints for Thin Deep Nets - -
RKD - Relational Knowledge Distillation - -

License

Copyright © marload

AquVitae is open-sourced software licensed under the MIT License.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].