GitPlanet
Projects
Users
Categories
Languages
About
All Categories
→
No Category
→ distillation
Top 9 distillation open source projects
Distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
✭ 3,760
Jupyter Notebook
python
jupyter-notebook
pytorch
deep-neural-networks
onnx
quantization
pruning
regularization
group-lasso
distillation
truncated-svd
network-compression
pruning-structures
early-exit
automl-for-compression
Yolov5-distillation-train-inference
Yolov5 distillation training | Yolov5知识蒸馏训练,支持训练自己的数据
✭ 84
python
Jupyter Notebook
shell
Dockerfile
object-detection
model-compression
distillation
yolov5
konwledge-distillation
FKD
A Fast Knowledge Distillation Framework for Visual Recognition
✭ 49
python
deep-learning
pytorch
knowledge-distillation
efficient-algorithm
distillation
self-supervised-learning
efficient-training
efficientnet-pytorch
simpleAICV-pytorch-ImageNet-COCO-training
SimpleAICV:pytorch training example on ImageNet(ILSVRC2012)/COCO2017/VOC2007+2012 datasets.Include ResNet/DarkNet/RetinaNet/FCOS/CenterNet/TTFNet/YOLOv3/YOLOv4/YOLOv5/YOLOX.
✭ 276
python
shell
pytorch
classification
imagenet
coco
resnet
object-detection
darknet
cifar100
voc
distillation
retinanet
yolov3
fcos
centernet
yolov4
yolov5
ttfnet
ilsvrc2012
yolox
mosaic-augment
bert-squeeze
🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
✭ 56
python
nlp
transformers
lstm
theseus
pruning
quantization
bert
distillation
pytorch-lightning
fastbert
deebert
roberta-wwm-base-distill
this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
✭ 61
python
natural-language-processing
tensorflow
pretrained-models
bert
distillation
roberta
CCL
PyTorch Implementation on Paper [CVPR2021]Distilling Audio-Visual Knowledge by Compositional Contrastive Learning
✭ 76
python
shell
pytorch
video-recognition
distillation
audio-visual-learning
contrastive-learning
cvpr2021
compositional-contrastive-learning
audio-teacher-models
multi-modal-distillation
adaptive-wavelets
Adaptive, interpretable wavelets across domains (NeurIPS 2021)
✭ 58
Jupyter Notebook
python
machine-learning
statistics
deep-learning
neural-network
pytorch
interpretability
wavelets
wavelet-analysis
distillation
xai
explainability
ZAQ-code
CVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)
✭ 59
python
quantization
zero-shot
model-compression
distillation
1-9
of
9
distillation projects