All Projects → bhheo → AB_distillation

bhheo / AB_distillation

Licence: MIT license
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to AB distillation

Awesome Knowledge Distillation
Awesome Knowledge Distillation
Stars: ✭ 2,634 (+2408.57%)
Mutual labels:  knowledge-distillation, knowledge-transfer
self-driving-car
Implementation of the paper "End to End Learning for Self-Driving Cars"
Stars: ✭ 54 (-48.57%)
Mutual labels:  transfer-learning
BA3US
code for our ECCV 2020 paper "A Balanced and Uncertainty-aware Approach for Partial Domain Adaptation"
Stars: ✭ 31 (-70.48%)
Mutual labels:  transfer-learning
FisherPruning
Group Fisher Pruning for Practical Network Compression(ICML2021)
Stars: ✭ 127 (+20.95%)
Mutual labels:  network-compression
ATA-GAN
Demo code for Attention-Aware Generative Adversarial Networks paper
Stars: ✭ 13 (-87.62%)
Mutual labels:  transfer-learning
awesome-graph-self-supervised-learning
Awesome Graph Self-Supervised Learning
Stars: ✭ 805 (+666.67%)
Mutual labels:  transfer-learning
Warehouse Robot Path Planning
A multi agent path planning solution under a warehouse scenario using Q learning and transfer learning.🤖️
Stars: ✭ 59 (-43.81%)
Mutual labels:  transfer-learning
speech-recognition-transfer-learning
Speech command recognition DenseNet transfer learning from UrbanSound8k in keras tensorflow
Stars: ✭ 18 (-82.86%)
Mutual labels:  transfer-learning
MutualGuide
Localize to Classify and Classify to Localize: Mutual Guidance in Object Detection
Stars: ✭ 97 (-7.62%)
Mutual labels:  knowledge-distillation
Transfer-Learning-for-Fault-Diagnosis
This repository is for the transfer learning or domain adaptive with fault diagnosis.
Stars: ✭ 123 (+17.14%)
Mutual labels:  transfer-learning
smoke-detection-transfer-learning
use transfer learning to detect smoke in images and videos
Stars: ✭ 16 (-84.76%)
Mutual labels:  transfer-learning
neural-compressor
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (+534.29%)
Mutual labels:  knowledge-distillation
wechsel
Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-62.86%)
Mutual labels:  transfer-learning
Transformers-Domain-Adaptation
Adapt Transformer-based language models to new text domains
Stars: ✭ 67 (-36.19%)
Mutual labels:  transfer-learning
ReinventCommunity
No description or website provided.
Stars: ✭ 103 (-1.9%)
Mutual labels:  transfer-learning
ProteinLM
Protein Language Model
Stars: ✭ 76 (-27.62%)
Mutual labels:  transfer-learning
backprop
Backprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+118.1%)
Mutual labels:  transfer-learning
object detection
Implementatoin of object detection using Tensorflow 2.1.0 | this can be use in a car for object detection
Stars: ✭ 13 (-87.62%)
Mutual labels:  transfer-learning
WSDM2022-PTUPCDR
This is the official implementation of our paper Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR), which has been accepted by WSDM2022.
Stars: ✭ 65 (-38.1%)
Mutual labels:  transfer-learning
TransforLearning TensorFlow
使用预训练好的InceptionV3模型对自己的数据进行分类,用这个代码的同学希望可以给一个star
Stars: ✭ 58 (-44.76%)
Mutual labels:  transfer-learning

Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons

Official Pytorch implementation of paper:

Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019).

Slides and poster are available on homepage

Environment

Python 3.6, Pytorch 0.4.1, Torchvision

Knowledge distillation (CIFAR-10)

cifar10_AB_distillation.py


Distillation from WRN 22-4 (teacher) to WRN 16-2 (student) on CIFAR-10 dataset.

Pre-trained teacher network (WRN 22-4) is included. Just run the code.

Transfer learning (MIT_scenes)

MITscenes_AB_distillation.py


Transfer learning from ImageNet pre-trained model (teacher) to randomly initialized model (student).

Teacher : ImageNet pre-trained ResNet 50

Student : MobileNet or MobileNetV2 (randomly initialized model)

Please change base learning rate to 0.1 for MobileNetV2.


MIT_scenes dataset should be arranged for Torchvision ImageFolder function.

Train set : $dataset_path / train / $class_name / $image_name

Test set : $dataset_path / test / $class_name / $image name

and run with dataset path.

MobileNet

python MITscenes_AB_distillation.py --data_root $dataset_path

MobileNet V2

python MITscenes_AB_distillation.py --data_root $dataset_path --network mobilenetV2

Other implementations

Tensorflow: https://github.com/sseung0703/Knowledge_distillation_methods_wtih_Tensorflow

Citation

@inproceedings{ABdistill,
	title = {Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons},
	author = {Byeongho Heo, Minsik Lee, Sangdoo Yun, Jin Young Choi},
	booktitle = {AAAI Conference on Artificial Intelligence (AAAI)},
	year = {2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].