All Projects → yxgeee → BAKE

yxgeee / BAKE

Licence: MIT license
Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to BAKE

SAN
[ECCV 2020] Scale Adaptive Network: Learning to Learn Parameterized Classification Networks for Scalable Input Images
Stars: ✭ 41 (-48.1%)
Mutual labels:  imagenet, knowledge-distillation
head-network-distillation
[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Networks for Edge-assisted Real-time Systems"
Stars: ✭ 27 (-65.82%)
Mutual labels:  imagenet, knowledge-distillation
Labelimg
🖍️ LabelImg is a graphical image annotation tool and label object bounding boxes in images
Stars: ✭ 16,088 (+20264.56%)
Mutual labels:  imagenet
alexnet
custom implementation alexnet with tensorflow
Stars: ✭ 21 (-73.42%)
Mutual labels:  imagenet
Mobilenetv3 Pytorch
Implementing Searching for MobileNetV3 paper using Pytorch
Stars: ✭ 243 (+207.59%)
Mutual labels:  imagenet
Moga
MoGA: Searching Beyond MobileNetV3
Stars: ✭ 220 (+178.48%)
Mutual labels:  imagenet
Dawn Bench Entries
DAWNBench: An End-to-End Deep Learning Benchmark and Competition
Stars: ✭ 254 (+221.52%)
Mutual labels:  imagenet
Atomnas
Code for ICLR 2020 paper 'AtomNAS: Fine-Grained End-to-End Neural Architecture Search'
Stars: ✭ 197 (+149.37%)
Mutual labels:  imagenet
distill-and-select
Authors official PyTorch implementation of the "DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval" [IJCV 2022]
Stars: ✭ 43 (-45.57%)
Mutual labels:  knowledge-distillation
Pyramidnet Pytorch
A PyTorch implementation for PyramidNets (Deep Pyramidal Residual Networks, https://arxiv.org/abs/1610.02915)
Stars: ✭ 234 (+196.2%)
Mutual labels:  imagenet
nested-transformer
Nested Hierarchical Transformer https://arxiv.org/pdf/2105.12723.pdf
Stars: ✭ 174 (+120.25%)
Mutual labels:  imagenet
Pyconv
Pyramidal Convolution: Rethinking Convolutional Neural Networks for Visual Recognition (https://arxiv.org/pdf/2006.11538.pdf)
Stars: ✭ 231 (+192.41%)
Mutual labels:  imagenet
Triplet Attention
Official PyTorch Implementation for "Rotate to Attend: Convolutional Triplet Attention Module." [WACV 2021]
Stars: ✭ 222 (+181.01%)
Mutual labels:  imagenet
kdtf
Knowledge Distillation using Tensorflow
Stars: ✭ 139 (+75.95%)
Mutual labels:  knowledge-distillation
Mini Imagenet Tools
Tools for generating mini-ImageNet dataset and processing batches
Stars: ✭ 209 (+164.56%)
Mutual labels:  imagenet
Sequential Imagenet Dataloader
A plug-in replacement for DataLoader to load ImageNet disk-sequentially in PyTorch.
Stars: ✭ 198 (+150.63%)
Mutual labels:  imagenet
Fusenet
Deep fusion project of deeply-fused nets, and the study on the connection to ensembling
Stars: ✭ 230 (+191.14%)
Mutual labels:  imagenet
Selecsls Pytorch
Reference ImageNet implementation of SelecSLS CNN architecture proposed in the SIGGRAPH 2020 paper "XNect: Real-time Multi-Person 3D Motion Capture with a Single RGB Camera". The repository also includes code for pruning the model based on implicit sparsity emerging from adaptive gradient descent methods, as detailed in the CVPR 2019 paper "On implicit filter level sparsity in Convolutional Neural Networks".
Stars: ✭ 251 (+217.72%)
Mutual labels:  imagenet
Object-Detection-Knowledge-Distillation
An Object Detection Knowledge Distillation framework powered by pytorch, now having SSD and yolov5.
Stars: ✭ 189 (+139.24%)
Mutual labels:  knowledge-distillation
cozmo-tensorflow
🤖 Cozmo the Robot recognizes objects with TensorFlow
Stars: ✭ 61 (-22.78%)
Mutual labels:  imagenet

Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification

Updates

[2021-06-08] The implementation of BAKE on small-scale datasets has been added, please refer to small_scale. [2021-06-09] The implementation of BAKE on ImageNet has been added, please refer to imagenet.

Citation

If you find BAKE helpful in your research, please consider citing:

@misc{ge2020bake,
    title={Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification},
    author={Yixiao Ge and Ching Lam Choi and Xiao Zhang and Peipei Zhao and Feng Zhu and Rui Zhao and Hongsheng Li},
    year={2021},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].