All Projects → Efficient-Computing → Similar Projects or Alternatives

87 Open source projects that are alternatives of or similar to Efficient-Computing

Pretrained Language Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Stars: ✭ 2,033 (+328.9%)
Distill-BERT-Textgen
Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
Stars: ✭ 121 (-74.47%)
Mutual labels:  knowledge-distillation
ACCV TinyGAN
BigGAN; Knowledge Distillation; Black-Box; Fast Training; 16x compression
Stars: ✭ 62 (-86.92%)
Mutual labels:  knowledge-distillation
Kd lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Stars: ✭ 173 (-63.5%)
Mutual labels:  model-compression
Awesome Knowledge Distillation
Awesome Knowledge Distillation
Stars: ✭ 2,634 (+455.7%)
Mutual labels:  knowledge-distillation
SAN
[ECCV 2020] Scale Adaptive Network: Learning to Learn Parameterized Classification Networks for Scalable Input Images
Stars: ✭ 41 (-91.35%)
Mutual labels:  knowledge-distillation
Bert Of Theseus
⛵️The official PyTorch implementation for "BERT-of-Theseus: Compressing BERT by Progressive Module Replacing" (EMNLP 2020).
Stars: ✭ 209 (-55.91%)
Mutual labels:  model-compression
ATMC
[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-91.35%)
Mutual labels:  model-compression
BAKE
Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
Stars: ✭ 79 (-83.33%)
Mutual labels:  knowledge-distillation
Pytorch Weights pruning
PyTorch Implementation of Weights Pruning
Stars: ✭ 158 (-66.67%)
Mutual labels:  model-compression
Collaborative Distillation
PyTorch code for our CVPR'20 paper "Collaborative Distillation for Ultra-Resolution Universal Style Transfer"
Stars: ✭ 138 (-70.89%)
Mutual labels:  model-compression
DS-Net
(CVPR 2021, Oral) Dynamic Slimmable Network
Stars: ✭ 204 (-56.96%)
Mutual labels:  model-compression
Structured-Bayesian-Pruning-pytorch
pytorch implementation of Structured Bayesian Pruning
Stars: ✭ 18 (-96.2%)
Mutual labels:  model-compression
Knowledge distillation via TF2.0
The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API
Stars: ✭ 87 (-81.65%)
Mutual labels:  knowledge-distillation
mmrazor
OpenMMLab Model Compression Toolbox and Benchmark.
Stars: ✭ 644 (+35.86%)
Mutual labels:  knowledge-distillation
Pocketflow
An Automatic Model Compression (AutoMC) framework for developing smaller and faster AI applications.
Stars: ✭ 2,672 (+463.71%)
Mutual labels:  model-compression
LD
Localization Distillation for Dense Object Detection (CVPR 2022)
Stars: ✭ 271 (-42.83%)
Mutual labels:  knowledge-distillation
Jfasttext
Java interface for fastText
Stars: ✭ 193 (-59.28%)
Mutual labels:  model-compression
cool-papers-in-pytorch
Reimplementing cool papers in PyTorch...
Stars: ✭ 21 (-95.57%)
Mutual labels:  knowledge-distillation
Keras compressor
Model Compression CLI Tool for Keras.
Stars: ✭ 160 (-66.24%)
Mutual labels:  model-compression
LabelRelaxation-CVPR21
Official PyTorch Implementation of Embedding Transfer with Label Relaxation for Improved Metric Learning, CVPR 2021
Stars: ✭ 37 (-92.19%)
Mutual labels:  knowledge-distillation
Ld Net
Efficient Contextualized Representation: Language Model Pruning for Sequence Labeling
Stars: ✭ 148 (-68.78%)
Mutual labels:  model-compression
MoTIS
Mobile(iOS) Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP). Accepted at NAACL 2022.
Stars: ✭ 60 (-87.34%)
Mutual labels:  knowledge-distillation
distill-and-select
Authors official PyTorch implementation of the "DnS: Distill-and-Select for Efficient and Accurate Video Indexing and Retrieval" [IJCV 2022]
Stars: ✭ 43 (-90.93%)
Mutual labels:  knowledge-distillation
Microexpnet
MicroExpNet: An Extremely Small and Fast Model For Expression Recognition From Frontal Face Images
Stars: ✭ 121 (-74.47%)
Mutual labels:  model-compression
Tf2
An Open Source Deep Learning Inference Engine Based on FPGA
Stars: ✭ 113 (-76.16%)
Mutual labels:  model-compression
Auto-Compression
Automatic DNN compression tool with various model compression and neural architecture search techniques
Stars: ✭ 19 (-95.99%)
Mutual labels:  model-compression
MutualGuide
Localize to Classify and Classify to Localize: Mutual Guidance in Object Detection
Stars: ✭ 97 (-79.54%)
Mutual labels:  knowledge-distillation
kdtf
Knowledge Distillation using Tensorflow
Stars: ✭ 139 (-70.68%)
Mutual labels:  knowledge-distillation
bert-AAD
Adversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
Stars: ✭ 27 (-94.3%)
Mutual labels:  knowledge-distillation
Hawq
Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
Stars: ✭ 108 (-77.22%)
Mutual labels:  model-compression
neural-compressor
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (+40.51%)
Mutual labels:  knowledge-distillation
model optimizer
Model optimizer used in Adlik.
Stars: ✭ 22 (-95.36%)
Mutual labels:  knowledge-distillation
SemCKD
This is the official implementation for the AAAI-2021 paper (Cross-Layer Distillation with Semantic Calibration).
Stars: ✭ 42 (-91.14%)
Mutual labels:  knowledge-distillation
FGD
Focal and Global Knowledge Distillation for Detectors (CVPR 2022)
Stars: ✭ 124 (-73.84%)
Mutual labels:  knowledge-distillation
torch-model-compression
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (-73.42%)
Mutual labels:  model-compression
Awesome Ai Infrastructures
Infrastructures™ for Machine Learning Training/Inference in Production.
Stars: ✭ 223 (-52.95%)
Mutual labels:  model-compression
ESNAC
Learnable Embedding Space for Efficient Neural Architecture Compression
Stars: ✭ 27 (-94.3%)
Mutual labels:  model-compression
Torch Pruning
A pytorch pruning toolkit for structured neural network pruning and layer dependency maintaining.
Stars: ✭ 193 (-59.28%)
Mutual labels:  model-compression
MLIC-KD-WSD
Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection (ACM MM 2018)
Stars: ✭ 58 (-87.76%)
Mutual labels:  knowledge-distillation
Mobile Id
Deep Face Model Compression
Stars: ✭ 187 (-60.55%)
Mutual labels:  model-compression
Yolov5-distillation-train-inference
Yolov5 distillation training | Yolov5知识蒸馏训练,支持训练自己的数据
Stars: ✭ 84 (-82.28%)
Mutual labels:  model-compression
Awesome Ml Model Compression
Awesome machine learning model compression research papers, tools, and learning material.
Stars: ✭ 166 (-64.98%)
Mutual labels:  model-compression
BitPack
BitPack is a practical tool to efficiently save ultra-low precision/mixed-precision quantized models.
Stars: ✭ 36 (-92.41%)
Mutual labels:  model-compression
Pruning
Code for "Co-Evolutionary Compression for Unpaired Image Translation" (ICCV 2019) and "SCOP: Scientific Control for Reliable Neural Network Pruning" (NeurIPS 2020).
Stars: ✭ 159 (-66.46%)
Mutual labels:  model-compression
Zero-shot Knowledge Distillation Pytorch
ZSKD with PyTorch
Stars: ✭ 26 (-94.51%)
Mutual labels:  knowledge-distillation
Amc Models
[ECCV 2018] AMC: AutoML for Model Compression and Acceleration on Mobile Devices
Stars: ✭ 154 (-67.51%)
Mutual labels:  model-compression
awesome-efficient-gnn
Code and resources on scalable and efficient Graph Neural Networks
Stars: ✭ 498 (+5.06%)
Mutual labels:  knowledge-distillation
Yolov3
yolov3 by pytorch
Stars: ✭ 142 (-70.04%)
Mutual labels:  model-compression
Regularization-Pruning
[ICLR'21] PyTorch code for our paper "Neural Pruning via Growing Regularization"
Stars: ✭ 44 (-90.72%)
Mutual labels:  model-compression
Condensa
Programmable Neural Network Compression
Stars: ✭ 129 (-72.78%)
Mutual labels:  model-compression
Object-Detection-Knowledge-Distillation
An Object Detection Knowledge Distillation framework powered by pytorch, now having SSD and yolov5.
Stars: ✭ 189 (-60.13%)
Mutual labels:  knowledge-distillation
Awesome Model Compression
papers about model compression
Stars: ✭ 119 (-74.89%)
Mutual labels:  model-compression
FastPose
pytorch realtime multi person keypoint estimation
Stars: ✭ 36 (-92.41%)
Mutual labels:  model-compression
ZAQ-code
CVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)
Stars: ✭ 59 (-87.55%)
Mutual labels:  model-compression
ProSelfLC-2021
noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation.
Stars: ✭ 45 (-90.51%)
Mutual labels:  knowledge-distillation
allie
🤖 A machine learning framework for audio, text, image, video, or .CSV files (50+ featurizers and 15+ model trainers).
Stars: ✭ 93 (-80.38%)
Mutual labels:  model-compression
FKD
A Fast Knowledge Distillation Framework for Visual Recognition
Stars: ✭ 49 (-89.66%)
Mutual labels:  knowledge-distillation
AB distillation
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Stars: ✭ 105 (-77.85%)
Mutual labels:  knowledge-distillation
head-network-distillation
[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural Networks for Edge-assisted Real-time Systems"
Stars: ✭ 27 (-94.3%)
Mutual labels:  knowledge-distillation
1-60 of 87 similar projects