All Projects → huawei-noah → Efficient-Computing

huawei-noah / Efficient-Computing

Licence: other
Efficient-Computing

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Efficient-Computing

Pretrained Language Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Stars: ✭ 2,033 (+328.9%)
Mutual labels:  knowledge-distillation, model-compression
cool-papers-in-pytorch
Reimplementing cool papers in PyTorch...
Stars: ✭ 21 (-95.57%)
Mutual labels:  knowledge-distillation
SAN
[ECCV 2020] Scale Adaptive Network: Learning to Learn Parameterized Classification Networks for Scalable Input Images
Stars: ✭ 41 (-91.35%)
Mutual labels:  knowledge-distillation
ATMC
[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-91.35%)
Mutual labels:  model-compression
Structured-Bayesian-Pruning-pytorch
pytorch implementation of Structured Bayesian Pruning
Stars: ✭ 18 (-96.2%)
Mutual labels:  model-compression
mmrazor
OpenMMLab Model Compression Toolbox and Benchmark.
Stars: ✭ 644 (+35.86%)
Mutual labels:  knowledge-distillation
LD
Localization Distillation for Dense Object Detection (CVPR 2022)
Stars: ✭ 271 (-42.83%)
Mutual labels:  knowledge-distillation
Yolov5-distillation-train-inference
Yolov5 distillation training | Yolov5知识蒸馏训练,支持训练自己的数据
Stars: ✭ 84 (-82.28%)
Mutual labels:  model-compression
Regularization-Pruning
[ICLR'21] PyTorch code for our paper "Neural Pruning via Growing Regularization"
Stars: ✭ 44 (-90.72%)
Mutual labels:  model-compression
Zero-shot Knowledge Distillation Pytorch
ZSKD with PyTorch
Stars: ✭ 26 (-94.51%)
Mutual labels:  knowledge-distillation
MoTIS
Mobile(iOS) Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP). Accepted at NAACL 2022.
Stars: ✭ 60 (-87.34%)
Mutual labels:  knowledge-distillation
MutualGuide
Localize to Classify and Classify to Localize: Mutual Guidance in Object Detection
Stars: ✭ 97 (-79.54%)
Mutual labels:  knowledge-distillation
bert-AAD
Adversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
Stars: ✭ 27 (-94.3%)
Mutual labels:  knowledge-distillation
neural-compressor
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (+40.51%)
Mutual labels:  knowledge-distillation
SemCKD
This is the official implementation for the AAAI-2021 paper (Cross-Layer Distillation with Semantic Calibration).
Stars: ✭ 42 (-91.14%)
Mutual labels:  knowledge-distillation
torch-model-compression
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (-73.42%)
Mutual labels:  model-compression
FastPose
pytorch realtime multi person keypoint estimation
Stars: ✭ 36 (-92.41%)
Mutual labels:  model-compression
ESNAC
Learnable Embedding Space for Efficient Neural Architecture Compression
Stars: ✭ 27 (-94.3%)
Mutual labels:  model-compression
ProSelfLC-2021
noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation.
Stars: ✭ 45 (-90.51%)
Mutual labels:  knowledge-distillation
allie
🤖 A machine learning framework for audio, text, image, video, or .CSV files (50+ featurizers and 15+ model trainers).
Stars: ✭ 93 (-80.38%)
Mutual labels:  model-compression

Efficient-Computing

This repo is a collection of Efficient-Computing methods.

Data-Efficient-Model-Compression is a series of compression methods with no or little training data.

Efficient-NAS is a series of NAS methods that can search desired model efficiently.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].