All Projects → Paddleslim → Similar Projects or Alternatives

291 Open source projects that are alternatives of or similar to Paddleslim

Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+2.07%)
Model Optimization
A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
Stars: ✭ 992 (+46.53%)
Awesome Ml Model Compression
Awesome machine learning model compression research papers, tools, and learning material.
Stars: ✭ 166 (-75.48%)
Kd lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Stars: ✭ 173 (-74.45%)
torch-model-compression
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (-81.39%)
Micronet
micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference)、Low-Bit(≤2b)/Ternary and Binary(TWN/BNN/XNOR-Net); post-training-quantization(PTQ), 8-bit(tensorrt); 2、 pruning: normal、regular and group convolutional channel pruning; 3、 group convolution structure; 4、batch-normalization fuse for quantization. deploy: tensorrt, fp32/fp16/int8(ptq-calibration)、op-adapt(upsample)、dynamic_shape
Stars: ✭ 1,232 (+81.98%)
ATMC
[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-93.94%)
Awesome Ai Infrastructures
Infrastructures™ for Machine Learning Training/Inference in Production.
Stars: ✭ 223 (-67.06%)
Nni
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+1480.21%)
Awesome Pruning
A curated list of neural network pruning resources.
Stars: ✭ 1,017 (+50.22%)
Mutual labels:  model-compression, pruning
Hawq
Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
Stars: ✭ 108 (-84.05%)
Mutual labels:  quantization, model-compression
Tf2
An Open Source Deep Learning Inference Engine Based on FPGA
Stars: ✭ 113 (-83.31%)
Mutual labels:  quantization, model-compression
BossNAS
(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (-81.54%)
Mutual labels:  nas, neural-architecture-search
SSD-Pruning-and-quantization
Pruning and quantization for SSD. Model compression.
Stars: ✭ 19 (-97.19%)
Mutual labels:  pruning, quantization
Ntagger
reference pytorch code for named entity tagging
Stars: ✭ 58 (-91.43%)
Mutual labels:  quantization, pruning
Pretrained Language Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Stars: ✭ 2,033 (+200.3%)
Mutual labels:  model-compression, quantization
Nncf
PyTorch*-based Neural Network Compression Framework for enhanced OpenVINO™ inference
Stars: ✭ 218 (-67.8%)
Mutual labels:  quantization, pruning
Model compression
PyTorch Model Compression
Stars: ✭ 150 (-77.84%)
Mutual labels:  quantization, pruning
Soft Filter Pruning
Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks
Stars: ✭ 291 (-57.02%)
Mutual labels:  model-compression, pruning
Awesome Autodl
A curated list of automated deep learning (including neural architecture search and hyper-parameter optimization) resources.
Stars: ✭ 1,819 (+168.69%)
Mutual labels:  nas, neural-architecture-search
Awesome Edge Machine Learning
A curated list of awesome edge machine learning resources, including research papers, inference engines, challenges, books, meetups and others.
Stars: ✭ 139 (-79.47%)
Mutual labels:  quantization, pruning
Autodl Projects
Automated deep learning algorithms implemented in PyTorch.
Stars: ✭ 1,187 (+75.33%)
Mutual labels:  nas, neural-architecture-search
Dna
Block-wisely Supervised Neural Architecture Search with Knowledge Distillation (CVPR 2020)
Stars: ✭ 147 (-78.29%)
Mutual labels:  nas, neural-architecture-search
Archai
Reproducible Rapid Research for Neural Architecture Search (NAS)
Stars: ✭ 266 (-60.71%)
Mutual labels:  nas, neural-architecture-search
BitPack
BitPack is a practical tool to efficiently save ultra-low precision/mixed-precision quantized models.
Stars: ✭ 36 (-94.68%)
Mutual labels:  quantization, model-compression
sparsezoo
Neural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes
Stars: ✭ 264 (-61%)
Mutual labels:  pruning, quantization
Distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Stars: ✭ 3,760 (+455.39%)
Mutual labels:  quantization, pruning
TF-NAS
TF-NAS: Rethinking Three Search Freedoms of Latency-Constrained Differentiable Neural Architecture Search (ECCV2020)
Stars: ✭ 66 (-90.25%)
Mutual labels:  nas, neural-architecture-search
bert-squeeze
🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (-91.73%)
Mutual labels:  pruning, quantization
Neural-Architecture-Search
This repo is about NAS
Stars: ✭ 26 (-96.16%)
Mutual labels:  nas, neural-architecture-search
Auto-Compression
Automatic DNN compression tool with various model compression and neural architecture search techniques
Stars: ✭ 19 (-97.19%)
deep-learning-roadmap
my own deep learning mastery roadmap
Stars: ✭ 40 (-94.09%)
Mutual labels:  nas, neural-architecture-search
Filter Pruning Geometric Median
Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration (CVPR 2019 Oral)
Stars: ✭ 338 (-50.07%)
Mutual labels:  model-compression, pruning
Torch Pruning
A pytorch pruning toolkit for structured neural network pruning and layer dependency maintaining.
Stars: ✭ 193 (-71.49%)
Mutual labels:  model-compression, pruning
CM-NAS
CM-NAS: Cross-Modality Neural Architecture Search for Visible-Infrared Person Re-Identification (ICCV2021)
Stars: ✭ 39 (-94.24%)
Mutual labels:  nas, neural-architecture-search
mmrazor
OpenMMLab Model Compression Toolbox and Benchmark.
Stars: ✭ 644 (-4.87%)
Mutual labels:  pruning, nas
Regularization-Pruning
[ICLR'21] PyTorch code for our paper "Neural Pruning via Growing Regularization"
Stars: ✭ 44 (-93.5%)
Mutual labels:  pruning, model-compression
sparsify
Easy-to-use UI for automatically sparsifying neural networks and creating sparsification recipes for better inference performance and a smaller footprint
Stars: ✭ 138 (-79.62%)
Mutual labels:  pruning, quantization
Nas Benchmark
"NAS evaluation is frustratingly hard", ICLR2020
Stars: ✭ 126 (-81.39%)
Mutual labels:  nas, neural-architecture-search
SViTE
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
Stars: ✭ 50 (-92.61%)
Mutual labels:  pruning, model-compression
Awesome Nas Papers
Awesome Neural Architecture Search Papers
Stars: ✭ 213 (-68.54%)
Mutual labels:  nas, neural-architecture-search
neural-compressor
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (-1.62%)
Mutual labels:  pruning, quantization
torchprune
A research library for pytorch-based neural network pruning, compression, and more.
Stars: ✭ 133 (-80.35%)
Hypernets
A General Automated Machine Learning framework to simplify the development of End-to-end AutoML toolkits in specific domains.
Stars: ✭ 221 (-67.36%)
Mutual labels:  nas, neural-architecture-search
Aimet
AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
Stars: ✭ 453 (-33.09%)
Mutual labels:  quantization, pruning
ZAQ-code
CVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)
Stars: ✭ 59 (-91.29%)
Mutual labels:  quantization, model-compression
DS-Net
(CVPR 2021, Oral) Dynamic Slimmable Network
Stars: ✭ 204 (-69.87%)
Mutual labels:  pruning, model-compression
ESNAC
Learnable Embedding Space for Efficient Neural Architecture Compression
Stars: ✭ 27 (-96.01%)
nas-encodings
Encodings for neural architecture search
Stars: ✭ 29 (-95.72%)
Mutual labels:  nas, neural-architecture-search
Awesome Emdl
Embedded and mobile deep learning research resources
Stars: ✭ 554 (-18.17%)
Mutual labels:  quantization, pruning
Config
Armbian configuration utility
Stars: ✭ 317 (-53.18%)
Mutual labels:  nas
Deephash
An Open-Source Package for Deep Learning to Hash (DeepHash)
Stars: ✭ 417 (-38.4%)
Mutual labels:  quantization
Real Time Network
real-time network architecture for mobile devices and semantic segmentation
Stars: ✭ 308 (-54.51%)
Pnasnet.pytorch
PyTorch implementation of PNASNet-5 on ImageNet
Stars: ✭ 309 (-54.36%)
Nas Bench 201
NAS-Bench-201 API and Instruction
Stars: ✭ 537 (-20.68%)
Mutual labels:  nas
Autogan
[ICCV 2019] "AutoGAN: Neural Architecture Search for Generative Adversarial Networks" by Xinyu Gong, Shiyu Chang, Yifan Jiang and Zhangyang Wang
Stars: ✭ 388 (-42.69%)
Deephash Papers
Must-read papers on deep learning to hash (DeepHash)
Stars: ✭ 302 (-55.39%)
Mutual labels:  quantization
Ftpgrab
Grab your files periodically from a remote FTP or SFTP server easily
Stars: ✭ 300 (-55.69%)
Mutual labels:  nas
Pngquant
Lossy PNG compressor — pngquant command based on libimagequant library
Stars: ✭ 4,086 (+503.55%)
Mutual labels:  quantization
Amc
[ECCV 2018] AMC: AutoML for Model Compression and Acceleration on Mobile Devices
Stars: ✭ 298 (-55.98%)
Mutual labels:  model-compression
1-60 of 291 similar projects