All Projects → batchnorm-pruning → Similar Projects or Alternatives

78 Open source projects that are alternatives of or similar to batchnorm-pruning

Awesome Ml Model Compression
Awesome machine learning model compression research papers, tools, and learning material.
Stars: ✭ 166 (+151.52%)
Mutual labels:  pruning
Adventures In Tensorflow Lite
This repository contains notebooks that show the usage of TensorFlow Lite for quantizing deep neural networks.
Stars: ✭ 79 (+19.7%)
Mutual labels:  pruning
PyTorch-Deep-Compression
A PyTorch implementation of the iterative pruning method described in Han et. al. (2015)
Stars: ✭ 39 (-40.91%)
Mutual labels:  pruning
Neuralnetworks.thought Experiments
Observations and notes to understand the workings of neural network models and other thought experiments using Tensorflow
Stars: ✭ 199 (+201.52%)
Mutual labels:  pruning
Tf Keras Surgeon
Pruning and other network surgery for trained TF.Keras models.
Stars: ✭ 25 (-62.12%)
Mutual labels:  pruning
deep-compression
Learning both Weights and Connections for Efficient Neural Networks https://arxiv.org/abs/1506.02626
Stars: ✭ 156 (+136.36%)
Mutual labels:  pruning
Awesome Edge Machine Learning
A curated list of awesome edge machine learning resources, including research papers, inference engines, challenges, books, meetups and others.
Stars: ✭ 139 (+110.61%)
Mutual labels:  pruning
fasterai1
FasterAI: A repository for making smaller and faster models with the FastAI library.
Stars: ✭ 34 (-48.48%)
Mutual labels:  pruning
Delve
PyTorch and Keras model training and layer saturation monitor
Stars: ✭ 49 (-25.76%)
Mutual labels:  pruning
Awd Lstm Lm
LSTM and QRNN Language Model Toolkit for PyTorch
Stars: ✭ 1,834 (+2678.79%)
Mutual labels:  sgd
Awesome Ai Infrastructures
Infrastructures™ for Machine Learning Training/Inference in Production.
Stars: ✭ 223 (+237.88%)
Mutual labels:  pruning
Aimet
AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
Stars: ✭ 453 (+586.36%)
Mutual labels:  pruning
torchprune
A research library for pytorch-based neural network pruning, compression, and more.
Stars: ✭ 133 (+101.52%)
Mutual labels:  pruning
Mobile Yolov5 Pruning Distillation
mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
Stars: ✭ 192 (+190.91%)
Mutual labels:  pruning
FactorizationMachine
implementation of factorization machine, support classification.
Stars: ✭ 19 (-71.21%)
Mutual labels:  sgd
Lottery Ticket Hypothesis In Pytorch
This repository contains a Pytorch implementation of the paper "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" by Jonathan Frankle and Michael Carbin that can be easily adapted to any model/dataset.
Stars: ✭ 162 (+145.45%)
Mutual labels:  pruning
vgg16 batchnorm
VGG16 architecture with BatchNorm
Stars: ✭ 14 (-78.79%)
Mutual labels:  batchnorm
Filter Grafting
Filter Grafting for Deep Neural Networks(CVPR 2020)
Stars: ✭ 110 (+66.67%)
Mutual labels:  pruning
SGDLibrary
MATLAB/Octave library for stochastic optimization algorithms: Version 1.0.20
Stars: ✭ 165 (+150%)
Mutual labels:  sgd
Grasp
Code for "Picking Winning Tickets Before Training by Preserving Gradient Flow" https://openreview.net/pdf?id=SkgsACVKPH
Stars: ✭ 58 (-12.12%)
Mutual labels:  pruning
AutoOpt
Automatic and Simultaneous Adjustment of Learning Rate and Momentum for Stochastic Gradient Descent
Stars: ✭ 44 (-33.33%)
Mutual labels:  sgd
Model Optimization
A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
Stars: ✭ 992 (+1403.03%)
Mutual labels:  pruning
pytorch-network-slimming
A package to make do Network Slimming a little easier
Stars: ✭ 40 (-39.39%)
Mutual labels:  pruning
Paddleslim
PaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+925.76%)
Mutual labels:  pruning
a-tour-of-pytorch-optimizers
A tour of different optimization algorithms in PyTorch.
Stars: ✭ 46 (-30.3%)
Mutual labels:  sgd
Skimcaffe
Caffe for Sparse Convolutional Neural Network
Stars: ✭ 230 (+248.48%)
Mutual labels:  pruning
Distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Stars: ✭ 3,760 (+5596.97%)
Mutual labels:  pruning
sparsezoo
Neural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes
Stars: ✭ 264 (+300%)
Mutual labels:  pruning
Nncf
PyTorch*-based Neural Network Compression Framework for enhanced OpenVINO™ inference
Stars: ✭ 218 (+230.3%)
Mutual labels:  pruning
ATMC
[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-37.88%)
Mutual labels:  pruning
Torch Pruning
A pytorch pruning toolkit for structured neural network pruning and layer dependency maintaining.
Stars: ✭ 193 (+192.42%)
Mutual labels:  pruning
GAN-LTH
[ICLR 2021] "GANs Can Play Lottery Too" by Xuxi Chen, Zhenyu Zhang, Yongduo Sui, Tianlong Chen
Stars: ✭ 24 (-63.64%)
Mutual labels:  pruning
Kd lib
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Stars: ✭ 173 (+162.12%)
Mutual labels:  pruning
bert-squeeze
🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (-15.15%)
Mutual labels:  pruning
Hrank
Pytorch implementation of our CVPR 2020 (Oral) -- HRank: Filter Pruning using High-Rank Feature Map
Stars: ✭ 164 (+148.48%)
Mutual labels:  pruning
prunnable-layers-pytorch
Prunable nn layers for pytorch.
Stars: ✭ 47 (-28.79%)
Mutual labels:  pruning
Model compression
PyTorch Model Compression
Stars: ✭ 150 (+127.27%)
Mutual labels:  pruning
FisherPruning
Group Fisher Pruning for Practical Network Compression(ICML2021)
Stars: ✭ 127 (+92.42%)
Mutual labels:  pruning
Cen
[NeurIPS 2020] Code release for paper "Deep Multimodal Fusion by Channel Exchanging" (In PyTorch)
Stars: ✭ 112 (+69.7%)
Mutual labels:  pruning
LinkOS-Android-Samples
Java based sample code for developing on Android. The demos in this repository are stored on separate branches. To navigate to a demo, please click branches.
Stars: ✭ 52 (-21.21%)
Mutual labels:  sgd
Micronet
micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference)、Low-Bit(≤2b)/Ternary and Binary(TWN/BNN/XNOR-Net); post-training-quantization(PTQ), 8-bit(tensorrt); 2、 pruning: normal、regular and group convolutional channel pruning; 3、 group convolution structure; 4、batch-normalization fuse for quantization. deploy: tensorrt, fp32/fp16/int8(ptq-calibration)、op-adapt(upsample)、dynamic_shape
Stars: ✭ 1,232 (+1766.67%)
Mutual labels:  pruning
NaiveNASflux.jl
Your local Flux surgeon
Stars: ✭ 20 (-69.7%)
Mutual labels:  pruning
Model Compression And Acceleration Progress
Repository to track the progress in model compression and acceleration
Stars: ✭ 63 (-4.55%)
Mutual labels:  pruning
theedhum-nandrum
A sentiment classifier on mixed language (and mixed script) reviews in Tamil, Malayalam and English
Stars: ✭ 16 (-75.76%)
Mutual labels:  sgd
Ntagger
reference pytorch code for named entity tagging
Stars: ✭ 58 (-12.12%)
Mutual labels:  pruning
neural-compressor
Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.
Stars: ✭ 666 (+909.09%)
Mutual labels:  pruning
Awesome Pruning
A curated list of neural network pruning resources.
Stars: ✭ 1,017 (+1440.91%)
Mutual labels:  pruning
DS-Net
(CVPR 2021, Oral) Dynamic Slimmable Network
Stars: ✭ 204 (+209.09%)
Mutual labels:  pruning
Ridurre Network Filter Pruning Keras
Keras model convolutional filter pruning package
Stars: ✭ 35 (-46.97%)
Mutual labels:  pruning
Pruning filters for efficient convnets
PyTorch implementation of "Pruning Filters For Efficient ConvNets"
Stars: ✭ 96 (+45.45%)
Mutual labels:  pruning
Pytorch Pruning
PyTorch Implementation of [1611.06440] Pruning Convolutional Neural Networks for Resource Efficient Inference
Stars: ✭ 740 (+1021.21%)
Mutual labels:  pruning
DiFacto2 ffm
Distributed Fieldaware Factorization Machines based on Parameter Server
Stars: ✭ 11 (-83.33%)
Mutual labels:  sgd
Awesome Emdl
Embedded and mobile deep learning research resources
Stars: ✭ 554 (+739.39%)
Mutual labels:  pruning
torch-model-compression
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (+90.91%)
Mutual labels:  pruning
TransE
TransE方法的Python实现,解释SGD中TransE的向量更新
Stars: ✭ 31 (-53.03%)
Mutual labels:  sgd
Generalizing-Lottery-Tickets
This repository contains code to replicate the experiments given in NeurIPS 2019 paper "One ticket to win them all: generalizing lottery ticket initializations across datasets and optimizers"
Stars: ✭ 48 (-27.27%)
Mutual labels:  pruning
jetbrains-utility
Remove/Backup – settings & cli for macOS (OS X) – DataGrip, AppCode, CLion, Gogland, IntelliJ, PhpStorm, PyCharm, Rider, RubyMine, WebStorm
Stars: ✭ 62 (-6.06%)
Mutual labels:  pruning
numpy-neuralnet-exercise
Implementation of key concepts of neuralnetwork via numpy
Stars: ✭ 49 (-25.76%)
Mutual labels:  sgd
Dynamic Model Pruning with Feedback
Implement of Dynamic Model Pruning with Feedback with pytorch
Stars: ✭ 25 (-62.12%)
Mutual labels:  pruning
Selecsls Pytorch
Reference ImageNet implementation of SelecSLS CNN architecture proposed in the SIGGRAPH 2020 paper "XNect: Real-time Multi-Person 3D Motion Capture with a Single RGB Camera". The repository also includes code for pruning the model based on implicit sparsity emerging from adaptive gradient descent methods, as detailed in the CVPR 2019 paper "On implicit filter level sparsity in Convolutional Neural Networks".
Stars: ✭ 251 (+280.3%)
Mutual labels:  pruning
1-60 of 78 similar projects