All Projects → JasonZHM → CAE-ADMM

JasonZHM / CAE-ADMM

Licence: MIT license
CAE-ADMM: Implicit Bitrate Optimization via ADMM-Based Pruning in Compressive Autoencoders

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to CAE-ADMM

autoencoder based image compression
Autoencoder based image compression: can the learning be quantization independent? https://arxiv.org/abs/1802.09371
Stars: ✭ 21 (-38.24%)
Mutual labels:  image-compression, autoencoders
qoix
Elixir implementation of the Quite OK Image format
Stars: ✭ 18 (-47.06%)
Mutual labels:  image-compression
tinify-ruby
Ruby client for the Tinify API.
Stars: ✭ 41 (+20.59%)
Mutual labels:  image-compression
continuous Bernoulli
There are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-35.29%)
Mutual labels:  autoencoders
sqip
SQIP is a tool for SVG-based LQIP image creation written in go
Stars: ✭ 46 (+35.29%)
Mutual labels:  image-compression
Deep-Drug-Coder
A tensorflow.keras generative neural network for de novo drug design, first-authored in Nature Machine Intelligence while working at AstraZeneca.
Stars: ✭ 143 (+320.59%)
Mutual labels:  autoencoders
tinify-net
.NET client for the Tinify API.
Stars: ✭ 45 (+32.35%)
Mutual labels:  image-compression
ElasticModels
ElasticModels is a elasticsearch object modeling tool designed to work in and asynchronous environment. Builded for official elasticsearch client library Main inspiration was mongoose project
Stars: ✭ 13 (-61.76%)
Mutual labels:  model-architecture
xice7-imageKit
基于java语言实现的简单的图片处理
Stars: ✭ 23 (-32.35%)
Mutual labels:  image-compression
imagezero
Fast Lossless Color Image Compression Library
Stars: ✭ 49 (+44.12%)
Mutual labels:  image-compression
amr
Official adversarial mixup resynthesis repository
Stars: ✭ 31 (-8.82%)
Mutual labels:  autoencoders
nimPNG
PNG (Portable Network Graphics) decoder and encoder written in Nim
Stars: ✭ 81 (+138.24%)
Mutual labels:  image-compression
api
docs.nekos.moe/
Stars: ✭ 31 (-8.82%)
Mutual labels:  image-compression
kesi
Knowledge distillation from Ensembles of Iterative pruning (BMVC 2020)
Stars: ✭ 23 (-32.35%)
Mutual labels:  network-pruning
image-optimizer
A free and open source tool for optimizing images and vector graphics.
Stars: ✭ 740 (+2076.47%)
Mutual labels:  image-compression
Imgbot
An Azure Function solution to crawl through all of your image files in GitHub and losslessly compress them. This will make the file size go down, but leave the dimensions and quality untouched. Once it's done, ImgBot will open a pull request for you to review and merge. [email protected]
Stars: ✭ 1,017 (+2891.18%)
Mutual labels:  image-compression
zImageOptimizer
Simple image optimizer for JPEG, PNG and GIF images on Linux, MacOS and FreeBSD.
Stars: ✭ 108 (+217.65%)
Mutual labels:  image-compression
ImagePicker
Android library to choose image from gallery or camera with option to compress result image
Stars: ✭ 73 (+114.71%)
Mutual labels:  image-compression
docker-imgproxy
🌐 An ultra fast, production-grade on-the-fly image processing web server. Designed for high throughput with Nginx caching. Powered by imgproxy.
Stars: ✭ 45 (+32.35%)
Mutual labels:  image-compression
autoencoders tensorflow
Automatic feature engineering using deep learning and Bayesian inference using TensorFlow.
Stars: ✭ 66 (+94.12%)
Mutual labels:  autoencoders

CAE-ADMM: IMPLICIT BITRATE OPTIMIZATION VIA ADMM-BASED PRUNING IN COMPRESSIVE AUTOENCODERS

Haimeng Zhao, Peiyuan Liao

Abstract

We introduce ADMM-pruned Compressive AutoEncoder (CAE-ADMM) that uses Alternative Direction Method of Multipliers (ADMM) to optimize the trade-off between distortion and efficiency of lossy image compression. Specifically, ADMM in our method is to promote sparsity to implicitly optimize the bitrate, different from entropy estimators used in the previous research. The experiments on public datasets show that our method outperforms the original CAE and some traditional codecs in terms of SSIM/MS-SSIM metrics, at reasonable inference speed.

Paper & Citation

arXiv:1901.07196 [cs.CV]

If you use these models in your research, please cite:

@article{zhao2019cae,
  title={CAE-ADMM: Implicit Bitrate Optimization via ADMM-based Pruning in Compressive Autoencoders},
  author={Zhao, Haimeng and Liao, Peiyuan},
  journal={arXiv preprint arXiv:1901.07196},
  year={2019}
}

Model Architecture

Model Architecture

The architecture of CAE-ADMM. "Conv k/spP" stands for a convolutional layer with kernel size k times k with a stride of s and a reflection padding of P, and "Conv Down" is reducing the height and weight by 2.

Performance

Comparison of different method with respect to SSIM and MS-SSIM on the Kodak PhotoCD dataset. Note that Toderici et al. used RNN structure instead of entropy coding while CAE-ADMM (Ours) replaces entropy coding with pruning method.

Example

bpp 0.3 bpp 0.3

Comparison of latent code before and after pruning for kodim21. For the sake of clarity, we marked zero values in the feature map before normalization as black.

Acknowledgement

pytorch-msssim: Implementation of MS-SSIM in PyTorch is from pytorch-msssim

huffmancoding.py: Implementation of Huffman coding is from Deep-Compression-PyTorch

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].