All Projects → DuaneNielsen → Deepinfomaxpytorch

DuaneNielsen / Deepinfomaxpytorch

Learning deep representations by mutual information estimation and maximization

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Deepinfomaxpytorch

Kvdo
A pair of kernel modules which provide pools of deduplicated and/or compressed block storage.
Stars: ✭ 168 (-20.75%)
Mutual labels:  compression
Fastlz
Small & portable byte-aligned LZ77 compression
Stars: ✭ 180 (-15.09%)
Mutual labels:  compression
Lq Nets
LQ-Nets: Learned Quantization for Highly Accurate and Compact Deep Neural Networks
Stars: ✭ 195 (-8.02%)
Mutual labels:  compression
Compress
Optimized Go Compression Packages
Stars: ✭ 2,478 (+1068.87%)
Mutual labels:  compression
Tensorflow Tutorials
텐서플로우를 기초부터 응용까지 단계별로 연습할 수 있는 소스 코드를 제공합니다
Stars: ✭ 2,096 (+888.68%)
Mutual labels:  autoencoder
Timeseries Clustering Vae
Variational Recurrent Autoencoder for timeseries clustering in pytorch
Stars: ✭ 190 (-10.38%)
Mutual labels:  autoencoder
Zip
Swift framework for zipping and unzipping files.
Stars: ✭ 2,120 (+900%)
Mutual labels:  compression
Meshoptimizer
Mesh optimization library that makes meshes smaller and faster to render
Stars: ✭ 2,930 (+1282.08%)
Mutual labels:  compression
Cidlib
The CIDLib general purpose C++ development environment
Stars: ✭ 179 (-15.57%)
Mutual labels:  compression
Streamvbyte
Fast integer compression in C using the StreamVByte codec
Stars: ✭ 195 (-8.02%)
Mutual labels:  compression
Tensorflow 101
中文的 tensorflow tutorial with jupyter notebooks
Stars: ✭ 172 (-18.87%)
Mutual labels:  autoencoder
Stegcloak
Hide secrets with invisible characters in plain text securely using passwords 🧙🏻‍♂️⭐
Stars: ✭ 2,379 (+1022.17%)
Mutual labels:  compression
Datacompression
Swift libcompression wrapper as an extension for the Data type (GZIP, ZLIB, LZFSE, LZMA, LZ4, deflate, RFC-1950, RFC-1951, RFC-1952)
Stars: ✭ 191 (-9.91%)
Mutual labels:  compression
Uzlib
Radically unbloated DEFLATE/zlib/gzip compression/decompression library. Can decompress any gzip/zlib data, and offers simplified compressor which produces gzip-compatible output, while requiring much less resources (and providing less compression ratio of course).
Stars: ✭ 168 (-20.75%)
Mutual labels:  compression
Util
A collection of useful utility functions
Stars: ✭ 201 (-5.19%)
Mutual labels:  compression
Deep image prior
Image reconstruction done with untrained neural networks.
Stars: ✭ 168 (-20.75%)
Mutual labels:  autoencoder
Deep white balance
Reference code for the paper: Deep White-Balance Editing, CVPR 2020 (Oral). Our method is a deep learning multi-task framework for white-balance editing.
Stars: ✭ 184 (-13.21%)
Mutual labels:  autoencoder
Turbobench
Compression Benchmark
Stars: ✭ 211 (-0.47%)
Mutual labels:  compression
Pylzma
Python bindings for the LZMA library
Stars: ✭ 202 (-4.72%)
Mutual labels:  compression
Nn compression
Stars: ✭ 193 (-8.96%)
Mutual labels:  compression

Deep InfoMax Pytorch

Pytorch implementation of Deep InfoMax https://arxiv.org/abs/1808.06670

Encoding data by maximimizing mutual information between the latent space and in this case, CIFAR 10 images.

Ported most of the code from rcallands chainer implementation. Thanks buddy! https://github.com/rcalland/deep-INFOMAX

Pytorch implementation by the research team here

Current Results (work in progress)

airplane automobile bird cat deer dog frog horse ship truck
Fully supervised 0.7780 0.8907 0.6233 0.5606 0.6891 0.6420 0.7967 0.8206 0.8619 0.8291
DeepInfoMax-Local 0.6120 0.6969 0.4020 0.4226 0.4917 0.5806 0.6871 0.5806 0.6855 0.5647

alt_text

Figure 1
Top: a red lamborghini, Middle: 10 closest images in the latent space (L2 distance), Bottom: 10 farthest images in the latent space.

Some more results..

alt_text

alt_text

alt_text

alt_text

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].