All Projects → cwkx → Gon

cwkx / Gon

Licence: mit
Gradient Origin Networks - a new type of generative model that is able to quickly learn a latent representation without an encoder

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Gon

Paragraph Vectors
📄 A PyTorch implementation of Paragraph Vectors (doc2vec).
Stars: ✭ 337 (+167.46%)
Mutual labels:  neural-networks, unsupervised-learning
Sealion
The first machine learning framework that encourages learning ML concepts instead of memorizing class functions.
Stars: ✭ 278 (+120.63%)
Mutual labels:  neural-networks, unsupervised-learning
Pyod
A Python Toolbox for Scalable Outlier Detection (Anomaly Detection)
Stars: ✭ 5,083 (+3934.13%)
Mutual labels:  neural-networks, unsupervised-learning
Dgi
Deep Graph Infomax (https://arxiv.org/abs/1809.10341)
Stars: ✭ 326 (+158.73%)
Mutual labels:  neural-networks, unsupervised-learning
Minisom
🔴 MiniSom is a minimalistic implementation of the Self Organizing Maps
Stars: ✭ 801 (+535.71%)
Mutual labels:  neural-networks, unsupervised-learning
Nnpack
Acceleration package for neural networks on multi-core CPUs
Stars: ✭ 1,538 (+1120.63%)
Mutual labels:  neural-networks
Cleanlab
The standard package for machine learning with noisy labels, finding mislabeled data, and uncertainty quantification. Works with most datasets and models.
Stars: ✭ 2,526 (+1904.76%)
Mutual labels:  unsupervised-learning
Growing Neural Cellular Automata
A reproduction of growing neural cellular automata using PyTorch.
Stars: ✭ 116 (-7.94%)
Mutual labels:  neural-networks
Learn Machine Learning
Learn to Build a Machine Learning Application from Top Articles
Stars: ✭ 116 (-7.94%)
Mutual labels:  neural-networks
Pytorch Model Zoo
A collection of deep learning models implemented in PyTorch
Stars: ✭ 125 (-0.79%)
Mutual labels:  neural-networks
Hyperdensenet
This repository contains the code of HyperDenseNet, a hyper-densely connected CNN to segment medical images in multi-modal image scenarios.
Stars: ✭ 124 (-1.59%)
Mutual labels:  neural-networks
Nlp Pretrained Model
A collection of Natural language processing pre-trained models.
Stars: ✭ 122 (-3.17%)
Mutual labels:  neural-networks
Amla
AutoML frAmework for Neural Networks
Stars: ✭ 119 (-5.56%)
Mutual labels:  neural-networks
Clicr
Machine reading comprehension on clinical case reports
Stars: ✭ 123 (-2.38%)
Mutual labels:  neural-networks
Deephyper
DeepHyper: Scalable Asynchronous Neural Architecture and Hyperparameter Search for Deep Neural Networks
Stars: ✭ 117 (-7.14%)
Mutual labels:  neural-networks
Keras2cpp
it's a small library for running trained Keras 2 models from a native C++ code.
Stars: ✭ 124 (-1.59%)
Mutual labels:  neural-networks
Openann
An open source library for artificial neural networks.
Stars: ✭ 117 (-7.14%)
Mutual labels:  neural-networks
Sfmlearner
An unsupervised learning framework for depth and ego-motion estimation from monocular videos
Stars: ✭ 1,661 (+1218.25%)
Mutual labels:  unsupervised-learning
3dpose gan
The authors' implementation of Unsupervised Adversarial Learning of 3D Human Pose from 2D Joint Locations
Stars: ✭ 124 (-1.59%)
Mutual labels:  unsupervised-learning
Neuroner
Named-entity recognition using neural networks. Easy-to-use and state-of-the-art results.
Stars: ✭ 1,579 (+1153.17%)
Mutual labels:  neural-networks

Gradient Origin Networks

This paper has been accepted at ICLR 2021.

This paper proposes a new type of generative model that is able to quickly learn a latent representation without an encoder. This is achieved using empirical Bayes to calculate the expectation of the posterior, which is implemented by initialising a latent vector with zeros, then using the gradient of the log-likelihood of the data with respect to this zero vector as new latent points. The approach has similar characteristics to autoencoders, but with a simpler architecture, and is demonstrated in a variational autoencoder equivalent that permits sampling. This also allows implicit representation networks to learn a space of implicit functions without requiring a hypernetwork, retaining their representation advantages across datasets. The experiments show that the proposed method converges faster, with significantly lower reconstruction error than autoencoders, while requiring half the parameters.

GON (GON)

Variational GON (Variational GON)

Implicit GON (Implicit GON)

The code is available in GON.py and licensed under the MIT license. For more information, please visit the Project Page. Here is a link to the paper. The implicit GON version uses a SIREN (Implicit Neural Representations with Periodic Activation Functions, Sitzmann et al., 2020).

YouTube Preview

Citation

If you find this useful, please cite:

@inproceedings{bond2020gradient,
   title     = {Gradient Origin Networks},
   author    = {Sam Bond-Taylor and Chris G. Willcocks},
   booktitle = {International Conference on Learning Representations},
   year      = {2021},
   url       = {https://openreview.net/pdf?id=0O_cQfw6uEh}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].