All Projects β†’ vikasverma1077 β†’ Manifold_mixup

vikasverma1077 / Manifold_mixup

Code for reproducing Manifold Mixup results (ICML 2019)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Manifold mixup

Reptile Pytorch
A PyTorch implementation of OpenAI's REPTILE algorithm
Stars: ✭ 129 (-58.92%)
Mutual labels:  deep-neural-networks, supervised-learning
Free Ai Resources
πŸš€ FREE AI Resources - πŸŽ“ Courses, πŸ‘· Jobs, πŸ“ Blogs, πŸ”¬ AI Research, and many more - for everyone!
Stars: ✭ 192 (-38.85%)
Mutual labels:  deep-neural-networks, supervised-learning
L2c
Learning to Cluster. A deep clustering strategy.
Stars: ✭ 262 (-16.56%)
Mutual labels:  deep-neural-networks, supervised-learning
Deep Diamond
A fast Clojure Tensor & Deep Learning library
Stars: ✭ 288 (-8.28%)
Mutual labels:  deep-neural-networks
Alphagozero Python Tensorflow
Congratulation to DeepMind! This is a reengineering implementation (on behalf of many other git repo in /support/) of DeepMind's Oct19th publication: [Mastering the Game of Go without Human Knowledge]. The supervised learning approach is more practical for individuals. (This repository has single purpose of education only)
Stars: ✭ 292 (-7.01%)
Mutual labels:  supervised-learning
Awesome Deep Vision Web Demo
A curated list of awesome deep vision web demo
Stars: ✭ 298 (-5.1%)
Mutual labels:  deep-neural-networks
Dancinggaga
AI 尬舞机
Stars: ✭ 315 (+0.32%)
Mutual labels:  deep-neural-networks
Bigdata18
Transfer learning for time series classification
Stars: ✭ 284 (-9.55%)
Mutual labels:  deep-neural-networks
Tensorflow Image Detection
A generic image detection program that uses Google's Machine Learning library, Tensorflow and a pre-trained Deep Learning Convolutional Neural Network model called Inception.
Stars: ✭ 306 (-2.55%)
Mutual labels:  deep-neural-networks
Deep Learning Uncertainty
Literature survey, paper reviews, experimental setups and a collection of implementations for baselines methods for predictive uncertainty estimation in deep learning models.
Stars: ✭ 296 (-5.73%)
Mutual labels:  deep-neural-networks
Cascaded Fcn
Source code for the MICCAI 2016 Paper "Automatic Liver and Lesion Segmentation in CT Using Cascaded Fully Convolutional NeuralNetworks and 3D Conditional Random Fields"
Stars: ✭ 296 (-5.73%)
Mutual labels:  deep-neural-networks
Dab
Data Augmentation by Backtranslation (DAB) ヽ( β€’_-)α•—
Stars: ✭ 294 (-6.37%)
Mutual labels:  deep-neural-networks
Yolo V2 Pytorch
YOLO for object detection tasks
Stars: ✭ 302 (-3.82%)
Mutual labels:  deep-neural-networks
Sednn
deep learning based speech enhancement using keras or pytorch, make it easy to use
Stars: ✭ 288 (-8.28%)
Mutual labels:  deep-neural-networks
Pytorch Vdsr
VDSR (CVPR2016) pytorch implementation
Stars: ✭ 313 (-0.32%)
Mutual labels:  deep-neural-networks
100 Days Of Ml Code
100-Days-Of-ML-CodeδΈ­ζ–‡η‰ˆ
Stars: ✭ 16,797 (+5249.36%)
Mutual labels:  supervised-learning
Rgn
Recurrent Geometric Networks for end-to-end differentiable learning of protein structure
Stars: ✭ 302 (-3.82%)
Mutual labels:  deep-neural-networks
Machine Learning Algorithms From Scratch
Implementing machine learning algorithms from scratch.
Stars: ✭ 297 (-5.41%)
Mutual labels:  supervised-learning
Model Compression Papers
Papers for deep neural network compression and acceleration
Stars: ✭ 296 (-5.73%)
Mutual labels:  deep-neural-networks
Agentnet
Deep Reinforcement Learning library for humans
Stars: ✭ 298 (-5.1%)
Mutual labels:  deep-neural-networks

Manifold_mixup (ICML 2019)

This repo consists Pytorch code for the ICML 2019 paper Manifold Mixup: Better Representations by Interpolating Hidden States (https://arxiv.org/abs/1806.05236 ICML version (http://proceedings.mlr.press/v97/verma19a.html))

The goal of our proposed algorithm, Manifold Mixup, is to learn robust features by interpolating the hidden states of examples. The representations learned by our method are more discriminative and compact as shown in the below figure. Please refer to Figure 1 and Figure 2 of our paper for more details.

The repo consist of two subfolders for Supervised Learning and GAN experiments. Each subfolder is self-contained (can be used independently of the other subfolders). Each subfolder has its own instruction on "How to run" in its README.md file.

If you find this work useful and use it on your own research, please concider citing our paper.

@InProceedings{pmlr-v97-verma19a,
  title = 	 {Manifold Mixup: Better Representations by Interpolating Hidden States},
  author = 	 {Verma, Vikas and Lamb, Alex and Beckham, Christopher and Najafi, Amir and Mitliagkas, Ioannis and Lopez-Paz, David and Bengio, Yoshua},
  booktitle = 	 {Proceedings of the 36th International Conference on Machine Learning},
  pages = 	 {6438--6447},
  year = 	 {2019},
  editor = 	 {Chaudhuri, Kamalika and Salakhutdinov, Ruslan},
  volume = 	 {97},
  series = 	 {Proceedings of Machine Learning Research},
  address = 	 {Long Beach, California, USA},
  month = 	 {09--15 Jun},
  publisher = 	 {PMLR},
  pdf = 	 {http://proceedings.mlr.press/v97/verma19a/verma19a.pdf},
  url = 	 {http://proceedings.mlr.press/v97/verma19a.html},
  }


Note: Please refer to our new repo for Interpolation based Semi-supervised Learning https://github.com/vikasverma1077/ICT

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].