All Projects → gabrielhuang → Reptile Pytorch

gabrielhuang / Reptile Pytorch

Licence: bsd-2-clause
A PyTorch implementation of OpenAI's REPTILE algorithm

Projects that are alternatives of or similar to Reptile Pytorch

Mxnet Finetuner
An all-in-one Deep Learning toolkit for image classification to fine-tuning pretrained models using MXNet.
Stars: ✭ 100 (-22.48%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Tensorflow2.0 Examples
🙄 Difficult algorithm, Simple code.
Stars: ✭ 1,397 (+982.95%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Models
DLTK Model Zoo
Stars: ✭ 101 (-21.71%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Pytorch Learners Tutorial
PyTorch tutorial for learners
Stars: ✭ 97 (-24.81%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Machine Learning Demystified
A weekly workshop series at ITP to teach machine learning with a focus on deep learning
Stars: ✭ 114 (-11.63%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Btctrading
Time Series Forecast with Bitcoin value, to detect upward/down trends with Machine Learning Algorithms
Stars: ✭ 99 (-23.26%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Intro To Deep Learning
A collection of materials to help you learn about deep learning
Stars: ✭ 103 (-20.16%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Dareblopy
Data Reading Blocks for Python
Stars: ✭ 82 (-36.43%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Isl Python
Solutions to labs and excercises from An Introduction to Statistical Learning, as Jupyter Notebooks.
Stars: ✭ 108 (-16.28%)
Mutual labels:  jupyter-notebook, supervised-learning
Selfdrivingcar
A collection of all projects pertaining to different layers in the SDC software stack
Stars: ✭ 107 (-17.05%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Deep Learning Python
Intro to Deep Learning, including recurrent, convolution, and feed forward neural networks.
Stars: ✭ 94 (-27.13%)
Mutual labels:  jupyter-notebook, deep-neural-networks
100 Days Of Nlp
Stars: ✭ 125 (-3.1%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Deep Dream In Pytorch
Pytorch implementation of the DeepDream computer vision algorithm
Stars: ✭ 90 (-30.23%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Deep Image Analogy Pytorch
Visual Attribute Transfer through Deep Image Analogy in PyTorch!
Stars: ✭ 100 (-22.48%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Breast Cancer Classification
Breast Cancer Classification using CNN and transfer learning
Stars: ✭ 86 (-33.33%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Pytorchnlpbook
Code and data accompanying Natural Language Processing with PyTorch published by O'Reilly Media https://nlproc.info
Stars: ✭ 1,390 (+977.52%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Mit 6.s094
MIT-6.S094: Deep Learning for Self-Driving Cars Assignments solutions
Stars: ✭ 74 (-42.64%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Swae
Implementation of the Sliced Wasserstein Autoencoders
Stars: ✭ 75 (-41.86%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Faceaging By Cyclegan
Stars: ✭ 105 (-18.6%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Ml Fraud Detection
Credit card fraud detection through logistic regression, k-means, and deep learning.
Stars: ✭ 117 (-9.3%)
Mutual labels:  jupyter-notebook, deep-neural-networks

Reptile

PyTorch implementation of OpenAI's Reptile algorithm for supervised learning.

Currently, it runs on Omniglot but not yet on MiniImagenet.

The code has not been tested extensively. Contributions and feedback are more than welcome!

Omniglot meta-learning dataset

There is already an Omniglot dataset class in torchvision, however it seems to be more adapted for supervised-learning than few-shot learning.

The omniglot.py provides a way to sample K-shot N-way base-tasks from Omniglot, and various utilities to split meta-training sets as well as base-tasks.

Features

  • [x] Monitor training with TensorboardX.
  • [x] Interrupt and resume training.
  • [x] Train and evaluate on Omniglot.
  • [ ] Meta-batch size > 1.
  • [ ] Train and evaluate on Mini-Imagenet.
  • [ ] Clarify Transductive vs. Non-transductive setting.
  • [ ] Add training curves in README.
  • [ ] Reproduce all settings from OpenAI's code.
  • [ ] Shell script to download datasets

How to train on Omniglot

Download the two parts of the Omniglot dataset:

Create a omniglot/ folder in the repo, unzip and merge the two files to have the following folder structure:

./train_omniglot.py
...
./omniglot/Alphabet_of_the_Magi/
./omniglot/Angelic/
./omniglot/Anglo-Saxon_Futhorc/
...
./omniglot/ULOG/

Now start training with

python train_omniglot.py log --cuda 0 $HYPERPARAMETERS  # with CPU
python train_omniglot.py log $HYPERPARAMETERS  # with CUDA

where $HYPERPARAMETERS depends on your task and hyperparameters.

Behavior:

  • If no checkpoints are found in log/, this will create a log/ folder to store tensorboard information and checkpoints.
  • If checkpoints are found in log/, this will resume from the last checkpoint.

Training can be interrupted at any time with ^C, and resumed from the last checkpoint by re-running the same command.

Omniglot Hyperparameters

The following set of hyperparameters work decently. They are taken from the OpenAI implementation but are adapted slightly for meta-batch=1.

For 5-way 5-shot (red curve):

python train_omniglot.py log/o55 --classes 5 --shots 5 --train-shots 10 --meta-iterations 100000 --iterations 5 --test-iterations 50 --batch 10 --meta-lr 0.2 --lr 0.001

For 5-way 1-shot (blue curve):

python train_omniglot.py log/o51 --classes 5 --shots 1 --train-shots 12 --meta-iterations 200000 --iterations 12 --test-iterations 86 --batch 10 --meta-lr 0.33 --lr 0.00044

References

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].