All Projects → haeusser → Learning_by_association

haeusser / Learning_by_association

Licence: apache-2.0
This repository contains code for the paper Learning by Association - A versatile semi-supervised training method for neural networks (CVPR 2017) and the follow-up work Associative Domain Adaptation (ICCV 2017).

Projects that are alternatives of or similar to Learning by association

Phonetic Similarity Vectors
Source code to accompany my paper "Poetic sound similarity vectors using phonetic features"
Stars: ✭ 148 (-1.33%)
Mutual labels:  jupyter-notebook
Slayerpytorch
PyTorch implementation of SLAYER for training Spiking Neural Networks
Stars: ✭ 150 (+0%)
Mutual labels:  jupyter-notebook
Autonomousdrivingcookbook
Scenarios, tutorials and demos for Autonomous Driving
Stars: ✭ 1,939 (+1192.67%)
Mutual labels:  jupyter-notebook
Pyomogallery
A collection of Pyomo examples
Stars: ✭ 149 (-0.67%)
Mutual labels:  jupyter-notebook
Time Series Forecasting Of Amazon Stock Prices Using Neural Networks Lstm And Gan
Project analyzes Amazon Stock data using Python. Feature Extraction is performed and ARIMA and Fourier series models are made. LSTM is used with multiple features to predict stock prices and then sentimental analysis is performed using news and reddit sentiments. GANs are used to predict stock data too where Amazon data is taken from an API as Generator and CNNs are used as discriminator.
Stars: ✭ 150 (+0%)
Mutual labels:  jupyter-notebook
Nlp adversarial examples
Implementation code for the paper "Generating Natural Language Adversarial Examples"
Stars: ✭ 149 (-0.67%)
Mutual labels:  jupyter-notebook
Pytorch Tutorials Kr
🇰🇷PyTorch에서 제공하는 튜토리얼의 한국어 번역을 위한 저장소입니다. (Translate PyTorch tutorials in Korean🇰🇷)
Stars: ✭ 148 (-1.33%)
Mutual labels:  jupyter-notebook
Ml Workspace
🛠 All-in-one web-based IDE specialized for machine learning and data science.
Stars: ✭ 2,337 (+1458%)
Mutual labels:  jupyter-notebook
Person Reid Tiny Baseline
Open source person re-identification in Pytorch
Stars: ✭ 150 (+0%)
Mutual labels:  jupyter-notebook
Computer vision
C/C++/Python based computer vision models using OpenPose, OpenCV, DLIB, Keras and Tensorflow libraries. Object Detection, Tracking, Face Recognition, Gesture, Emotion and Posture Recognition
Stars: ✭ 150 (+0%)
Mutual labels:  jupyter-notebook
Www old.julialang.org
Julia Project web site (Old)
Stars: ✭ 149 (-0.67%)
Mutual labels:  jupyter-notebook
Transformers Ru
A list of pretrained Transformer models for the Russian language.
Stars: ✭ 150 (+0%)
Mutual labels:  jupyter-notebook
Data Science Portfolio
A Portfolio of my Data Science Projects
Stars: ✭ 149 (-0.67%)
Mutual labels:  jupyter-notebook
Carnd Mercedes Sf Utilities
Tools for Sensor Fusion processing.
Stars: ✭ 149 (-0.67%)
Mutual labels:  jupyter-notebook
Face Depixelizer
Face Depixelizer based on "PULSE: Self-Supervised Photo Upsampling via Latent Space Exploration of Generative Models" repository.
Stars: ✭ 1,912 (+1174.67%)
Mutual labels:  jupyter-notebook
Deeplab v2
基于v2版本的deeplab,使用VGG16模型,在VOC2012,Pascal-context,NYU-v2等多个数据集上进行训练
Stars: ✭ 149 (-0.67%)
Mutual labels:  jupyter-notebook
Machinelearning Watermelonbook
周志华-机器学习
Stars: ✭ 150 (+0%)
Mutual labels:  jupyter-notebook
Deeplearning keras2
Modification of fast.ai deep learning course notebooks for usage with Keras 2 and Python 3.
Stars: ✭ 150 (+0%)
Mutual labels:  jupyter-notebook
Feature Selector
Feature selector is a tool for dimensionality reduction of machine learning datasets
Stars: ✭ 1,913 (+1175.33%)
Mutual labels:  jupyter-notebook
Python For Financial Analysis And Algorithmic Trading
https://www.udemy.com/python-for-finance-and-trading-algorithms/
Stars: ✭ 150 (+0%)
Mutual labels:  jupyter-notebook

This repository contains code for the paper Learning by Association - A versatile semi-supervised training method for neural networks (CVPR 2017) and the follow-up work Associative Domain Adaptation (ICCV 2017)

It is implemented with TensorFlow. Please refer to the TensorFlow documentation for further information.

The core functions are implemented in semisup/backend.py. The files train.py and eval.py demonstrate how to use them. A quick example is contained in mnist_train_eval.py.

In order to reproduce the results from the paper, please use the architectures and pipelines from the tools/{stl10,svhn,synth}.py. They are loaded automatically by setting the flag [target_]dataset in {train,eval}.py accordingly.

Before you get started, please make sure to add the following to your ~/.bashrc:

export PYTHONPATH=/path/to/learning_by_association:$PYTHONPATH

Copy the file semisup/tools/data_dirs.py.template to semisup/tools/data_dirs.py, adapt the paths and .gitignore this file.

Domain Adaptation Hyper parameters

Synth. Signs -> GTSRB

"target_dataset": "gtsrb",
"walker_weight_envelope_delay": "0",
"max_checkpoints": 5,
"dataset": "synth_signs",
"visit_weight": "0.1",
"sup_per_batch": 24,
"walker_weight_envelope_steps": 1,
"eval_batch_size": 24,
"walker_weight_envelope": "linear",
"unsup_batch_size": 1032,
"visit_weight_envelope": "linear",
"decay_steps": 9000,
"sup_per_class": -1,
"max_steps": 12000,
"architecture": "svhn_model"

MNIST -> MNIST-M

"target_dataset": "mnistm",
"walker_weight_envelope_delay": "500",
"max_checkpoints": 5,
"new_size": 32,
"dataset": "mnist3",
"visit_weight": "0.6",
"augmentation": true,
"walker_weight_envelope_steps": 1,
"walker_weight_envelope": "linear",
"unsup_batch_size": 1000,
"visit_weight_envelope": "linear",
"decay_steps": 9000,
"architecture": "svhn_model",
"sup_per_class": -1,
"sup_per_batch": 100,
"max_steps": "12000",

SVHN -> MNIST

"target_dataset": "mnist3",
"walker_weight_envelope_delay": "500",
"max_checkpoints": 5,
"new_size": 32,
"dataset": "svhn",
"sup_per_batch": 100,
"decay_steps": 9000,
"unsup_batch_size": 1000,
"sup_per_class": -1,
"walker_weight_envelope_steps": 1,
"walker_weight_envelope": "linear",
"visit_weight_envelope": "linear",
"architecture": "svhn_model",
"visit_weight": 0.2,
"max_steps": "12000"

Synth. Digits --> SVHN

"target_dataset": "svhn",
"walker_weight_envelope_delay": "2000",
"max_checkpoints": 5,
"dataset": "synth",
"sup_per_class": -1,
"sup_per_batch": 100,
"walker_weight_envelope_steps": 1,
"walker_weight_envelope": "linear",
"decay_steps": 9000,
"unsup_batch_size": 1000,
"visit_weight_envelope": "linear",
"architecture": "svhn_model",
"visit_weight": 0.2,
"max_steps": "20000",

If you use the code, please cite the paper "Learning by Association - A versatile semi-supervised training method for neural networks" or "Associative Domain Adaptation":

@string{cvpr="IEEE Conference on Computer Vision and Pattern Recognition (CVPR)"}
@InProceedings{haeusser-cvpr-17,
  author = 	 "P. Haeusser and A. Mordvintsev and D. Cremers",
  title = 	 "Learning by Association - A versatile semi-supervised training method for neural networks",
  booktitle = cvpr,
  year = 	 "2017",
}

@string{iccv="IEEE International Conference on Computer Vision (ICCV)"}
@InProceedings{haeusser-iccv-17,
  author = 	 "P. Haeusser and T. Frerix and A. Mordvintsev and D. Cremers",
  title = 	 "Associative Domain Adaptation",
  booktitle = iccv,
  year = 	 "2017",
}

For questions please contact Philip Haeusser ([email protected]).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].