All Projects → microsoft → Snca.pytorch

microsoft / Snca.pytorch

Licence: mit
Improving Generalization via Scalable Neighborhood Component Analysis

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Snca.pytorch

Tensorflow
This Repository contains all tensorflow tutorials.
Stars: ✭ 68 (-45.16%)
Mutual labels:  transfer-learning
Awesome Computer Vision
Awesome Resources for Advanced Computer Vision Topics
Stars: ✭ 92 (-25.81%)
Mutual labels:  transfer-learning
Dogbreed gluon
kaggle Dog Breed Identification
Stars: ✭ 116 (-6.45%)
Mutual labels:  transfer-learning
Causalworld
CausalWorld: A Robotic Manipulation Benchmark for Causal Structure and Transfer Learning
Stars: ✭ 76 (-38.71%)
Mutual labels:  transfer-learning
Intrusion Detection System Using Deep Learning
VGG-19 deep learning model trained using ISCX 2012 IDS Dataset
Stars: ✭ 85 (-31.45%)
Mutual labels:  transfer-learning
Pathnet
Tensorflow Implementation of PathNet: Evolution Channels Gradient Descent in Super Neural Networks
Stars: ✭ 96 (-22.58%)
Mutual labels:  transfer-learning
Jiant
jiant is an NLP toolkit
Stars: ✭ 1,147 (+825%)
Mutual labels:  transfer-learning
Sigir2020 peterrec
Parameter-Efficient Transfer from Sequential Behaviors for User Modeling and Recommendation
Stars: ✭ 121 (-2.42%)
Mutual labels:  transfer-learning
Tweet Stance Prediction
Applying NLP transfer learning techniques to predict Tweet stance
Stars: ✭ 91 (-26.61%)
Mutual labels:  transfer-learning
Convolutional Handwriting Gan
ScrabbleGAN: Semi-Supervised Varying Length Handwritten Text Generation (CVPR20)
Stars: ✭ 107 (-13.71%)
Mutual labels:  transfer-learning
Transfer Learning Conv Ai
🦄 State-of-the-Art Conversational AI with Transfer Learning
Stars: ✭ 1,217 (+881.45%)
Mutual labels:  transfer-learning
Ddc Transfer Learning
A simple implementation of Deep Domain Confusion: Maximizing for Domain Invariance
Stars: ✭ 83 (-33.06%)
Mutual labels:  transfer-learning
Awesome Transfer Learning
Best transfer learning and domain adaptation resources (papers, tutorials, datasets, etc.)
Stars: ✭ 1,349 (+987.9%)
Mutual labels:  transfer-learning
Libtlda
Library of transfer learners and domain-adaptive classifiers.
Stars: ✭ 71 (-42.74%)
Mutual labels:  transfer-learning
Haystack
🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+2649.19%)
Mutual labels:  transfer-learning
Deep Transfer Learning
Deep Transfer Learning Papers
Stars: ✭ 68 (-45.16%)
Mutual labels:  transfer-learning
Starcraft Ai
Reinforcement Learning and Transfer Learning based StarCraft Micromanagement
Stars: ✭ 95 (-23.39%)
Mutual labels:  transfer-learning
Pytorch classifiers
Almost any Image classification problem using pytorch
Stars: ✭ 122 (-1.61%)
Mutual labels:  transfer-learning
Keras transfer cifar10
Object classification with CIFAR-10 using transfer learning
Stars: ✭ 120 (-3.23%)
Mutual labels:  transfer-learning
Opentpod
Open Toolkit for Painless Object Detection
Stars: ✭ 106 (-14.52%)
Mutual labels:  transfer-learning

Improving Generalization via Scalable Neighborhood Component Analysis

This repo constains the pytorch implementation for the ECCV 2018 paper (paper). We use deep networks to learn feature representations optimized for nearest neighbor classifiers, which could generalize better for new object categories. This project is a re-investigation of Neighborhood Component Analysis (NCA) with recent technologies to make it scalable to deep networks and large-scale datasets.

Much of code is extended from the previous unsupervised learning project. Please refer to this repo for more details.

Pretrained Models

Currently, we provide three pretrained ResNet models. Each release contains the feature representation of all ImageNet training images (600 mb) and model weights (100-200mb). Models and their performance with nearest neighbor classifiers are as follows.

Code to reproduce the rest of the experiments are comming soon.

Nearest Neighbors

Please follow this link for a list of nearest neighbors on ImageNet. Results are visualized from our ResNet50 feature, compared with baseline ResNet50 feature, raw image features and previous unsupervised features. First column is the query image, followed by 20 retrievals ranked by the similarity.

Usage

Our code extends the pytorch implementation of imagenet classification in official pytorch release. Please refer to the official repo for details of data preparation and hardware configurations.

  • install python2 and pytorch>=0.4

  • clone this repo: git clone https://github.com/Microsoft/snca.pytorch

  • Training on ImageNet:

    python main.py DATAPATH --arch resnet18 -j 32 --temperature 0.05 --low-dim 128 -b 256

    • During training, we monitor the supervised validation accuracy by K nearest neighbor with k=1, as it's faster, and gives a good estimation of the feature quality.
  • Testing on ImageNet:

    python main.py DATAPATH --arch resnet18 --resume input_model.pth.tar -e runs testing with default K=30 neighbors.

  • Memory Consumption and Computation Issues

    Memory consumption is more of an issue than computation time. Currently, the implementation of nca module is not paralleled across multiple GPUs. Hence, the first GPU will consume much more memory than the others. For example, when training a ResNet18 network, GPU 0 will consume 11GB memory, while the others each takes 2.5GB. You will need to set the Caffe style "-b 128 --iter-size 2" for training deeper networks. Our released models are trained with V100 machines.

  • Training on CIFAR10:

    python cifar.py --temperature 0.05 --lr 0.1

Citation

@inproceedings{wu2018improving,
  title={Improving Generalization via Scalable Neighborhood Component Analysis},
  author={Wu, Zhirong and Efros, Alexei A and Yu, Stella},
  booktitle={European Conference on Computer Vision (ECCV) 2018},
  year={2018}
}

Contact

For any questions, please feel free to reach

Zhirong Wu: [email protected]

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].