All Projects → fmu2 → Gradfeat20

fmu2 / Gradfeat20

Licence: mit
Gradients as Features for Deep Representation Learning

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Gradfeat20

awesome-graph-self-supervised-learning
Awesome Graph Self-Supervised Learning
Stars: ✭ 805 (+2583.33%)
Mutual labels:  transfer-learning, representation-learning
Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+28170%)
Mutual labels:  transfer-learning, representation-learning
awesome-contrastive-self-supervised-learning
A comprehensive list of awesome contrastive self-supervised learning papers.
Stars: ✭ 748 (+2393.33%)
Mutual labels:  transfer-learning, representation-learning
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (+170%)
Mutual labels:  transfer-learning, representation-learning
Modelsgenesis
Official Keras & PyTorch Implementation and Pre-trained Models for Models Genesis - MICCAI 2019
Stars: ✭ 416 (+1286.67%)
Mutual labels:  transfer-learning, representation-learning
Awesome Federated Learning
Federated Learning Library: https://fedml.ai
Stars: ✭ 624 (+1980%)
Mutual labels:  transfer-learning
Skin Cancer Image Classification
Skin cancer classification using Inceptionv3
Stars: ✭ 16 (-46.67%)
Mutual labels:  transfer-learning
Unsupervised Classification
SCAN: Learning to Classify Images without Labels (ECCV 2020), incl. SimCLR.
Stars: ✭ 605 (+1916.67%)
Mutual labels:  representation-learning
Easytransfer
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
Stars: ✭ 563 (+1776.67%)
Mutual labels:  transfer-learning
Graphvite
GraphVite: A General and High-performance Graph Embedding System
Stars: ✭ 865 (+2783.33%)
Mutual labels:  representation-learning
Bert language understanding
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Stars: ✭ 933 (+3010%)
Mutual labels:  transfer-learning
Cutmix Pytorch
Official Pytorch implementation of CutMix regularizer
Stars: ✭ 754 (+2413.33%)
Mutual labels:  transfer-learning
Tensorflow 101
TensorFlow 101: Introduction to Deep Learning for Python Within TensorFlow
Stars: ✭ 642 (+2040%)
Mutual labels:  transfer-learning
Bagofconcepts
Python implementation of bag-of-concepts
Stars: ✭ 18 (-40%)
Mutual labels:  representation-learning
Deepdrive
Deepdrive is a simulator that allows anyone with a PC to push the state-of-the-art in self-driving
Stars: ✭ 628 (+1993.33%)
Mutual labels:  transfer-learning
Awesome Bert Nlp
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+1790%)
Mutual labels:  transfer-learning
Simclr
PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations
Stars: ✭ 750 (+2400%)
Mutual labels:  representation-learning
Deepfakes video classification
Deepfakes Video classification via CNN, LSTM, C3D and triplets
Stars: ✭ 24 (-20%)
Mutual labels:  transfer-learning
Getting Things Done With Pytorch
Jupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
Stars: ✭ 738 (+2360%)
Mutual labels:  transfer-learning
Naacl transfer learning tutorial
Repository of code for the tutorial on Transfer Learning in NLP held at NAACL 2019 in Minneapolis, MN, USA
Stars: ✭ 687 (+2190%)
Mutual labels:  transfer-learning

Gradients as Features for Deep Representation Learning

alt text

This code repository is under construction.

Overview

This repository contains code for reproducing the results in Gradients as Features for Deep Representation Learning published as a conference paper at ICLR 2020. The code has been tested in an conda environment with Python 3 and PyTorch >= 1.3.

Quick Start

Download the base networks here. We currently support BiGAN/ALI encoder pre-trained on CIFAR-10/-100 or SVHN as the base network. In the download link, "ali" stands for ALI trained on Jenson-Shannon divergence, and "wali" stands for ALI trained on Wasserstein distance. See (and please star :) ) our repository on Wasserstein BiGAN.

  • File names with a trailing zero correspond to randomly initialized networks (e.g., fnet0.pt, std_hnet0.pt, etc.).
  • File names with a trailing one correspond to networks pre-trained with generative modeling (e.g., fnet1.pt, std_hnet1.pt, etc.).
  • File names with the prefix "std" correspond to networks under standard parametrization.
  • File names with the prefix "ntk" correspond to networks under NTK parametrization.

Update the loading and saving paths in the configuration files before you try out the sample commands.

  • Activation baseline (i.e., the standard multi-class logistic regressor)
python ./src/benchmark.py -c ./configs/cifar10/ali/actv.config
  • Full model (i.e., the proposed linear model)
python ./src/benchmark.py -c ./configs/cifar10/ali/linear_conv3.config
  • Gradient baseline (i.e., the gradient term alone in the proposed model)
python ./src/benchmark.py -c ./configs/cifar10/ali/grad_conv3.config
  • Network fine-tuning
python ./src/benchmark.py -c ./configs/cifar10/ali/finetune_conv3.config

Please note that we use a slightly different set of hyperparameters for training compared with what was originally used in the paper. In particular, we apply stochastic gradient descent (SGD) instead of ADAM as the default optimizer to repect the convention. Accordingly, we modify the learning rate scheduling since we found that it leads to faster convergence.

Contact

Fangzhou Mu ([email protected])

Bibtex

@inproceedings{mu2020gradfeat,
  title={Gradients as Features for Deep Representation Learning},
  author={Mu, Fangzhou and Liang, Yingyu and Li, Yin},
  booktitle={International Conference on Learning Representations (ICLR)},
  year={2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].