All Projects → akshitac8 → Generative_MLZSL

akshitac8 / Generative_MLZSL

Licence: GPL-3.0 license
[TPAMI Under Submission] Generative Multi-Label Zero-Shot Learning

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to Generative MLZSL

tfvaegan
[ECCV 2020] Official Pytorch implementation for "Latent Embedding Feedback and Discriminative Features for Zero-Shot Classification". SOTA results for ZSL and GZSL
Stars: ✭ 107 (+189.19%)
Mutual labels:  pytorch-implementation, gzsl, zsl, clswgan
Walk-Transformer
From Random Walks to Transformer for Learning Node Embeddings (ECML-PKDD 2020) (In Pytorch and Tensorflow)
Stars: ✭ 26 (-29.73%)
Mutual labels:  self-attention, pytorch-implementation
pytorch-gans
PyTorch implementation of GANs (Generative Adversarial Networks). DCGAN, Pix2Pix, CycleGAN, SRGAN
Stars: ✭ 21 (-43.24%)
Mutual labels:  generative-adversarial-network, pytorch-implementation
Deep-Learning-Pytorch
A repo containing code covering various aspects of deep learning on Pytorch. Great for beginners and intermediate in the field
Stars: ✭ 59 (+59.46%)
Mutual labels:  generative-adversarial-network, pytorch-implementation
gan-vae-pretrained-pytorch
Pretrained GANs + VAEs + classifiers for MNIST/CIFAR in pytorch.
Stars: ✭ 134 (+262.16%)
Mutual labels:  generative-adversarial-network, pytorch-implementation
Alae
[CVPR2020] Adversarial Latent Autoencoders
Stars: ✭ 3,178 (+8489.19%)
Mutual labels:  generative-adversarial-network, pytorch-implementation
subjectiveqe-esrgan
PyTorch implementation of ESRGAN (ECCVW 2018) for compressed image subjective quality enhancement.
Stars: ✭ 12 (-67.57%)
Mutual labels:  generative-adversarial-network, pytorch-implementation
deep-blueberry
If you've always wanted to learn about deep-learning but don't know where to start, then you might have stumbled upon the right place!
Stars: ✭ 17 (-54.05%)
Mutual labels:  generative-adversarial-network, pytorch-implementation
MobileHumanPose
This repo is official PyTorch implementation of MobileHumanPose: Toward real-time 3D human pose estimation in mobile devices(CVPRW 2021).
Stars: ✭ 206 (+456.76%)
Mutual labels:  pytorch-implementation
mtss-gan
MTSS-GAN: Multivariate Time Series Simulation with Generative Adversarial Networks (by @firmai)
Stars: ✭ 77 (+108.11%)
Mutual labels:  generative-adversarial-network
GalaXC
GalaXC: Graph Neural Networks with Labelwise Attention for Extreme Classification
Stars: ✭ 28 (-24.32%)
Mutual labels:  multi-label-classification
RandLA-Net-pytorch
🍀 Pytorch Implementation of RandLA-Net (https://arxiv.org/abs/1911.11236)
Stars: ✭ 69 (+86.49%)
Mutual labels:  pytorch-implementation
CycleGAN-gluon-mxnet
this repo attemps to reproduce Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks(CycleGAN) use gluon reimplementation
Stars: ✭ 31 (-16.22%)
Mutual labels:  generative-adversarial-network
Text-Classification-LSTMs-PyTorch
The aim of this repository is to show a baseline model for text classification by implementing a LSTM-based model coded in PyTorch. In order to provide a better understanding of the model, it will be used a Tweets dataset provided by Kaggle.
Stars: ✭ 45 (+21.62%)
Mutual labels:  pytorch-implementation
multi-label-text-classification
Mutli-label text classification using ConvNet and graph embedding (Tensorflow implementation)
Stars: ✭ 44 (+18.92%)
Mutual labels:  multi-label-classification
onn
Online Deep Learning: Learning Deep Neural Networks on the Fly / Non-linear Contextual Bandit Algorithm (ONN_THS)
Stars: ✭ 139 (+275.68%)
Mutual labels:  pytorch-implementation
MCAR
Learning to Discover Multi-Class Attentional Regions for Multi-Label Image Recognition
Stars: ✭ 32 (-13.51%)
Mutual labels:  multi-label-classification
svae cf
[ WSDM '19 ] Sequential Variational Autoencoders for Collaborative Filtering
Stars: ✭ 38 (+2.7%)
Mutual labels:  pytorch-implementation
precision-recall-distributions
Assessing Generative Models via Precision and Recall (official repository)
Stars: ✭ 80 (+116.22%)
Mutual labels:  generative-adversarial-network
cosine-ood-detector
Hyperparameter-Free Out-of-Distribution Detection Using Softmax of Scaled Cosine Similarity
Stars: ✭ 30 (-18.92%)
Mutual labels:  pytorch-implementation

PWC

Generative Multi-Label Zero-Shot Learning

Akshita Gupta*, Sanath Narayan*, Salman Khan, Fahad Shahbaz Khan, Ling Shao, Joost van de Weijer

(* denotes equal contribution)

Webpage: https://akshitac8.github.io/GAN_MLZSL/

Overview

This repository contains the implementation of Generative Multi-Label Zero-Shot Learning.

In this work, we tackle the problem of synthesizing multi-label features in the context of zero-shot setting for recognition all (un)seen labels with a novel training mechanism.

Installation

The codebase is built on PyTorch 1.1.0 and tested on Ubuntu 16.04 environment (Python3.6, CUDA9.0, cuDNN7.5).

For installing, follow these intructions

conda create -n mlzsl python=3.6
conda activate mlzsl
conda install pytorch=1.1 torchvision=0.3 cudatoolkit=9.0 -c pytorch
pip install matplotlib scikit-image scikit-learn opencv-python yacs joblib natsort tqdm pandas h5py==2.10.0

Data Preparation

Training using NUS-WIDE dataset:

Download the NUS-WIDE features, tags and other required training files from the drive link shared below.

link: https://drive.google.com/drive/folders/1tCo-xawWrnGQGaWYJEKQOQ31ts__rAse?usp=sharing

Extract them in the ./datasets folder.

Training using custom dataset:

Download the custom dataset in the same data folder. Please make sure to convert your custom dataset in the same format as NUS-WIDE.

python preprocess.py --image_dir data/custom_data/ --output_dir data/custom_data_jsons/ --train_json custom_data_train  --test_json custom_data_test

Above preprocessing step will create train and test jsons with ZSL and GZSL requirements.

The train and test jsons are used as an input when running feature extraction code.

python extract_4096_features.py --train_json custom_data_train  --test_json custom_data_test --gpu

Above feature extraction will save features in .h5 format which is used for training for our CLF model.

Training and Evaluation

NUS-WIDE

To train and evaluate zero-shot learning model on full NUS-WIDE dataset, run:

sh ./scripts/train_nus_wide.sh

Model Checkpoint

We also include the checkpoint of the zero-shot generative model on NUS-WIDE for fast evaluation in weights folder. Please download the pretrained weights according to the intructions within the folder. To reproduce results, run:

sh ./scripts/eval_nus_wide.sh

Citation

If this code is helpful for your research, we would appreciate if you cite the work:

@article{gupta2021generative,
  title={Generative Multi-Label Zero-Shot Learning},
  author={Gupta, Akshita and Narayan, Sanath and Khan, Salman and Khan, Fahad Shahbaz and Shao, Ling and van de Weijer, Joost},
  journal={arXiv preprint arXiv:2101.11606},
  year={2021}

Acknowledgments

I thank Dat Huynh for discussions and feedback regarding the evaluation protocol and sharing details for the baseline zero-shot methods. I thank Aditya Arora for suggestions on the figure aesthetics.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].