All Projects → kanchen-usc → Grounder

kanchen-usc / Grounder

Licence: mit
Implementation of Grounding of Textual Phrases in Images by Reconstruction in Tensorflow

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Grounder

Pytorch Attention Guided Cyclegan
Pytorch implementation of Unsupervised Attention-guided Image-to-Image Translation.
Stars: ✭ 67 (-19.28%)
Mutual labels:  attention-mechanism
Karateclub
Karate Club: An API Oriented Open-source Python Framework for Unsupervised Learning on Graphs (CIKM 2020)
Stars: ✭ 1,190 (+1333.73%)
Mutual labels:  unsupervised-learning
Mug
Learning Video Object Segmentation from Unlabeled Videos (CVPR2020)
Stars: ✭ 81 (-2.41%)
Mutual labels:  unsupervised-learning
Concrete Autoencoders
Stars: ✭ 68 (-18.07%)
Mutual labels:  unsupervised-learning
Sarcasm Detection
Detecting Sarcasm on Twitter using both traditonal machine learning and deep learning techniques.
Stars: ✭ 73 (-12.05%)
Mutual labels:  attention-mechanism
Hierarchical Attention Networks
TensorFlow implementation of the paper "Hierarchical Attention Networks for Document Classification"
Stars: ✭ 75 (-9.64%)
Mutual labels:  attention-mechanism
Deepattention
Deep Visual Attention Prediction (TIP18)
Stars: ✭ 65 (-21.69%)
Mutual labels:  attention-mechanism
Openselfsup
Self-Supervised Learning Toolbox and Benchmark
Stars: ✭ 1,239 (+1392.77%)
Mutual labels:  unsupervised-learning
Fake news detection deep learning
Fake News Detection using Deep Learning models in Tensorflow
Stars: ✭ 74 (-10.84%)
Mutual labels:  attention-mechanism
Image similarity
PyTorch Blog Post On Image Similarity Search
Stars: ✭ 80 (-3.61%)
Mutual labels:  unsupervised-learning
Group Level Emotion Recognition
Model submitted for the ICMI 2018 EmotiW Group-Level Emotion Recognition Challenge
Stars: ✭ 70 (-15.66%)
Mutual labels:  attention-mechanism
Se3 Transformer Pytorch
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Stars: ✭ 73 (-12.05%)
Mutual labels:  attention-mechanism
Deepaffinity
Protein-compound affinity prediction through unified RNN-CNN
Stars: ✭ 75 (-9.64%)
Mutual labels:  attention-mechanism
Insta Dm
Learning Monocular Depth in Dynamic Scenes via Instance-Aware Projection Consistency (AAAI 2021)
Stars: ✭ 67 (-19.28%)
Mutual labels:  unsupervised-learning
Simplednn
SimpleDNN is a machine learning lightweight open-source library written in Kotlin designed to support relevant neural network architectures in natural language processing tasks
Stars: ✭ 81 (-2.41%)
Mutual labels:  attention-mechanism
Sine
A PyTorch Implementation of "SINE: Scalable Incomplete Network Embedding" (ICDM 2018).
Stars: ✭ 67 (-19.28%)
Mutual labels:  unsupervised-learning
Attention Based Aspect Extraction
Code for unsupervised aspect extraction, using Keras and its Backends
Stars: ✭ 75 (-9.64%)
Mutual labels:  unsupervised-learning
Attend infer repeat
A Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (-1.2%)
Mutual labels:  attention-mechanism
Sturcture Inpainting
Source code of AAAI 2020 paper 'Learning to Incorporate Structure Knowledge for Image Inpainting'
Stars: ✭ 78 (-6.02%)
Mutual labels:  attention-mechanism
Marta Gan
MARTA GANs: Unsupervised Representation Learning for Remote Sensing Image Classification
Stars: ✭ 75 (-9.64%)
Mutual labels:  unsupervised-learning

GroundeR

This repository contains implementation for Grounding of Textual Phrases in Images by Reconstruction in ECCV 2016.

Setup

Note: Please read the feature representation files in feature and annotation directories before using the code.

Platform: Tensorflow-1.0.1 (python 2.7)
Visual features: We use Faster-RCNN pre-trained on PASCAL 2012 VOC for Flickr30K Entities, and pre-trained on ImageNet for Referit Game. Please put visual features in the feature directory (More details can be seen in the README.md in this directory). (Fine-tuned features can achieve better performance, which are available in this repository).
Sentence features: We encode one-hot vector for each query, as well as the annotation for each query and image pair. Please put the encoded features in the annotation directory (More details are provided in the README.md in this directory).
File list: We generate a file list for each image in the Flickr30K Entities. If you would like to train and test on other dataset (e.g. Referit Game), please follow the similar format in the flickr_train_val.lst and flickr_test.lst.
Hyper parameters: Please check the Config class in the train_supervise.py and train_unsupervise.py.

Training & Test

We implement both supervised and unsupervised scenarios of GroundeR model.

Supervised Model

For training, please enter the root folder of GroundeR, then type

$ python train_supervise.py -m [Model Name] -g [GPU ID]

For testing, please enter the root folder of GroundeR, then type

$ python evaluate_supervise.py -m [Model Name] -g [GPU ID] --restore_id [Restore epoch ID]

Make sure the model name entered for evaluation is the same as the model name in training, and the epoch id exists.

Unsupervised Model

The implementation of unsupervised model of GroundeR is a little different from the paper: In Equation 5, original GroundeR adopts a softmax function to calculate attention weights, while we adopt a relu function to generate these weights. We observe a performance drop by using softmax function. To try original GroundeR model, please uncomment line 96 and comment line 97 in model_unsupervise.py.
For training, please enter the root folder of GroundeR, then type

$ python train_unsupervise.py -m [Model Name] -g [GPU ID]

For testing, please enter the root folder of GroundeR, then type

$ python evaluate_unsupervise.py -m [Model Name] -g [GPU ID] --restore_id [Restore epoch ID]

Make sure the model name entered for evaluation is the same as the model name in training, and the epoch id exists.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].