All Projects → YicongHong → Entity-Graph-VLN

YicongHong / Entity-Graph-VLN

Licence: other
Code of the NeurIPS 2021 paper: Language and Visual Entity Relationship Graph for Agent Navigation

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to Entity-Graph-VLN

DiGCL
The PyTorch implementation of Directed Graph Contrastive Learning (DiGCL), NeurIPS-2021
Stars: ✭ 27 (-20.59%)
Mutual labels:  graph-neural-networks, neurips-2021
Social-Recommendation
Summary of social recommendation papers and codes
Stars: ✭ 143 (+320.59%)
Mutual labels:  graph-neural-networks
SelfGNN
A PyTorch implementation of "SelfGNN: Self-supervised Graph Neural Networks without explicit negative sampling" paper, which appeared in The International Workshop on Self-Supervised Learning for the Web (SSL'21) @ the Web Conference 2021 (WWW'21).
Stars: ✭ 24 (-29.41%)
Mutual labels:  graph-neural-networks
DCGCN
Densely Connected Graph Convolutional Networks for Graph-to-Sequence Learning (authors' MXNet implementation for the TACL19 paper)
Stars: ✭ 73 (+114.71%)
Mutual labels:  graph-neural-networks
kaggle-champs
Code for the CHAMPS Predicting Molecular Properties Kaggle competition
Stars: ✭ 49 (+44.12%)
Mutual labels:  graph-neural-networks
AC-VRNN
PyTorch code for CVIU paper "AC-VRNN: Attentive Conditional-VRNN for Multi-Future Trajectory Prediction"
Stars: ✭ 21 (-38.24%)
Mutual labels:  graph-neural-networks
GNNLens2
Visualization tool for Graph Neural Networks
Stars: ✭ 155 (+355.88%)
Mutual labels:  graph-neural-networks
ASAP
AAAI 2020 - ASAP: Adaptive Structure Aware Pooling for Learning Hierarchical Graph Representations
Stars: ✭ 83 (+144.12%)
Mutual labels:  graph-neural-networks
EgoCNN
Code for "Distributed, Egocentric Representations of Graphs for Detecting Critical Structures" (ICML 2019)
Stars: ✭ 16 (-52.94%)
Mutual labels:  graph-neural-networks
SIAN
Code and data for ECML-PKDD paper "Social Influence Attentive Neural Network for Friend-Enhanced Recommendation"
Stars: ✭ 25 (-26.47%)
Mutual labels:  graph-neural-networks
cisip-FIRe
Fast Image Retrieval (FIRe) is an open source project to promote image retrieval research. It implements most of the major binary hashing methods to date, together with different popular backbone networks and public datasets.
Stars: ✭ 40 (+17.65%)
Mutual labels:  neurips-2021
awesome-efficient-gnn
Code and resources on scalable and efficient Graph Neural Networks
Stars: ✭ 498 (+1364.71%)
Mutual labels:  graph-neural-networks
visual-compatibility
Context-Aware Visual Compatibility Prediction (https://arxiv.org/abs/1902.03646)
Stars: ✭ 92 (+170.59%)
Mutual labels:  graph-neural-networks
GalaXC
GalaXC: Graph Neural Networks with Labelwise Attention for Extreme Classification
Stars: ✭ 28 (-17.65%)
Mutual labels:  graph-neural-networks
robust-gcn
Implementation of the paper "Certifiable Robustness and Robust Training for Graph Convolutional Networks".
Stars: ✭ 35 (+2.94%)
Mutual labels:  graph-neural-networks
ntds 2019
Material for the EPFL master course "A Network Tour of Data Science", edition 2019.
Stars: ✭ 62 (+82.35%)
Mutual labels:  graph-neural-networks
Pro-GNN
Implementation of the KDD 2020 paper "Graph Structure Learning for Robust Graph Neural Networks"
Stars: ✭ 202 (+494.12%)
Mutual labels:  graph-neural-networks
SimP-GCN
Implementation of the WSDM 2021 paper "Node Similarity Preserving Graph Convolutional Networks"
Stars: ✭ 43 (+26.47%)
Mutual labels:  graph-neural-networks
Fine-Grained-R2R
Code and data of the Fine-Grained R2R Dataset proposed in the EMNLP 2021 paper Sub-Instruction Aware Vision-and-Language Navigation
Stars: ✭ 34 (+0%)
Mutual labels:  vision-and-language-navigation
SubGNN
Subgraph Neural Networks (NeurIPS 2020)
Stars: ✭ 136 (+300%)
Mutual labels:  graph-neural-networks

Entity-Graph-VLN

Code of the NeurIPS 2020 paper: Language and Visual Entity Relationship Graph for Agent Navigation
Yicong Hong, Cristian Rodriguez-Opazo, Yuankai Qi, Qi Wu, Stephen Gould

[Paper] [Supplemental] [GitHub]

"Halliday hated making rules. Why is that line sticking in my head? Maybe it's because Art3mis said it, and she's hot. Maybe it's because she called me out. Sitting here in my tiny corner of nowhere, protecting my small slice of nothing." --- Ready Player One 2018.

Prerequisites

Installation

Install the Matterport3D Simulator.

Please find the versions of packages in our environment here. In particular, we use:

  • Python 3.6.9
  • NumPy 1.18.1
  • OpenCV 3.4.2
  • PyTorch 1.3.0
  • Torchvision 0.4.1

Data Preparation

Please follow the instructions below to prepare the data in directories:

Trained Network Weights

R2R Navigation

Please read Peter Anderson's VLN paper for the R2R Navigation task.

Our code is based on the code structure of the EnvDrop.

Reproduce Testing Results

To replicate the performance reported in our paper, load the trained network weights and run validation:

bash run/agent.bash

Training

Navigator

To train the network from scratch, first train a Navigator on the R2R training split:

Modify run/agent.bash, remove the argument for --load and set --train listener. Then,

bash run/agent.bash

The trained Navigator will be saved under snap/.

Speaker

You also need to train a Speaker for augmented training:

bash run/speak.bash

The trained Speaker will be saved under snap/.

Augmented Navigator

Finally, keep training the Navigator with the mixture of original data and augmented data:

bash run/bt_envdrop.bash

We apply a one-step learning rate decay to 1e-5 when training saturates.

Citation

If you use or discuss our Entity Relationship Graph, please cite our paper:

@article{hong2020language,
  title={Language and Visual Entity Relationship Graph for Agent Navigation},
  author={Hong, Yicong and Rodriguez, Cristian and Qi, Yuankai and Wu, Qi and Gould, Stephen},
  journal={Advances in Neural Information Processing Systems},
  volume={33},
  year={2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].