All Projects → Nik-V9 → HEAPUtil

Nik-V9 / HEAPUtil

Licence: MIT license
Code for the RA-L (IROS) 2021 paper "A Hierarchical Dual Model of Environment- and Place-Specific Utility for Visual Place Recognition"

Programming Languages

python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to HEAPUtil

suspend
suspend/resume a list of processes in Windows
Stars: ✭ 23 (-50%)
Mutual labels:  utility
ReaperJPN-Phroneris
製品版REAPER日本語化パッチ(森)
Stars: ✭ 41 (-10.87%)
Mutual labels:  localization
DiGCL
The PyTorch implementation of Directed Graph Contrastive Learning (DiGCL), NeurIPS-2021
Stars: ✭ 27 (-41.3%)
Mutual labels:  contrastive-learning
awesome-graph-self-supervised-learning-based-recommendation
A curated list of awesome graph & self-supervised-learning-based recommendation.
Stars: ✭ 37 (-19.57%)
Mutual labels:  contrastive-learning
FastDMG
Fast, no-nonsense disk image mounting for macOS
Stars: ✭ 72 (+56.52%)
Mutual labels:  utility
Lingo-Vapor
Vapor provider for Lingo - the Swift localization library
Stars: ✭ 45 (-2.17%)
Mutual labels:  localization
lingua
A PHP-7 language codes converter, from and to the most common formats (ISO or not)
Stars: ✭ 35 (-23.91%)
Mutual labels:  localization
xd-storage-helper
A little helper to make storing key-value-pairs (e.g. settings) for Adobe XD plugins easier.
Stars: ✭ 22 (-52.17%)
Mutual labels:  utility
dart.cn
Dart docs localization, get started from the wiki page here: https://github.com/cfug/dart.cn/wiki
Stars: ✭ 64 (+39.13%)
Mutual labels:  localization
french
French language pack to localize the Flarum forum software plus its official and third-party extensions.
Stars: ✭ 17 (-63.04%)
Mutual labels:  localization
laika
Log, test, intercept and modify Apollo Client's operations
Stars: ✭ 99 (+115.22%)
Mutual labels:  utility
awesome-translations
😎 Awesome lists about Internationalization & localization stuff. l10n, g11n, m17n, i18n. Translations! 🌎🌍
Stars: ✭ 54 (+17.39%)
Mutual labels:  localization
vanilla-docker
A sweet Docker setup for Vanilla Forums
Stars: ✭ 34 (-26.09%)
Mutual labels:  utility
discord-paginationembed
A pagination utility for MessageEmbed in Discord.JS
Stars: ✭ 93 (+102.17%)
Mutual labels:  utility
micSwitch
macOS menu bar application for the mic mute/unmute with single click or shortcut with walkie-talkie style support
Stars: ✭ 37 (-19.57%)
Mutual labels:  utility
GA SLAM
🚀 SLAM for autonomous planetary rovers with global localization
Stars: ✭ 40 (-13.04%)
Mutual labels:  localization
utilsac
Utility functions
Stars: ✭ 13 (-71.74%)
Mutual labels:  utility
laravel-translate
Generate translation files for Laravel using Google Translate
Stars: ✭ 22 (-52.17%)
Mutual labels:  localization
codac
Codac is a library for constraint programming over reals, trajectories and sets.
Stars: ✭ 31 (-32.61%)
Mutual labels:  localization
CVC
CVC: Contrastive Learning for Non-parallel Voice Conversion (INTERSPEECH 2021, in PyTorch)
Stars: ✭ 45 (-2.17%)
Mutual labels:  contrastive-learning

A Hierarchical Dual Model of Environment- and Place-Specific Utility for Visual Place Recognition

License: MIT stars QUT Centre for Robotics arXiv IEEE Xplore RA-L 2021 Open In Colab YouTube

PWC

Introduction

HEAPUtil is an IEEE RA-L & IROS 2021 research paper. In this work, we present a method for unsupervised estimation of the Environment-Specific (ES) and Place-Specific (PS) Utility of unique visual cues in a reference map represented as VLAD clusters. Furthermore, we employ this Utility in a unified hierarchical global-to-local VPR pipeline to enable better place recognition and localization capability for robots, with reduced storage and compute time requirements. This repo contains the official code for estimating the Utility of visual cues and the hierarchical global-to-local VPR pipeline.


Utility-guided Hierarchical Visual Place Recognition.

For more details, please see:

Dependencies

Simply run the following command: pip install -r requirements.txt

Conda

conda create -n heaputil python=3.8 mamba -c conda-forge -y
conda activate heaputil
mamba install numpy opencv pytorch matplotlib faiss-gpu scipy scikit-image=0.18.2 torchvision scikit-learn h5py -c conda-forge

Data

For Data Loading, we use .mat files which contain information regarding Reference Image Paths, Query Image Paths, Ground-truth Co-ordinates for Reference and Query Images, and the Positive Localization Distance Threshold. These .mat files for the Berlin Kudamm, Nordland Summer Vs Winter and Oxford Day Vs Night datasets are present in the ./dataset-mat-files folder.

We provide the Berlin Kudamm Dataset for Inference:

For more details regarding the Berlin Kudamm dataset please refer to this paper.

For all the scripts, apart from SuperPoint Extraction, you may use the --dataset flag to mention the dataset to use. By default, it is set to 'berlin' and the default choices are ['oxford', 'nordland', 'berlin'].

Quick Start

Here's a Colab Notebook to effortlessly run tests on the Berlin Dataset.

Scripts

Please use the --help flag to see all available arguments for the scripts.

NetVLAD (Global Descriptor)

Extract NetVLAD Descriptors, Predictions and Cluster Masks:

python NetVLAD/main.py --resume './data/NetVLAD/netvlad-checkpoint-cc16' --root_dir './data' --save --save_path './data/NetVLAD'

Environment- and Place-Specific Utility Estimation

Estimate the Environment- and Place-Specific Utility of VLAD Clusters for the Reference Map:

python utility.py --root_dir './data' --netvlad_extracts_path './data/NetVLAD' --save_path './data/Utility' --save_viz

You may use the --save_viz flag to visualize the Environment-Specific and Place-Specific Utility as shown below:

   
Visualizing ES (left) & PS (right) Utility (Red indicates low utility and blue/gray indicates high utility)

SuperPoint Feature Extraction

Generate path lists which are required for SuperPoint Extraction & SuperGlue:

python generate_path_lists.py --root_dir './data' --netvlad_predictions './data/NetVLAD' --save_path './data'

Extract SuperPoint features for the Reference Map:

python SuperGlue/superpoint_extraction.py --input_images './data/db_list.txt' --split 'db' --input_dir './data' --output_dir './data/SuperPoint'

Extract SuperPoint features for the Queries:

python SuperGlue/superpoint_extraction.py --input_images './data/q_list.txt' --split 'query' --input_dir './data' --output_dir './data/SuperPoint'

Utility-guided Local Feature Matching

You may use the --viz flag to visualize the best matches as a gif.

Vanilla

Run Vanilla SuperPoint based Local Feature Matching:

python local_feature_matching.py --input_dir './data' --output_dir './data/LFM/Vanilla' \
--netvlad_extracts_path './data/NetVLAD' --superpoint_extracts_path './data/SuperPoint' --utility_path './data/Utility'

Environment-Specific (ES) Utility

Run ES-Utility guided Local Feature Matching:

python local_feature_matching.py --input_dir './data' --output_dir './data/LFM/ES_Utility' \
--netvlad_extracts_path './data/NetVLAD' --superpoint_extracts_path './data/SuperPoint' --utility_path './data/Utility' \
--es_utility

Place-Specific (PS) Utility

Run PS-Utility guided Local Feature Matching:

python local_feature_matching.py --input_dir './data' --output_dir './data/LFM/PS_Utility' \
--netvlad_extracts_path './data/NetVLAD' --superpoint_extracts_path './data/SuperPoint' --utility_path './data/Utility' \
--ps_utility

Default Number of Top Utility Clusters to use for Local Feature Matching is 10. Please use the --k flag to use a different number of top utility clusters.

Combined ES & PS Utility

Run ES & PS-Utility guided Local Feature Matching:

python local_feature_matching.py --input_dir './data' --output_dir './data/LFM/Utility' \
--netvlad_extracts_path './data/NetVLAD' --superpoint_extracts_path './data/SuperPoint' --utility_path './data/Utility' \
--es_utility --ps_utility --viz

Default Number of Top Utility Clusters to use for Local Feature Matching is X-1 clusters, where X is the number of useful clusters determined by the Environment-Specific system. To use a different number of top utility clusters please use the --non_default_k and --k flags.

We use the --viz flag to visualize the best matches along with utility reference masks as a gif as shown below:


ES & PS Utility-guided Local Feature Matching (Cyan mask represents regions with high utility)

Utility-guided SuperGlue

Similar to Local Feature Matching, you may run the superglue_match_pairs.py file for Vanilla SuperGlue & Utility-guided SuperGlue. You may use the --viz flag to visualize all the matches and dump the SuperGlue-style plots.

Run ES & PS-Utility guided SuperGlue:

python superglue_match_pairs.py --input_pairs './data/berlin_netvlad_candidate_list.txt' --input_dir './data' --output_dir './data/SuperGlue/Utility' \
--netvlad_extracts_path './data/NetVLAD' --utility_path './data/Utility' \
--es_utility --ps_utility

BibTeX Citation

If any ideas from the paper or code from this repo are used, please consider citing:

@article{keetha2021hierarchical,
  author={Keetha, Nikhil Varma and Milford, Michael and Garg, Sourav},
  journal={IEEE Robotics and Automation Letters}, 
  title={A Hierarchical Dual Model of Environment- and Place-Specific Utility for Visual Place Recognition}, 
  year={2021},
  volume={6},
  number={4},
  pages={6969-6976},
  doi={10.1109/LRA.2021.3096751}}

The code is licensed under the MIT License.

Acknowledgements

The authors acknowledge the support from the Queensland University of Technology (QUT) through the Centre for Robotics.

Furthermore, we would like to acknowledge the Pytorch Implementation of NetVlad from Nanne and the original implementation of SuperGlue.

Related works

Please check out this collection of related works on place recognition.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].