All Projects → mchancan → Flynet

mchancan / Flynet

Licence: mit
Official PyTorch implementation of paper "A Hybrid Compact Neural Architecture for Visual Place Recognition" by M. Chancán (RA-L & ICRA 2020) https://doi.org/10.1109/LRA.2020.2967324

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Flynet

Emotion Recognition Using Speech
Building and training Speech Emotion Recognizer that predicts human emotions using Python, Sci-kit learn and Keras
Stars: ✭ 159 (+329.73%)
Mutual labels:  neural-networks, recurrent-neural-networks
Lstm anomaly thesis
Anomaly detection for temporal data using LSTMs
Stars: ✭ 178 (+381.08%)
Mutual labels:  neural-networks, recurrent-neural-networks
Pytorch Esn
An Echo State Network module for PyTorch.
Stars: ✭ 98 (+164.86%)
Mutual labels:  neural-networks, recurrent-neural-networks
Ml In Tf
Get started with Machine Learning in TensorFlow with a selection of good reads and implemented examples!
Stars: ✭ 45 (+21.62%)
Mutual labels:  neural-networks, recurrent-neural-networks
Stock Price Prediction Lstm
OHLC Average Prediction of Apple Inc. Using LSTM Recurrent Neural Network
Stars: ✭ 232 (+527.03%)
Mutual labels:  neural-networks, recurrent-neural-networks
Udacity Deep Learning Nanodegree
This is just a collection of projects that made during my DEEPLEARNING NANODEGREE by UDACITY
Stars: ✭ 15 (-59.46%)
Mutual labels:  neural-networks, recurrent-neural-networks
Deep Spying
Spying using Smartwatch and Deep Learning
Stars: ✭ 172 (+364.86%)
Mutual labels:  neural-networks, recurrent-neural-networks
Coursera Deep Learning Specialization
Notes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks; (v) Sequence Models
Stars: ✭ 188 (+408.11%)
Mutual labels:  neural-networks, recurrent-neural-networks
Echotorch
A Python toolkit for Reservoir Computing and Echo State Network experimentation based on pyTorch. EchoTorch is the only Python module available to easily create Deep Reservoir Computing models.
Stars: ✭ 231 (+524.32%)
Mutual labels:  neural-networks, recurrent-neural-networks
Deep Learning With Python
Deep learning codes and projects using Python
Stars: ✭ 195 (+427.03%)
Mutual labels:  neural-networks, recurrent-neural-networks
Komputation
Komputation is a neural network framework for the Java Virtual Machine written in Kotlin and CUDA C.
Stars: ✭ 295 (+697.3%)
Mutual labels:  neural-networks, recurrent-neural-networks
Carrot
🥕 Evolutionary Neural Networks in JavaScript
Stars: ✭ 261 (+605.41%)
Mutual labels:  neural-networks, recurrent-neural-networks
Tensorflow Tutorial
TensorFlow and Deep Learning Tutorials
Stars: ✭ 748 (+1921.62%)
Mutual labels:  neural-networks, recurrent-neural-networks
Hyperpose
HyperPose: A Collection of Real-time Human Pose Estimation
Stars: ✭ 961 (+2497.3%)
Mutual labels:  neural-networks
Named Entity Recognition
name entity recognition with recurrent neural network(RNN) in tensorflow
Stars: ✭ 20 (-45.95%)
Mutual labels:  recurrent-neural-networks
Rnn lstm gesture recog
For recognising hand gestures using RNN and LSTM... Implementation in TensorFlow
Stars: ✭ 14 (-62.16%)
Mutual labels:  recurrent-neural-networks
Deepaudioclassification
Finding the genre of a song with Deep Learning
Stars: ✭ 969 (+2518.92%)
Mutual labels:  neural-networks
Lstmvis
Visualization Toolbox for Long Short Term Memory networks (LSTMs)
Stars: ✭ 959 (+2491.89%)
Mutual labels:  recurrent-neural-networks
Yerevann.github.io
YerevaNN blog
Stars: ✭ 13 (-64.86%)
Mutual labels:  neural-networks
3d Semantic Segmentation For Scene Parsing
A new approach for the real time 3D semantic segmentation based on feature abstract and deep learning method
Stars: ✭ 13 (-64.86%)
Mutual labels:  neural-networks

A Hybrid Compact Neural Architecture for Visual Place Recognition

In this release, we provide an open source implementation of the FlyNet supervised learning experiments in A Hybrid Compact Neural Architecture for Visual Place Recognition, published in the IEEE Robotics and Automation Letters (RA-L) journal (DOI 10.1109/LRA.2020.2967324). Preprint version available at https://arxiv.org/abs/1910.06840.

Project page: https://mchancan.github.io/projects/FlyNet

News!

  • (Mar 1, 2021) New (relevant) research on Sequential Place Learning that address the main limitations of CANNs is now available!
  • (Nov 22, 2020) A demo of the CANN component has been released here.

Dataset

The dataset used to run this code can be downloaded from here, which is a small subset of the Nordland dataset. However, this code can easily be adapted to run across other much larger datasets.

Dependencies

This code was tested on PyTorch v1.0 and Python 3.6.

Use FlyNet

We provide a demo of FlyNet on the Nordland dataset. After downloading the dataset, extract it into the dataset/ directory and run:

python main.py

Sample results

License

FlyNet itself is released under the MIT License (refer to the LICENSE file for details) for academic purposes. For commercial usage, please contact us via [email protected]

Citation

If you find this project useful for your research, please use the following BibTeX entry.

@article{
	chancan2020hybrid,
	author = {M. {Chanc\'an} and L. {Hernandez-Nunez} and A. {Narendra} and A. B. {Barron} and M. {Milford}},
	journal = {IEEE Robotics and Automation Letters},
	title = {A Hybrid Compact Neural Architecture for Visual Place Recognition},
	year = {2020},
	volume = {5},
	number = {2},
	pages = {993--1000},
	keywords = {Biomimetics;localization;visual-based navigation},
	doi = {10.1109/LRA.2020.2967324},
	ISSN = {2377-3774},
	month = {April}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].