All Projects → VectorInstitute → VariationalNeuralAnnealing

VectorInstitute / VariationalNeuralAnnealing

Licence: other
A variational implementation of classical and quantum annealing using recurrent neural networks for the purpose of solving optimization problems.

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to VariationalNeuralAnnealing

Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (+466.67%)
Mutual labels:  recurrent-neural-networks, rnn
Iseebetter
iSeeBetter: Spatio-Temporal Video Super Resolution using Recurrent-Generative Back-Projection Networks | Python3 | PyTorch | GANs | CNNs | ResNets | RNNs | Published in Springer Journal of Computational Visual Media, September 2020, Tsinghua University Press
Stars: ✭ 202 (+861.9%)
Mutual labels:  recurrent-neural-networks, rnn
Rnn From Scratch
Use tensorflow's tf.scan to build vanilla, GRU and LSTM RNNs
Stars: ✭ 123 (+485.71%)
Mutual labels:  recurrent-neural-networks, rnn
Easyesn
Python library for Reservoir Computing using Echo State Networks
Stars: ✭ 93 (+342.86%)
Mutual labels:  recurrent-neural-networks, rnn
ACT
Alternative approach for Adaptive Computation Time in TensorFlow
Stars: ✭ 16 (-23.81%)
Mutual labels:  recurrent-neural-networks, rnn
Pytorch Pos Tagging
A tutorial on how to implement models for part-of-speech tagging using PyTorch and TorchText.
Stars: ✭ 96 (+357.14%)
Mutual labels:  recurrent-neural-networks, rnn
Deep Learning With Python
Deep learning codes and projects using Python
Stars: ✭ 195 (+828.57%)
Mutual labels:  recurrent-neural-networks, rnn
Deepseqslam
The Official Deep Learning Framework for Route-based Place Recognition
Stars: ✭ 49 (+133.33%)
Mutual labels:  recurrent-neural-networks, rnn
danifojo-2018-repeatrnn
Comparing Fixed and Adaptive Computation Time for Recurrent Neural Networks
Stars: ✭ 32 (+52.38%)
Mutual labels:  recurrent-neural-networks, rnn
Pytorch Sentiment Analysis
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+15180.95%)
Mutual labels:  recurrent-neural-networks, rnn
Gru Svm
[ICMLC 2018] A Neural Network Architecture Combining Gated Recurrent Unit (GRU) and Support Vector Machine (SVM) for Intrusion Detection
Stars: ✭ 76 (+261.9%)
Mutual labels:  recurrent-neural-networks, rnn
Human-Activity-Recognition
Human activity recognition using TensorFlow on smartphone sensors dataset and an LSTM RNN. Classifying the type of movement amongst six categories (WALKING, WALKING_UPSTAIRS, WALKING_DOWNSTAIRS, SITTING, STANDING, LAYING).
Stars: ✭ 16 (-23.81%)
Mutual labels:  recurrent-neural-networks, rnn
Codegan
[Deprecated] Source Code Generation using Sequence Generative Adversarial Networks
Stars: ✭ 73 (+247.62%)
Mutual labels:  recurrent-neural-networks, rnn
Pytorch Learners Tutorial
PyTorch tutorial for learners
Stars: ✭ 97 (+361.9%)
Mutual labels:  recurrent-neural-networks, rnn
Bitcoin Price Prediction Using Lstm
Bitcoin price Prediction ( Time Series ) using LSTM Recurrent neural network
Stars: ✭ 67 (+219.05%)
Mutual labels:  recurrent-neural-networks, rnn
Pytorch Kaldi
pytorch-kaldi is a project for developing state-of-the-art DNN/RNN hybrid speech recognition systems. The DNN part is managed by pytorch, while feature extraction, label computation, and decoding are performed with the kaldi toolkit.
Stars: ✭ 2,097 (+9885.71%)
Mutual labels:  recurrent-neural-networks, rnn
Theano Kaldi Rnn
THEANO-KALDI-RNNs is a project implementing various Recurrent Neural Networks (RNNs) for RNN-HMM speech recognition. The Theano Code is coupled with the Kaldi decoder.
Stars: ✭ 31 (+47.62%)
Mutual labels:  recurrent-neural-networks, rnn
Predicting Myers Briggs Type Indicator With Recurrent Neural Networks
Stars: ✭ 43 (+104.76%)
Mutual labels:  recurrent-neural-networks, rnn
Rnn ctc
Recurrent Neural Network and Long Short Term Memory (LSTM) with Connectionist Temporal Classification implemented in Theano. Includes a Toy training example.
Stars: ✭ 220 (+947.62%)
Mutual labels:  recurrent-neural-networks, rnn
Probabilistic-RNN-DA-Classifier
Probabilistic Dialogue Act Classification for the Switchboard Corpus using an LSTM model
Stars: ✭ 22 (+4.76%)
Mutual labels:  recurrent-neural-networks, rnn

Variational Neural Annealing

Variational neural annealing (VNA) is a framework to variationally simulate classical and quantum annealing for the purpose of solving optimization problems using neural networks. In this paper https://www.nature.com/articles/s42256-021-00401-3 (arXiv version: https://arxiv.org/abs/2101.10154), we show that we can implement a variational version of classical annealing (VCA) and its quantum counterpart (VQA) using recurrent neural networks. We find that our implementation significantly outperforms traditional simulated annealing in the asymptotic limit on prototypical spin models, suggesting the promising potential of this route to optimization.

This repository is aimed to facilitate the reproducibilty of the results of our paper.

Our implementation is based on RNN wave functions's code.

Content

This repository contains a source code of our implementation and tutorials under the format of jupyter notebooks for demonstration purposes.

src

This section contains our source code with the following implementations:

  1. src/VNA_1DTRNN: an implementation of VNA using 1D Tensorized RNNs to find the ground state of a random ising chains with open boundary conditions. All you need to do is run the file src/run_VNA_randomisingchain.py.

  2. src/VNA_2DTRNN: an implementation of VNA using 2D Tensorized RNNs to find the ground state of the 2D Edwards-Anderson model with open boundary conditions. To execute this module, you can run the file src/run_VNA_EdwardsAnderson.py.

  3. src/VNA_DilatedRNN: an implementation of VNA using Dilated RNNs to find the ground state of the Sherrington-Kirkpatrick model. To execute this implementation, you can run the python file src/run_VNA_SherringtonKirkpatrick.py.

To be able to run VCA in each one of these modules, you can set Bx0 (initial transvere magnetic field) in the hyperparameters section to zero in the execution python files. Similarly if you want to run VQA, you can set T0 (initial temperature) to zero. Also, if you want to run RVQA, you can set Bx0 and T0 to be both non-zero. Finally, if you want to run Classical-Quantum optimization CQO, you can set both Bx0 and T0 to zero. More details about the acronyms VCA, VQA, RVQA and CQO are provided in our paper.

We note that in this code we use the tensordot2 operation from the TensorNetwork package to speed up tensorized operations.

tools

This section contains the tools we used to generate the random instances of the models we considered in our paper.

tutorials

In this section of the repository, we demonstrate how our source code works in simple cases through Jupyter notebooks that you can run on Google Colaboratory to take advantage of GPU speed up. These tutorials will help you to become more familiar with the content of the source code. The tutorials module contains the following:

  1. tutorials/VNA_1DTRNNs.ipynb: a demonstration of VNA using 1D Tensorized RNNs applied to random ising chains with open boundary conditions.
  2. tutorials/VNA_2DTRNNs.ipynb: a demonstration of VNA using 2D Tensorized RNNs on the 2D Edwards-Anderson model with open boundary conditions.
  3. tutorials/VNA_DilatedRNNs.ipynb: a demonstration of VNA using Dilated RNNs applied to the Sherrington-Kirkpatrick model.

For more details, you can check our manuscript on arXiv: https://arxiv.org/abs/2101.10154 or on Nature Machine Intelligence: https://www.nature.com/articles/s42256-021-00401-3 (free access at https://rdcu.be/cAIyS).

For questions or inquiries, you can reach out to this email [email protected].

Dependencies

This code works on Python (3.6.10) with TensorFlow (1.13.1) and NumPy (1.16.3) modules. We also note that this code runs much faster on a GPU as compared to a CPU. No installation is required providing that the dependencies are available.

Disclaimer

This code can be freely used for academic purposes that are socially and scientifically beneficial, however it is under Vector Institute’s Intellectual Property (IP) policy for profit related activities.

License

This code is under the 'Attribution-NonCommercial-ShareAlike 4.0 International' license.

Citing

@Article{VNA2021,
author={Hibat-Allah, Mohamed and Inack, Estelle M. and Wiersema, Roeland and Melko, Roger G. and Carrasquilla, Juan},
title={Variational neural annealing},
journal={Nature Machine Intelligence},
year={2021},
month={Nov},
day={01},
volume={3},
number={11},
pages={952-961},
issn={2522-5839},
doi={10.1038/s42256-021-00401-3},
url={https://doi.org/10.1038/s42256-021-00401-3}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].