All Projects → ngarneau → Understanding Pytorch Batching Lstm

ngarneau / Understanding Pytorch Batching Lstm

Understanding and visualizing PyTorch Batching with LSTM

Projects that are alternatives of or similar to Understanding Pytorch Batching Lstm

Rnn lstm from scratch
How to build RNNs and LSTMs from scratch with NumPy.
Stars: ✭ 156 (+24.8%)
Mutual labels:  jupyter-notebook, lstm-neural-networks
Lstm anomaly thesis
Anomaly detection for temporal data using LSTMs
Stars: ✭ 178 (+42.4%)
Mutual labels:  jupyter-notebook, lstm-neural-networks
Image Caption Generator
[DEPRECATED] A Neural Network based generative model for captioning images using Tensorflow
Stars: ✭ 141 (+12.8%)
Mutual labels:  jupyter-notebook, lstm-neural-networks
Scribe
Realistic Handwriting with Tensorflow
Stars: ✭ 193 (+54.4%)
Mutual labels:  jupyter-notebook, lstm-neural-networks
Deep Learning Time Series
List of papers, code and experiments using deep learning for time series forecasting
Stars: ✭ 796 (+536.8%)
Mutual labels:  jupyter-notebook, lstm-neural-networks
Automatic Image Captioning
Generating Captions for images using Deep Learning
Stars: ✭ 84 (-32.8%)
Mutual labels:  jupyter-notebook, lstm-neural-networks
Aulas
Aulas da Escola de Inteligência Artificial de São Paulo
Stars: ✭ 166 (+32.8%)
Mutual labels:  jupyter-notebook, lstm-neural-networks
Da Rnn
📃 **Unofficial** PyTorch Implementation of DA-RNN (arXiv:1704.02971)
Stars: ✭ 256 (+104.8%)
Mutual labels:  jupyter-notebook, lstm-neural-networks
Stockpriceprediction
Stock Price Prediction using Machine Learning Techniques
Stars: ✭ 700 (+460%)
Mutual labels:  jupyter-notebook, lstm-neural-networks
Bitcoin Price Prediction Using Lstm
Bitcoin price Prediction ( Time Series ) using LSTM Recurrent neural network
Stars: ✭ 67 (-46.4%)
Mutual labels:  jupyter-notebook, lstm-neural-networks
Pytorch Learners Tutorial
PyTorch tutorial for learners
Stars: ✭ 97 (-22.4%)
Mutual labels:  jupyter-notebook, lstm-neural-networks
Onlineminingtripletloss
PyTorch conversion of https://omoindrot.github.io/triplet-loss
Stars: ✭ 125 (+0%)
Mutual labels:  jupyter-notebook
Gdeltpyr
Python based framework to retreive Global Database of Events, Language, and Tone (GDELT) version 1.0 and version 2.0 data.
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Error Detection
A Baseline for Detecting Misclassified and Out-of-Distribution Examples in Neural Networks
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Jupyterlab Demo
Demonstrations of JupyterLab
Stars: ✭ 122 (-2.4%)
Mutual labels:  jupyter-notebook
Nlp Beginner Guide Keras
NLP model implementations with keras for beginner
Stars: ✭ 125 (+0%)
Mutual labels:  jupyter-notebook
Pandaset Devkit
Stars: ✭ 121 (-3.2%)
Mutual labels:  jupyter-notebook
Carnd Lenet Lab
Implement the LeNet deep neural network model with TensorFlow.
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Predictive Maintenance
Data Wrangling, EDA, Feature Engineering, Model Selection, Regression, Binary and Multi-class Classification (Python, scikit-learn)
Stars: ✭ 124 (-0.8%)
Mutual labels:  jupyter-notebook
Midi Rnn
Generate monophonic melodies with machine learning using a basic LSTM RNN
Stars: ✭ 124 (-0.8%)
Mutual labels:  lstm-neural-networks

Understanding and visualizing PyTorch Batching with LSTM

This is a small notebook that I wrote to help me understand how batching was done in PyTorch with an Recurrent Neural Network (LSTM).

Please, if you see anything wrong within this notebook feel free to contribute or submit an issue, I may have misunderstood/misinterpreted/misrepresented some things here.

The point here wasn't to build a state of the art model but visualize properly how PyTorch handle the tensors while batching them into an LSTM.

Thanks to Tushar-N from which I inspired this repo and of cours the Classifying Names with a Character-Level RNN PyTorch tutorial.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].