All Projects → rajesh-bhat → dhs_summit_2019_image_captioning

rajesh-bhat / dhs_summit_2019_image_captioning

Licence: MIT License
Image captioning using attention models

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to dhs summit 2019 image captioning

Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+9952.94%)
Mutual labels:  lstm, attention, sequence-to-sequence, encoder-decoder
Rnn For Joint Nlu
Pytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 176 (+417.65%)
Mutual labels:  lstm, attention, encoder-decoder
Banglatranslator
Bangla Machine Translator
Stars: ✭ 21 (-38.24%)
Mutual labels:  lstm, attention, encoder-decoder
Base-On-Relation-Method-Extract-News-DA-RNN-Model-For-Stock-Prediction--Pytorch
基於關聯式新聞提取方法之雙階段注意力機制模型用於股票預測
Stars: ✭ 33 (-2.94%)
Mutual labels:  lstm, attention
Screenshot To Code
A neural network that transforms a design mock-up into a static website.
Stars: ✭ 13,561 (+39785.29%)
Mutual labels:  lstm, encoder-decoder
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (+20.59%)
Mutual labels:  attention, sequence-to-sequence
Multimodal Sentiment Analysis
Attention-based multimodal fusion for sentiment analysis
Stars: ✭ 172 (+405.88%)
Mutual labels:  lstm, attention
Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses
Word Sense Disambiguation using Word Specific models, All word models and Hierarchical models in Tensorflow
Stars: ✭ 33 (-2.94%)
Mutual labels:  lstm, attention
EBIM-NLI
Enhanced BiLSTM Inference Model for Natural Language Inference
Stars: ✭ 24 (-29.41%)
Mutual labels:  lstm, attention
datastories-semeval2017-task6
Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-41.18%)
Mutual labels:  lstm, attention
iPerceive
Applying Common-Sense Reasoning to Multi-Modal Dense Video Captioning and Video Question Answering | Python3 | PyTorch | CNNs | Causality | Reasoning | LSTMs | Transformers | Multi-Head Self Attention | Published in IEEE Winter Conference on Applications of Computer Vision (WACV) 2021
Stars: ✭ 52 (+52.94%)
Mutual labels:  lstm, attention
Datastories Semeval2017 Task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (+441.18%)
Mutual labels:  lstm, attention
Deep Time Series Prediction
Seq2Seq, Bert, Transformer, WaveNet for time series prediction.
Stars: ✭ 183 (+438.24%)
Mutual labels:  lstm, attention
protein-transformer
Predicting protein structure through sequence modeling
Stars: ✭ 77 (+126.47%)
Mutual labels:  attention, sequence-to-sequence
automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (+26.47%)
Mutual labels:  lstm, attention
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-14.71%)
Mutual labels:  lstm, attention
ConvLSTM-PyTorch
ConvLSTM/ConvGRU (Encoder-Decoder) with PyTorch on Moving-MNIST
Stars: ✭ 202 (+494.12%)
Mutual labels:  lstm, encoder-decoder
Abstractive Summarization
Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Stars: ✭ 128 (+276.47%)
Mutual labels:  lstm, encoder-decoder
Deep News Summarization
News summarization using sequence to sequence model with attention in TensorFlow.
Stars: ✭ 167 (+391.18%)
Mutual labels:  lstm, encoder-decoder
deep-trans
Transliterating English to Hindi using Recurrent Neural Networks
Stars: ✭ 44 (+29.41%)
Mutual labels:  lstm, sequence-to-sequence

Image Captioning using attention models

Update: 26th July 2020

Blog published at Weights and Biases: https://app.wandb.ai/authors/image-captioning/reports/Generate-Meaningful-Captions-for-Images-with-Attention-Models--VmlldzoxNzg0ODA

Code for Image Captioning now available in TF 2.0 : tensorflow-image-captioning.ipynb

Presented at Data Hack Summit 2019 and 3rd Kaggle Days Meetup Bangalore - Senior Track

Alt text Kaggle Meetup

Task

TASK

Traing Data

TASK

Attention weight calculation

ATTENTION_WEIGHTS

Context Vector

ATTENTION_WEIGHTS

Repository referred for coding:

https://github.com/sgrvinod/a-PyTorch-Tutorial-to-Image-Captioning Modifications done: using shared weights when calculating attention weights.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].