All Projects → menon92 → Banglatranslator

menon92 / Banglatranslator

Licence: apache-2.0
Bangla Machine Translator

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Banglatranslator

Rnn For Joint Nlu
Pytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 176 (+738.1%)
Mutual labels:  lstm, attention, encoder-decoder
dhs summit 2019 image captioning
Image captioning using attention models
Stars: ✭ 34 (+61.9%)
Mutual labels:  lstm, attention, encoder-decoder
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+16176.19%)
Mutual labels:  lstm, attention, encoder-decoder
Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses
Word Sense Disambiguation using Word Specific models, All word models and Hierarchical models in Tensorflow
Stars: ✭ 33 (+57.14%)
Mutual labels:  lstm, attention
Screenshot To Code
A neural network that transforms a design mock-up into a static website.
Stars: ✭ 13,561 (+64476.19%)
Mutual labels:  lstm, encoder-decoder
EBIM-NLI
Enhanced BiLSTM Inference Model for Natural Language Inference
Stars: ✭ 24 (+14.29%)
Mutual labels:  lstm, attention
automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (+104.76%)
Mutual labels:  lstm, attention
datastories-semeval2017-task6
Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-4.76%)
Mutual labels:  lstm, attention
ntua-slp-semeval2018
Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Stars: ✭ 79 (+276.19%)
Mutual labels:  lstm, attention
Base-On-Relation-Method-Extract-News-DA-RNN-Model-For-Stock-Prediction--Pytorch
基於關聯式新聞提取方法之雙階段注意力機制模型用於股票預測
Stars: ✭ 33 (+57.14%)
Mutual labels:  lstm, attention
Encoder decoder
Four styles of encoder decoder model by Python, Theano, Keras and Seq2Seq
Stars: ✭ 269 (+1180.95%)
Mutual labels:  attention, encoder-decoder
Datastories Semeval2017 Task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (+776.19%)
Mutual labels:  lstm, attention
Vad
Voice activity detection (VAD) toolkit including DNN, bDNN, LSTM and ACAM based VAD. We also provide our directly recorded dataset.
Stars: ✭ 622 (+2861.9%)
Mutual labels:  lstm, attention
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (+38.1%)
Mutual labels:  lstm, attention
Deep Time Series Prediction
Seq2Seq, Bert, Transformer, WaveNet for time series prediction.
Stars: ✭ 183 (+771.43%)
Mutual labels:  lstm, attention
iPerceive
Applying Common-Sense Reasoning to Multi-Modal Dense Video Captioning and Video Question Answering | Python3 | PyTorch | CNNs | Causality | Reasoning | LSTMs | Transformers | Multi-Head Self Attention | Published in IEEE Winter Conference on Applications of Computer Vision (WACV) 2021
Stars: ✭ 52 (+147.62%)
Mutual labels:  lstm, attention
Deep News Summarization
News summarization using sequence to sequence model with attention in TensorFlow.
Stars: ✭ 167 (+695.24%)
Mutual labels:  lstm, encoder-decoder
Multimodal Sentiment Analysis
Attention-based multimodal fusion for sentiment analysis
Stars: ✭ 172 (+719.05%)
Mutual labels:  lstm, attention
ConvLSTM-PyTorch
ConvLSTM/ConvGRU (Encoder-Decoder) with PyTorch on Moving-MNIST
Stars: ✭ 202 (+861.9%)
Mutual labels:  lstm, encoder-decoder
Crnn attention ocr chinese
CRNN with attention to do OCR,add Chinese recognition
Stars: ✭ 315 (+1400%)
Mutual labels:  lstm, attention

BanglaTranslator

Translate bangla to english. This model is train based on encoder decoder with attention mechanism. This repository may be a starting point to approaching bangla machine translation problem. If this repository helps others people who are working on bangla machine translation then it would be very greatfull for me.

Dataset

I use dataset provide in http://www.manythings.org/anki/ben-eng.zip . This dataset contain english bangla sentence pair in the following format,

I'm counting on you.	আমি আপনার উপর নির্ভর করে আছি।
I want your opinion.	আমি আপনার মতামত চাই।
How is your daughter?	আপনার মেয়ে কেমন আছে?

Project structure

BanglaTranslator
├── assets
│   └── banglafonts
│       └── Siyamrupali.ttf
├── data
│   ├── ben-eng
│   │   ├── _about.txt
│   │   └── ben.txt
├── docs
│   └── U0980.pdf
├── models
│   ├── input_language_tokenizer.json
│   ├── target_language_tokenizer.json
├── translator
│   ├── config.py
│   ├── datasets.py
│   ├── infer.py
│   ├── __init__.py
│   ├── models.py
│   ├── train.py
│   └── utils.py
├── infer-example.ipynb
├── README.md
└── training-example.ipynb
  • assets contain bangla font that used in plotting
  • data contain english bangla pair dataset
  • docs contrain documeantaion bangla unicode poins and it's char maping
  • models contrain saved tokenize and training checkpoints if you do training
  • translator is the core of the project that contrain all the required scripts for this project.
  • infer-example.ipynb An example notebook that shows how predict on single sentence using saved checkpoints
  • training-example.ipynb you can use this notebook to train bangla to english translator model

Dependency

python 3.7
tensorflow 2.x
matplotlib
sklearn
tqdm
jupyter notebook

Pre-train model

If you want to just test the model then you need to download pretrain model from from google drive link and extract training_checkpoints.zip file under models directory

Test result

I test pre-train model and got result like bellow.

  • If you want to test it yourself please check infer-example.ipynb and also download pre-train model

Resources

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].