All Projects → MurtyShikhar → Question Answering

MurtyShikhar / Question Answering

TensorFlow implementation of Match-LSTM and Answer pointer for the popular SQuAD dataset.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Question Answering

Reading Comprehension Question Answering Papers
Survey on Machine Reading Comprehension
Stars: ✭ 101 (-24.06%)
Mutual labels:  question-answering
Dynamic Coattention Network Plus
Dynamic Coattention Network Plus (DCN+) TensorFlow implementation. Question answering using Deep NLP.
Stars: ✭ 117 (-12.03%)
Mutual labels:  question-answering
Midi Rnn
Generate monophonic melodies with machine learning using a basic LSTM RNN
Stars: ✭ 124 (-6.77%)
Mutual labels:  lstm-neural-networks
Ama
[[I'm slow at replying these days, but I hope to get back to answering questions eventually]] Ask me anything!
Stars: ✭ 102 (-23.31%)
Mutual labels:  question-answering
Tableqa
AI Tool for querying natural language on tabular data.
Stars: ✭ 109 (-18.05%)
Mutual labels:  question-answering
Clicr
Machine reading comprehension on clinical case reports
Stars: ✭ 123 (-7.52%)
Mutual labels:  question-answering
Aspect Based Sentiment Analysis
Aspect Based Sentiment Analysis
Stars: ✭ 99 (-25.56%)
Mutual labels:  lstm-neural-networks
Medquad
Medical Question Answering Dataset of 47,457 QA pairs created from 12 NIH websites
Stars: ✭ 129 (-3.01%)
Mutual labels:  question-answering
Bi Att Flow
Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.
Stars: ✭ 1,472 (+1006.77%)
Mutual labels:  question-answering
Dan Jurafsky Chris Manning Nlp
My solution to the Natural Language Processing course made by Dan Jurafsky, Chris Manning in Winter 2012.
Stars: ✭ 124 (-6.77%)
Mutual labels:  question-answering
Repo 2016
R, Python and Mathematica Codes in Machine Learning, Deep Learning, Artificial Intelligence, NLP and Geolocation
Stars: ✭ 103 (-22.56%)
Mutual labels:  lstm-neural-networks
Video2description
Video to Text: Generates description in natural language for given video (Video Captioning)
Stars: ✭ 107 (-19.55%)
Mutual labels:  lstm-neural-networks
Dynamic Memory Networks Plus Pytorch
Implementation of Dynamic memory networks plus in Pytorch
Stars: ✭ 123 (-7.52%)
Mutual labels:  question-answering
Deep Generation
I used in this project a reccurent neural network to generate c code based on a dataset of c files from the linux repository.
Stars: ✭ 101 (-24.06%)
Mutual labels:  lstm-neural-networks
Understanding Pytorch Batching Lstm
Understanding and visualizing PyTorch Batching with LSTM
Stars: ✭ 125 (-6.02%)
Mutual labels:  lstm-neural-networks
Flexneuart
Flexible classic and NeurAl Retrieval Toolkit
Stars: ✭ 99 (-25.56%)
Mutual labels:  question-answering
Haystack
🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+2463.16%)
Mutual labels:  question-answering
Kbqa Ar Smcnn
Question answering over Freebase (single-relation)
Stars: ✭ 129 (-3.01%)
Mutual labels:  question-answering
Ml Projects
ML based projects such as Spam Classification, Time Series Analysis, Text Classification using Random Forest, Deep Learning, Bayesian, Xgboost in Python
Stars: ✭ 127 (-4.51%)
Mutual labels:  lstm-neural-networks
Knowledge Aware Reader
PyTorch implementation of the ACL 2019 paper "Improving Question Answering over Incomplete KBs with Knowledge-Aware Reader"
Stars: ✭ 123 (-7.52%)
Mutual labels:  question-answering

Match-LSTM and Answer Pointer (Wang and Jiang, ICLR 2016)

This repo attempts to reproduce the match-lstm and answer pointer experiments from the 2016 paper on the same. A lot of the preprocessing boiler code is taken from Stanford CS224D.

The meat of the code is in qa_model.py. I had to modify tensorflow's original attention mechanism implementation for the code to be correct. run train.py to train the model and qa_answer.py to generate answers given a set of paragraphs. Contact me at [email protected] for more info.

This code also serves as an example code showing how tensorflow's attention mechanism can be wired together. As of August 13th, 2017, such an example was not available anywhere.

Preprocessing

Before training, you're going to want to do some preprocessing of the data. Run the following from the command line:

$ python preprocessing/dwr.py
$ python preprocessing/squad_preprocess.py
$ python qa_data.py

The last step can take a bit of time (~30 minutes).

Training

After preprocessing is complete, you can train your model by running the following command:

$ python train.py

Note that depending on your configs, this model will train for a very long time! Given the default configs, on a modern laptop computer this will trian for multiple (~2) hours per epoch, and the default is 30 epochs (~60 hours).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].