co-attentionPytorch implementation of "Dynamic Coattention Networks For Question Answering"
Stars: ✭ 54 (-83.02%)
qaTensorFlow Models for the Stanford Question Answering Dataset
Stars: ✭ 72 (-77.36%)
FastFusionNetA PyTorch Implementation of FastFusionNet on SQuAD 1.1
Stars: ✭ 38 (-88.05%)
Medi-CoQAConversational Question Answering on Clinical Text
Stars: ✭ 22 (-93.08%)
PersianQAPersian (Farsi) Question Answering Dataset (+ Models)
Stars: ✭ 114 (-64.15%)
extractive rc by runtime mtCode and datasets of "Multilingual Extractive Reading Comprehension by Runtime Machine Translation"
Stars: ✭ 36 (-88.68%)
SquadDockerfile for automated build of a Squad gameserver: https://hub.docker.com/r/cm2network/squad/
Stars: ✭ 21 (-93.4%)
Transformer-QG-on-SQuADImplement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Stars: ✭ 28 (-91.19%)
diwako duiA UI showing unit positions and names of units in your squad
Stars: ✭ 39 (-87.74%)
Dawn Bench EntriesDAWNBench: An End-to-End Deep Learning Benchmark and Competition
Stars: ✭ 254 (-20.13%)
R NetR-net in PyTorch, with ELMo
Stars: ✭ 194 (-38.99%)
X3daudio1 7 hrtfHRTF for Arma 3, Skyrim, and other titles that use XAudio2 + X3DAudio
Stars: ✭ 192 (-39.62%)
R Net In KerasOpen R-NET implementation and detailed analysis: https://git.io/vd8dx
Stars: ✭ 181 (-43.08%)
Albert Tf2.0ALBERT model Pretraining and Fine Tuning using TF2.0
Stars: ✭ 180 (-43.4%)
AllureAllure of the Stars is a near-future Sci-Fi roguelike and tactical squad combat game written in Haskell; please offer feedback, e.g., after trying out the web frontend version at
Stars: ✭ 149 (-53.14%)
MnemonicreaderA PyTorch implementation of Mnemonic Reader for the Machine Comprehension task
Stars: ✭ 137 (-56.92%)
Haystack🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+972.01%)
Bi Att FlowBi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.
Stars: ✭ 1,472 (+362.89%)
Match LstmA PyTorch implemention of Match-LSTM, R-NET and M-Reader for Machine Reading Comprehension
Stars: ✭ 92 (-71.07%)
Bidaf PytorchAn Implementation of Bidirectional Attention Flow
Stars: ✭ 42 (-86.79%)
QanetA Tensorflow implementation of QANet for machine reading comprehension
Stars: ✭ 996 (+213.21%)
FusionnetMy implementation of the FusionNet for machine comprehension
Stars: ✭ 29 (-90.88%)
Awesome Qa😎 A curated list of the Question Answering (QA)
Stars: ✭ 596 (+87.42%)
R NetTensorflow Implementation of R-Net
Stars: ✭ 582 (+83.02%)
LambdahackHaskell game engine library for roguelike dungeon crawlers; please offer feedback, e.g., after trying out the sample game with the web frontend at
Stars: ✭ 439 (+38.05%)
DrqaA pytorch implementation of Reading Wikipedia to Answer Open-Domain Questions.
Stars: ✭ 378 (+18.87%)
R NetA Tensorflow Implementation of R-net: Machine reading comprehension with self matching networks
Stars: ✭ 321 (+0.94%)