All Projects → oaqa → Flexneuart

oaqa / Flexneuart

Licence: other
Flexible classic and NeurAl Retrieval Toolkit

Programming Languages

java
68154 projects - #9 most used programming language

Projects that are alternatives of or similar to Flexneuart

COVID19-IRQA
No description or website provided.
Stars: ✭ 32 (-67.68%)
Mutual labels:  information-retrieval, question-answering
text2text
Text2Text: Cross-lingual natural language processing and generation toolkit
Stars: ✭ 188 (+89.9%)
Mutual labels:  information-retrieval, question-answering
ProQA
Progressively Pretrained Dense Corpus Index for Open-Domain QA and Information Retrieval
Stars: ✭ 44 (-55.56%)
Mutual labels:  information-retrieval, question-answering
Clicr
Machine reading comprehension on clinical case reports
Stars: ✭ 123 (+24.24%)
Mutual labels:  question-answering, neural-networks
Drl4nlp.scratchpad
Notes on Deep Reinforcement Learning for Natural Language Processing papers
Stars: ✭ 26 (-73.74%)
Mutual labels:  information-retrieval, neural-networks
Dan Jurafsky Chris Manning Nlp
My solution to the Natural Language Processing course made by Dan Jurafsky, Chris Manning in Winter 2012.
Stars: ✭ 124 (+25.25%)
Mutual labels:  question-answering, information-retrieval
cherche
📑 Neural Search
Stars: ✭ 196 (+97.98%)
Mutual labels:  information-retrieval, question-answering
HAR
Code for WWW2019 paper "A Hierarchical Attention Retrieval Model for Healthcare Question Answering"
Stars: ✭ 22 (-77.78%)
Mutual labels:  information-retrieval, question-answering
Awesome Neural Models For Semantic Match
A curated list of papers dedicated to neural text (semantic) matching.
Stars: ✭ 669 (+575.76%)
Mutual labels:  question-answering, information-retrieval
Cdqa
⛔ [NOT MAINTAINED] An End-To-End Closed Domain Question Answering System.
Stars: ✭ 500 (+405.05%)
Mutual labels:  question-answering, information-retrieval
Haystack
🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+3343.43%)
Mutual labels:  question-answering, information-retrieval
Bert Vietnamese Question Answering
Vietnamese question answering system with BERT
Stars: ✭ 57 (-42.42%)
Mutual labels:  question-answering, information-retrieval
Chatbot
Русскоязычный чатбот
Stars: ✭ 106 (+7.07%)
Mutual labels:  question-answering, neural-networks
FinBERT-QA
Financial Domain Question Answering with pre-trained BERT Language Model
Stars: ✭ 70 (-29.29%)
Mutual labels:  information-retrieval, question-answering
cdQA-ui
⛔ [NOT MAINTAINED] A web interface for cdQA and other question answering systems.
Stars: ✭ 19 (-80.81%)
Mutual labels:  information-retrieval, question-answering
Knowledge Graphs
A collection of research on knowledge graphs
Stars: ✭ 845 (+753.54%)
Mutual labels:  question-answering, information-retrieval
Bidaf Keras
Bidirectional Attention Flow for Machine Comprehension implemented in Keras 2
Stars: ✭ 60 (-39.39%)
Mutual labels:  question-answering, neural-networks
Riddle
Race and ethnicity Imputation from Disease history with Deep LEarning
Stars: ✭ 91 (-8.08%)
Mutual labels:  neural-networks
Factorized Tdnn
PyTorch implementation of the Factorized TDNN (TDNN-F) from "Semi-Orthogonal Low-Rank Matrix Factorization for Deep Neural Networks" and Kaldi
Stars: ✭ 98 (-1.01%)
Mutual labels:  neural-networks
Forte
Forte is a flexible and powerful NLP builder FOR TExt. This is part of the CASL project: http://casl-project.ai/
Stars: ✭ 89 (-10.1%)
Mutual labels:  information-retrieval

FlexNeuART (flex-noo-art)

Flexible classic and NeurAl Retrieval Toolkit, or shortly FlexNeuART (intended pronunciation flex-noo-art) is a substantially reworked knn4qa package. The overview can be found in our EMNLP OSS workshop paper: Flexible retrieval with NMSLIB and FlexNeuART, 2020. Leonid Boytsov, Eric Nyberg.

In Aug-Dec 2020, we used this framework to generate best traditional and/or neural runs in the MSMARCO Document ranking task. In fact, our best traditional (non-neural) run slightly outperformed a couple of neural submissions. The code for the best-performing neural model will be published within 2-3 months. This model is described in our ECIR 2021 paper: Boytsov, Leonid, and Zico Kolter. "Exploring Classic and Neural Lexical Translation Models for Information Retrieval: Interpretability, Effectiveness, and Efficiency Benefits." ECIR 2021.

FlexNeuART is under active development. More detailed description and documentaion is to appear. Currently we have:

For neural network training FlexNeuART incorporates a re-worked variant of CEDR (MacAvaney et al' 2019).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].