FasterTransformerTransformer related optimization, including BERT, GPT
Stars: ✭ 1,571 (+11121.43%)
deepnlp小时候练手的nlp项目
Stars: ✭ 11 (-21.43%)
Transformers🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+398057.14%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (+64.29%)
syntaxdotNeural syntax annotator, supporting sequence labeling, lemmatization, and dependency parsing.
Stars: ✭ 32 (+128.57%)
bert-as-a-service TFXEnd-to-end pipeline with TFX to train and deploy a BERT model for sentiment analysis.
Stars: ✭ 32 (+128.57%)
KitanaQAKitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (+314.29%)
semantic-document-relationsImplementation, trained models and result data for the paper "Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles"
Stars: ✭ 21 (+50%)
BertvizTool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+24492.86%)
Kevinpro-NLP-demoAll NLP you Need Here. 个人实现了一些好玩的NLP demo,目前包含13个NLP应用的pytorch实现
Stars: ✭ 117 (+735.71%)
bert in a flaskA dockerized flask API, serving ALBERT and BERT predictions using TensorFlow 2.0.
Stars: ✭ 32 (+128.57%)
XpersonaXPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (+285.71%)
Spark NlpState of the Art Natural Language Processing
Stars: ✭ 2,518 (+17885.71%)
SIGIR2021 ConureOne Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (+64.29%)
TabFormerCode & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Stars: ✭ 209 (+1392.86%)
PDNThe official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (+214.29%)
Filipino-Text-BenchmarksOpen-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (+57.14%)
golgothaContextualised Embeddings and Language Modelling using BERT and Friends using R
Stars: ✭ 39 (+178.57%)
Bert PytorchGoogle AI 2018 BERT pytorch implementation
Stars: ✭ 4,642 (+33057.14%)
Nlp TutorialNatural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+70578.57%)
datalinguistStanford CoreNLP in idiomatic Clojure.
Stars: ✭ 93 (+564.29%)
sisterSImple SenTence EmbeddeR
Stars: ✭ 66 (+371.43%)
vietnamese-robertaA Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (+57.14%)
question generatorAn NLP system for generating reading comprehension questions
Stars: ✭ 188 (+1242.86%)
CAIL法研杯CAIL2019阅读理解赛题参赛模型
Stars: ✭ 34 (+142.86%)
UdacityThis repo includes all the projects I have finished in the Udacity Nanodegree programs
Stars: ✭ 57 (+307.14%)
keyword-transformerOfficial implementation of the Keyword Transformer: https://arxiv.org/abs/2104.00769
Stars: ✭ 76 (+442.86%)
JointIDSFBERT-based joint intent detection and slot filling with intent-slot attention mechanism (INTERSPEECH 2021)
Stars: ✭ 55 (+292.86%)
php-json-apiJSON API transformer outputting valid (PSR-7) API Responses.
Stars: ✭ 68 (+385.71%)
Transformer-QG-on-SQuADImplement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Stars: ✭ 28 (+100%)
ClusterTransformerTopic clustering library built on Transformer embeddings and cosine similarity metrics.Compatible with all BERT base transformers from huggingface.
Stars: ✭ 36 (+157.14%)
DolboNetРусскоязычный чат-бот для Discord на архитектуре Transformer
Stars: ✭ 53 (+278.57%)
neural-ranking-kdImproving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation
Stars: ✭ 74 (+428.57%)
POS-TaggersPart-of-Speech Tagging Models in Python
Stars: ✭ 16 (+14.29%)
nlp-cheat-sheet-pythonNLP Cheat Sheet, Python, spacy, LexNPL, NLTK, tokenization, stemming, sentence detection, named entity recognition
Stars: ✭ 69 (+392.86%)
kospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+3157.14%)
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+1535.71%)
CheXbertCombining Automatic Labelers and Expert Annotations for Accurate Radiology Report Labeling Using BERT
Stars: ✭ 51 (+264.29%)
TitleStylistSource code for our "TitleStylist" paper at ACL 2020
Stars: ✭ 72 (+414.29%)
NMeCabJapanese morphological analyzer on .NET
Stars: ✭ 65 (+364.29%)
ganbertEnhancing the BERT training with Semi-supervised Generative Adversarial Networks
Stars: ✭ 205 (+1364.29%)
En-transformerImplementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (+835.71%)
BERT-QECode and resources for the paper "BERT-QE: Contextualized Query Expansion for Document Re-ranking".
Stars: ✭ 43 (+207.14%)
Sohu20192019搜狐校园算法大赛
Stars: ✭ 26 (+85.71%)
TriB-QA吹逼我们是认真的
Stars: ✭ 45 (+221.43%)
Transformer-MM-Explainability[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (+3357.14%)
GraphormerGraphormer is a deep learning package that allows researchers and developers to train custom models for molecule modeling tasks. It aims to accelerate the research and application in AI for molecule science, such as material design, drug discovery, etc.
Stars: ✭ 1,194 (+8428.57%)
SA-BERTCIKM 2020: Speaker-Aware BERT for Multi-Turn Response Selection in Retrieval-Based Chatbots
Stars: ✭ 71 (+407.14%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+764.29%)
svelte-jestJest Svelte component transformer
Stars: ✭ 37 (+164.29%)
korpatbert특허분야 특화된 한국어 AI언어모델 KorPatBERT
Stars: ✭ 48 (+242.86%)