LAMB Optimizer TFLAMB Optimizer for Large Batch Training (TensorFlow version)
Stars: ✭ 119 (+153.19%)
oreilly-bert-nlpThis repository contains code for the O'Reilly Live Online Training for BERT
Stars: ✭ 19 (-59.57%)
Sohu20192019搜狐校园算法大赛
Stars: ✭ 26 (-44.68%)
TypewriterCliTypewriter NetCore version with command line interface and single file processing.
Stars: ✭ 20 (-57.45%)
neural-ranking-kdImproving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation
Stars: ✭ 74 (+57.45%)
korpatbert특허분야 특화된 한국어 AI언어모델 KorPatBERT
Stars: ✭ 48 (+2.13%)
TwinBertpytorch implementation of the TwinBert paper
Stars: ✭ 36 (-23.4%)
ganbertEnhancing the BERT training with Semi-supervised Generative Adversarial Networks
Stars: ✭ 205 (+336.17%)
ExpBERTCode for our ACL '20 paper "Representation Engineering with Natural Language Explanations"
Stars: ✭ 28 (-40.43%)
JointIDSFBERT-based joint intent detection and slot filling with intent-slot attention mechanism (INTERSPEECH 2021)
Stars: ✭ 55 (+17.02%)
tools-generation-detection-synthetic-contentCompilation of the state of the art of tools, articles, forums and links of interest to generate and detect any type of synthetic content using deep learning.
Stars: ✭ 107 (+127.66%)
DiscEvalDiscourse Based Evaluation of Language Understanding
Stars: ✭ 18 (-61.7%)
bert attn vizVisualize BERT's self-attention layers on text classification tasks
Stars: ✭ 41 (-12.77%)
DE-LIMITDeEpLearning models for MultIlingual haTespeech (DELIMIT): Benchmarking multilingual models across 9 languages and 16 datasets.
Stars: ✭ 90 (+91.49%)
question generatorAn NLP system for generating reading comprehension questions
Stars: ✭ 188 (+300%)
sisterSImple SenTence EmbeddeR
Stars: ✭ 66 (+40.43%)
CAIL法研杯CAIL2019阅读理解赛题参赛模型
Stars: ✭ 34 (-27.66%)
neuro-comma🇷🇺 Punctuation restoration production-ready model for Russian language 🇷🇺
Stars: ✭ 46 (-2.13%)
CheXbertCombining Automatic Labelers and Expert Annotations for Accurate Radiology Report Labeling Using BERT
Stars: ✭ 51 (+8.51%)
TriB-QA吹逼我们是认真的
Stars: ✭ 45 (-4.26%)
pixitar🧝 Pixitar is an avatar generation library written in Ruby.
Stars: ✭ 20 (-57.45%)
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+387.23%)
CartographerPersistent multiplayer map of pins and stories.
Stars: ✭ 43 (-8.51%)
TabFormerCode & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Stars: ✭ 209 (+344.68%)
clickbaiterGenerates clickbait tech headlines. Don't ask why.
Stars: ✭ 40 (-14.89%)
R-ATRegularized Adversarial Training
Stars: ✭ 19 (-59.57%)
sticker2Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
Stars: ✭ 14 (-70.21%)
probabilistic nlgTensorflow Implementation of Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation (NAACL 2019).
Stars: ✭ 28 (-40.43%)
banglabertThis repository contains the official release of the model "BanglaBERT" and associated downstream finetuning code and datasets introduced in the paper titled "BanglaBERT: Language Model Pretraining and Benchmarks for Low-Resource Language Understanding Evaluation in Bangla" accpeted in Findings of the Annual Conference of the North American Chap…
Stars: ✭ 186 (+295.74%)
beirA Heterogeneous Benchmark for Information Retrieval. Easy to use, evaluate your models across 15+ diverse IR datasets.
Stars: ✭ 738 (+1470.21%)
cli-template⚗ The most advanced CLI template on earth! Featuring automatic releases, website generation and a custom CI-System out of the box.
Stars: ✭ 43 (-8.51%)
BERTOverflowA Pre-trained BERT on StackOverflow Corpus
Stars: ✭ 40 (-14.89%)
PromptPapersMust-read papers on prompt-based tuning for pre-trained language models.
Stars: ✭ 2,317 (+4829.79%)
AliceMindALIbaba's Collection of Encoder-decoders from MinD (Machine IntelligeNce of Damo) Lab
Stars: ✭ 1,479 (+3046.81%)
BertSimilarityComputing similarity of two sentences with google's BERT algorithm。利用Bert计算句子相似度。语义相似度计算。文本相似度计算。
Stars: ✭ 348 (+640.43%)
Transformer-QG-on-SQuADImplement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Stars: ✭ 28 (-40.43%)
NLPDataAugmentationChinese NLP Data Augmentation, BERT Contextual Augmentation
Stars: ✭ 94 (+100%)
SA-BERTCIKM 2020: Speaker-Aware BERT for Multi-Turn Response Selection in Retrieval-Based Chatbots
Stars: ✭ 71 (+51.06%)
Fill-the-GAP[ACL-WS] 4th place solution to gendered pronoun resolution challenge on Kaggle
Stars: ✭ 13 (-72.34%)
SceelixA procedural generation software for automating 2D/3D content creation.
Stars: ✭ 98 (+108.51%)
generaptrGeneraptr is a node package that helps when starting up a project by generating boilerplate code for Express api.
Stars: ✭ 16 (-65.96%)
wechselCode for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-17.02%)
KitanaQAKitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (+23.4%)
FasterTransformerTransformer related optimization, including BERT, GPT
Stars: ✭ 1,571 (+3242.55%)
BERT-QECode and resources for the paper "BERT-QE: Contextualized Query Expansion for Document Re-ranking".
Stars: ✭ 43 (-8.51%)