All Projects → ark-nlp → Similar Projects or Alternatives

205 Open source projects that are alternatives of or similar to ark-nlp

parsbert-ner
🤗 ParsBERT Persian NER Tasks
Stars: ✭ 15 (-93.53%)
Mutual labels:  bert
bert for corrector
基于bert进行中文文本纠错
Stars: ✭ 199 (-14.22%)
Mutual labels:  bert
label-studio-transformers
Label data using HuggingFace's transformers and automatically get a prediction service
Stars: ✭ 117 (-49.57%)
Mutual labels:  bert
Transformer-QG-on-SQuAD
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Stars: ✭ 28 (-87.93%)
Mutual labels:  bert
muse-as-service
REST API for sentence tokenization and embedding using Multilingual Universal Sentence Encoder.
Stars: ✭ 45 (-80.6%)
Mutual labels:  bert
BiaffineDependencyParsing
BERT+Self-attention Encoder ; Biaffine Decoder ; Pytorch Implement
Stars: ✭ 67 (-71.12%)
Mutual labels:  bert
BERT-BiLSTM-CRF
BERT-BiLSTM-CRF的Keras版实现
Stars: ✭ 40 (-82.76%)
Mutual labels:  bert
BERT-QE
Code and resources for the paper "BERT-QE: Contextualized Query Expansion for Document Re-ranking".
Stars: ✭ 43 (-81.47%)
Mutual labels:  bert
bert-movie-reviews-sentiment-classifier
Build a Movie Reviews Sentiment Classifier with Google's BERT Language Model
Stars: ✭ 12 (-94.83%)
Mutual labels:  bert
korpatbert
특허분야 특화된 한국어 AI언어모델 KorPatBERT
Stars: ✭ 48 (-79.31%)
Mutual labels:  bert
consistency
Implementation of models in our EMNLP 2019 paper: A Logic-Driven Framework for Consistency of Neural Models
Stars: ✭ 26 (-88.79%)
Mutual labels:  bert
TabFormer
Code & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Stars: ✭ 209 (-9.91%)
Mutual labels:  bert
ganbert-pytorch
Enhancing the BERT training with Semi-supervised Generative Adversarial Networks in Pytorch/HuggingFace
Stars: ✭ 60 (-74.14%)
Mutual labels:  bert
R-AT
Regularized Adversarial Training
Stars: ✭ 19 (-91.81%)
Mutual labels:  bert
Transformers-Tutorials
This repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+1118.97%)
Mutual labels:  bert
GEANet-BioMed-Event-Extraction
Code for the paper Biomedical Event Extraction with Hierarchical Knowledge Graphs
Stars: ✭ 52 (-77.59%)
Mutual labels:  bert
tfbert
基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。
Stars: ✭ 54 (-76.72%)
Mutual labels:  bert
datagrand bert
2019达观杯信息提取第5名代码
Stars: ✭ 20 (-91.38%)
Mutual labels:  bert
roberta-wwm-base-distill
this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
Stars: ✭ 61 (-73.71%)
Mutual labels:  bert
bert attn viz
Visualize BERT's self-attention layers on text classification tasks
Stars: ✭ 41 (-82.33%)
Mutual labels:  bert
Tianchi2020ChineseMedicineQuestionGeneration
2020 阿里云天池大数据竞赛-中医药文献问题生成挑战赛
Stars: ✭ 20 (-91.38%)
Mutual labels:  bert
LAMB Optimizer TF
LAMB Optimizer for Large Batch Training (TensorFlow version)
Stars: ✭ 119 (-48.71%)
Mutual labels:  bert
AnnA Anki neuronal Appendix
Using machine learning on your anki collection to enhance the scheduling via semantic clustering and semantic similarity
Stars: ✭ 39 (-83.19%)
Mutual labels:  bert
embedding study
中文预训练模型生成字向量学习,测试BERT,ELMO的中文效果
Stars: ✭ 94 (-59.48%)
Mutual labels:  bert
COVID-19-Tweet-Classification-using-Roberta-and-Bert-Simple-Transformers
Rank 1 / 216
Stars: ✭ 24 (-89.66%)
Mutual labels:  bert
les-military-mrc-rank7
莱斯杯:全国第二届“军事智能机器阅读”挑战赛 - Rank7 解决方案
Stars: ✭ 37 (-84.05%)
Mutual labels:  bert
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-90.09%)
Mutual labels:  bert
NLPDataAugmentation
Chinese NLP Data Augmentation, BERT Contextual Augmentation
Stars: ✭ 94 (-59.48%)
Mutual labels:  bert
bert-AAD
Adversarial Adaptation with Distillation for BERT Unsupervised Domain Adaptation
Stars: ✭ 27 (-88.36%)
Mutual labels:  bert
neuro-comma
🇷🇺 Punctuation restoration production-ready model for Russian language 🇷🇺
Stars: ✭ 46 (-80.17%)
Mutual labels:  bert
NAG-BERT
[EACL'21] Non-Autoregressive with Pretrained Language Model
Stars: ✭ 47 (-79.74%)
Mutual labels:  bert
wisdomify
A BERT-based reverse dictionary of Korean proverbs
Stars: ✭ 95 (-59.05%)
Mutual labels:  bert
BERT-embedding
A simple wrapper class for extracting features(embedding) and comparing them using BERT in TensorFlow
Stars: ✭ 24 (-89.66%)
Mutual labels:  bert
OpenUE
OpenUE是一个轻量级知识图谱抽取工具 (An Open Toolkit for Universal Extraction from Text published at EMNLP2020: https://aclanthology.org/2020.emnlp-demos.1.pdf)
Stars: ✭ 274 (+18.1%)
Mutual labels:  bert
PromptPapers
Must-read papers on prompt-based tuning for pre-trained language models.
Stars: ✭ 2,317 (+898.71%)
Mutual labels:  bert
ChineseNER
中文NER的那些事儿
Stars: ✭ 241 (+3.88%)
Mutual labels:  bert
contextualSpellCheck
✔️Contextual word checker for better suggestions
Stars: ✭ 274 (+18.1%)
Mutual labels:  bert
gpl
Powerful unsupervised domain adaptation method for dense retrieval. Requires only unlabeled corpus and yields massive improvement: "GPL: Generative Pseudo Labeling for Unsupervised Domain Adaptation of Dense Retrieval" https://arxiv.org/abs/2112.07577
Stars: ✭ 216 (-6.9%)
Mutual labels:  bert
sticker2
Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
Stars: ✭ 14 (-93.97%)
Mutual labels:  bert
bert-sentiment
Fine-grained Sentiment Classification Using BERT
Stars: ✭ 49 (-78.88%)
Mutual labels:  bert
task-transferability
Data and code for our paper "Exploring and Predicting Transferability across NLP Tasks", to appear at EMNLP 2020.
Stars: ✭ 35 (-84.91%)
Mutual labels:  bert
TradeTheEvent
Implementation of "Trade the Event: Corporate Events Detection for News-Based Event-Driven Trading." In Findings of ACL2021
Stars: ✭ 64 (-72.41%)
Mutual labels:  bert
SA-BERT
CIKM 2020: Speaker-Aware BERT for Multi-Turn Response Selection in Retrieval-Based Chatbots
Stars: ✭ 71 (-69.4%)
Mutual labels:  bert
FinBERT-QA
Financial Domain Question Answering with pre-trained BERT Language Model
Stars: ✭ 70 (-69.83%)
Mutual labels:  bert
Pytorch-NLU
Pytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech ta…
Stars: ✭ 151 (-34.91%)
Mutual labels:  bert
npo classifier
Automated coding using machine-learning and remapping the U.S. nonprofit sector: A guide and benchmark
Stars: ✭ 18 (-92.24%)
Mutual labels:  bert
Romanian-Transformers
This repo is the home of Romanian Transformers.
Stars: ✭ 60 (-74.14%)
Mutual labels:  bert
AiSpace
AiSpace: Better practices for deep learning model development and deployment For Tensorflow 2.0
Stars: ✭ 28 (-87.93%)
Mutual labels:  bert
berserker
Berserker - BERt chineSE woRd toKenizER
Stars: ✭ 17 (-92.67%)
Mutual labels:  bert
GLUE-bert4keras
基于bert4keras的GLUE基准代码
Stars: ✭ 59 (-74.57%)
Mutual labels:  bert
banglabert
This repository contains the official release of the model "BanglaBERT" and associated downstream finetuning code and datasets introduced in the paper titled "BanglaBERT: Language Model Pretraining and Benchmarks for Low-Resource Language Understanding Evaluation in Bangla" accpeted in Findings of the Annual Conference of the North American Chap…
Stars: ✭ 186 (-19.83%)
Mutual labels:  bert
protonet-bert-text-classification
finetune bert for small dataset text classification in a few-shot learning manner using ProtoNet
Stars: ✭ 28 (-87.93%)
Mutual labels:  bert
Xpersona
XPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (-76.72%)
Mutual labels:  bert
BertSimilarity
Computing similarity of two sentences with google's BERT algorithm。利用Bert计算句子相似度。语义相似度计算。文本相似度计算。
Stars: ✭ 348 (+50%)
Mutual labels:  bert
LMMS
Language Modelling Makes Sense - WSD (and more) with Contextual Embeddings
Stars: ✭ 79 (-65.95%)
Mutual labels:  bert
GoEmotions-pytorch
Pytorch Implementation of GoEmotions 😍😢😱
Stars: ✭ 95 (-59.05%)
Mutual labels:  bert
DeepNER
An Easy-to-use, Modular and Prolongable package of deep-learning based Named Entity Recognition Models.
Stars: ✭ 9 (-96.12%)
Mutual labels:  bert
PDN
The official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (-81.03%)
Mutual labels:  bert
BERT-for-Chinese-Question-Answering
No description or website provided.
Stars: ✭ 75 (-67.67%)
Mutual labels:  bert
backprop
Backprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (-1.29%)
Mutual labels:  bert
61-120 of 205 similar projects