All Projects → Tianchi2020ChineseMedicineQuestionGeneration → Similar Projects or Alternatives

279 Open source projects that are alternatives of or similar to Tianchi2020ChineseMedicineQuestionGeneration

Transformer-QG-on-SQuAD
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Stars: ✭ 28 (+40%)
Mutual labels:  bert, question-generation, roberta
berserker
Berserker - BERt chineSE woRd toKenizER
Stars: ✭ 17 (-15%)
Mutual labels:  sequence-to-sequence, bert
vietnamese-roberta
A Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (+10%)
Mutual labels:  bert, roberta
Albert zh
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
Stars: ✭ 3,500 (+17400%)
Mutual labels:  bert, roberta
erc
Emotion recognition in conversation
Stars: ✭ 34 (+70%)
Mutual labels:  bert, roberta
KLUE
📖 Korean NLU Benchmark
Stars: ✭ 420 (+2000%)
Mutual labels:  bert, roberta
roberta-wwm-base-distill
this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
Stars: ✭ 61 (+205%)
Mutual labels:  bert, roberta
text2text
Text2Text: Cross-lingual natural language processing and generation toolkit
Stars: ✭ 188 (+840%)
Mutual labels:  bert, question-generation
CLUE pytorch
CLUE baseline pytorch CLUE的pytorch版本基线
Stars: ✭ 72 (+260%)
Mutual labels:  bert, roberta
les-military-mrc-rank7
莱斯杯:全国第二届“军事智能机器阅读”挑战赛 - Rank7 解决方案
Stars: ✭ 37 (+85%)
Mutual labels:  bert, roberta
Chinese Bert Wwm
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
Stars: ✭ 6,357 (+31685%)
Mutual labels:  bert, roberta
classy
classy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (+205%)
Mutual labels:  sequence-to-sequence, bert
Clue
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
Stars: ✭ 2,425 (+12025%)
Mutual labels:  bert, roberta
text-generation-transformer
text generation based on transformer
Stars: ✭ 36 (+80%)
Mutual labels:  sequence-to-sequence, bert
Bertviz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+17115%)
Mutual labels:  bert, roberta
Text-Summarization
Abstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (+90%)
Mutual labels:  bert, roberta
Roberta zh
RoBERTa中文预训练模型: RoBERTa for Chinese
Stars: ✭ 1,953 (+9665%)
Mutual labels:  bert, roberta
beir
A Heterogeneous Benchmark for Information Retrieval. Easy to use, evaluate your models across 15+ diverse IR datasets.
Stars: ✭ 738 (+3590%)
Mutual labels:  bert, question-generation
question generator
An NLP system for generating reading comprehension questions
Stars: ✭ 188 (+840%)
Mutual labels:  bert, question-generation
HE2LaTeX
Converting handwritten equations to LaTeX
Stars: ✭ 84 (+320%)
Mutual labels:  sequence-to-sequence
BERT-embedding
A simple wrapper class for extracting features(embedding) and comparing them using BERT in TensorFlow
Stars: ✭ 24 (+20%)
Mutual labels:  bert
BERT-BiLSTM-CRF
BERT-BiLSTM-CRF的Keras版实现
Stars: ✭ 40 (+100%)
Mutual labels:  bert
consistency
Implementation of models in our EMNLP 2019 paper: A Logic-Driven Framework for Consistency of Neural Models
Stars: ✭ 26 (+30%)
Mutual labels:  bert
A-Persona-Based-Neural-Conversation-Model
No description or website provided.
Stars: ✭ 22 (+10%)
Mutual labels:  sequence-to-sequence
bert-squeeze
🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (+180%)
Mutual labels:  bert
Transformers-Tutorials
This repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+14040%)
Mutual labels:  bert
Transformer-Transducer
PyTorch implementation of "Transformer Transducer: A Streamable Speech Recognition Model with Transformer Encoders and RNN-T Loss" (ICASSP 2020)
Stars: ✭ 61 (+205%)
Mutual labels:  sequence-to-sequence
keras-transformer-xl
Transformer-XL with checkpoint loader
Stars: ✭ 65 (+225%)
Mutual labels:  transformer-xl
bert nli
A Natural Language Inference (NLI) model based on Transformers (BERT and ALBERT)
Stars: ✭ 97 (+385%)
Mutual labels:  bert
ai web RISKOUT BTS
국방 리스크 관리 플랫폼 (🏅 국방부장관상/Minister of National Defense Award)
Stars: ✭ 18 (-10%)
Mutual labels:  bert
explicit memory tracker
[ACL 2020] Explicit Memory Tracker with Coarse-to-Fine Reasoning for Conversational Machine Reading
Stars: ✭ 35 (+75%)
Mutual labels:  question-generation
anonymisation
Anonymization of legal cases (Fr) based on Flair embeddings
Stars: ✭ 85 (+325%)
Mutual labels:  bert
Xpersona
XPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (+170%)
Mutual labels:  bert
task-transferability
Data and code for our paper "Exploring and Predicting Transferability across NLP Tasks", to appear at EMNLP 2020.
Stars: ✭ 35 (+75%)
Mutual labels:  bert
NLP-Review-Scorer
Score your NLP paper review
Stars: ✭ 25 (+25%)
Mutual labels:  bert
bert-movie-reviews-sentiment-classifier
Build a Movie Reviews Sentiment Classifier with Google's BERT Language Model
Stars: ✭ 12 (-40%)
Mutual labels:  bert
Kevinpro-NLP-demo
All NLP you Need Here. 个人实现了一些好玩的NLP demo,目前包含13个NLP应用的pytorch实现
Stars: ✭ 117 (+485%)
Mutual labels:  bert
rasa milktea chatbot
Chatbot with bert chinese model, base on rasa framework(中文聊天机器人,结合bert意图分析,基于rasa框架)
Stars: ✭ 97 (+385%)
Mutual labels:  bert
AnnA Anki neuronal Appendix
Using machine learning on your anki collection to enhance the scheduling via semantic clustering and semantic similarity
Stars: ✭ 39 (+95%)
Mutual labels:  bert
openroberta-lab
The programming environment »Open Roberta Lab« by Fraunhofer IAIS enables children and adolescents to program robots. A variety of different programming blocks are provided to program motors and sensors of the robot. Open Roberta Lab uses an approach of graphical programming so that beginners can seamlessly start coding. As a cloud-based applica…
Stars: ✭ 98 (+390%)
Mutual labels:  roberta
hard-label-attack
Natural Language Attacks in a Hard Label Black Box Setting.
Stars: ✭ 26 (+30%)
Mutual labels:  bert
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (+15%)
Mutual labels:  bert
contextualSpellCheck
✔️Contextual word checker for better suggestions
Stars: ✭ 274 (+1270%)
Mutual labels:  bert
Sequence-to-Sequence-Learning-of-Financial-Time-Series-in-Algorithmic-Trading
My bachelor's thesis—analyzing the application of LSTM-based RNNs on financial markets. 🤓
Stars: ✭ 64 (+220%)
Mutual labels:  sequence-to-sequence
textwiser
[AAAI 2021] TextWiser: Text Featurization Library
Stars: ✭ 26 (+30%)
Mutual labels:  bert
bert-tensorflow-pytorch-spacy-conversion
Instructions for how to convert a BERT Tensorflow model to work with HuggingFace's pytorch-transformers, and spaCy. This walk-through uses DeepPavlov's RuBERT as example.
Stars: ✭ 26 (+30%)
Mutual labels:  bert
Cross-Lingual-MRC
Cross-Lingual Machine Reading Comprehension (EMNLP 2019)
Stars: ✭ 66 (+230%)
Mutual labels:  bert
NAG-BERT
[EACL'21] Non-Autoregressive with Pretrained Language Model
Stars: ✭ 47 (+135%)
Mutual labels:  bert
trove
Weakly supervised medical named entity classification
Stars: ✭ 55 (+175%)
Mutual labels:  bert
wechsel
Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (+95%)
Mutual labels:  bert
PromptPapers
Must-read papers on prompt-based tuning for pre-trained language models.
Stars: ✭ 2,317 (+11485%)
Mutual labels:  bert
tfbert
基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。
Stars: ✭ 54 (+170%)
Mutual labels:  bert
BERT-for-Chinese-Question-Answering
No description or website provided.
Stars: ✭ 75 (+275%)
Mutual labels:  bert
gender-unbiased BERT-based pronoun resolution
Source code for the ACL workshop paper and Kaggle competition by Google AI team
Stars: ✭ 42 (+110%)
Mutual labels:  bert
sticker2
Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
Stars: ✭ 14 (-30%)
Mutual labels:  bert
CVAE Dial
CVAE_XGate model in paper "Xu, Dusek, Konstas, Rieser. Better Conversations by Modeling, Filtering, and Optimizing for Coherence and Diversity"
Stars: ✭ 16 (-20%)
Mutual labels:  sequence-to-sequence
DE-LIMIT
DeEpLearning models for MultIlingual haTespeech (DELIMIT): Benchmarking multilingual models across 9 languages and 16 datasets.
Stars: ✭ 90 (+350%)
Mutual labels:  bert
SA-BERT
CIKM 2020: Speaker-Aware BERT for Multi-Turn Response Selection in Retrieval-Based Chatbots
Stars: ✭ 71 (+255%)
Mutual labels:  bert
PDN
The official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (+120%)
Mutual labels:  bert
1-60 of 279 similar projects