All Projects → Fast Bert → Similar Projects or Alternatives

340 Open source projects that are alternatives of or similar to Fast Bert

Fastai
The fastai deep learning library
Stars: ✭ 21,718 (+1194.28%)
Mutual labels:  fastai
T3
[EMNLP 2020] "T3: Tree-Autoencoder Constrained Adversarial Text Generation for Targeted Attack" by Boxin Wang, Hengzhi Pei, Boyuan Pan, Qian Chen, Shuohang Wang, Bo Li
Stars: ✭ 25 (-98.51%)
Mutual labels:  bert
syntaxdot
Neural syntax annotator, supporting sequence labeling, lemmatization, and dependency parsing.
Stars: ✭ 32 (-98.09%)
Mutual labels:  bert
bert tokenization for java
This is a java version of Chinese tokenization descried in BERT.
Stars: ✭ 39 (-97.68%)
Mutual labels:  bert
Vit Pytorch
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Stars: ✭ 7,199 (+329.02%)
Mutual labels:  transformers
n-grammer-pytorch
Implementation of N-Grammer, augmenting Transformers with latent n-grams, in Pytorch
Stars: ✭ 50 (-97.02%)
Mutual labels:  transformers
keras-bert-ner
Keras solution of Chinese NER task using BiLSTM-CRF/BiGRU-CRF/IDCNN-CRF model with Pretrained Language Model: supporting BERT/RoBERTa/ALBERT
Stars: ✭ 7 (-99.58%)
Mutual labels:  bert
Chinese Bert Wwm
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
Stars: ✭ 6,357 (+278.84%)
Mutual labels:  bert
NER-FunTool
本NER项目包含多个中文数据集,模型采用BiLSTM+CRF、BERT+Softmax、BERT+Cascade、BERT+WOL等,最后用TFServing进行模型部署,线上推理和线下推理。
Stars: ✭ 56 (-96.66%)
Mutual labels:  bert
icedata
IceData: Datasets Hub for the *IceVision* Framework
Stars: ✭ 41 (-97.56%)
Mutual labels:  fastai
docker-containers
Docker images for fastai
Stars: ✭ 143 (-91.48%)
Mutual labels:  fastai
VideoBERT
Using VideoBERT to tackle video prediction
Stars: ✭ 56 (-96.66%)
Mutual labels:  bert
Reformer Pytorch
Reformer, the efficient Transformer, in Pytorch
Stars: ✭ 1,644 (-2.03%)
Mutual labels:  transformers
minicons
Utility for analyzing Transformer based representations of language.
Stars: ✭ 28 (-98.33%)
Mutual labels:  transformers
knowledge-graph-nlp-in-action
从模型训练到部署,实战知识图谱(Knowledge Graph)&自然语言处理(NLP)。涉及 Tensorflow, Bert+Bi-LSTM+CRF,Neo4j等 涵盖 Named Entity Recognition,Text Classify,Information Extraction,Relation Extraction 等任务。
Stars: ✭ 58 (-96.54%)
Mutual labels:  bert
NSP-BERT
The code for our paper "NSP-BERT: A Prompt-based Zero-Shot Learner Through an Original Pre-training Task —— Next Sentence Prediction"
Stars: ✭ 166 (-90.11%)
Mutual labels:  bert
hashformers
Hashformers is a framework for hashtag segmentation with transformers.
Stars: ✭ 18 (-98.93%)
Mutual labels:  transformers
ADL2019
Applied Deep Learning (2019 Spring) @ NTU
Stars: ✭ 20 (-98.81%)
Mutual labels:  bert
KAREN
KAREN: Unifying Hatespeech Detection and Benchmarking
Stars: ✭ 18 (-98.93%)
Mutual labels:  bert
textgo
Text preprocessing, representation, similarity calculation, text search and classification. Let's go and play with text!
Stars: ✭ 33 (-98.03%)
Mutual labels:  bert
Mengzi
Mengzi Pretrained Models
Stars: ✭ 238 (-85.82%)
Mutual labels:  bert
KoELECTRA-Pipeline
Transformers Pipeline with KoELECTRA
Stars: ✭ 37 (-97.79%)
Mutual labels:  transformers
nuwa-pytorch
Implementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Stars: ✭ 347 (-79.32%)
Mutual labels:  transformers
Filipino-Text-Benchmarks
Open-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-98.69%)
Mutual labels:  bert
FewCLUE
FewCLUE 小样本学习测评基准,中文版
Stars: ✭ 251 (-85.04%)
Mutual labels:  bert
SemEval2019Task3
Code for ANA at SemEval-2019 Task 3
Stars: ✭ 41 (-97.56%)
Mutual labels:  bert
Product-Categorization-NLP
Multi-Class Text Classification for products based on their description with Machine Learning algorithms and Neural Networks (MLP, CNN, Distilbert).
Stars: ✭ 30 (-98.21%)
Mutual labels:  transformers
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-98.63%)
Mutual labels:  bert
MobileQA
离线端阅读理解应用 QA for mobile, Android & iPhone
Stars: ✭ 49 (-97.08%)
Mutual labels:  bert
SQUAD2.Q-Augmented-Dataset
Augmented version of SQUAD 2.0 for Questions
Stars: ✭ 31 (-98.15%)
Mutual labels:  bert
Transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+3221.93%)
Mutual labels:  bert
spark-transformers
Spark-Transformers: Library for exporting Apache Spark MLLIB models to use them in any Java application with no other dependencies.
Stars: ✭ 39 (-97.68%)
Mutual labels:  transformers
trapper
State-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-98.33%)
Mutual labels:  transformers
bern
A neural named entity recognition and multi-type normalization tool for biomedical text mining
Stars: ✭ 151 (-91%)
Mutual labels:  bert
fast.ai notes
📓 Notes for fast.ai courses: intro to ML, Practical DL and Cutting edge DL.
Stars: ✭ 65 (-96.13%)
Mutual labels:  fastai
RTX-2080Ti-Vs-GTX-1080Ti-CIFAR-100-Benchmarks
No description or website provided.
Stars: ✭ 16 (-99.05%)
Mutual labels:  fastai
deepflash2
A deep-learning pipeline for segmentation of ambiguous microscopic images.
Stars: ✭ 34 (-97.97%)
Mutual labels:  fastai
WSDM-Cup-2019
[ACM-WSDM] 3rd place solution at WSDM Cup 2019, Fake News Classification on Kaggle.
Stars: ✭ 62 (-96.31%)
Mutual labels:  bert
Bert Pytorch
Google AI 2018 BERT pytorch implementation
Stars: ✭ 4,642 (+176.64%)
Mutual labels:  bert
fastai-visual-guide
Notebooks for Fastai Viusal Guide
Stars: ✭ 25 (-98.51%)
Mutual labels:  fastai
deepprojects
A non-ending collection of jupyter notebooks
Stars: ✭ 30 (-98.21%)
Mutual labels:  fastai
ttt
A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+
Stars: ✭ 35 (-97.91%)
Mutual labels:  transformers
PIE
Fast + Non-Autoregressive Grammatical Error Correction using BERT. Code and Pre-trained models for paper "Parallel Iterative Edit Models for Local Sequence Transduction": www.aclweb.org/anthology/D19-1435.pdf (EMNLP-IJCNLP 2019)
Stars: ✭ 164 (-90.23%)
Mutual labels:  bert
semantic-document-relations
Implementation, trained models and result data for the paper "Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles"
Stars: ✭ 21 (-98.75%)
Mutual labels:  bert
BERT-Chinese-Couplet
BERT for Chinese Couplet | BERT用于自动对对联
Stars: ✭ 19 (-98.87%)
Mutual labels:  bert
fastai-docker-deploy
Deploy fastai models with Docker
Stars: ✭ 19 (-98.87%)
Mutual labels:  fastai
text2keywords
Trained T5 and T5-large model for creating keywords from text
Stars: ✭ 53 (-96.84%)
Mutual labels:  transformers
small-text
Active Learning for Text Classification in Python
Stars: ✭ 241 (-85.64%)
Mutual labels:  transformers
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+489.69%)
Mutual labels:  bert
Ernie
Official implementations for various pre-training models of ERNIE-family, covering topics of Language Understanding & Generation, Multimodal Understanding & Generation, and beyond.
Stars: ✭ 4,659 (+177.65%)
Mutual labels:  bert
Yuno
Yuno is context based search engine for anime.
Stars: ✭ 320 (-80.93%)
Mutual labels:  transformers
simple transformers
Simple transformer implementations that I can understand
Stars: ✭ 18 (-98.93%)
Mutual labels:  transformers
DocProduct
Medical Q&A with Deep Language Models
Stars: ✭ 527 (-68.59%)
Mutual labels:  bert
eve-bot
EVE bot, a customer service chatbot to enhance virtual engagement for Twitter Apple Support
Stars: ✭ 31 (-98.15%)
Mutual labels:  transformers
are-16-heads-really-better-than-1
Code for the paper "Are Sixteen Heads Really Better than One?"
Stars: ✭ 128 (-92.37%)
Mutual labels:  bert
transformer-models
Deep Learning Transformer models in MATLAB
Stars: ✭ 90 (-94.64%)
Mutual labels:  bert
CLUE pytorch
CLUE baseline pytorch CLUE的pytorch版本基线
Stars: ✭ 72 (-95.71%)
Mutual labels:  bert
FinBERT
A Pretrained BERT Model for Financial Communications. https://arxiv.org/abs/2006.08097
Stars: ✭ 193 (-88.5%)
Mutual labels:  bert
ark-nlp
A private nlp coding package, which quickly implements the SOTA solutions.
Stars: ✭ 232 (-86.17%)
Mutual labels:  bert
robustness-vit
Contains code for the paper "Vision Transformers are Robust Learners" (AAAI 2022).
Stars: ✭ 78 (-95.35%)
Mutual labels:  transformers
61-120 of 340 similar projects