All Projects → FasterTransformer → Similar Projects or Alternatives

555 Open source projects that are alternatives of or similar to FasterTransformer

NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-98.54%)
Mutual labels:  transformer, gpt, bert
TabFormer
Code & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Stars: ✭ 209 (-86.7%)
Mutual labels:  transformer, gpt, bert
Xpersona
XPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (-96.56%)
Mutual labels:  transformer, bert
bert in a flask
A dockerized flask API, serving ALBERT and BERT predictions using TensorFlow 2.0.
Stars: ✭ 32 (-97.96%)
Mutual labels:  transformer, bert
les-military-mrc-rank7
莱斯杯:全国第二届“军事智能机器阅读”挑战赛 - Rank7 解决方案
Stars: ✭ 37 (-97.64%)
Mutual labels:  transformer, bert
bert-as-a-service TFX
End-to-end pipeline with TFX to train and deploy a BERT model for sentiment analysis.
Stars: ✭ 32 (-97.96%)
Mutual labels:  transformer, bert
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-98.54%)
Mutual labels:  transformer, bert
tensorflow-ml-nlp-tf2
텐서플로2와 머신러닝으로 시작하는 자연어처리 (로지스틱회귀부터 BERT와 GPT3까지) 실습자료
Stars: ✭ 245 (-84.4%)
Mutual labels:  transformer, bert
semantic-document-relations
Implementation, trained models and result data for the paper "Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles"
Stars: ✭ 21 (-98.66%)
Mutual labels:  transformer, bert
are-16-heads-really-better-than-1
Code for the paper "Are Sixteen Heads Really Better than One?"
Stars: ✭ 128 (-91.85%)
Mutual labels:  transformer, bert
Filipino-Text-Benchmarks
Open-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-98.6%)
Mutual labels:  transformer, bert
Transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+3448.19%)
Mutual labels:  transformer, bert
transformer-models
Deep Learning Transformer models in MATLAB
Stars: ✭ 90 (-94.27%)
Mutual labels:  transformer, bert
vietnamese-roberta
A Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (-98.6%)
Mutual labels:  transformer, bert
KitanaQA
KitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (-96.31%)
Mutual labels:  transformer, bert
pytorch-gpt-x
Implementation of autoregressive language model using improved Transformer and DeepSpeed pipeline parallelism.
Stars: ✭ 21 (-98.66%)
Mutual labels:  transformer, gpt
Tokenizers
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Stars: ✭ 5,077 (+223.17%)
Mutual labels:  gpt, bert
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+529.85%)
Mutual labels:  transformer, bert
sticker2
Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
Stars: ✭ 14 (-99.11%)
Mutual labels:  transformer, bert
Kevinpro-NLP-demo
All NLP you Need Here. 个人实现了一些好玩的NLP demo,目前包含13个NLP应用的pytorch实现
Stars: ✭ 117 (-92.55%)
Mutual labels:  transformer, bert
golgotha
Contextualised Embeddings and Language Modelling using BERT and Friends using R
Stars: ✭ 39 (-97.52%)
Mutual labels:  transformer, bert
PDN
The official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (-97.2%)
Mutual labels:  transformer, bert
Bert Pytorch
Google AI 2018 BERT pytorch implementation
Stars: ✭ 4,642 (+195.48%)
Mutual labels:  transformer, bert
text-generation-transformer
text generation based on transformer
Stars: ✭ 36 (-97.71%)
Mutual labels:  transformer, bert
Bertviz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+119.16%)
Mutual labels:  transformer, bert
sister
SImple SenTence EmbeddeR
Stars: ✭ 66 (-95.8%)
Mutual labels:  transformer, bert
imdb-transformer
A simple Neural Network for sentiment analysis, embedding sentences using a Transformer network.
Stars: ✭ 26 (-98.35%)
Mutual labels:  transformer
ClusterTransformer
Topic clustering library built on Transformer embeddings and cosine similarity metrics.Compatible with all BERT base transformers from huggingface.
Stars: ✭ 36 (-97.71%)
Mutual labels:  transformer
BERT-chinese-text-classification-pytorch
This repo contains a PyTorch implementation of a pretrained BERT model for text classification.
Stars: ✭ 92 (-94.14%)
Mutual labels:  bert
Zero-Shot-TTS
Unofficial Implementation of Zero-Shot Text-to-Speech for Text-Based Insertion in Audio Narration
Stars: ✭ 33 (-97.9%)
Mutual labels:  transformer
neural-ranking-kd
Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation
Stars: ✭ 74 (-95.29%)
Mutual labels:  bert
Basic-UI-for-GPT-J-6B-with-low-vram
A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
Stars: ✭ 90 (-94.27%)
Mutual labels:  gpt
ExpBERT
Code for our ACL '20 paper "Representation Engineering with Natural Language Explanations"
Stars: ✭ 28 (-98.22%)
Mutual labels:  bert
BiaffineDependencyParsing
BERT+Self-attention Encoder ; Biaffine Decoder ; Pytorch Implement
Stars: ✭ 67 (-95.74%)
Mutual labels:  bert
catr
Image Captioning Using Transformer
Stars: ✭ 206 (-86.89%)
Mutual labels:  transformer
R-AT
Regularized Adversarial Training
Stars: ✭ 19 (-98.79%)
Mutual labels:  bert
Highway-Transformer
[ACL‘20] Highway Transformer: A Gated Transformer.
Stars: ✭ 26 (-98.35%)
Mutual labels:  transformer
proc-that
proc(ess)-that - easy extendable ETL tool for Node.js. Written in TypeScript.
Stars: ✭ 25 (-98.41%)
Mutual labels:  transformer
kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (-70.97%)
Mutual labels:  transformer
rasa-bert-finetune
支持rasa-nlu 的bert finetune
Stars: ✭ 46 (-97.07%)
Mutual labels:  bert
Video-Action-Transformer-Network-Pytorch-
Implementation of the paper Video Action Transformer Network
Stars: ✭ 126 (-91.98%)
Mutual labels:  transformer
rx-scheduler-transformer
rxjava scheduler transformer tools for android
Stars: ✭ 15 (-99.05%)
Mutual labels:  transformer
GEANet-BioMed-Event-Extraction
Code for the paper Biomedical Event Extraction with Hierarchical Knowledge Graphs
Stars: ✭ 52 (-96.69%)
Mutual labels:  bert
text simplification
Text Simplification Model based on Encoder-Decoder (includes Transformer and Seq2Seq) model.
Stars: ✭ 66 (-95.8%)
Mutual labels:  transformer
Neural-Machine-Translation
Several basic neural machine translation models implemented by PyTorch & TensorFlow
Stars: ✭ 29 (-98.15%)
Mutual labels:  transformer
CheXbert
Combining Automatic Labelers and Expert Annotations for Accurate Radiology Report Labeling Using BERT
Stars: ✭ 51 (-96.75%)
Mutual labels:  bert
dingo-serializer-switch
A middleware to switch fractal serializers in dingo
Stars: ✭ 49 (-96.88%)
Mutual labels:  transformer
bert extension tf
BERT Extension in TensorFlow
Stars: ✭ 29 (-98.15%)
Mutual labels:  bert
TitleStylist
Source code for our "TitleStylist" paper at ACL 2020
Stars: ✭ 72 (-95.42%)
Mutual labels:  transformer
Neural-Scam-Artist
Web Scraping, Document Deduplication & GPT-2 Fine-tuning with a newly created scam dataset.
Stars: ✭ 18 (-98.85%)
Mutual labels:  transformer
VideoTransformer-pytorch
PyTorch implementation of a collections of scalable Video Transformer Benchmarks.
Stars: ✭ 159 (-89.88%)
Mutual labels:  transformer
ganbert
Enhancing the BERT training with Semi-supervised Generative Adversarial Networks
Stars: ✭ 205 (-86.95%)
Mutual labels:  bert
TRAR-VQA
[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-96.88%)
Mutual labels:  transformer
oreilly-bert-nlp
This repository contains code for the O'Reilly Live Online Training for BERT
Stars: ✭ 19 (-98.79%)
Mutual labels:  bert
En-transformer
Implementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (-91.66%)
Mutual labels:  transformer
transformer-ls
Official PyTorch Implementation of Long-Short Transformer (NeurIPS 2021).
Stars: ✭ 201 (-87.21%)
Mutual labels:  transformer
datagrand bert
2019达观杯信息提取第5名代码
Stars: ✭ 20 (-98.73%)
Mutual labels:  bert
BERT-QE
Code and resources for the paper "BERT-QE: Contextualized Query Expansion for Document Re-ranking".
Stars: ✭ 43 (-97.26%)
Mutual labels:  bert
beir
A Heterogeneous Benchmark for Information Retrieval. Easy to use, evaluate your models across 15+ diverse IR datasets.
Stars: ✭ 738 (-53.02%)
Mutual labels:  bert
1-60 of 555 similar projects