All Projects → tensorflow-ml-nlp-tf2 → Similar Projects or Alternatives

1162 Open source projects that are alternatives of or similar to tensorflow-ml-nlp-tf2

Filipino-Text-Benchmarks
Open-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-91.02%)
Mutual labels:  transformer, bert, nli
Transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+22651.84%)
Mutual labels:  transformer, seq2seq, bert
Kashgari
Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification, includes Word2Vec, BERT, and GPT2 Language Embedding.
Stars: ✭ 2,235 (+812.24%)
Mutual labels:  seq2seq, ner, bert
transformer-models
Deep Learning Transformer models in MATLAB
Stars: ✭ 90 (-63.27%)
Mutual labels:  transformer, bert, gpt2
pytorch-transformer-chatbot
PyTorch v1.2에서 생긴 Transformer API 를 이용한 간단한 Chitchat 챗봇
Stars: ✭ 44 (-82.04%)
Mutual labels:  transformer, seq2seq, korean-nlp
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+1295.1%)
Mutual labels:  transformer, rnn, seq2seq
Bertviz
Tool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+1305.31%)
Mutual labels:  transformer, bert, gpt2
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-90.61%)
Mutual labels:  transformer, seq2seq, bert
FasterTransformer
Transformer related optimization, including BERT, GPT
Stars: ✭ 1,571 (+541.22%)
Mutual labels:  transformer, bert
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-90.61%)
Mutual labels:  transformer, bert
Nlp Interview Notes
本项目是作者们根据个人面试和经验总结出的自然语言处理(NLP)面试准备的学习笔记与资料,该资料目前包含 自然语言处理各领域的 面试题积累。
Stars: ✭ 207 (-15.51%)
Mutual labels:  transformer, ner
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-75.51%)
Mutual labels:  transformer, seq2seq
deep-molecular-optimization
Molecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (-75.51%)
Mutual labels:  transformer, seq2seq
bert-as-a-service TFX
End-to-end pipeline with TFX to train and deploy a BERT model for sentiment analysis.
Stars: ✭ 32 (-86.94%)
Mutual labels:  transformer, bert
semantic-document-relations
Implementation, trained models and result data for the paper "Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles"
Stars: ✭ 21 (-91.43%)
Mutual labels:  transformer, bert
Tsai
Time series Timeseries Deep Learning Pytorch fastai - State-of-the-art Deep Learning with Time Series and Sequences in Pytorch / fastai
Stars: ✭ 407 (+66.12%)
Mutual labels:  transformer, rnn
teanaps
자연어 처리와 텍스트 분석을 위한 오픈소스 파이썬 라이브러리 입니다.
Stars: ✭ 91 (-62.86%)
Bert Pytorch
Google AI 2018 BERT pytorch implementation
Stars: ✭ 4,642 (+1794.69%)
Mutual labels:  transformer, bert
Meta Emb
Multilingual Meta-Embeddings for Named Entity Recognition (RepL4NLP & EMNLP 2019)
Stars: ✭ 28 (-88.57%)
Mutual labels:  transformer, ner
Text Classification Models Pytorch
Implementation of State-of-the-art Text Classification Models in Pytorch
Stars: ✭ 379 (+54.69%)
Mutual labels:  transformer, seq2seq
Seq2seqchatbots
A wrapper around tensor2tensor to flexibly train, interact, and generate data for neural chatbots.
Stars: ✭ 466 (+90.2%)
Mutual labels:  transformer, seq2seq
Nlp Experiments In Pytorch
PyTorch repository for text categorization and NER experiments in Turkish and English.
Stars: ✭ 35 (-85.71%)
Mutual labels:  transformer, ner
Machine Translation
Stars: ✭ 51 (-79.18%)
Mutual labels:  transformer, seq2seq
Nlp research
NLP research:基于tensorflow的nlp深度学习项目,支持文本分类/句子匹配/序列标注/文本生成 四大任务
Stars: ✭ 141 (-42.45%)
Mutual labels:  transformer, ner
Eeg Dl
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (-32.65%)
Mutual labels:  transformer, rnn
Kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (-22.45%)
Mutual labels:  transformer, seq2seq
Embedding
Embedding模型代码和学习笔记总结
Stars: ✭ 25 (-89.8%)
Mutual labels:  transformer, seq2seq
Ner Bert Pytorch
PyTorch solution of named entity recognition task Using Google AI's pre-trained BERT model.
Stars: ✭ 249 (+1.63%)
Mutual labels:  transformer, ner
are-16-heads-really-better-than-1
Code for the paper "Are Sixteen Heads Really Better than One?"
Stars: ✭ 128 (-47.76%)
Mutual labels:  transformer, bert
golgotha
Contextualised Embeddings and Language Modelling using BERT and Friends using R
Stars: ✭ 39 (-84.08%)
Mutual labels:  transformer, bert
text-generation-transformer
text generation based on transformer
Stars: ✭ 36 (-85.31%)
Mutual labels:  transformer, bert
Paddlenlp
NLP Core Library and Model Zoo based on PaddlePaddle 2.0
Stars: ✭ 212 (-13.47%)
Mutual labels:  transformer, seq2seq
transformer-tensorflow2.0
transformer in tensorflow 2.0
Stars: ✭ 53 (-78.37%)
Mutual labels:  tf2, transformer
Swin-Transformer-Tensorflow
Unofficial implementation of "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows" (https://arxiv.org/abs/2103.14030)
Stars: ✭ 45 (-81.63%)
Mutual labels:  tf2, transformer
bert in a flask
A dockerized flask API, serving ALBERT and BERT predictions using TensorFlow 2.0.
Stars: ✭ 32 (-86.94%)
Mutual labels:  transformer, bert
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+66.53%)
Mutual labels:  transformer, seq2seq
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+60.82%)
Mutual labels:  transformer, seq2seq
Joeynmt
Minimalist NMT for educational purposes
Stars: ✭ 420 (+71.43%)
Mutual labels:  transformer, seq2seq
Bert Multitask Learning
BERT for Multitask Learning
Stars: ✭ 380 (+55.1%)
Mutual labels:  transformer, ner
Transformer-QG-on-SQuAD
Implement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Stars: ✭ 28 (-88.57%)
Mutual labels:  bert, gpt2
Rust Bert
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Stars: ✭ 510 (+108.16%)
Mutual labels:  transformer, ner
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+304.08%)
Mutual labels:  transformer, seq2seq
Xpersona
XPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (-77.96%)
Mutual labels:  transformer, bert
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (-56.73%)
Mutual labels:  transformer, seq2seq
Tensorflow Ml Nlp
텐서플로우와 머신러닝으로 시작하는 자연어처리(로지스틱회귀부터 트랜스포머 챗봇까지)
Stars: ✭ 176 (-28.16%)
Mutual labels:  transformer, seq2seq
Etagger
reference tensorflow code for named entity tagging
Stars: ✭ 100 (-59.18%)
Mutual labels:  transformer, ner
neuro-comma
🇷🇺 Punctuation restoration production-ready model for Russian language 🇷🇺
Stars: ✭ 46 (-81.22%)
Mutual labels:  ner, bert
sister
SImple SenTence EmbeddeR
Stars: ✭ 66 (-73.06%)
Mutual labels:  transformer, bert
Transformer Temporal Tagger
Code and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (-77.55%)
Mutual labels:  transformer, seq2seq
kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+86.12%)
Mutual labels:  transformer, seq2seq
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+3938.78%)
Mutual labels:  transformer, bert
KorQuAD-Question-Generation
question generation model with KorQuAD dataset
Stars: ✭ 27 (-88.98%)
Mutual labels:  gpt2, korquad
hangul-search-js
🇰🇷 Simple Korean text search module
Stars: ✭ 22 (-91.02%)
les-military-mrc-rank7
莱斯杯:全国第二届“军事智能机器阅读”挑战赛 - Rank7 解决方案
Stars: ✭ 37 (-84.9%)
Mutual labels:  transformer, bert
ChineseNER
中文NER的那些事儿
Stars: ✭ 241 (-1.63%)
Mutual labels:  ner, bert
Neural-Scam-Artist
Web Scraping, Document Deduplication & GPT-2 Fine-tuning with a newly created scam dataset.
Stars: ✭ 18 (-92.65%)
Mutual labels:  transformer, gpt2
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-88.57%)
Mutual labels:  transformer, seq2seq
PDN
The official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (-82.04%)
Mutual labels:  transformer, bert
Asr
Stars: ✭ 54 (-77.96%)
Mutual labels:  transformer, seq2seq
vietnamese-roberta
A Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (-91.02%)
Mutual labels:  transformer, bert
1-60 of 1162 similar projects