gpt-j-apiAPI for the GPT-J language model 🦜. Including a FastAPI backend and a streamlit frontend
Stars: ✭ 248 (-62.76%)
gpt-jA GPT-J API to use with python3 to generate text, blogs, code, and more
Stars: ✭ 101 (-84.83%)
subword-lstm-lmLSTM Language Model with Subword Units Input Representations
Stars: ✭ 45 (-93.24%)
Xlnet PytorchAn implementation of Google Brain's 2019 XLNet in PyTorch
Stars: ✭ 304 (-54.35%)
open clipAn open source implementation of CLIP.
Stars: ✭ 1,534 (+130.33%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (-38.74%)
minGPT-TFA minimal TF2 re-implementation of the OpenAI GPT training
Stars: ✭ 36 (-94.59%)
tying-wv-and-wcImplementation for "Tying Word Vectors and Word Classifiers: A Loss Framework for Language Modeling"
Stars: ✭ 39 (-94.14%)
mlp-gpt-jaxA GPT, made only of MLPs, in Jax
Stars: ✭ 53 (-92.04%)
TrankitTrankit is a Light-Weight Transformer-based Python Toolkit for Multilingual Natural Language Processing
Stars: ✭ 311 (-53.3%)
FNet-pytorchUnofficial implementation of Google's FNet: Mixing Tokens with Fourier Transforms
Stars: ✭ 204 (-69.37%)
Tokenizers💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Stars: ✭ 5,077 (+662.31%)
Black-Box-TuningICML'2022: Black-Box Tuning for Language-Model-as-a-Service
Stars: ✭ 99 (-85.14%)
BertweetBERTweet: A pre-trained language model for English Tweets (EMNLP-2020)
Stars: ✭ 282 (-57.66%)
cscgCode Generation as a Dual Task of Code Summarization.
Stars: ✭ 28 (-95.8%)
DebertaThe implementation of DeBERTa
Stars: ✭ 541 (-18.77%)
gdcCode for the ICLR 2021 paper "A Distributional Approach to Controlled Text Generation"
Stars: ✭ 94 (-85.89%)
PCPMPresenting Collection of Pretrained Models. Links to pretrained models in NLP and voice.
Stars: ✭ 21 (-96.85%)
Zamia SpeechOpen tools and data for cloudless automatic speech recognition
Stars: ✭ 374 (-43.84%)
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (-65.62%)
miniconsUtility for analyzing Transformer based representations of language.
Stars: ✭ 28 (-95.8%)
MinTLMinTL: Minimalist Transfer Learning for Task-Oriented Dialogue Systems
Stars: ✭ 61 (-90.84%)
dasher-webDasher text entry in HTML, CSS, JavaScript, and SVG
Stars: ✭ 34 (-94.89%)
Azureml BertEnd-to-End recipes for pre-training and fine-tuning BERT using Azure Machine Learning Service
Stars: ✭ 342 (-48.65%)
CodeT5Code for CodeT5: a new code-aware pre-trained encoder-decoder model.
Stars: ✭ 390 (-41.44%)
Word-Prediction-NgramNext Word Prediction using n-gram Probabilistic Model with various Smoothing Techniques
Stars: ✭ 25 (-96.25%)
Gpt NeoxAn implementation of model parallel GPT-3-like models on GPUs, based on the DeepSpeed library. Designed to be able to train models in the hundreds of billions of parameters or larger.
Stars: ✭ 303 (-54.5%)
language-plannerOfficial Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
Stars: ✭ 84 (-87.39%)
Transfer NlpNLP library designed for reproducible experimentation management
Stars: ✭ 287 (-56.91%)
Bert PytorchGoogle AI 2018 BERT pytorch implementation
Stars: ✭ 4,642 (+597%)
mongolian-nlpUseful resources for Mongolian NLP
Stars: ✭ 119 (-82.13%)
BluebertBlueBERT, pre-trained on PubMed abstracts and clinical notes (MIMIC-III).
Stars: ✭ 273 (-59.01%)
CoLAKECOLING'2020: CoLAKE: Contextualized Language and Knowledge Embedding
Stars: ✭ 86 (-87.09%)
KobertKorean BERT pre-trained cased (KoBERT)
Stars: ✭ 591 (-11.26%)
LM-CNLCChinese Natural Language Correction via Language Model
Stars: ✭ 15 (-97.75%)
few-shot-lmThe source code of "Language Models are Few-shot Multilingual Learners" (MRL @ EMNLP 2021)
Stars: ✭ 32 (-95.2%)
CtcwordbeamsearchConnectionist Temporal Classification (CTC) decoder with dictionary and language model for TensorFlow.
Stars: ✭ 398 (-40.24%)
wechselCode for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-94.14%)
python-arpa🐍 Python library for n-gram models in ARPA format
Stars: ✭ 35 (-94.74%)
Albert pytorchA Lite Bert For Self-Supervised Learning Language Representations
Stars: ✭ 539 (-19.07%)
mlmachine learning
Stars: ✭ 29 (-95.65%)
SDLM-pytorchCode accompanying EMNLP 2018 paper Language Modeling with Sparse Product of Sememe Experts
Stars: ✭ 27 (-95.95%)
Tf chatbot seq2seq antilmSeq2seq chatbot with attention and anti-language model to suppress generic response, option for further improve by deep reinforcement learning.
Stars: ✭ 369 (-44.59%)
Dl Nlp ReadingsMy Reading Lists of Deep Learning and Natural Language Processing
Stars: ✭ 656 (-1.5%)
Awesome Bert NlpA curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (-14.86%)
CtcdecoderConnectionist Temporal Classification (CTC) decoding algorithms: best path, prefix search, beam search and token passing. Implemented in Python.
Stars: ✭ 529 (-20.57%)
Kogpt2Korean GPT-2 pretrained cased (KoGPT2)
Stars: ✭ 368 (-44.74%)
pyVHDLParserStreaming based VHDL parser.
Stars: ✭ 51 (-92.34%)