responsible-ai-toolboxThis project provides responsible AI user interfaces for Fairlearn, interpret-community, and Error Analysis, as well as foundational building blocks that they rely on.
Stars: ✭ 615 (+554.26%)
factedit🧐 Code & Data for Fact-based Text Editing (Iso et al; ACL 2020)
Stars: ✭ 16 (-82.98%)
asr2424-hour Automatic Speech Recognition
Stars: ✭ 27 (-71.28%)
Bit RnnQuantize weights and activations in Recurrent Neural Networks.
Stars: ✭ 86 (-8.51%)
Advanced Models여러가지 유명한 신경망 모델들을 제공합니다. (DCGAN, VAE, Resnet 등등)
Stars: ✭ 48 (-48.94%)
Mead BaselineDeep-Learning Model Exploration and Development for NLP
Stars: ✭ 238 (+153.19%)
Vaaku2VecLanguage Modeling and Text Classification in Malayalam Language using ULMFiT
Stars: ✭ 68 (-27.66%)
Xlnet zh中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large
Stars: ✭ 207 (+120.21%)
deepstoryDeepstory turns a text/generated text into a video where the character is animated to speak your story using his/her voice.
Stars: ✭ 61 (-35.11%)
Attention MechanismsImplementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
Stars: ✭ 203 (+115.96%)
rnn-theanoRNN(LSTM, GRU) in Theano with mini-batch training; character-level language models in Theano
Stars: ✭ 68 (-27.66%)
calmContext Aware Language Models
Stars: ✭ 29 (-69.15%)
Char Rnn ChineseMulti-layer Recurrent Neural Networks (LSTM, GRU, RNN) for character-level language models in Torch. Based on code of https://github.com/karpathy/char-rnn. Support Chinese and other things.
Stars: ✭ 192 (+104.26%)
pd3f🏭 PDF text extraction pipeline: self-hosted, local-first, Docker-based
Stars: ✭ 132 (+40.43%)
Nlp learning结合python一起学习自然语言处理 (nlp): 语言模型、HMM、PCFG、Word2vec、完形填空式阅读理解任务、朴素贝叶斯分类器、TFIDF、PCA、SVD
Stars: ✭ 188 (+100%)
dasher-webDasher text entry in HTML, CSS, JavaScript, and SVG
Stars: ✭ 34 (-63.83%)
Bert Sklearna sklearn wrapper for Google's BERT model
Stars: ✭ 182 (+93.62%)
TF-NNLM-TKA toolkit for neural language modeling using Tensorflow including basic models like RNNs and LSTMs as well as more advanced models.
Stars: ✭ 20 (-78.72%)
OptimusOptimus: the first large-scale pre-trained VAE language model
Stars: ✭ 180 (+91.49%)
subword-lstm-lmLSTM Language Model with Subword Units Input Representations
Stars: ✭ 45 (-52.13%)
Gpt NeoAn implementation of model parallel GPT2& GPT3-like models, with the ability to scale up to full GPT3 sizes (and possibly more!), using the mesh-tensorflow library.
Stars: ✭ 1,252 (+1231.91%)
PLBARTOfficial code of our work, Unified Pre-training for Program Understanding and Generation [NAACL 2021].
Stars: ✭ 151 (+60.64%)
Xlnet GenXLNet for generating language.
Stars: ✭ 164 (+74.47%)
auto codingA basic and simple tool for code auto completion
Stars: ✭ 42 (-55.32%)
Lotclass[EMNLP 2020] Text Classification Using Label Names Only: A Language Model Self-Training Approach
Stars: ✭ 160 (+70.21%)
ChatetteA powerful dataset generator for Rasa NLU, inspired by Chatito
Stars: ✭ 205 (+118.09%)
F LmLanguage Modeling
Stars: ✭ 156 (+65.96%)
deep-explanation-penalizationCode for using CDEP from the paper "Interpretations are useful: penalizing explanations to align neural networks with prior knowledge" https://arxiv.org/abs/1909.13584
Stars: ✭ 110 (+17.02%)
SpeechtAn opensource speech-to-text software written in tensorflow
Stars: ✭ 152 (+61.7%)
swig-srilmSWIG Wrapper for the SRILM toolkit
Stars: ✭ 33 (-64.89%)
Nlp PapersPapers and Book to look at when starting NLP 📚
Stars: ✭ 111 (+18.09%)
TupeTransformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking Positional Encoding in Language Pre-training". Improve existing models like BERT.
Stars: ✭ 143 (+52.13%)
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+143.62%)
Electra中文 预训练 ELECTRA 模型: 基于对抗学习 pretrain Chinese Model
Stars: ✭ 132 (+40.43%)
gap-text2sqlGAP-text2SQL: Learning Contextual Representations for Semantic Parsing with Generation-Augmented Pre-Training
Stars: ✭ 83 (-11.7%)
RobbertA Dutch RoBERTa-based language model
Stars: ✭ 120 (+27.66%)
Nlg EvalEvaluation code for various unsupervised automated metrics for Natural Language Generation.
Stars: ✭ 822 (+774.47%)
Lingopackage lingo provides the data structures and algorithms required for natural language processing
Stars: ✭ 113 (+20.21%)
TERMTilted Empirical Risk Minimization (ICLR '21)
Stars: ✭ 37 (-60.64%)
GetlangNatural language detection package in pure Go
Stars: ✭ 110 (+17.02%)
Tc BotUser Simulation for Task-Completion Dialogues
Stars: ✭ 733 (+679.79%)
Easy BertA Dead Simple BERT API for Python and Java (https://github.com/google-research/bert)
Stars: ✭ 106 (+12.77%)
personality-predictionExperiments for automated personality detection using Language Models and psycholinguistic features on various famous personality datasets including the Essays dataset (Big-Five)
Stars: ✭ 109 (+15.96%)
Pytorch gbw lmPyTorch Language Model for 1-Billion Word (LM1B / GBW) Dataset
Stars: ✭ 101 (+7.45%)
Chatito🎯🗯 Generate datasets for AI chatbots, NLP tasks, named entity recognition or text classification models using a simple DSL!
Stars: ✭ 678 (+621.28%)
TongramsA C++ library providing fast language model queries in compressed space.
Stars: ✭ 88 (-6.38%)
awesome-nlgA curated list of resources dedicated to Natural Language Generation (NLG)
Stars: ✭ 386 (+310.64%)
Accelerated TextAccelerated Text is a no-code natural language generation platform. It will help you construct document plans which define how your data is converted to textual descriptions varying in wording and structure.
Stars: ✭ 256 (+172.34%)
Transformers-TutorialsThis repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+2908.51%)
PCPMPresenting Collection of Pretrained Models. Links to pretrained models in NLP and voice.
Stars: ✭ 21 (-77.66%)
awesome-codexA list dedicated to products, demos and articles related to 🤖 OpenAI's Codex.
Stars: ✭ 115 (+22.34%)