All Projects → ajaech → calm

ajaech / calm

Licence: other
Context Aware Language Models

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to calm

Automatic Speech Recognition
🎧 Automatic Speech Recognition: DeepSpeech & Seq2Seq (TensorFlow)
Stars: ✭ 192 (+562.07%)
Mutual labels:  language-model
Relational Rnn Pytorch
An implementation of DeepMind's Relational Recurrent Neural Networks in PyTorch.
Stars: ✭ 236 (+713.79%)
Mutual labels:  language-model
pd3f
🏭 PDF text extraction pipeline: self-hosted, local-first, Docker-based
Stars: ✭ 132 (+355.17%)
Mutual labels:  language-model
Gpt Scrolls
A collaborative collection of open-source safe GPT-3 prompts that work well
Stars: ✭ 195 (+572.41%)
Mutual labels:  language-model
Pytorch Nce
The Noise Contrastive Estimation for softmax output written in Pytorch
Stars: ✭ 204 (+603.45%)
Mutual labels:  language-model
Zeroth
Kaldi-based Korean ASR (한국어 음성인식) open-source project
Stars: ✭ 248 (+755.17%)
Mutual labels:  language-model
Bert As Language Model
bert as language model, fork from https://github.com/google-research/bert
Stars: ✭ 185 (+537.93%)
Mutual labels:  language-model
KB-ALBERT
KB국민은행에서 제공하는 경제/금융 도메인에 특화된 한국어 ALBERT 모델
Stars: ✭ 215 (+641.38%)
Mutual labels:  language-model
Xlnet zh
中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large
Stars: ✭ 207 (+613.79%)
Mutual labels:  language-model
COCO-LM
[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (+275.86%)
Mutual labels:  language-model
Protein Sequence Embedding Iclr2019
Source code for "Learning protein sequence embeddings using information from structure" - ICLR 2019
Stars: ✭ 194 (+568.97%)
Mutual labels:  language-model
Attention Mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
Stars: ✭ 203 (+600%)
Mutual labels:  language-model
PLBART
Official code of our work, Unified Pre-training for Program Understanding and Generation [NAACL 2021].
Stars: ✭ 151 (+420.69%)
Mutual labels:  language-model
Char Rnn Chinese
Multi-layer Recurrent Neural Networks (LSTM, GRU, RNN) for character-level language models in Torch. Based on code of https://github.com/karpathy/char-rnn. Support Chinese and other things.
Stars: ✭ 192 (+562.07%)
Mutual labels:  language-model
rnn-theano
RNN(LSTM, GRU) in Theano with mini-batch training; character-level language models in Theano
Stars: ✭ 68 (+134.48%)
Mutual labels:  language-model
Nlp learning
结合python一起学习自然语言处理 (nlp): 语言模型、HMM、PCFG、Word2vec、完形填空式阅读理解任务、朴素贝叶斯分类器、TFIDF、PCA、SVD
Stars: ✭ 188 (+548.28%)
Mutual labels:  language-model
Mead Baseline
Deep-Learning Model Exploration and Development for NLP
Stars: ✭ 238 (+720.69%)
Mutual labels:  language-model
asr24
24-hour Automatic Speech Recognition
Stars: ✭ 27 (-6.9%)
Mutual labels:  language-model
Vaaku2Vec
Language Modeling and Text Classification in Malayalam Language using ULMFiT
Stars: ✭ 68 (+134.48%)
Mutual labels:  language-model
TF-NNLM-TK
A toolkit for neural language modeling using Tensorflow including basic models like RNNs and LSTMs as well as more advanced models.
Stars: ✭ 20 (-31.03%)
Mutual labels:  language-model

CALM

Context Aware Language Models

Code for building language models that adapt to different contexts. This code was originally written to support the experiments in the paper Improving Context Aware Language Models. It has since been modified to support experiments for the paper Low-Rank RNN Adaptation for Context-Aware Language Modeling (to appear in TACL). Read the paper for a complete description of the model.

The main idea is that metadata or other context information can be used to adapt or control a language model. The adaptation takes place by allowing a context embedding to transform the weights of the recurrent layer of the model. We call this model the FactorCell model.

I will work on documenting the code more. Send me a message if you want some help getting started.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].