PLBARTOfficial code of our work, Unified Pre-training for Program Understanding and Generation [NAACL 2021].
Stars: ✭ 151 (-61.28%)
DebertaThe implementation of DeBERTa
Stars: ✭ 541 (+38.72%)
COCO-LM[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (-72.05%)
PCPMPresenting Collection of Pretrained Models. Links to pretrained models in NLP and voice.
Stars: ✭ 21 (-94.62%)
minGPT-TFA minimal TF2 re-implementation of the OpenAI GPT training
Stars: ✭ 36 (-90.77%)
GLOM-TensorFlowAn attempt at the implementation of GLOM, Geoffrey Hinton's paper for emergent part-whole hierarchies from data
Stars: ✭ 32 (-91.79%)
cscgCode Generation as a Dual Task of Code Summarization.
Stars: ✭ 28 (-92.82%)
subword-lstm-lmLSTM Language Model with Subword Units Input Representations
Stars: ✭ 45 (-88.46%)
MTL-AQAWhat and How Well You Performed? A Multitask Learning Approach to Action Quality Assessment [CVPR 2019]
Stars: ✭ 38 (-90.26%)
FLIPA collection of tasks to probe the effectiveness of protein sequence representations in modeling aspects of protein design
Stars: ✭ 35 (-91.03%)
Black-Box-TuningICML'2022: Black-Box Tuning for Language-Model-as-a-Service
Stars: ✭ 99 (-74.62%)
amrOfficial adversarial mixup resynthesis repository
Stars: ✭ 31 (-92.05%)
protoProto-RL: Reinforcement Learning with Prototypical Representations
Stars: ✭ 67 (-82.82%)
TCEThis repository contains the code implementation used in the paper Temporally Coherent Embeddings for Self-Supervised Video Representation Learning (TCE).
Stars: ✭ 51 (-86.92%)
open clipAn open source implementation of CLIP.
Stars: ✭ 1,534 (+293.33%)
Word-Prediction-NgramNext Word Prediction using n-gram Probabilistic Model with various Smoothing Techniques
Stars: ✭ 25 (-93.59%)
mlmachine learning
Stars: ✭ 29 (-92.56%)
ShapeFormerOfficial repository for the ShapeFormer Project
Stars: ✭ 97 (-75.13%)
MSFOfficial code for "Mean Shift for Self-Supervised Learning"
Stars: ✭ 42 (-89.23%)
causal-mlMust-read papers and resources related to causal inference and machine (deep) learning
Stars: ✭ 387 (-0.77%)
LM-CNLCChinese Natural Language Correction via Language Model
Stars: ✭ 15 (-96.15%)
Pose2vecA Repository for maintaining various human skeleton preprocessing steps in numpy and tensorflow along with tensorflow model to learn pose embeddings.
Stars: ✭ 25 (-93.59%)
FEATHERThe reference implementation of FEATHER from the CIKM '20 paper "Characteristic Functions on Graphs: Birds of a Feather, from Statistical Descriptors to Parametric Models".
Stars: ✭ 34 (-91.28%)
language-plannerOfficial Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
Stars: ✭ 84 (-78.46%)
pair2vecpair2vec: Compositional Word-Pair Embeddings for Cross-Sentence Inference
Stars: ✭ 62 (-84.1%)
gnn-lspeSource code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (-57.69%)
wechselCode for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-90%)
autoencoders tensorflowAutomatic feature engineering using deep learning and Bayesian inference using TensorFlow.
Stars: ✭ 66 (-83.08%)
ParametricUMAP paperParametric UMAP embeddings for representation and semisupervised learning. From the paper "Parametric UMAP: learning embeddings with deep neural networks for representation and semi-supervised learning" (Sainburg, McInnes, Gentner, 2020).
Stars: ✭ 132 (-66.15%)
rl singing voiceUnsupervised Representation Learning for Singing Voice Separation
Stars: ✭ 18 (-95.38%)
ExConExCon: Explanation-driven Supervised Contrastive Learning
Stars: ✭ 17 (-95.64%)
reprieveA library for evaluating representations.
Stars: ✭ 68 (-82.56%)
mongolian-nlpUseful resources for Mongolian NLP
Stars: ✭ 119 (-69.49%)
M-NMFAn implementation of "Community Preserving Network Embedding" (AAAI 2017)
Stars: ✭ 119 (-69.49%)
gpt-jA GPT-J API to use with python3 to generate text, blogs, code, and more
Stars: ✭ 101 (-74.1%)
REGALRepresentation learning-based graph alignment based on implicit matrix factorization and structural embeddings
Stars: ✭ 78 (-80%)
CoLAKECOLING'2020: CoLAKE: Contextualized Language and Knowledge Embedding
Stars: ✭ 86 (-77.95%)
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (-41.28%)
gpt-j-apiAPI for the GPT-J language model 🦜. Including a FastAPI backend and a streamlit frontend
Stars: ✭ 248 (-36.41%)
mlp-gpt-jaxA GPT, made only of MLPs, in Jax
Stars: ✭ 53 (-86.41%)
SimCLRPytorch implementation of "A Simple Framework for Contrastive Learning of Visual Representations"
Stars: ✭ 65 (-83.33%)
dasher-webDasher text entry in HTML, CSS, JavaScript, and SVG
Stars: ✭ 34 (-91.28%)
gdcCode for the ICLR 2021 paper "A Distributional Approach to Controlled Text Generation"
Stars: ✭ 94 (-75.9%)
swig-srilmSWIG Wrapper for the SRILM toolkit
Stars: ✭ 33 (-91.54%)
lm-scorer📃Language Model based sentences scoring library
Stars: ✭ 264 (-32.31%)
Learning-From-RulesImplementation of experiments in paper "Learning from Rules Generalizing Labeled Exemplars" to appear in ICLR2020 (https://openreview.net/forum?id=SkeuexBtDr)
Stars: ✭ 46 (-88.21%)
meta-embeddingsMeta-embeddings are a probabilistic generalization of embeddings in machine learning.
Stars: ✭ 22 (-94.36%)
game-feature-learningCode for paper "Cross-Domain Self-supervised Multi-task Feature Learning using Synthetic Imagery", Ren et al., CVPR'18
Stars: ✭ 68 (-82.56%)