jax-modelsUnofficial JAX implementations of deep learning research papers
Stars: ✭ 108 (+272.41%)
awesome-huggingface🤗 A list of wonderful open-source projects & applications integrated with Hugging Face libraries.
Stars: ✭ 436 (+1403.45%)
graphsignalGraphsignal Python agent
Stars: ✭ 158 (+444.83%)
tttA package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+
Stars: ✭ 35 (+20.69%)
question generatorAn NLP system for generating reading comprehension questions
Stars: ✭ 188 (+548.28%)
omdJAX code for the paper "Control-Oriented Model-Based Reinforcement Learning with Implicit Differentiation"
Stars: ✭ 43 (+48.28%)
policy-data-analyzerBuilding a model to recognize incentives for landscape restoration in environmental policies from Latin America, the US and India. Bringing NLP to the world of policy analysis through an extensible framework that includes scraping, preprocessing, active learning and text analysis pipelines.
Stars: ✭ 22 (-24.14%)
Transformers🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+192113.79%)
text2keywordsTrained T5 and T5-large model for creating keywords from text
Stars: ✭ 53 (+82.76%)
jax-resnetImplementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).
Stars: ✭ 61 (+110.34%)
ercEmotion recognition in conversation
Stars: ✭ 34 (+17.24%)
jax-rlJAX implementations of core Deep RL algorithms
Stars: ✭ 61 (+110.34%)
efficientnet-jaxEfficientNet, MobileNetV3, MobileNetV2, MixNet, etc in JAX w/ Flax Linen and Objax
Stars: ✭ 114 (+293.1%)
converseConversational text Analysis using various NLP techniques
Stars: ✭ 147 (+406.9%)
PyprobmlPython code for "Machine learning: a probabilistic perspective" (2nd edition)
Stars: ✭ 4,197 (+14372.41%)
score flowOfficial code for "Maximum Likelihood Training of Score-Based Diffusion Models", NeurIPS 2021 (spotlight)
Stars: ✭ 49 (+68.97%)
clip-italianCLIP (Contrastive Language–Image Pre-training) for Italian
Stars: ✭ 113 (+289.66%)
uvadlc notebooksRepository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2022/Spring 2022
Stars: ✭ 901 (+3006.9%)
robustness-vitContains code for the paper "Vision Transformers are Robust Learners" (AAAI 2022).
Stars: ✭ 78 (+168.97%)
koclipKoCLIP: Korean port of OpenAI CLIP, in Flax
Stars: ✭ 80 (+175.86%)
get-started-with-JAXThe purpose of this repo is to make it easy to get started with JAX, Flax, and Haiku. It contains my "Machine Learning with JAX" series of tutorials (YouTube videos and Jupyter Notebooks) as well as the content I found useful while learning about the JAX ecosystem.
Stars: ✭ 229 (+689.66%)
HugsVisionHugsVision is a easy to use huggingface wrapper for state-of-the-art computer vision
Stars: ✭ 154 (+431.03%)
Text-SummarizationAbstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (+31.03%)
RecipeManagerApiA tool to manage your families and friends recipes like a chef.
Stars: ✭ 32 (+10.34%)
DocSumA tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
Stars: ✭ 58 (+100%)
ingredientsExtract recipe ingredients from any recipe website on the internet.
Stars: ✭ 96 (+231.03%)
anonymisationAnonymization of legal cases (Fr) based on Flair embeddings
Stars: ✭ 85 (+193.1%)
Transformers-TutorialsThis repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+9651.72%)
molecule-attention-transformerPytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
Stars: ✭ 46 (+58.62%)
nalcosSearch Git commits in natural language
Stars: ✭ 50 (+72.41%)
xpandasUniversal 1d/2d data containers with Transformers functionality for data analysis.
Stars: ✭ 25 (-13.79%)
wechselCode for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (+34.48%)
GPJaxA didactic Gaussian process package for researchers in Jax.
Stars: ✭ 159 (+448.28%)
jaxfgFactor graphs and nonlinear optimization for JAX
Stars: ✭ 124 (+327.59%)
ml-with-audioHF's ML for Audio study group
Stars: ✭ 104 (+258.62%)
transformers-interpretModel explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Stars: ✭ 861 (+2868.97%)
WellcomeMLRepository for Machine Learning utils at the Wellcome Trust
Stars: ✭ 31 (+6.9%)
elastic transformersMaking BERT stretchy. Semantic Elasticsearch with Sentence Transformers
Stars: ✭ 153 (+427.59%)
pysentimientoA Python multilingual toolkit for Sentiment Analysis and Social NLP tasks
Stars: ✭ 274 (+844.83%)
parsbert-ner🤗 ParsBERT Persian NER Tasks
Stars: ✭ 15 (-48.28%)
bert-squeeze🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (+93.1%)
transformer generalizationThe official repository for our paper "The Devil is in the Detail: Simple Tricks Improve Systematic Generalization of Transformers". We significantly improve the systematic generalization of transformer models on a variety of datasets using simple tricks and careful considerations.
Stars: ✭ 58 (+100%)
MinecordA lightweight but powerful Minecraft Discord bot with recipe lookup, server pings, name history, skin renders, and more.
Stars: ✭ 30 (+3.45%)
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+689.66%)
birdskitchenA Desktop Recipe Manager application using Electron & React.
Stars: ✭ 64 (+120.69%)
deepfrogAn NLP-suite powered by deep learning
Stars: ✭ 16 (-44.83%)
dm pixPIX is an image processing library in JAX, for JAX.
Stars: ✭ 271 (+834.48%)
RETRO-pytorchImplementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (+1531.03%)
Chinese-Minority-PLMCINO: Pre-trained Language Models for Chinese Minority (少数民族语言预训练模型)
Stars: ✭ 133 (+358.62%)
huggingpics🤗🖼️ HuggingPics: Fine-tune Vision Transformers for anything using images found on the web.
Stars: ✭ 161 (+455.17%)
recipesApplication for managing recipes, planning meals, building shopping lists and much much more!
Stars: ✭ 3,570 (+12210.34%)
modulesThe official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
Stars: ✭ 25 (-13.79%)
code-transformerImplementation of the paper "Language-agnostic representation learning of source code from structure and context".
Stars: ✭ 130 (+348.28%)