score flowOfficial code for "Maximum Likelihood Training of Score-Based Diffusion Models", NeurIPS 2021 (spotlight)
Stars: ✭ 49 (-38.75%)
jax-modelsUnofficial JAX implementations of deep learning research papers
Stars: ✭ 108 (+35%)
Transformers🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+69577.5%)
jax-rlJAX implementations of core Deep RL algorithms
Stars: ✭ 61 (-23.75%)
PyprobmlPython code for "Machine learning: a probabilistic perspective" (2nd edition)
Stars: ✭ 4,197 (+5146.25%)
omdJAX code for the paper "Control-Oriented Model-Based Reinforcement Learning with Implicit Differentiation"
Stars: ✭ 43 (-46.25%)
efficientnet-jaxEfficientNet, MobileNetV3, MobileNetV2, MixNet, etc in JAX w/ Flax Linen and Objax
Stars: ✭ 114 (+42.5%)
uvadlc notebooksRepository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2022/Spring 2022
Stars: ✭ 901 (+1026.25%)
get-started-with-JAXThe purpose of this repo is to make it easy to get started with JAX, Flax, and Haiku. It contains my "Machine Learning with JAX" series of tutorials (YouTube videos and Jupyter Notebooks) as well as the content I found useful while learning about the JAX ecosystem.
Stars: ✭ 229 (+186.25%)
jax-resnetImplementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).
Stars: ✭ 61 (-23.75%)
dm pixPIX is an image processing library in JAX, for JAX.
Stars: ✭ 271 (+238.75%)
RECCONThis repository contains the dataset and the PyTorch implementations of the models from the paper Recognizing Emotion Cause in Conversations.
Stars: ✭ 126 (+57.5%)
rA9JAX-based Spiking Neural Network framework
Stars: ✭ 60 (-25%)
Transformer-QG-on-SQuADImplement Question Generator with SOTA pre-trained Language Models (RoBERTa, BERT, GPT, BART, T5, etc.)
Stars: ✭ 28 (-65%)
pytorch-vitAn Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Stars: ✭ 250 (+212.5%)
PoLitBertPolish RoBERTA model trained on Polish literature, Wikipedia, and Oscar. The major assumption is that quality text will give a good model.
Stars: ✭ 25 (-68.75%)
feed forward vqgan clipFeed forward VQGAN-CLIP model, where the goal is to eliminate the need for optimizing the latent space of VQGAN for each input prompt
Stars: ✭ 135 (+68.75%)
image-classificationA collection of SOTA Image Classification Models in PyTorch
Stars: ✭ 70 (-12.5%)
transformer-lsOfficial PyTorch Implementation of Long-Short Transformer (NeurIPS 2021).
Stars: ✭ 201 (+151.25%)
visualizationa collection of visualization function
Stars: ✭ 189 (+136.25%)
towheeTowhee is a framework that is dedicated to making neural data processing pipelines simple and fast.
Stars: ✭ 821 (+926.25%)
ShinRLShinRL: A Library for Evaluating RL Algorithms from Theoretical and Practical Perspectives (Deep RL Workshop 2021)
Stars: ✭ 30 (-62.5%)
ADAMADAM implements a collection of algorithms for calculating rigid-body dynamics in Jax, CasADi, PyTorch, and Numpy.
Stars: ✭ 51 (-36.25%)
YOLOSYou Only Look at One Sequence (NeurIPS 2021)
Stars: ✭ 612 (+665%)
InterpretDLInterpretDL: Interpretation of Deep Learning Models,基于『飞桨』的模型可解释性算法库。
Stars: ✭ 121 (+51.25%)
mlp-gpt-jaxA GPT, made only of MLPs, in Jax
Stars: ✭ 53 (-33.75%)
ImageNet21KOfficial Pytorch Implementation of: "ImageNet-21K Pretraining for the Masses"(NeurIPS, 2021) paper
Stars: ✭ 565 (+606.25%)
robustness-vitContains code for the paper "Vision Transformers are Robust Learners" (AAAI 2022).
Stars: ✭ 78 (-2.5%)
mobilevit-pytorchA PyTorch implementation of "MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer".
Stars: ✭ 349 (+336.25%)
roberta-wwm-base-distillthis is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
Stars: ✭ 61 (-23.75%)
pytorch-cifar-model-zooImplementation of Conv-based and Vit-based networks designed for CIFAR.
Stars: ✭ 62 (-22.5%)
jax-cfdComputational Fluid Dynamics in JAX
Stars: ✭ 399 (+398.75%)
Text-SummarizationAbstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (-52.5%)
vietnamese-robertaA Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (-72.5%)
LaTeX-OCRpix2tex: Using a ViT to convert images of equations into LaTeX code.
Stars: ✭ 1,566 (+1857.5%)
libaiLiBai(李白): A Toolbox for Large-Scale Distributed Parallel Training
Stars: ✭ 284 (+255%)
Transformers-TutorialsThis repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+3435%)
zero-shot-object-trackingObject tracking implemented with the Roboflow Inference API, DeepSort, and OpenAI CLIP.
Stars: ✭ 242 (+202.5%)
wax-mlA Python library for machine-learning and feedback loops on streaming data
Stars: ✭ 36 (-55%)
Evo-ViTOfficial implement of Evo-ViT: Slow-Fast Token Evolution for Dynamic Vision Transformer
Stars: ✭ 50 (-37.5%)
VT-UNet[MICCAI2022] This is an official PyTorch implementation for A Robust Volumetric Transformer for Accurate 3D Tumor Segmentation
Stars: ✭ 151 (+88.75%)
openroberta-labThe programming environment »Open Roberta Lab« by Fraunhofer IAIS enables children and adolescents to program robots. A variety of different programming blocks are provided to program motors and sensors of the robot. Open Roberta Lab uses an approach of graphical programming so that beginners can seamlessly start coding. As a cloud-based applica…
Stars: ✭ 98 (+22.5%)
RoBERTaABSAImplementation of paper Does syntax matter? A strong baseline for Aspect-based Sentiment Analysis with RoBERTa.
Stars: ✭ 112 (+40%)
graphsignalGraphsignal Python agent
Stars: ✭ 158 (+97.5%)
GPJaxA didactic Gaussian process package for researchers in Jax.
Stars: ✭ 159 (+98.75%)
FlaxengineFlax Engine – multi-platform 3D game engine
Stars: ✭ 3,127 (+3808.75%)
CLIP-Guided-DiffusionJust playing with getting CLIP Guided Diffusion running locally, rather than having to use colab.
Stars: ✭ 328 (+310%)
madam👩 Pytorch and Jax code for the Madam optimiser.
Stars: ✭ 46 (-42.5%)
fedpaFederated posterior averaging implemented in JAX
Stars: ✭ 38 (-52.5%)