score sde pytorchPyTorch implementation for Score-Based Generative Modeling through Stochastic Differential Equations (ICLR 2021, Oral)
Stars: ✭ 755 (+1440.82%)
PyprobmlPython code for "Machine learning: a probabilistic perspective" (2nd edition)
Stars: ✭ 4,197 (+8465.31%)
jax-modelsUnofficial JAX implementations of deep learning research papers
Stars: ✭ 108 (+120.41%)
jax-resnetImplementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).
Stars: ✭ 61 (+24.49%)
awesome-ebmCollecting research materials on EBM/EBL (Energy Based Models, Energy Based Learning)
Stars: ✭ 143 (+191.84%)
koclipKoCLIP: Korean port of OpenAI CLIP, in Flax
Stars: ✭ 80 (+63.27%)
efficientnet-jaxEfficientNet, MobileNetV3, MobileNetV2, MixNet, etc in JAX w/ Flax Linen and Objax
Stars: ✭ 114 (+132.65%)
uvadlc notebooksRepository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2022/Spring 2022
Stars: ✭ 901 (+1738.78%)
jax-rlJAX implementations of core Deep RL algorithms
Stars: ✭ 61 (+24.49%)
Transformers🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+113659.18%)
omdJAX code for the paper "Control-Oriented Model-Based Reinforcement Learning with Implicit Differentiation"
Stars: ✭ 43 (-12.24%)
get-started-with-JAXThe purpose of this repo is to make it easy to get started with JAX, Flax, and Haiku. It contains my "Machine Learning with JAX" series of tutorials (YouTube videos and Jupyter Notebooks) as well as the content I found useful while learning about the JAX ecosystem.
Stars: ✭ 229 (+367.35%)
cflow-adOfficial PyTorch code for WACV 2022 paper "CFLOW-AD: Real-Time Unsupervised Anomaly Detection with Localization via Conditional Normalizing Flows"
Stars: ✭ 138 (+181.63%)
madam👩 Pytorch and Jax code for the Madam optimiser.
Stars: ✭ 46 (-6.12%)
ifl-tppImplementation of "Intensity-Free Learning of Temporal Point Processes" (Spotlight @ ICLR 2020)
Stars: ✭ 58 (+18.37%)
SoCo[NeurIPS 2021 Spotlight] Aligning Pretraining for Detection via Object-Level Contrastive Learning
Stars: ✭ 125 (+155.1%)
UMNNImplementation of Unconstrained Monotonic Neural Network and the related experiments. These architectures are particularly useful for modelling monotonic transformations in normalizing flows.
Stars: ✭ 63 (+28.57%)
semi-supervised-NFsCode for the paper Semi-Conditional Normalizing Flows for Semi-Supervised Learning
Stars: ✭ 23 (-53.06%)
DiGCLThe PyTorch implementation of Directed Graph Contrastive Learning (DiGCL), NeurIPS-2021
Stars: ✭ 27 (-44.9%)
bayexBayesian Optimization in JAX
Stars: ✭ 24 (-51.02%)
robustness-vitContains code for the paper "Vision Transformers are Robust Learners" (AAAI 2022).
Stars: ✭ 78 (+59.18%)
GPJaxA didactic Gaussian process package for researchers in Jax.
Stars: ✭ 159 (+224.49%)
unsup-partsUnsupervised Part Discovery from Contrastive Reconstruction (NeurIPS 2021)
Stars: ✭ 35 (-28.57%)
SemiSeg-AELSemi-Supervised Semantic Segmentation via Adaptive Equalization Learning, NeurIPS 2021 (Spotlight)
Stars: ✭ 79 (+61.22%)
WaveGrad2PyTorch Implementation of Google Brain's WaveGrad 2: Iterative Refinement for Text-to-Speech Synthesis
Stars: ✭ 55 (+12.24%)
Normalizing FlowsImplementation of Normalizing flows on MNIST https://arxiv.org/abs/1505.05770
Stars: ✭ 14 (-71.43%)
DiffuseVAEA combination of VAE's and Diffusion Models for efficient, controllable and high-fidelity generation from low-dimensional latents
Stars: ✭ 81 (+65.31%)
dm pixPIX is an image processing library in JAX, for JAX.
Stars: ✭ 271 (+453.06%)
sliced score matchingCode for reproducing results in the sliced score matching paper (UAI 2019)
Stars: ✭ 68 (+38.78%)
SymJAXDocumentation:
Stars: ✭ 103 (+110.2%)
fedpaFederated posterior averaging implemented in JAX
Stars: ✭ 38 (-22.45%)
MongeAmpereFlowContinuous-time gradient flow for generative modeling and variational inference
Stars: ✭ 29 (-40.82%)
ShinRLShinRL: A Library for Evaluating RL Algorithms from Theoretical and Practical Perspectives (Deep RL Workshop 2021)
Stars: ✭ 30 (-38.78%)
MISEMultimodal Image Synthesis and Editing: A Survey
Stars: ✭ 214 (+336.73%)
flaxOptimizersA collection of optimizers, some arcane others well known, for Flax.
Stars: ✭ 21 (-57.14%)
deeprob-kitA Python Library for Deep Probabilistic Modeling
Stars: ✭ 32 (-34.69%)
rA9JAX-based Spiking Neural Network framework
Stars: ✭ 60 (+22.45%)
continuous-time-flow-processPyTorch code of "Modeling Continuous Stochastic Processes with Dynamic Normalizing Flows" (NeurIPS 2020)
Stars: ✭ 34 (-30.61%)
pcanPrototypical Cross-Attention Networks for Multiple Object Tracking and Segmentation, NeurIPS 2021 Spotlight
Stars: ✭ 294 (+500%)
ML-Optimizers-JAXToy implementations of some popular ML optimizers using Python/JAX
Stars: ✭ 37 (-24.49%)
normalizing-flowsImplementations of normalizing flows using python and tensorflow
Stars: ✭ 15 (-69.39%)
mlp-gpt-jaxA GPT, made only of MLPs, in Jax
Stars: ✭ 53 (+8.16%)
Entity-Graph-VLNCode of the NeurIPS 2021 paper: Language and Visual Entity Relationship Graph for Agent Navigation
Stars: ✭ 34 (-30.61%)
braxMassively parallel rigidbody physics simulation on accelerator hardware.
Stars: ✭ 1,208 (+2365.31%)
jaxdfA JAX-based research framework for writing differentiable numerical simulators with arbitrary discretizations
Stars: ✭ 50 (+2.04%)
cisip-FIReFast Image Retrieval (FIRe) is an open source project to promote image retrieval research. It implements most of the major binary hashing methods to date, together with different popular backbone networks and public datasets.
Stars: ✭ 40 (-18.37%)
NeuroSEEDImplementation of Neural Distance Embeddings for Biological Sequences (NeuroSEED) in PyTorch (NeurIPS 2021)
Stars: ✭ 40 (-18.37%)
wax-mlA Python library for machine-learning and feedback loops on streaming data
Stars: ✭ 36 (-26.53%)
jax-cfdComputational Fluid Dynamics in JAX
Stars: ✭ 399 (+714.29%)
annotated-s4Implementation of https://srush.github.io/annotated-s4
Stars: ✭ 133 (+171.43%)