All Projects β†’ phlippe β†’ uvadlc_notebooks

phlippe / uvadlc_notebooks

Licence: MIT license
Repository of Jupyter notebook tutorials for teaching the Deep Learning Course at the University of Amsterdam (MSc AI), Fall 2022/Spring 2022

Programming Languages

Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to uvadlc notebooks

get-started-with-JAX
The purpose of this repo is to make it easy to get started with JAX, Flax, and Haiku. It contains my "Machine Learning with JAX" series of tutorials (YouTube videos and Jupyter Notebooks) as well as the content I found useful while learning about the JAX ecosystem.
Stars: ✭ 229 (-74.58%)
Mutual labels:  flax, jax, optax
chef-transformer
Chef Transformer 🍲 .
Stars: ✭ 29 (-96.78%)
Mutual labels:  flax, jax
efficientnet-jax
EfficientNet, MobileNetV3, MobileNetV2, MixNet, etc in JAX w/ Flax Linen and Objax
Stars: ✭ 114 (-87.35%)
Mutual labels:  flax, jax
koclip
KoCLIP: Korean port of OpenAI CLIP, in Flax
Stars: ✭ 80 (-91.12%)
Mutual labels:  flax, jax
score flow
Official code for "Maximum Likelihood Training of Score-Based Diffusion Models", NeurIPS 2021 (spotlight)
Stars: ✭ 49 (-94.56%)
Mutual labels:  flax, jax
jax-rl
JAX implementations of core Deep RL algorithms
Stars: ✭ 61 (-93.23%)
Mutual labels:  flax, jax
jax-resnet
Implementations and checkpoints for ResNet, Wide ResNet, ResNeXt, ResNet-D, and ResNeSt in JAX (Flax).
Stars: ✭ 61 (-93.23%)
Mutual labels:  flax, jax
Transformers
πŸ€— Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+6086.68%)
Mutual labels:  flax, jax
Pyprobml
Python code for "Machine learning: a probabilistic perspective" (2nd edition)
Stars: ✭ 4,197 (+365.82%)
Mutual labels:  flax, jax
omd
JAX code for the paper "Control-Oriented Model-Based Reinforcement Learning with Implicit Differentiation"
Stars: ✭ 43 (-95.23%)
Mutual labels:  flax, jax
jax-models
Unofficial JAX implementations of deep learning research papers
Stars: ✭ 108 (-88.01%)
Mutual labels:  flax, jax
ionic4-sidemenu-auth
Building a Basic Ionic 4 Login Flow with Angular Router & Side Menu UI
Stars: ✭ 34 (-96.23%)
Mutual labels:  tutorials
react-native-curated
πŸ’β€β™‚οΈ Hand picked collection of packages, tutorials and more for React Native.
Stars: ✭ 43 (-95.23%)
Mutual labels:  tutorials
tutorials
Collection of tutorials for various libraries and technologies
Stars: ✭ 98 (-89.12%)
Mutual labels:  tutorials
YouTube tutorial
I store all the code I used in my YouTube tutorial here. Feel free to download and play around themπŸ˜‰
Stars: ✭ 56 (-93.78%)
Mutual labels:  tutorials
deepfillv2-pylightning
Clean minimal implementation of Free-Form Image Inpainting with Gated Convolutions in pytorch lightning. Inspired from pytorch implementation by @avalonstrel.
Stars: ✭ 13 (-98.56%)
Mutual labels:  pytorch-lightning
docs
Documentation site for LFE
Stars: ✭ 23 (-97.45%)
Mutual labels:  tutorials
deepaudio-speaker
neural network based speaker embedder
Stars: ✭ 19 (-97.89%)
Mutual labels:  pytorch-lightning
UnityTutorials-RTS
The code for my series of tutorials on how to make a real-time stategy (RTS) game in the well-know Unity game engine (with C# scripting)!
Stars: ✭ 256 (-71.59%)
Mutual labels:  tutorials
Tutorials
Start solving PDEs in Julia with Gridap.jl
Stars: ✭ 79 (-91.23%)
Mutual labels:  tutorials

UvA Deep Learning Tutorials

Note: To look at the notebooks in a nicer format, visit our RTD website: https://uvadlc-notebooks.readthedocs.io/en/latest/

Course website: https://uvadlc.github.io/
Course edition: Fall 2022 (Nov. 01 - Dec. 24) - Being kept up to date
Recordings: YouTube Playlist
Author: Phillip Lippe

For this year's course edition, we created a series of Jupyter notebooks that are designed to help you understanding the "theory" from the lectures by seeing corresponding implementations. We will visit various topics such as optimization techniques, transformers, graph neural networks, and more (for a full list, see below). The notebooks are there to help you understand the material and teach you details of the PyTorch framework, including PyTorch Lightning. Further, we provide one-to-one translations of the notebooks to JAX+Flax as alternative framework.

The notebooks are presented in the first hour of every group tutorial session. During the tutorial sessions, we will present the content and explain the implementation of the notebooks. You can decide yourself whether you just want to look at the filled notebook, want to try it yourself, or code along during the practical session. The notebooks are not directly part of any mandatory assignments on which you would be graded or similarly. However, we encourage you to get familiar with the notebooks and experiment or extend them yourself. Further, the content presented will be relevant for the graded assignment and exam.

The tutorials have been integrated as official tutorials of PyTorch Lightning. Thus, you can also view them in their documentation.

How to run the notebooks

On this website, you will find the notebooks exported into a HTML format so that you can read them from whatever device you prefer. However, we suggest that you also give them a try and run them yourself. There are three main ways of running the notebooks we recommend:

  • Locally on CPU: All notebooks are stored on the github repository that also builds this website. You can find them here: https://github.com/phlippe/uvadlc_notebooks/tree/master/docs/tutorial_notebooks. The notebooks are designed so that you can execute them on common laptops without the necessity of a GPU. We provide pretrained models that are automatically downloaded when running the notebooks, or can manually be downloaded from this Google Drive. The required disk space for the pretrained models and datasets is less than 1GB. To ensure that you have all the right python packages installed, we provide a conda environment in the same repository (choose the CPU or GPU version depending on your system).

  • Google Colab: If you prefer to run the notebooks on a different platform than your own computer, or want to experiment with GPU support, we recommend using Google Colab. Each notebook on this documentation website has a badge with a link to open it on Google Colab. Remember to enable GPU support before running the notebook (Runtime -> Change runtime type). Each notebook can be executed independently, and doesn't require you to connect your Google Drive or similar. However, when closing the session, changes might be lost if you don't save it to your local computer or have copied the notebook to your Google Drive beforehand.

  • Lisa cluster: If you want to train your own (larger) neural networks based on the notebooks, you can make use of the Lisa cluster. However, this is only suggested if you really want to train a new model, and use the other two options to go through the discussion and analysis of the models. Lisa might not allow you with your student account to run Jupyter notebooks directly on the gpu_shared partition. Instead, you can first convert the notebooks to a script using jupyter nbconvert --to script ...ipynb, and then start a job on Lisa for running the script. A few advices when running on Lisa:

    • Disable the tqdm statements in the notebook. Otherwise your slurm output file might overflow and be several MB large. In PyTorch Lightning, you can do this by setting progress_bar_refresh_rate=0 in the trainer.
    • Comment out the matplotlib plotting statements, or change :code:plt.show() to plt.savefig(...).

Tutorial-Lecture alignment

We will discuss 7 of the tutorials in the course, spread across lectures to cover something from every area. You can align the tutorials with the lectures based on their topics. The list of tutorials is:

  • Guide 1: Working with the Lisa cluster
  • Tutorial 2: Introduction to PyTorch
  • Tutorial 3: Activation functions
  • Tutorial 4: Optimization and Initialization
  • Tutorial 5: Inception, ResNet and DenseNet
  • Tutorial 6: Transformers and Multi-Head Attention
  • Tutorial 7: Graph Neural Networks
  • Tutorial 8: Deep Energy Models
  • Tutorial 9: Autoencoders
  • Tutorial 10: Adversarial attacks
  • Tutorial 11: Normalizing Flows on image modeling
  • Tutorial 12: Autoregressive Image Modeling
  • Tutorial 15: Vision Transformers
  • Tutorial 16: Meta Learning - Learning to Learn
  • Tutorial 17: Self-Supervised Contrastive Learning with SimCLR

Feedback, Questions or Contributions

This is the first time we present these tutorials during the Deep Learning course. As with any other project, small bugs and issues are expected. We appreciate any feedback from students, whether it is about a spelling mistake, implementation bug, or suggestions for improvements/additions to the notebooks. Please use the following link to submit feedback, or feel free to reach out to me directly per mail (p dot lippe at uva dot nl), or grab me during any TA session.

If you find the tutorials helpful and would like to cite them, you can use the following bibtex:

@misc{lippe2022uvadlc,
   title        = {{UvA Deep Learning Tutorials}},
   author       = {Phillip Lippe},
   year         = 2022,
   howpublished = {\url{https://uvadlc-notebooks.readthedocs.io/en/latest/}}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].