All Projects → deepmipt → Tdl

deepmipt / Tdl

Course "Theories of Deep Learning"

Projects that are alternatives of or similar to Tdl

Bitcoin trading bot
This is the code for "Bitcoin Trading Bot" By Siraj Raval on Youtube
Stars: ✭ 163 (-0.61%)
Mutual labels:  jupyter-notebook
Notes
Contains Example Programs and Notebooks for some courses at Bogazici University, Department of Computer Engineering
Stars: ✭ 163 (-0.61%)
Mutual labels:  jupyter-notebook
Csharp with csharpfritz
Show notes, slides, and samples from the CSharp with CSharpFritz show
Stars: ✭ 164 (+0%)
Mutual labels:  jupyter-notebook
Repo 2018
Deep Learning Summer School + Tensorflow + OpenCV cascade training + YOLO + COCO + CycleGAN + AWS EC2 Setup + AWS IoT Project + AWS SageMaker + AWS API Gateway + Raspberry Pi3 Ubuntu Core
Stars: ✭ 163 (-0.61%)
Mutual labels:  jupyter-notebook
Imagecompletion Dcgan
Image completion using deep convolutional generative adversarial nets in tensorflow
Stars: ✭ 163 (-0.61%)
Mutual labels:  jupyter-notebook
Blog posts
Blog posts for matatat.org
Stars: ✭ 163 (-0.61%)
Mutual labels:  jupyter-notebook
Bigdata docker
Big Data Ecosystem Docker
Stars: ✭ 161 (-1.83%)
Mutual labels:  jupyter-notebook
Competition Baseline
数据科学竞赛知识、代码、思路
Stars: ✭ 2,553 (+1456.71%)
Mutual labels:  jupyter-notebook
Neuralnets
Deep Learning libraries tested on images and time series
Stars: ✭ 163 (-0.61%)
Mutual labels:  jupyter-notebook
Dl Course
Deep Learning with Catalyst
Stars: ✭ 162 (-1.22%)
Mutual labels:  jupyter-notebook
Ai Toolkit Iot Edge
AI Toolkit for Azure IoT Edge
Stars: ✭ 163 (-0.61%)
Mutual labels:  jupyter-notebook
Learnpythonforresearch
This repository provides everything you need to get started with Python for (social science) research.
Stars: ✭ 163 (-0.61%)
Mutual labels:  jupyter-notebook
Machine Learning
Machine learning library written in readable python code
Stars: ✭ 163 (-0.61%)
Mutual labels:  jupyter-notebook
Keraspersonlab
Keras-tensorflow implementation of PersonLab (https://arxiv.org/abs/1803.08225)
Stars: ✭ 163 (-0.61%)
Mutual labels:  jupyter-notebook
Workshops
Workshops organized to introduce students to security, AI, AR/VR, hardware and software
Stars: ✭ 162 (-1.22%)
Mutual labels:  jupyter-notebook
Coursera Machine Learning Solutions Python
A repository with solutions to the assignments on Andrew Ng's machine learning MOOC on Coursera
Stars: ✭ 163 (-0.61%)
Mutual labels:  jupyter-notebook
Scientific graphics in python
Электронный учебник-пособие по научной графике в python
Stars: ✭ 163 (-0.61%)
Mutual labels:  jupyter-notebook
Maml Jax
Implementation of Model-Agnostic Meta-Learning (MAML) in Jax
Stars: ✭ 164 (+0%)
Mutual labels:  jupyter-notebook
Ta
Technical Analysis Library using Pandas and Numpy
Stars: ✭ 2,649 (+1515.24%)
Mutual labels:  jupyter-notebook
Tensorflow2 Crash Course
A quick crash course in understanding the essentials of TensorFlow 2 and the integrated Keras API
Stars: ✭ 162 (-1.22%)
Mutual labels:  jupyter-notebook

TDL logo

This is a GitHub page of Theoretical Deep Learning course held by Neural Networks and Deep Learning Lab., MIPT. The working language of this course is Russian.

The second part: there is a second part of this course devoted to generalization ability. The first part of this course is NOT a prerequisite for the second part.

Location: Moscow Institute of Physics and Technology, Phystech.Bio building, room 512 ФИЗТЕХ.ЦИФРА building, room 5-6 room 5-19 room 5-22.

Time: Friday, 10:45, starting from 15th of February, 2019.

Videos are available here.

Lecture slides, homework assignments and videos will appear in this repo and will be available for everyone. However, we can guarantee that we will check your homework only if you are a MIPT student.

For MIPT students: in the case you want to take this course into your personal syllabus, the real name of the course is: "Теоретический анализ подходов глубокого обучения".

Further announcements will be in our Telegram chat: https://t.me/joinchat/D_ljjxJHIrD8IuFvfqVLPw

Syllabus:

This syllabus is not final and may change. The order of topics will change with high probability.

  1. 15.02.2019 Introduction. Loss landscape of linear networks.

  2. 22.02.2019 Loss landscape of linear networks.

  3. 1.03.2019 Loss landscape of linear res-nets. Loss landscape of wide, but shallow sigmoid nets.

  4. 8.03.2019 No class.

  5. 15.03.2019 Loss landscape of deep and wide sigmoid nets.

  6. 22.03.2019 Spin-glass model. Elimination of local minima.

  7. 29.03.2019 GD almost surely converges to local minima. Noisy GD converges to local minima for any initialization. Cancelled.

  8. 5.04.2019 General convergence guarantees for GD and its variants.

  9. 12.04.2019 GD dynamics on wide shallow ReLU nets.

  10. 19.04.2019 (Ivan Skorokodov) The information bottleneck method. Part 1.

  11. 26.04.2019 (Ivan Skorokodov) The information bottleneck method. Part 2.

  12. 3.05.2019 No class.

  13. 10.05.2019 No class.

  14. 17.05.2019 Neural nets in the limit of infinite width. Homework solution demonstration.

Prerequisites:

  • Basic calculus / probability / linear algebra (matrix differentiation, SVD, eigenvalues, eigenvectors, Hessian, Markov's inequality)
  • Labs are given as jupyter notebooks
  • We use python3; need familiriaty with numpy, pytorch, matplotlib
  • Some experience in DL (not the first time of learning MNIST, familiarity with such words as BatchNorm, ResNet, Dropout)
  • Labs are possible to do on CPU, but it can take quite a long time to train (~1-2 days).

Grading:

This course will contain one lab and four theoretical assignments. There also will be an oral exam (in the form of interview) at the end of the course.

Let p_{hw} = "your points for homeworks" / "total possible points for homeworks (excluding extra points)". Define p_{exam} analogously.

Your final grade will be computed as follows: grade = min(10, p_{hw} * k_{hw} + p_{exam} * k_{exam}), where the coefficents are:

  • k_{hw} = 8
  • k_{exam} = 4

This numbers are not final and can change slightly.

The deadlines for all assignments are computed as follows: "the day, when the assignments appear at this page, 23:59 Moscow time" + 3 weeks. All deadlines are strict.

All homework assignments will appear not more often than once a week.

Send your homeworks to [email protected]

E-mails should be named as "Lab or theory" + "number" + "-" + "Your Name and Surname"

Current grades live here.

Exam (actually, zachet with grade) will take place at zachet week. Exam syllabus live here.

Homeworks:

The first theoretical assignment is out! Deadline: 16.03.2019 23:59 Moscow time.

The first lab assignment is out! Deadline: 30.03.2019 23:59 Moscow time.

The second theoretical assignment is out! Deadline: 16.04.2019 23:59 Moscow time.

The third theoretical assignment is out! Deadline: 13.05.2019 23:59 Moscow time.

The fourth theoretical assignment is out! Deadline: 21.05.2019 23:59 Moscow time.

Theoretical assignments live here. Labs live here.

Course staff:

  • Eugene Golikov - course admin, lectures, homeworks
  • Ivan Skorokhodov - homework review and beta-test, lecture about information bottleneck, off-screen comments

We also thank Mikhail Arkhipov for gingerbread operating.

This course is dedicated to the memory of Maksim Kretov | 30.12.1986 - 13.02.2019, without whom this course would have never been created.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].