All Projects → hhaji → Deep Learning

hhaji / Deep Learning

Course: Deep Learning

Projects that are alternatives of or similar to Deep Learning

Algorithms With Python
Solving the fundamentals of algorithms using Python
Stars: ✭ 116 (-0.85%)
Mutual labels:  jupyter-notebook
Python Ecology Lesson
Data Analysis and Visualization in Python for Ecologists
Stars: ✭ 116 (-0.85%)
Mutual labels:  jupyter-notebook
Blog
Source code for my personal blog
Stars: ✭ 117 (+0%)
Mutual labels:  jupyter-notebook
Ebookml src
Source code in ebook Machine Learning
Stars: ✭ 116 (-0.85%)
Mutual labels:  jupyter-notebook
Advanced training
Advanced Scikit-learn training session
Stars: ✭ 116 (-0.85%)
Mutual labels:  jupyter-notebook
Snns
Tutorials and implementations for "Self-normalizing networks"
Stars: ✭ 1,525 (+1203.42%)
Mutual labels:  jupyter-notebook
Rethinking Pyro
Statistical Rethinking with PyTorch and Pyro
Stars: ✭ 116 (-0.85%)
Mutual labels:  jupyter-notebook
Speechcmdrecognition
A neural attention model for speech command recognition
Stars: ✭ 116 (-0.85%)
Mutual labels:  jupyter-notebook
Deeplearningmodels
Stars: ✭ 116 (-0.85%)
Mutual labels:  jupyter-notebook
How To Build Own Text Summarizer Using Deep Learning
In this notebook, we will build an abstractive based text summarizer using deep learning from the scratch in python using keras
Stars: ✭ 117 (+0%)
Mutual labels:  jupyter-notebook
Pygoturn
PyTorch implementation of GOTURN object tracker: Learning to Track at 100 FPS with Deep Regression Networks (ECCV 2016)
Stars: ✭ 116 (-0.85%)
Mutual labels:  jupyter-notebook
Demo Docker
Demo notebooks inside a docker for end-to-end examples
Stars: ✭ 116 (-0.85%)
Mutual labels:  jupyter-notebook
Underwater detection
2020年全国水下机器人(湛江)大赛
Stars: ✭ 116 (-0.85%)
Mutual labels:  jupyter-notebook
Dogbreed gluon
kaggle Dog Breed Identification
Stars: ✭ 116 (-0.85%)
Mutual labels:  jupyter-notebook
Objectdetection
Some experiments with object detection in PyTorch
Stars: ✭ 117 (+0%)
Mutual labels:  jupyter-notebook
Ds bowl 2018
Kaggle Data Science Bowl 2018
Stars: ✭ 116 (-0.85%)
Mutual labels:  jupyter-notebook
Data Science 45min Intros
Ipython notebook presentations for getting starting with basic programming, statistics and machine learning techniques
Stars: ✭ 1,513 (+1193.16%)
Mutual labels:  jupyter-notebook
Hands On Data Analysis With Pandas
Materials for following along with Hands-On Data Analysis with Pandas.
Stars: ✭ 117 (+0%)
Mutual labels:  jupyter-notebook
Ruijin round1
瑞金医院MMC人工智能辅助构建知识图谱大赛初赛
Stars: ✭ 117 (+0%)
Mutual labels:  jupyter-notebook
Theseus growth
Theseus is a Python library for cohort analysis
Stars: ✭ 117 (+0%)
Mutual labels:  jupyter-notebook

Deep Learning

Deep Learning Using PyTorch

Lecturer: Hossein Hajiabolhassan

Data Science Center

Shahid Beheshti University
Teaching Assistants:
Behnaz H.M. Hoseyni Yavar T. Yeganeh Mostafa Khodayari Esmail Mafakheri

Index:


Course Overview:

In this course, you will learn the foundations of Deep Learning, understand how to build 
neural networks, and learn how to lead successful machine learning projects. You will learn 
about Convolutional networks, RNNs, LSTM, Adam, Dropout, BatchNorm, and more.

Main TextBooks:

Book 1 Book 2 Book 3 Book 4 Book 5

Main TextBooks:
Additional TextBooks:

Slides and Papers:

Recommended Slides & Papers:

  1. Introduction (1 Session)

Required Reading:
Suggested Reading:
Additional Resources:
  • Video of lecture by Ian Goodfellow and discussion of Chapter 1 at a reading group in San Francisco organized by Alena Kruchkova
  • Paper: On the Origin of Deep Learning by Haohan Wang and Bhiksha Raj
Applied Mathematics and Machine Learning Basics:
  1. Toolkit Lab 1: Google Colab and Anaconda (1 Session)

Required Reading:
Suggested Reading:
Additional Resources:
  1. Toolkit Lab 2: Getting Started with PyTorch (2 Sessions)

Required Reading:
Suggested Reading:
Additional Resources:
  • Blog: Learning PyTorch with Exampls by Justin Johnson. This tutorial introduces the fundamental concepts of PyTorch through self-contained examples.
Building Dynamic Models Using the Subclassing API:
  1. Deep Feedforward Networks (6 Sessions)

Required Reading:
Interesting Questions:
Suggested Reading:
Additional Resources:
  1. Toolkit Lab 3: Preprocessing Datasets by PyTorch (1 Session)

Required Reading:
Suggested Reading:
Additional Resources:
  1. Regularization for Deep Learning (5 Sessions)

Required Reading:
Suggested Reading:
Additional Reading:
  1. Toolkit Lab 4: Using a Neural Network to Fit the Data with PyTorch (2 Sessions)

Required Reading:
Suggested Reading:
Additional Resources:
  1. Optimization for Training Deep Models (5 Sessions)

Required Reading:
Suggested Reading:
Additional Reading:
  1. Convolutional Networks (3 Sessions)

Required Reading:
Suggested Reading:
Additional Reading:  
Fourier Transformation:
  1. Toolkit Lab 5: Using Convolutions to Generalize (2 Sessions)

Required Reading:    
Suggested Reading:
Additional Resources:
  1. Sequence Modeling: Recurrent and Recursive Networks (4 Sessions)

Required Reading:
Suggested Reading:
Additional Reading:
  1. Toolkit Lab 6: Transfer Learning and Other Tricks (1 Session)

Required Reading:    
Suggested Reading:
Additional Resources:
  1. Practical Methodology (2 Sessions)

Required Reading:
Suggested Reading:
Additional Reading:
  1. Toolkit Lab 7: Optuna: Automatic Hyperparameter Optimization Software (1 Session)

Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning.

Required Reading:
Suggested Reading:
Additional Resources:
  1. Applications (1 Session)

Required Reading:
Suggested Reading:
Additional Reading:
  1. Autoencoders (2 Sessions)

Required Reading:
Suggested Reading:
Additional Reading:
  1. Generative Adversarial Networks (1 Session)

Required Reading:

Slide: Generative Adversarial Networks (GANs) by Binglin, Shashank, and Bhargav
Paper: NIPS 2016 Tutorial: Generative Adversarial Networks by Ian Goodfellow

Suggested Reading:
Additional Reading:
  1. Graph Neural Networks (1 Session)

Required Reading:
Suggested Reading:
Additional Reading:

Additional Resources:

Class Time and Location:

Saturday and Monday 10:30-12:00 AM (Fall 2020)

Recitation and Assignments:

Tuesday 16:00-18:00 PM (Fall 2020), Refer to the following link to check the assignments.

Projects:

Projects are programming assignments that cover the topic of this course. Any project is written by Jupyter Notebook. Projects will require the use of Python 3.7, as well as additional Python libraries.

Google Colab:

Google Colab is a free cloud service and it supports free GPU!

Fascinating Guides For Machine Learning:

Latex:

The students can include mathematical notation within markdown cells using LaTeX in their Jupyter Notebooks.

  • A Brief Introduction to LaTeX PDF
  • Math in LaTeX PDF
  • Sample Document PDF
  • TikZ: A collection Latex files of PGF/TikZ figures (including various neural networks) by Petar Veličković.

Grading:

  • Projects and Midterm – 50%
  • Endterm – 50%

ُThree Exams:

  • First Midterm Examination: Saturday 1399/09/01, 10:30-12:00
  • Second Midterm Examination: Saturday 1399/10/06, 10:30-12:00
  • Final Examination: Wednesday 1399/10/24, 14:00-16:00

Prerequisites:

General mathematical sophistication; and a solid understanding of Algorithms, Linear Algebra, and Probability Theory, at the advanced undergraduate or beginning graduate level, or equivalent.

Linear Algebra:

Probability and Statistics:

Topics:

Have a look at some reports of Kaggle or Stanford students (CS224N, CS224D) to get some general inspiration.

Account:

It is necessary to have a GitHub account to share your projects. It offers plans for both private repositories and free accounts. Github is like the hammer in your toolbox, therefore, you need to have it!

Academic Honor Code:

Honesty and integrity are vital elements of the academic works. All your submitted assignments must be entirely your own (or your own group's).

We will follow the standard of Department of Mathematical Sciences approach:

  • You can get help, but you MUST acknowledge the help on the work you hand in
  • Failure to acknowledge your sources is a violation of the Honor Code
  • You can talk to others about the algorithm(s) to be used to solve a homework problem; as long as you then mention their name(s) on the work you submit
  • You should not use code of others or be looking at code of others when you write your own: You can talk to people but have to write your own solution/code

Questions?

I will be having office hours for this course on Saturday (09:00 AM--10:00 AM). If this is not convenient, email me at [email protected] or talk to me after class.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].