All Projects → ahmedbesbes → Neural Network From Scratch

ahmedbesbes / Neural Network From Scratch

Ever wondered how to code your Neural Network using NumPy, with no frameworks involved?

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Neural Network From Scratch

Pytorch Lesson Zh
pytorch 包教不包会
Stars: ✭ 279 (+21.3%)
Mutual labels:  jupyter-notebook, tutorial, neural-networks
Basic reinforcement learning
An introductory series to Reinforcement Learning (RL) with comprehensive step-by-step tutorials.
Stars: ✭ 826 (+259.13%)
Mutual labels:  jupyter-notebook, tutorial, neural-networks
Gdrl
Grokking Deep Reinforcement Learning
Stars: ✭ 304 (+32.17%)
Mutual labels:  jupyter-notebook, neural-networks, numpy
Tutorials
AI-related tutorials. Access any of them for free → https://towardsai.net/editorial
Stars: ✭ 204 (-11.3%)
Mutual labels:  jupyter-notebook, tutorial, neural-networks
Teaching Monolith
Data science teaching materials
Stars: ✭ 126 (-45.22%)
Mutual labels:  jupyter-notebook, backpropagation, numpy
Thesemicolon
This repository contains Ipython notebooks and datasets for the data analytics youtube tutorials on The Semicolon.
Stars: ✭ 345 (+50%)
Mutual labels:  jupyter-notebook, tutorial, numpy
Dsp Theory
Theory of digital signal processing (DSP): signals, filtration (IIR, FIR, CIC, MAF), transforms (FFT, DFT, Hilbert, Z-transform) etc.
Stars: ✭ 437 (+90%)
Mutual labels:  jupyter-notebook, tutorial, numpy
Ncar Python Tutorial
Numerical & Scientific Computing with Python Tutorial
Stars: ✭ 50 (-78.26%)
Mutual labels:  jupyter-notebook, tutorial, numpy
Learning python
Source material for Python Like You Mean it
Stars: ✭ 78 (-66.09%)
Mutual labels:  jupyter-notebook, tutorial, numpy
Deeplearning
Deep Learning From Scratch
Stars: ✭ 66 (-71.3%)
Mutual labels:  jupyter-notebook, tutorial, backpropagation
Shape Detection
🟣 Object detection of abstract shapes with neural networks
Stars: ✭ 170 (-26.09%)
Mutual labels:  jupyter-notebook, tutorial, neural-networks
Gasyori100knock
image processing codes to understand algorithm
Stars: ✭ 1,988 (+764.35%)
Mutual labels:  jupyter-notebook, tutorial, numpy
Psi4numpy
Combining Psi4 and Numpy for education and development.
Stars: ✭ 170 (-26.09%)
Mutual labels:  jupyter-notebook, tutorial, numpy
Dlsys Course.github.io
Deep learning system course
Stars: ✭ 207 (-10%)
Mutual labels:  jupyter-notebook, tutorial
Gluon Api
A clear, concise, simple yet powerful and efficient API for deep learning.
Stars: ✭ 2,322 (+909.57%)
Mutual labels:  jupyter-notebook, neural-networks
Windrose
A Python Matplotlib, Numpy library to manage wind data, draw windrose (also known as a polar rose plot), draw probability density function and fit Weibull distribution
Stars: ✭ 208 (-9.57%)
Mutual labels:  jupyter-notebook, numpy
Rl Tutorial Jnrr19
Stable-Baselines tutorial for Journées Nationales de la Recherche en Robotique 2019
Stars: ✭ 204 (-11.3%)
Mutual labels:  jupyter-notebook, tutorial
Sc17
SuperComputing 2017 Deep Learning Tutorial
Stars: ✭ 211 (-8.26%)
Mutual labels:  jupyter-notebook, tutorial
Tensorflow
Deep Learning Zero to All - Tensorflow
Stars: ✭ 216 (-6.09%)
Mutual labels:  jupyter-notebook, tutorial
Tcdf
Temporal Causal Discovery Framework (PyTorch): discovering causal relationships between time series
Stars: ✭ 217 (-5.65%)
Mutual labels:  jupyter-notebook, neural-networks

Learn backpropagtion the hard way

Backpropagation

In this repository, I will show you how to build a neural network from scratch (yes, by using plain python code with no framework involved) that trains by mini-batches using gradient descent. Check nn.py for the code.

In the related notebook Neural_Network_from_scratch_with_Numpy.ipynb we will test nn.py on a set of non-linear classification problems

  • We'll train the neural network for some number of epochs and some hyperparameters
  • Plot a live/interactive decision boundary
  • Plot the train and validation metrics such as the loss and the accuracies

Example: Noisy Moons (Check the notebook for other kinds of problems)

Decision boundary (you'll get to this graph animated during training)

Decision boundary

Loss and accuracy monitoring on train and validation sets

Loss/Accuracy monitoring on train/val

Where to go from here?

nn.py is a toy neural network that is meant for educational purposes only. So there's room for a lot of improvement if you want to pimp it. Here are some guidelines:

  • Implement a different loss function such as the Binary Cross Entropy loss. For a classification problem, this loss works better than a Mean Square Error.
  • Make the code generic regarding the activation functions so that we can choose any function we want: ReLU, Sigmoid, Tanh, etc.
  • Try to code another optimizers: SGD is good but it has some limitations: sometimes it can be stuck in local minima. Look into Adam or RMSProp.
  • Play with the hyperparameters and check the validation metrics
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].