All Projects → karpathy → Micrograd

karpathy / Micrograd

Licence: mit
A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Micrograd

Griffon Vm
Griffon Data Science Virtual Machine
Stars: ✭ 128 (-93.07%)
Mutual labels:  jupyter-notebook
Pytorch Ensembles
Pitfalls of In-Domain Uncertainty Estimation and Ensembling in Deep Learning, ICLR 2020
Stars: ✭ 128 (-93.07%)
Mutual labels:  jupyter-notebook
Predictive Filter Flow
Predictive Filter Flow for fully/self-supervised learning on various vision tasks
Stars: ✭ 128 (-93.07%)
Mutual labels:  jupyter-notebook
Pyvi
Python Vietnamese Core NLP Toolkit
Stars: ✭ 128 (-93.07%)
Mutual labels:  jupyter-notebook
Python Workshop
A series of Jupyter Notebooks on exploring Unidata technology with Python. See website for more information.
Stars: ✭ 127 (-93.13%)
Mutual labels:  jupyter-notebook
Nb2xls
Convert Jupyter notebook to Excel spreadsheet
Stars: ✭ 129 (-93.02%)
Mutual labels:  jupyter-notebook
Rsis
Recurrent Neural Networks for Semantic Instance Segmentation
Stars: ✭ 128 (-93.07%)
Mutual labels:  jupyter-notebook
Reptile Pytorch
A PyTorch implementation of OpenAI's REPTILE algorithm
Stars: ✭ 129 (-93.02%)
Mutual labels:  jupyter-notebook
Contactpose
Large dataset of hand-object contact, hand- and object-pose, and 2.9 M RGB-D grasp images.
Stars: ✭ 129 (-93.02%)
Mutual labels:  jupyter-notebook
My deep project
个人深度学习项目整理
Stars: ✭ 129 (-93.02%)
Mutual labels:  jupyter-notebook
Pydata Chicago2016 Ml Tutorial
Machine learning with scikit-learn tutorial at PyData Chicago 2016
Stars: ✭ 128 (-93.07%)
Mutual labels:  jupyter-notebook
2013 fall astr599
Content for my Astronomy 599 Course: Intro to scientific computing in Python
Stars: ✭ 128 (-93.07%)
Mutual labels:  jupyter-notebook
Eewpython
A series of Jupyter notebook to learn Google Earth Engine with Python
Stars: ✭ 129 (-93.02%)
Mutual labels:  jupyter-notebook
Gumbel
Gumbel-Softmax Variational Autoencoder with Keras
Stars: ✭ 127 (-93.13%)
Mutual labels:  jupyter-notebook
Dask Tutorial Pycon 2018
Stars: ✭ 129 (-93.02%)
Mutual labels:  jupyter-notebook
Abstractive Summarization
Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Stars: ✭ 128 (-93.07%)
Mutual labels:  jupyter-notebook
Real Time Sentiment Tracking On Twitter For Brand Improvement And Trend Recognition
A real-time interactive web app based on data pipelines using streaming Twitter data, automated sentiment analysis, and MySQL&PostgreSQL database (Deployed on Heroku)
Stars: ✭ 127 (-93.13%)
Mutual labels:  jupyter-notebook
Accelerated dl pytorch
Accelerated Deep Learning with PyTorch at Jupyter Day Atlanta II
Stars: ✭ 129 (-93.02%)
Mutual labels:  jupyter-notebook
Pytorch Book
Source codes for the book "Application of Neural Network and PyTorch"
Stars: ✭ 129 (-93.02%)
Mutual labels:  jupyter-notebook
Stanford Machine Learning Camp
Stars: ✭ 128 (-93.07%)
Mutual labels:  jupyter-notebook

micrograd

awww

A tiny Autograd engine (with a bite! :)). Implements backpropagation (reverse-mode autodiff) over a dynamically built DAG and a small neural networks library on top of it with a PyTorch-like API. Both are tiny, with about 100 and 50 lines of code respectively. The DAG only operates over scalar values, so e.g. we chop up each neuron into all of its individual tiny adds and multiplies. However, this is enough to build up entire deep neural nets doing binary classification, as the demo notebook shows. Potentially useful for educational purposes.

Installation

pip install micrograd

Example usage

Below is a slightly contrived example showing a number of possible supported operations:

from micrograd.engine import Value

a = Value(-4.0)
b = Value(2.0)
c = a + b
d = a * b + b**3
c += c + 1
c += 1 + c + (-a)
d += d * 2 + (b + a).relu()
d += 3 * d + (b - a).relu()
e = c - d
f = e**2
g = f / 2.0
g += 10.0 / f
print(f'{g.data:.4f}') # prints 24.7041, the outcome of this forward pass
g.backward()
print(f'{a.grad:.4f}') # prints 138.8338, i.e. the numerical value of dg/da
print(f'{b.grad:.4f}') # prints 645.5773, i.e. the numerical value of dg/db

Training a neural net

The notebook demo.ipynb provides a full demo of training an 2-layer neural network (MLP) binary classifier. This is achieved by initializing a neural net from micrograd.nn module, implementing a simple svm "max-margin" binary classification loss and using SGD for optimization. As shown in the notebook, using a 2-layer neural net with two 16-node hidden layers we achieve the following decision boundary on the moon dataset:

2d neuron

Tracing / visualization

For added convenience, the notebook trace_graph.ipynb produces graphviz visualizations. E.g. this one below is of a simple 2D neuron, arrived at by calling draw_dot on the code below, and it shows both the data (left number in each node) and the gradient (right number in each node).

from micrograd import nn
n = nn.Neuron(2)
x = [Value(1.0), Value(-2.0)]
y = n(x)
dot = draw_dot(y)

2d neuron

Running tests

To run the unit tests you will have to install PyTorch, which the tests use as a reference for verifying the correctness of the calculated gradients. Then simply:

python -m pytest

License

MIT

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].