All Projects → rsokl → Mygrad

rsokl / Mygrad

Licence: mit
A pure-python/numpy autograd tensor library

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Mygrad

Data Science Complete Tutorial
For extensive instructor led learning
Stars: ✭ 1,027 (+1233.77%)
Mutual labels:  numpy
Numpy ringbuffer
Ring-buffer implementation that thinly wraps a numpy array
Stars: ✭ 52 (-32.47%)
Mutual labels:  numpy
Cloud Volume
Read and write Neuroglancer datasets programmatically.
Stars: ✭ 63 (-18.18%)
Mutual labels:  numpy
Eulerian Remote Heartrate Detection
Remote heart rate detection through Eulerian magnification of face videos
Stars: ✭ 48 (-37.66%)
Mutual labels:  numpy
Awesome Ebooks
收录开源的经典技术书籍 PDF 文件及相关网站,持续更新中...
Stars: ✭ 51 (-33.77%)
Mutual labels:  numpy
Mlkatas
A series of self-correcting challenges for practicing your Machine Learning and Deep Learning skills
Stars: ✭ 58 (-24.68%)
Mutual labels:  numpy
Wynullview
An easy way to use for view's empty state 一行代码显示空视图,高度自定义
Stars: ✭ 44 (-42.86%)
Mutual labels:  numpy
Double pendulum
Animations of random double pendulums
Stars: ✭ 73 (-5.19%)
Mutual labels:  numpy
Gait Recognition
Distance Recognition of a Human Being with Deep CNN's
Stars: ✭ 51 (-33.77%)
Mutual labels:  numpy
Dicomweb Client
Python client for DICOMweb RESTful services
Stars: ✭ 60 (-22.08%)
Mutual labels:  numpy
Ncar Python Tutorial
Numerical & Scientific Computing with Python Tutorial
Stars: ✭ 50 (-35.06%)
Mutual labels:  numpy
Numpy Convnet
A small and pure Numpy Convolutional Neural Network library.
Stars: ✭ 50 (-35.06%)
Mutual labels:  numpy
Perlin Numpy
A fast and simple perlin noise generator using numpy
Stars: ✭ 58 (-24.68%)
Mutual labels:  numpy
Iml
Курс "Введение в машинное обучение" (ВМК, МГУ имени М.В. Ломоносова)
Stars: ✭ 46 (-40.26%)
Mutual labels:  numpy
Accupy
Accurate sums and dot products for Python.
Stars: ✭ 65 (-15.58%)
Mutual labels:  numpy
Machine Learning
notebooks with example for machine learning examples
Stars: ✭ 45 (-41.56%)
Mutual labels:  numpy
25daysinmachinelearning
I will update this repository to learn Machine learning with python with statistics content and materials
Stars: ✭ 53 (-31.17%)
Mutual labels:  numpy
Docker Alpine Python Machinelearning
Small Docker image with Python Machine Learning tools (~180MB) https://hub.docker.com/r/frolvlad/alpine-python-machinelearning/
Stars: ✭ 76 (-1.3%)
Mutual labels:  numpy
Dicom Numpy
Properly generate a 3D numpy array from a set of DICOM files.
Stars: ✭ 64 (-16.88%)
Mutual labels:  numpy
Dask
Parallel computing with task scheduling
Stars: ✭ 9,309 (+11989.61%)
Mutual labels:  numpy

Tested with Hypothesis codecov Documentation Status Build Status PyPi version Python version support

MyGrad's Documentation

Introducing mygrad

mygrad is a simple, NumPy-centric autograd library. An autograd library enables you to automatically compute derivatives of mathematical functions. This library is designed to serve both as a tool for prototyping/testing and as an education tool for learning about gradient-based machine learning; it is easy to install, has a readable and easily customizable code base, and provides a sleek interface that mimics NumPy. Furthermore, it leverages NumPy's vectorization to achieve good performance despite the library's simplicity.

This is not meant to be a competitor to libraries like PyTorch (which mygrad most closely resembles) or TensorFlow. Rather, it is meant to serve as a useful tool for students who are learning about training neural networks using back propagation.

Installing mygrad (this project)

To install MyGrad, you can pip-install it:

pip install mygrad

or clone this repository and navigate to the MyGrad directory, then run:

python setup.py install

MyGrad requires numpy. It is highly recommended that you utilize a release of numpy built with MKL for access to optimized math routines.

A Simple Application

Let's use mygrad to compute the derivative of CodeCogsEqn.gif, which is .

mygrad.Tensor behaves nearly identically to NumPy's ndarray, in addition to having the machinery needed to compute the analytic derivatives of functions. Suppose we want to compute this derivative at x = 3. We can create a 0-dimensional tensor (a scalar) for x and compute f(x):

>>> import mygrad as mg
>>> x = mg.Tensor(3.0)
>>> f = x ** 2
>>> f
Tensor(9.0)

Invoking f.backward() instructs mygrad to trace through the computational graph that produced f and compute the derivatives of f with respect to all of its independent variables. Thus, executing f.backward() will compute and will store the value in x.grad:

>>> f.backward()  # triggers computation of `df/dx`
>>> x.grad  # df/dx = 2x = 6.0
array(6.0)

This is the absolute tip of the iceberg. mygrad can compute derivatives of multivariable composite functions of tensor-valued variables!

Some Bells and Whistles

mygrad supports all of NumPy's essential features, including:

mygrad.Tensor plays nicely with NumPy-arrays, which behave as constants when they are used in computational graphs:

>>> import numpy as np
>>> x = mg.Tensor([2.0, 2.0, 2.0])
>>> y = np.array([1.0, 2.0, 3.0])
>>> f = x ** y  # (2 ** 1, 2 ** 2, 2 ** 3)
>>> f.backward()
>>> x.grad
array([ 1.,  4., 12.])

mygrad.nnet supplies essential functions for machine learning, including:

  • N-dimensional convolutions (with striding, dilation, and padding)
  • N-dimensional pooling
  • A gated recurrent unit for sequence-learning (with input-level dropout and variational hidden-hidden dropout)

It leverages a nice sliding window view function, which produces convolution-style windowed views of arrays/tensors without making copies of them, to intuitively (and quite efficiently) perform the neural network-style convolutions and pooling.

Advanced Example

The following is an example of using mygrad to compute the hinge loss of classification scores and to "backpropagate" through (compute the gradient of) this loss. This example demonstrates some of mygrad's ability to perform backpropagation through broadcasted operations, basic indexing, advanced indexing, and in-place assignments.

>>> from mygrad import Tensor
>>> import numpy as np
>>> class_scores = Tensor(10 * np.random.rand(100, 10))         # 100 samples, 10 possible classes for each
>>> class_labels = np.random.randint(low=0, high=10, size=100)  # correct label for each datum
>>> class_labels = (range(len(class_labels)), class_labels)
>>> correct_class_scores = class_scores[class_labels]

>>> Lij = class_scores - correct_class_scores[:, np.newaxis] + 1.  # 100x10 margins
>>> Lij[Lij <= 0] = 0      # scores within the hinge incur no loss
>>> Lij[class_labels] = 0  # the score corresponding to the correct label incurs no loss

>>> loss = Lij.sum() / class_scores.shape[0]  # compute mean hinge loss
>>> loss.backward()    # compute gradient of loss w.r.t all dependent tensors
>>> class_scores.grad  # d(loss)/d(class_scores)
array([[ 0.  ,  0.01,  0.  , -0.04,  0.  ,  0.  ,  0.01,  0.  ,  0.01, 0.01], ...])

Computational Graph Visualization

mygrad uses Graphviz and a Python interface for Graphviz to render the computational graphs built using tensors. These graphs can be rendered in Jupyter notebooks, allowing for quick checks of graph structure, or can be saved to file for later reference.

The dependencies can be installed with:

conda install graphviz
conda install python-graphviz
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].