All Projects → Renatochaz → Mathematics_for_Machine_Learning

Renatochaz / Mathematics_for_Machine_Learning

Licence: other
Notes and step-by-step exercises resolution to aid students learning the base math for machine learning. Content summed up from the the course from the Imperial London College in Coursera.

Programming Languages

Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to Mathematics for Machine Learning

Mathematics for Machine Learning
Learn mathematics behind machine learning and explore different mathematics in machine learning.
Stars: ✭ 28 (-36.36%)
Mutual labels:  matrix, linear-algebra, mathematics
Math Php
Powerful modern math library for PHP: Features descriptive statistics and regressions; Continuous and discrete probability distributions; Linear algebra with matrices and vectors, Numerical analysis; special mathematical functions; Algebra
Stars: ✭ 2,009 (+4465.91%)
Mutual labels:  matrix, linear-algebra, mathematics
Nalgebra
Linear algebra library for Rust.
Stars: ✭ 2,433 (+5429.55%)
Mutual labels:  matrix, linear-algebra
Phpsci Carray
PHP library for scientific computing powered by C
Stars: ✭ 176 (+300%)
Mutual labels:  matrix, linear-algebra
Mathnet Numerics
Math.NET Numerics
Stars: ✭ 2,688 (+6009.09%)
Mutual labels:  matrix, linear-algebra
Eigen Git Mirror
THIS MIRROR IS DEPRECATED -- New url: https://gitlab.com/libeigen/eigen
Stars: ✭ 1,659 (+3670.45%)
Mutual labels:  matrix, linear-algebra
Node Sylvester
🐱 Sylvester is a vector, matrix, and geometry library for JavaScript, that runs in the browser and on the server.
Stars: ✭ 144 (+227.27%)
Mutual labels:  matrix, linear-algebra
Peroxide
Rust numeric library with R, MATLAB & Python syntax
Stars: ✭ 191 (+334.09%)
Mutual labels:  matrix, linear-algebra
Korma
Mathematics library focused on geometry for Multiplatform Kotlin 1.3
Stars: ✭ 65 (+47.73%)
Mutual labels:  matrix, mathematics
eigen-js
⚡ Eigen-js is a port of the Eigen C++ linear algebra library
Stars: ✭ 78 (+77.27%)
Mutual labels:  matrix, linear-algebra
Pygraphblas
GraphBLAS for Python
Stars: ✭ 252 (+472.73%)
Mutual labels:  matrix, linear-algebra
eigen
Owl's OCaml Interface to Eigen3 C++ Library
Stars: ✭ 30 (-31.82%)
Mutual labels:  matrix, linear-algebra
Gonum
开源Go语言数值算法库(An open numerical library purely based on Go programming language)
Stars: ✭ 128 (+190.91%)
Mutual labels:  matrix, mathematics
Numphp
Mathematical PHP library for scientific computing
Stars: ✭ 120 (+172.73%)
Mutual labels:  matrix, linear-algebra
Lacaml
OCaml bindings for BLAS/LAPACK (high-performance linear algebra Fortran libraries)
Stars: ✭ 101 (+129.55%)
Mutual labels:  matrix, linear-algebra
Ugm
Ubpa Graphics Mathematics
Stars: ✭ 178 (+304.55%)
Mutual labels:  matrix, mathematics
susa
High Performance Computing (HPC) and Signal Processing Framework
Stars: ✭ 55 (+25%)
Mutual labels:  linear-algebra, mathematics
Pycm
Multi-class confusion matrix library in Python
Stars: ✭ 1,076 (+2345.45%)
Mutual labels:  matrix, mathematics
Spmp
sparse matrix pre-processing library
Stars: ✭ 62 (+40.91%)
Mutual labels:  matrix, linear-algebra
Blasjs
Pure Javascript manually written 👌 implementation of BLAS, Many numerical software applications use BLAS computations, including Armadillo, LAPACK, LINPACK, GNU Octave, Mathematica, MATLAB, NumPy, R, and Julia.
Stars: ✭ 241 (+447.73%)
Mutual labels:  matrix, linear-algebra

Mathematics for Machine Learning

In no way graded assignments and exams needed to certificate will be shared due to the Coursera honor code

Please note that sometimes github doesn't load the .ipynb file or use a incorrect diagram for a matrix, feel free to download it and use in your own reader

The aim of my repository is to give students learning the basis math for machine learning (in special those doing the Imperial College London Mathematics for Machine Learning course) some helpful resources and somewhere to guide then in the practice exercises available at the course.

This repository covers the following topics:

  • Linear algebra;
  • Multivariate Calculus;
  • Principal Component Analysis (PCA).

Basic course description (From Imperial College London course)

For a lot of higher level courses in Machine Learning and Data Science, you find you need to freshen up on the basics in mathematics - stuff you may have studied before in school or university, but which was taught in another context, or not very intuitively, such that you struggle to relate it to how it’s used in Computer Science. This specialization aims to bridge that gap, getting you up to speed in the underlying mathematics, building an intuitive understanding, and relating it to Machine Learning and Data Science.

In the first course on Linear Algebra we look at what linear algebra is and how it relates to data. Then we look through what vectors and matrices are and how to work with them.

The second course, Multivariate Calculus, builds on this to look at how to optimize fitting functions to get good fits to data. It starts from introductory calculus and then uses the matrices and vectors from the first course to look at data fitting.

The third course, Dimensionality Reduction with Principal Component Analysis, uses the mathematics from the first two courses to compress high-dimensional data. This course is of intermediate difficulty and will require Python and numpy knowledge.

At the end of this specialization you will have gained the prerequisite mathematical knowledge to continue your journey and take more advanced courses in machine learning.

Course link

Available notes and exercises resolutions

Linear Algebra

  • Week 1:

    Solving simultaneous equations

  • Week 2:

    Modulus e inner products

    Cosine e dot products

    Scalar and vector projections

    Basis change

    Linear dependence

  • Week 3:

    Matrix multiplication

    Matrix properties

    Identity matrix

    Matrix transformation

    Solving simultaneous equations through matrix method

    Inverse matrix

  • Week 4:

    Einstein summation

    Symmetry of the dot product

    Notes on non-square matrix multiplication

    Changing basis in matrices

    Transformation in changed basis

    Orthogonal matrices

    Gram-schmidt process

  • Week 5:

    Eigenvalues

    Eigenvectors

    Special eigen-cases

    Changing to the eigenbasis

Multivariate Calculus

  • Week1:

    Differentiation and definition of a derivative;

    Sum rule;

    Power rule;

    Special cases derivative;

    Product rule;

    Chain rule;

    All around application.

  • Week2:

    Dependent and independent variables;

    Extension to multivariate differentiation;

    Multivariate complex example;

    Multivariate partial differentiation;

    Jacobian vector;

    Hessian matrix.

  • Week3:

    Multivariate chain rule;

    Neural network in matrix form;

    Applied NN with the chain rule

  • Week4:

    Why approximate function;

    Power series;

    Maclaurin series;

    Taylor series;

    Linearisation;

    Multivariate taylor series.

  • Week5:

    One dimensional newton-raphson;

    Gradient descent;

    Constrained optimisation.

  • Week 6 not provided.

Principal Component Analysis

  • Week 1:

    Mean of a dataset;

    One dimensional variance;

    Covariance matrix;

    Linear transformation properties for the mean, variance and covariance;

    Numpy tutorial (from the course lab);

    A little gift for you, if you come this far :)

  • Week 2:

    Dot product, angles and distance between vectors;

    Inner products;

    Inner products and length of vectors;

    Inner products, orthogonality and angle between vectors.

  • Week 3:

    Projections onto 1-D subspace;

    Projections in higher dimensions (N-D subspace).

  • Week 4:

    PCA objective and key ideas;

    Coordinates of projected data;

    Derivation of the average square reconstruction error;

    Finding the basis vectors that spans the principal subspace;

    Summary of key equations. My personal note on how to create a PCA function (to help in the final assignment).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].