All Projects → ddbourgin → Numpy Ml

ddbourgin / Numpy Ml

Licence: gpl-3.0
Machine learning, in numpy

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Numpy Ml

Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (-98.79%)
Mutual labels:  neural-networks, lstm, vae
Attend infer repeat
A Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (-99.26%)
Mutual labels:  neural-networks, attention, vae
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-99.74%)
Mutual labels:  word2vec, lstm, attention
Deep Time Series Prediction
Seq2Seq, Bert, Transformer, WaveNet for time series prediction.
Stars: ✭ 183 (-98.35%)
Mutual labels:  lstm, attention, wavenet
Neural Tangents
Fast and Easy Infinite Neural Networks in Python
Stars: ✭ 1,357 (-87.77%)
Mutual labels:  neural-networks, bayesian-inference, gaussian-processes
Gam
A PyTorch implementation of "Graph Classification Using Structural Attention" (KDD 2018).
Stars: ✭ 227 (-97.95%)
Mutual labels:  reinforcement-learning, neural-networks, attention
Pycadl
Python package with source code from the course "Creative Applications of Deep Learning w/ TensorFlow"
Stars: ✭ 356 (-96.79%)
Mutual labels:  word2vec, vae, wavenet
Ai Reading Materials
Some of the ML and DL related reading materials, research papers that I've read
Stars: ✭ 79 (-99.29%)
Mutual labels:  reinforcement-learning, lstm
Tensorflow seq2seq chatbot
Stars: ✭ 81 (-99.27%)
Mutual labels:  neural-networks, lstm
Gpstuff
GPstuff - Gaussian process models for Bayesian analysis
Stars: ✭ 106 (-99.05%)
Mutual labels:  bayesian-inference, gaussian-processes
Nlp Journey
Documents, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation),etc. All codes are implemented intensorflow 2.0.
Stars: ✭ 1,290 (-88.38%)
Mutual labels:  word2vec, attention
Machine Learning
My Attempt(s) In The World Of ML/DL....
Stars: ✭ 78 (-99.3%)
Mutual labels:  lstm, attention
Outlace.github.io
Machine learning and data science blog.
Stars: ✭ 65 (-99.41%)
Mutual labels:  reinforcement-learning, neural-networks
Aorun
Deep Learning over PyTorch
Stars: ✭ 61 (-99.45%)
Mutual labels:  neural-networks, bayesian-inference
Self Attention Classification
document classification using LSTM + self attention
Stars: ✭ 84 (-99.24%)
Mutual labels:  lstm, attention
Mojitalk
Code for "MojiTalk: Generating Emotional Responses at Scale" https://arxiv.org/abs/1711.04090
Stars: ✭ 107 (-99.04%)
Mutual labels:  reinforcement-learning, vae
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-99.42%)
Mutual labels:  word2vec, attention
Pumas.jl
Pharmaceutical Modeling and Simulation for Nonlinear Mixed Effects (NLME), Quantiative Systems Pharmacology (QsP), Physiologically-Based Pharmacokinetics (PBPK) models mixed with machine learning
Stars: ✭ 84 (-99.24%)
Mutual labels:  neural-networks, bayesian-inference
Safeopt
Safe Bayesian Optimization
Stars: ✭ 90 (-99.19%)
Mutual labels:  reinforcement-learning, gaussian-processes
Sarcasmdetection
Sarcasm detection on tweets using neural network
Stars: ✭ 99 (-99.11%)
Mutual labels:  neural-networks, lstm

numpy-ml

Ever wish you had an inefficient but somewhat legible collection of machine learning algorithms implemented exclusively in NumPy? No?

Installation

For rapid experimentation

To use this code as a starting point for ML prototyping / experimentation, just clone the repository, create a new virtualenv, and start hacking:

$ git clone https://github.com/ddbourgin/numpy-ml.git
$ cd numpy-ml && virtualenv npml && source npml/bin/activate
$ pip3 install -r requirements-dev.txt

As a package

If you don't plan to modify the source, you can also install numpy-ml as a Python package: pip3 install -u numpy_ml.

The reinforcement learning agents train on environments defined in the OpenAI gym. To install these alongside numpy-ml, you can use pip3 install -u 'numpy_ml[rl]'.

Documentation

For more details on the available models, see the project documentation.

Available models

  1. Gaussian mixture model

    • EM training
  2. Hidden Markov model

    • Viterbi decoding
    • Likelihood computation
    • MLE parameter estimation via Baum-Welch/forward-backward algorithm
  3. Latent Dirichlet allocation (topic model)

    • Standard model with MLE parameter estimation via variational EM
    • Smoothed model with MAP parameter estimation via MCMC
  4. Neural networks

    • Layers / Layer-wise ops
      • Add
      • Flatten
      • Multiply
      • Softmax
      • Fully-connected/Dense
      • Sparse evolutionary connections
      • LSTM
      • Elman-style RNN
      • Max + average pooling
      • Dot-product attention
      • Embedding layer
      • Restricted Boltzmann machine (w. CD-n training)
      • 2D deconvolution (w. padding and stride)
      • 2D convolution (w. padding, dilation, and stride)
      • 1D convolution (w. padding, dilation, stride, and causality)
    • Modules
      • Bidirectional LSTM
      • ResNet-style residual blocks (identity and convolution)
      • WaveNet-style residual blocks with dilated causal convolutions
      • Transformer-style multi-headed scaled dot product attention
    • Regularizers
      • Dropout
    • Normalization
      • Batch normalization (spatial and temporal)
      • Layer normalization (spatial and temporal)
    • Optimizers
      • SGD w/ momentum
      • AdaGrad
      • RMSProp
      • Adam
    • Learning Rate Schedulers
      • Constant
      • Exponential
      • Noam/Transformer
      • Dlib scheduler
    • Weight Initializers
      • Glorot/Xavier uniform and normal
      • He/Kaiming uniform and normal
      • Standard and truncated normal
    • Losses
      • Cross entropy
      • Squared error
      • Bernoulli VAE loss
      • Wasserstein loss with gradient penalty
      • Noise contrastive estimation loss
    • Activations
      • ReLU
      • Tanh
      • Affine
      • Sigmoid
      • Leaky ReLU
      • ELU
      • SELU
      • Exponential
      • Hard Sigmoid
      • Softplus
    • Models
      • Bernoulli variational autoencoder
      • Wasserstein GAN with gradient penalty
      • word2vec encoder with skip-gram and CBOW architectures
    • Utilities
      • col2im (MATLAB port)
      • im2col (MATLAB port)
      • conv1D
      • conv2D
      • deconv2D
      • minibatch
  5. Tree-based models

    • Decision trees (CART)
    • [Bagging] Random forests
    • [Boosting] Gradient-boosted decision trees
  6. Linear models

    • Ridge regression
    • Logistic regression
    • Ordinary least squares
    • Bayesian linear regression w/ conjugate priors
      • Unknown mean, known variance (Gaussian prior)
      • Unknown mean, unknown variance (Normal-Gamma / Normal-Inverse-Wishart prior)
  7. n-Gram sequence models

    • Maximum likelihood scores
    • Additive/Lidstone smoothing
    • Simple Good-Turing smoothing
  8. Multi-armed bandit models

    • UCB1
    • LinUCB
    • Epsilon-greedy
    • Thompson sampling w/ conjugate priors
      • Beta-Bernoulli sampler
    • LinUCB
  9. Reinforcement learning models

    • Cross-entropy method agent
    • First visit on-policy Monte Carlo agent
    • Weighted incremental importance sampling Monte Carlo agent
    • Expected SARSA agent
    • TD-0 Q-learning agent
    • Dyna-Q / Dyna-Q+ with prioritized sweeping
  10. Nonparameteric models

    • Nadaraya-Watson kernel regression
    • k-Nearest neighbors classification and regression
    • Gaussian process regression
  11. Matrix factorization

    • Regularized alternating least-squares
    • Non-negative matrix factorization
  12. Preprocessing

    • Discrete Fourier transform (1D signals)
    • Discrete cosine transform (type-II) (1D signals)
    • Bilinear interpolation (2D signals)
    • Nearest neighbor interpolation (1D and 2D signals)
    • Autocorrelation (1D signals)
    • Signal windowing
    • Text tokenization
    • Feature hashing
    • Feature standardization
    • One-hot encoding / decoding
    • Huffman coding / decoding
    • Term frequency-inverse document frequency (TF-IDF) encoding
    • MFCC encoding
  13. Utilities

    • Similarity kernels
    • Distance metrics
    • Priority queue
    • Ball tree
    • Discrete sampler
    • Graph processing and generators

Contributing

Am I missing your favorite model? Is there something that could be cleaner / less confusing? Did I mess something up? Submit a PR! The only requirement is that your models are written with just the Python standard library and NumPy. The SciPy library is also permitted under special circumstances ;)

See full contributing guidelines here.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].