All Projects → blue-season → Pywarm

blue-season / Pywarm

Licence: mit
A cleaner way to build neural networks for PyTorch.

Programming Languages

python
139335 projects - #7 most used programming language
python3
1442 projects

Projects that are alternatives of or similar to Pywarm

Tageditor
🏖TagEditor - Annotation tool for spaCy
Stars: ✭ 92 (-50%)
Mutual labels:  data-science, neural-networks
Learn Machine Learning
Learn to Build a Machine Learning Application from Top Articles
Stars: ✭ 116 (-36.96%)
Mutual labels:  data-science, neural-networks
Codesearchnet
Datasets, tools, and benchmarks for representation learning of code.
Stars: ✭ 1,378 (+648.91%)
Mutual labels:  data-science, neural-networks
Dltk
Deep Learning Toolkit for Medical Image Analysis
Stars: ✭ 1,249 (+578.8%)
Mutual labels:  data-science, neural-networks
Fixy
Amacımız Türkçe NLP literatüründeki birçok farklı sorunu bir arada çözebilen, eşsiz yaklaşımlar öne süren ve literatürdeki çalışmaların eksiklerini gideren open source bir yazım destekleyicisi/denetleyicisi oluşturmak. Kullanıcıların yazdıkları metinlerdeki yazım yanlışlarını derin öğrenme yaklaşımıyla çözüp aynı zamanda metinlerde anlamsal analizi de gerçekleştirerek bu bağlamda ortaya çıkan yanlışları da fark edip düzeltebilmek.
Stars: ✭ 165 (-10.33%)
Mutual labels:  data-science, neural-networks
Knet.jl
Koç University deep learning framework.
Stars: ✭ 1,260 (+584.78%)
Mutual labels:  data-science, neural-networks
Keras Contrib
Keras community contributions
Stars: ✭ 1,532 (+732.61%)
Mutual labels:  data-science, neural-networks
Machine Learning From Scratch
Succinct Machine Learning algorithm implementations from scratch in Python, solving real-world problems (Notebooks and Book). Examples of Logistic Regression, Linear Regression, Decision Trees, K-means clustering, Sentiment Analysis, Recommender Systems, Neural Networks and Reinforcement Learning.
Stars: ✭ 42 (-77.17%)
Mutual labels:  data-science, neural-networks
Ml Workspace
🛠 All-in-one web-based IDE specialized for machine learning and data science.
Stars: ✭ 2,337 (+1170.11%)
Mutual labels:  data-science, neural-networks
Autograd.jl
Julia port of the Python autograd package.
Stars: ✭ 147 (-20.11%)
Mutual labels:  data-science, neural-networks
Mit Deep Learning
Tutorials, assignments, and competitions for MIT Deep Learning related courses.
Stars: ✭ 8,912 (+4743.48%)
Mutual labels:  data-science, neural-networks
Jaxnet
Concise deep learning for JAX
Stars: ✭ 171 (-7.07%)
Mutual labels:  data-science, neural-networks
Ai Platform
An open-source platform for automating tasks using machine learning models
Stars: ✭ 61 (-66.85%)
Mutual labels:  data-science, neural-networks
Vvedenie Mashinnoe Obuchenie
📝 Подборка ресурсов по машинному обучению
Stars: ✭ 1,282 (+596.74%)
Mutual labels:  data-science, neural-networks
Mckinsey Smartcities Traffic Prediction
Adventure into using multi attention recurrent neural networks for time-series (city traffic) for the 2017-11-18 McKinsey IronMan (24h non-stop) prediction challenge
Stars: ✭ 49 (-73.37%)
Mutual labels:  data-science, neural-networks
Sigmoidal ai
Tutoriais de Python, Data Science, Machine Learning e Deep Learning - Sigmoidal
Stars: ✭ 103 (-44.02%)
Mutual labels:  data-science, neural-networks
Sciblog support
Support content for my blog
Stars: ✭ 694 (+277.17%)
Mutual labels:  data-science, neural-networks
Pyclustering
pyclustring is a Python, C++ data mining library.
Stars: ✭ 806 (+338.04%)
Mutual labels:  data-science, neural-networks
Uncertainty Metrics
An easy-to-use interface for measuring uncertainty and robustness.
Stars: ✭ 145 (-21.2%)
Mutual labels:  data-science, neural-networks
Auptimizer
An automatic ML model optimization tool.
Stars: ✭ 166 (-9.78%)
Mutual labels:  data-science, neural-networks

PyWarm - A cleaner way to build neural networks for PyTorch

PyWarm

A cleaner way to build neural networks for PyTorch.

PyPI Python Version PyPI Version License

Examples | Tutorial | API reference


Introduction

PyWarm is a lightweight, high-level neural network construction API for PyTorch. It enables defining all parts of NNs in the functional way.

With PyWarm, you can put all network data flow logic in the forward() method of your model, without the need to define children modules in the __init__() method and then call it again in the forward(). This result in a much more readable model definition in fewer lines of code.

PyWarm only aims to simplify the network definition, and does not attempt to cover model training, validation or data handling.


For example, a convnet for MNIST: (If needed, click the tabs to switch between Warm and Torch versions)

# powered by PyWarm
import torch.nn as nn
import torch.nn.functional as F
import warm
import warm.functional as W


class ConvNet(nn.Module):

    def __init__(self):
        super().__init__()
        warm.up(self, [2, 1, 28, 28])

    def forward(self, x):
        x = W.conv(x, 20, 5, activation='relu')
        x = F.max_pool2d(x, 2)
        x = W.conv(x, 50, 5, activation='relu')
        x = F.max_pool2d(x, 2)
        x = x.view(-1, 800)
        x = W.linear(x, 500, activation='relu')
        x = W.linear(x, 10)
        return F.log_softmax(x, dim=1)
# vanilla PyTorch version, taken from
# pytorch tutorials/beginner_source/blitz/neural_networks_tutorial.py 
import torch.nn as nn
import torch.nn.functional as F


class ConvNet(nn.Module):

    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5, 1)
        self.conv2 = nn.Conv2d(20, 50, 5, 1)
        self.fc1 = nn.Linear(4*4*50, 500)
        self.fc2 = nn.Linear(500, 10)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        x = F.max_pool2d(x, 2, 2)
        x = F.relu(self.conv2(x))
        x = F.max_pool2d(x, 2, 2)
        x = x.view(-1, 4*4*50)
        x = F.relu(self.fc1(x))
        x = self.fc2(x)
        return F.log_softmax(x, dim=1)

A couple of things you may have noticed:

  • First of all, in the PyWarm version, the entire network definition and data flow logic resides in the forward() method. You don't have to look up and down repeatedly to understand what self.conv1, self.fc1 etc. is doing.

  • You do not need to track and specify in_channels (or in_features, etc.) for network layers. PyWarm can infer the information for you. e.g.

# Warm
x = W.conv(x, 20, 5, activation='relu')
x = W.conv(x, 50, 5, activation='relu')


# Torch
self.conv1 = nn.Conv2d(1, 20, 5, 1)
self.conv2 = nn.Conv2d(20, 50, 5, 1)
  • One unified W.conv for all 1D, 2D, and 3D cases. Fewer things to keep track of!

  • activation='relu'. All warm.functional APIs accept an optional activation keyword, which is basically equivalent to F.relu(W.conv(...)). The keyword activation can also take in a callable, for example activation=torch.nn.ReLU(inplace=True) or activation=swish.

For deeper neural networks, see additional examples.


Installation

pip3 install pywarm

Quick start: 30 seconds to PyWarm

If you already have experinces with PyTorch, using PyWarm is very straightforward:

  • First, import PyWarm in you model file:
import warm
import warm.functional as W
  • Second, remove child module definitions in the model's __init__() method. In stead, use W.conv, W.linear ... etc. in the model's forward() method, just like how you would use torch nn functional F.max_pool2d, F.relu ... etc.

    For example, instead of writing:

# Torch
class MyModule(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(in_channels, out_channels, kernel_size)
        # other child module definitions
    def forward(self, x):
        x = self.conv1(x)
        # more forward steps
  • You can now write in the warm way:
# Warm
class MyWarmModule(nn.Module):
    def __init__(self):
        super().__init__()
        warm.up(self, input_shape_or_data)
    def forward(self, x):
        x = W.conv(x, out_channels, kernel_size) # no in_channels needed
        # more forward steps
  • Finally, don't forget to warmify the model by adding

    warm.up(self, input_shape_or_data)

    at the end of the model's __init__() method. You need to supply input_shape_or_data, which is either a tensor of input data, or just its shape, e.g. [2, 1, 28, 28] for MNIST inputs.

    The model is now ready to use, just like any other PyTorch models.

Check out the tutorial and examples if you want to learn more!


Testing

Clone the repository first, then

cd pywarm
pytest -v

Documentation

Documentations are generated using the excellent Portray package.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].