All Projects → lucidrains → Siren Pytorch

lucidrains / Siren Pytorch

Licence: mit
Pytorch implementation of SIREN - Implicit Neural Representations with Periodic Activation Function

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Siren Pytorch

Tensorflow Dataset Tutorial
Notebook for my medium article about how to use Dataset API in TensorFlow
Stars: ✭ 158 (-6.51%)
Mutual labels:  artificial-intelligence
Wyrm
Autodifferentiation package in Rust.
Stars: ✭ 164 (-2.96%)
Mutual labels:  artificial-intelligence
Perfect Tensorflow
TensorFlow C API Class Wrapper in Server Side Swift.
Stars: ✭ 166 (-1.78%)
Mutual labels:  artificial-intelligence
Ecg Arrhythmia Classification
ECG arrhythmia classification using a 2-D convolutional neural network
Stars: ✭ 159 (-5.92%)
Mutual labels:  artificial-intelligence
Visualizer
A single-page website aiming to provide innovative and intuitive visualizations of common and AI algorithms.
Stars: ✭ 163 (-3.55%)
Mutual labels:  artificial-intelligence
Mindpark
Testbed for deep reinforcement learning
Stars: ✭ 163 (-3.55%)
Mutual labels:  artificial-intelligence
Rnnt Speech Recognition
End-to-end speech recognition using RNN Transducers in Tensorflow 2.0
Stars: ✭ 158 (-6.51%)
Mutual labels:  artificial-intelligence
Awesome Machine Learning In Compilers
Must read research papers and links to tools and datasets that are related to using machine learning for compilers and systems optimisation
Stars: ✭ 168 (-0.59%)
Mutual labels:  artificial-intelligence
Awesome Ai
A curated list of artificial intelligence resources (Courses, Tools, App, Open Source Project)
Stars: ✭ 161 (-4.73%)
Mutual labels:  artificial-intelligence
Blog
Technical blog repo of metaflow
Stars: ✭ 165 (-2.37%)
Mutual labels:  artificial-intelligence
Dynamics
A Compositional Object-Based Approach to Learning Physical Dynamics
Stars: ✭ 159 (-5.92%)
Mutual labels:  artificial-intelligence
Lazynlp
Library to scrape and clean web pages to create massive datasets.
Stars: ✭ 1,985 (+1074.56%)
Mutual labels:  artificial-intelligence
Iresnet
Improved Residual Networks (https://arxiv.org/pdf/2004.04989.pdf)
Stars: ✭ 163 (-3.55%)
Mutual labels:  artificial-intelligence
Avalanche
Avalanche: a End-to-End Library for Continual Learning.
Stars: ✭ 151 (-10.65%)
Mutual labels:  artificial-intelligence
Awesome Ml Courses
Awesome free machine learning and AI courses with video lectures.
Stars: ✭ 2,145 (+1169.23%)
Mutual labels:  artificial-intelligence
Edge Ai
A curated list of resources for embedded AI
Stars: ✭ 157 (-7.1%)
Mutual labels:  artificial-intelligence
Curvature
A full-featured editor for working with Utility-based AI
Stars: ✭ 163 (-3.55%)
Mutual labels:  artificial-intelligence
Colab
Continual Learning tutorials and demo running on Google Colaboratory.
Stars: ✭ 168 (-0.59%)
Mutual labels:  artificial-intelligence
Slot Attention
Implementation of Slot Attention from GoogleAI
Stars: ✭ 168 (-0.59%)
Mutual labels:  artificial-intelligence
Fixy
Amacımız Türkçe NLP literatüründeki birçok farklı sorunu bir arada çözebilen, eşsiz yaklaşımlar öne süren ve literatürdeki çalışmaların eksiklerini gideren open source bir yazım destekleyicisi/denetleyicisi oluşturmak. Kullanıcıların yazdıkları metinlerdeki yazım yanlışlarını derin öğrenme yaklaşımıyla çözüp aynı zamanda metinlerde anlamsal analizi de gerçekleştirerek bu bağlamda ortaya çıkan yanlışları da fark edip düzeltebilmek.
Stars: ✭ 165 (-2.37%)
Mutual labels:  artificial-intelligence

SIREN in Pytorch

PyPI version

Pytorch implementation of SIREN - Implicit Neural Representations with Periodic Activation Function

Install

$ pip install siren-pytorch

Usage

A SIREN based multi-layered neural network

import torch
from torch import nn
from siren_pytorch import SirenNet

net = SirenNet(
    dim_in = 2,                        # input dimension, ex. 2d coor
    dim_hidden = 256,                  # hidden dimension
    dim_out = 3,                       # output dimension, ex. rgb value
    num_layers = 5,                    # number of layers
    final_activation = nn.Sigmoid(),   # activation of final layer (nn.Identity() for direct output)
    w0_initial = 30.                   # different signals may require different omega_0 in the first layer - this is a hyperparameter
)

coor = torch.randn(1, 2)
net(coor) # (1, 3) <- rgb value

One SIREN layer

import torch
from siren_pytorch import Siren

neuron = Siren(
    dim_in = 3,
    dim_out = 256
)

coor = torch.randn(1, 3)
neuron(coor) # (1, 256)

Sine activation (just a wrapper around torch.sin)

import torch
from siren_pytorch import Sine

act = Sine(1.)
coor = torch.randn(1, 2)
act(coor)

Wrapper to train on a specific image of specified height and width from a given SirenNet, and then to subsequently generate.

import torch
from torch import nn
from siren_pytorch import SirenNet, SirenWrapper

net = SirenNet(
    dim_in = 2,                        # input dimension, ex. 2d coor
    dim_hidden = 256,                  # hidden dimension
    dim_out = 3,                       # output dimension, ex. rgb value
    num_layers = 5,                    # number of layers
    w0_initial = 30.                   # different signals may require different omega_0 in the first layer - this is a hyperparameter
)

wrapper = SirenWrapper(
    net,
    image_width = 256,
    image_height = 256
)

img = torch.randn(1, 3, 256, 256)
loss = wrapper(img)
loss.backward()

# after much training ...
# simply invoke the wrapper without passing in anything

pred_img = wrapper() # (1, 3, 256, 256)

Citations

@misc{sitzmann2020implicit,
    title={Implicit Neural Representations with Periodic Activation Functions},
    author={Vincent Sitzmann and Julien N. P. Martel and Alexander W. Bergman and David B. Lindell and Gordon Wetzstein},
    year={2020},
    eprint={2006.09661},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].