All Projects → djsaunde → spiketorch

djsaunde / spiketorch

Licence: other
Experiments with spiking neural networks (SNNs) in PyTorch. See https://github.com/BINDS-LAB-UMASS/bindsnet for the successor to this project.

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to spiketorch

IJCNN2016
Diverse, Noisy and Parallel: a New Spiking Neural Network Approach for Humanoid Robot Control
Stars: ✭ 14 (-83.13%)
Mutual labels:  spiking-neural-networks, snn
BrainModels
Brain models implementation with BrainPy
Stars: ✭ 36 (-56.63%)
Mutual labels:  neurons, spiking-neural-networks
spikeflow
Python library for easy creation and running of spiking neural networks in tensorflow.
Stars: ✭ 30 (-63.86%)
Mutual labels:  spiking-neural-networks, snn
WheatNNLeek
Spiking neural network system
Stars: ✭ 26 (-68.67%)
Mutual labels:  spiking-neural-networks, snn
SNNs-In-Tensorflow
Implementation of a Spiking Neural Network in Tensorflow.
Stars: ✭ 24 (-71.08%)
Mutual labels:  spiking-neural-networks, snn
hybrid-snn-conversion
Training spiking networks with hybrid ann-snn conversion and spike-based backpropagation
Stars: ✭ 72 (-13.25%)
Mutual labels:  spiking-neural-networks, snn
DL-NC
spiking-neural-networks
Stars: ✭ 34 (-59.04%)
Mutual labels:  spiking-neural-networks, snn
navis
Python 3 library for analysis of neuroanatomical data
Stars: ✭ 68 (-18.07%)
Mutual labels:  neurons
Neurapse
Nuerapse simulations for SNNs
Stars: ✭ 22 (-73.49%)
Mutual labels:  snn
rA9
JAX-based Spiking Neural Network framework
Stars: ✭ 60 (-27.71%)
Mutual labels:  spiking-neural-networks
norse
Deep learning for spiking neural networks
Stars: ✭ 59 (-28.92%)
Mutual labels:  spiking-neural-networks
snn object recognition
One-Shot Object Appearance Learning using Spiking Neural Networks
Stars: ✭ 23 (-72.29%)
Mutual labels:  spiking-neural-networks
CARLsim4
CARLsim is an efficient, easy-to-use, GPU-accelerated software framework for simulating large-scale spiking neural network (SNN) models with a high degree of biological detail.
Stars: ✭ 75 (-9.64%)
Mutual labels:  spiking-neural-networks
spore-nest-module
Synaptic Plasticity with Online Reinforcement learning
Stars: ✭ 24 (-71.08%)
Mutual labels:  spiking-neural-networks
models
This repository will host models, modules, algorithms and applications developed by the INRC Community to run on the Intel Loihi Platform.
Stars: ✭ 59 (-28.92%)
Mutual labels:  spiking-neural-networks
PyTorch-Spiking-YOLOv3
A PyTorch implementation of Spiking-YOLOv3. Two branches are provided, based on two common PyTorch implementation of YOLOv3(ultralytics/yolov3 & eriklindernoren/PyTorch-YOLOv3), with support for Spiking-YOLOv3-Tiny at present.
Stars: ✭ 144 (+73.49%)
Mutual labels:  snn
LSM
Liquid State Machines in Python and NEST
Stars: ✭ 39 (-53.01%)
Mutual labels:  spiking-neural-networks
BrainPy
Brain Dynamics Programming in Python
Stars: ✭ 242 (+191.57%)
Mutual labels:  spiking-neural-networks
snn angular velocity
Event-Based Angular Velocity Regression with Spiking Networks
Stars: ✭ 91 (+9.64%)
Mutual labels:  spiking-neural-networks
bindsnet
Simulation of spiking neural networks (SNNs) using PyTorch.
Stars: ✭ 34 (-59.04%)
Mutual labels:  spiking-neural-networks

SpikeTorch

Python package used for simulating spiking neural networks (SNNs) in PyTorch.

At the moment, the focus is on replicating the SNN described in Unsupervised learning of digit recognition using spike-timing-dependent plasticity (original code found here, extensions thereof found in my previous project repository here).

We are currently interested in applying SNNs to simple machine learning (ML) tasks, but the code can be used for any purpose.

Requirements

All code was developed using Python 3.6.x, and will fail if run with Python 2.x. Use pip install -r requirements.txt to download all project dependencies. You may have to consult the PyTorch webpage in order to get the right installation for your machine.

Setting things up

To begin, download and unzip the MNIST dataset by running ./data/get_MNIST.sh. To build the spiketorch package from source, change directory to the top level of this project and issue pip install . (PyPI support hopefully coming soon). After making changing to code in the spiketorch directory, issue pip install . -U or pip install . --upgrade at the top level of the project.

To replicate the SNN from the above paper, run python examples/eth.py. There are a number of optional command-line arguments which can be passed in, including --plot (displays useful monitoring figures), --n_neurons [int] (number of excitatory, inhibitory neurons simulated), --mode ['train' | 'test'] (sets network operation to the training or testing phase), and more. Run python code/eth.py --help for more information on the command-line arguments.

Note: This is a work in progress, including the replication script examples/eth.py and other modifications in examples/.

Background

One computational challenge is simulating time-dependent neuronal dynamics. This is typically done by solving ordinary differential equations (ODEs) which describe said dynamics. PyTorch does not explicitly support the solution of differential equations (as opposed to brian2, for example), but we can convert the ODEs defining the dynamics into difference equations and solve them at regular, short intervals (a dt on the order of 1 millisecond) as an approximation. Of course, under the hood, packages like brian2 are doing the same thing. Doing this in PyTorch is exciting for a few reasons:

  1. We can use the powerful and flexible torch.Tensor object, a wrapper around the numpy.ndarray which can be transferred to and from GPU devices.

  2. We can avoid "reinventing the wheel" by repurposing functions from the torch.nn.functional PyTorch submodule in our SNN architectures; e.g., convolution or pooling functions.

The concept that the neuron spike ordering and their relative timing encode information is a central theme in neuroscience. Markram et al. (1997) proposed that synapses between neurons should strengthen or degrade based on this relative timing, and prior to that, Donald Hebb proposed the theory of Hebbian learning, often simply stated as "Neurons that fire together wire together." Markram et al.'s extension of the Hebbian theory is known as spike-timing-dependent plasticity (STDP).

We are interested in applying SNNs to machine learning problems. We use STDP to modify weights of synapses connecting pairs or populations of neurons in SNNs. In the context of ML, we want to learn a setting of synapse weights which will generate appropriate data-dependent spiking activity in SNNs. This activity will allow us to subsequently perform some ML task of interest; e.g., discriminating or clustering input data.

For now, we use the MNIST handwritten digit dataset, which, though somewhat antiquated, is simple enough to develop new machine learning techniques on. The goal is to find a setting of synapse weights which will allow us to discriminate categories of input data. Based on historical spiking activity on training examples, we assign each neuron in an excitatory population an input category and subsequently classify test data based on these assignments.

Contributors

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].