All Projects → google-research → Computation Thru Dynamics

google-research / Computation Thru Dynamics

Licence: apache-2.0
Understanding computation in artificial and biological recurrent networks through the lens of dynamical systems.

Projects that are alternatives of or similar to Computation Thru Dynamics

Graph nn
Graph Classification with Graph Convolutional Networks in PyTorch (NeurIPS 2018 Workshop)
Stars: ✭ 268 (-1.83%)
Mutual labels:  jupyter-notebook
Streamingphish
Python-based utility that uses supervised machine learning to detect phishing domains from the Certificate Transparency log network.
Stars: ✭ 271 (-0.73%)
Mutual labels:  jupyter-notebook
Popular Rl Algorithms
PyTorch implementation of Soft Actor-Critic (SAC), Twin Delayed DDPG (TD3), Actor-Critic (AC/A2C), Proximal Policy Optimization (PPO), QT-Opt, PointNet..
Stars: ✭ 266 (-2.56%)
Mutual labels:  jupyter-notebook
Deep Learning
No description, website, or topics provided.
Stars: ✭ 3,058 (+1020.15%)
Mutual labels:  jupyter-notebook
Notebooks Statistics And Machinelearning
Jupyter Notebooks from the old UnsupervisedLearning.com (RIP) machine learning and statistics blog
Stars: ✭ 270 (-1.1%)
Mutual labels:  jupyter-notebook
Unintended Ml Bias Analysis
Stars: ✭ 271 (-0.73%)
Mutual labels:  jupyter-notebook
Facet
Human-explainable AI.
Stars: ✭ 269 (-1.47%)
Mutual labels:  jupyter-notebook
Drq
DrQ: Data regularized Q
Stars: ✭ 268 (-1.83%)
Mutual labels:  jupyter-notebook
Machine learing study
Stars: ✭ 270 (-1.1%)
Mutual labels:  jupyter-notebook
Cryptocurrency Price Prediction
Cryptocurrency Price Prediction Using LSTM neural network
Stars: ✭ 271 (-0.73%)
Mutual labels:  jupyter-notebook
Gophernotes
The Go kernel for Jupyter notebooks and nteract.
Stars: ✭ 3,100 (+1035.53%)
Mutual labels:  jupyter-notebook
Deeplearningwithtf2.0
Practical Exercises in TensorFlow 2.0 for Ian Goodfellows Deep Learning Book
Stars: ✭ 270 (-1.1%)
Mutual labels:  jupyter-notebook
Rad
RAD: Reinforcement Learning with Augmented Data
Stars: ✭ 268 (-1.83%)
Mutual labels:  jupyter-notebook
Pytorch Kaggle Starter
Pytorch starter kit for Kaggle competitions
Stars: ✭ 268 (-1.83%)
Mutual labels:  jupyter-notebook
Numerai
Code from my experiments on Numerai
Stars: ✭ 272 (-0.37%)
Mutual labels:  jupyter-notebook
Noah Research
Noah Research
Stars: ✭ 265 (-2.93%)
Mutual labels:  jupyter-notebook
Introduction To Python For Computational Science And Engineering
Book: Introduction to Python for Computational Science and Engineering
Stars: ✭ 271 (-0.73%)
Mutual labels:  jupyter-notebook
Aws Deepcomposer Samples
Stars: ✭ 273 (+0%)
Mutual labels:  jupyter-notebook
Data Science Is Software
Stars: ✭ 272 (-0.37%)
Mutual labels:  jupyter-notebook
Fba matting
Official repository for the paper F, B, Alpha Matting
Stars: ✭ 272 (-0.37%)
Mutual labels:  jupyter-notebook

Computation Through Dynamics

This repository contains a number of subprojects related to the interlinking of computation and dynamics in artificial and biological neural systems.

This is not an officially supported Google product.

Prerequisites

The code is written to be compatible with Python 3. You will also need:

  • JAX version 0.1.75 or greater (install) -
  • JAX lib latest version (installed with JAX)
  • NumPy, SciPy, Matplotlib (install SciPy stack, contains all of them)
  • h5py (install)
  • A GPU - XLA compiles these examples to CPU very slowly, so best to use a GPU for now.

Analysis of toy model associated with How recurrent networks implement contextual processing in sentiment analysis

Neural networks have a remarkable capacity for contextual processing—using recent or nearby inputs to modify processing of current input. For example, in natural language, contextual processing is necessary to correctly interpret negation (e.g. phrases such as “not bad”). However, our ability to understand how networks process context is limited. Here, we propose general methods for reverse engineering recurrent neural networks (RNNs) to identify and elucidate contextual processing.

This Jupyter notebook runs through the analysis of the toy model found in How recurrent networks implement contextual processing in sentiment analysis.

LFADS - Latent Factor Analysis via Dynamical Systems

LFADS is a tool for inferring dynamics from noisy, high-dimensional observations of a dynamics system. It is a sequential auto-encoder with some very particular bells and whistles. Here we have released a tutorial version, written in Python / Numpy / JAX intentionally implemented with readabilily, comprehension and innovation in mind. You may find the full TensorFlow implementation with run manager support (here).

The LFADS tutorial uses the integrator RNN example (see below). The LFADS tutorial example attempts to infer the hidden states of the integrator RNN as well as the white noise input to the RNN. One runs the integrator RNN example and then copies the resulting data file, written in /tmp/ to /tmp/LFADS/data/. Edit the name of the data file in run_lfads.py and then run execute run_lfads.py.

The LFADS tutorial is run through this Jupyter notebook.

Integrator RNN - train a Vanilla RNN to integrate white noise.

Integration is a very simple task and highlights how to set up a loop over time, batch over multiple input/target examples, use just-in-time compilation to speed the computation up, and take a gradient in JAX. The data from this example is also used as input for the LFADS tutorial.

This example is run through this Jupyter notebook.

Fixed point finding - train a GRU to make a binary decision and study it via fixed point finding.

The goal of this tutorial is to learn about fixed point finding by running the algorithm on a Gated Recurrent Unit (GRU), which is trained to make a binary decision, namely whether the integral of the white noise input is in total positive or negative, outputing either a +1 or a -1 to encode the decision.

Running the fixed point finder on this decision-making GRU will yield:

  1. the underlying fixed points
  2. the first order taylor series approximations around those fixed points.

Going through this tutorial will exercise the concepts defined in the Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks.

This example is run through this Jupyter notebook.

FORCE learning in Echostate networks

In Colab, Train an echostate network (ESN) to generate the chaotic output of another recurrent neural network. This Colab / IPython notebook implements a continuous-time ESN with FORCE learning implemented via recursive least squares (RLS). It also lets you use a GPU and quickly get started with JAX! Two different implementations are explored, one at the JAX / Python level and another at the LAX level. After JIT compilation, the JAX implementation runs very fast.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].