All Projects → greydanus → Hamiltonian Nn

greydanus / Hamiltonian Nn

Code for our paper "Hamiltonian Neural Networks"

Projects that are alternatives of or similar to Hamiltonian Nn

Hyperlearn
50% faster, 50% less RAM Machine Learning. Numba rewritten Sklearn. SVD, NNMF, PCA, LinearReg, RidgeReg, Randomized, Truncated SVD/PCA, CSR Matrices all 50+% faster
Stars: ✭ 1,204 (+425.76%)
Mutual labels:  jupyter-notebook, research
Galpy
Galactic Dynamics in python
Stars: ✭ 134 (-41.48%)
Mutual labels:  dynamics, physics
Calogan
Generative Adversarial Networks for High Energy Physics extended to a multi-layer calorimeter simulation
Stars: ✭ 87 (-62.01%)
Mutual labels:  jupyter-notebook, physics
Instapy Research
📄 Research repository for InstaPy
Stars: ✭ 60 (-73.8%)
Mutual labels:  jupyter-notebook, research
Infiniteboost
InfiniteBoost: building infinite ensembles with gradient descent
Stars: ✭ 180 (-21.4%)
Mutual labels:  jupyter-notebook, research
Python nlp tutorial
This repository provides everything to get started with Python for Text Mining / Natural Language Processing (NLP)
Stars: ✭ 72 (-68.56%)
Mutual labels:  jupyter-notebook, research
Pydy Tutorial Human Standing
PyDy tutorial materials for MASB 2014, PYCON 2014, and SciPy 2014/2015.
Stars: ✭ 135 (-41.05%)
Mutual labels:  dynamics, jupyter-notebook
Research
Notebooks based on financial machine learning.
Stars: ✭ 714 (+211.79%)
Mutual labels:  jupyter-notebook, research
Learnpythonforresearch
This repository provides everything you need to get started with Python for (social science) research.
Stars: ✭ 163 (-28.82%)
Mutual labels:  jupyter-notebook, research
Physac
2D physics header-only library for videogames developed in C using raylib library.
Stars: ✭ 151 (-34.06%)
Mutual labels:  dynamics, physics
Whitehat
Information about my experiences on ethical hacking 💀
Stars: ✭ 54 (-76.42%)
Mutual labels:  jupyter-notebook, research
Atari Model Zoo
A binary release of trained deep reinforcement learning models trained in the Atari machine learning benchmark, and a software release that enables easy visualization and analysis of models, and comparison across training algorithms.
Stars: ✭ 198 (-13.54%)
Mutual labels:  jupyter-notebook, research
Openpmd Viewer
🐍 Python visualization tools for openPMD files
Stars: ✭ 41 (-82.1%)
Mutual labels:  jupyter-notebook, research
Gala
Galactic and gravitational dynamics in Python
Stars: ✭ 73 (-68.12%)
Mutual labels:  dynamics, physics
3 body problem bot
Simulations of gravitational interaction of the random n-body systems
Stars: ✭ 36 (-84.28%)
Mutual labels:  jupyter-notebook, physics
Rasalit
Visualizations and helpers to improve and debug machine learning models for Rasa Open Source
Stars: ✭ 101 (-55.9%)
Mutual labels:  jupyter-notebook, research
Picongpu
Particle-in-Cell Simulations for the Exascale Era ✨
Stars: ✭ 452 (+97.38%)
Mutual labels:  research, physics
Dnc Tensorflow
A TensorFlow implementation of DeepMind's Differential Neural Computers (DNC)
Stars: ✭ 587 (+156.33%)
Mutual labels:  jupyter-notebook, research
Datasets
🎁 3,000,000+ Unsplash images made available for research and machine learning
Stars: ✭ 1,805 (+688.21%)
Mutual labels:  jupyter-notebook, research
Pybotics
The Python Toolbox for Robotics
Stars: ✭ 192 (-16.16%)
Mutual labels:  dynamics, research

Hamiltonian Neural Networks

Sam Greydanus, Misko Dzamba, Jason Yosinski | 2019

overall-idea.png

Basic usage

To train a Hamiltonian Neural Network (HNN):

  • Task 1: Ideal mass-spring system: python3 experiment-spring/train.py --verbose
  • Task 2: Ideal pendulum: python3 experiment-pend/train.py --verbose
  • Task 3: Real pendulum (from this Science paper): python3 experiment-real/train.py --verbose
  • Task 4: Two-body problem: python3 experiment-2body/train.py --verbose
  • Task 4b: Three-body problem: python3 experiment-3body/train.py --verbose
  • Task 5: Pixel pendulum (from OpenAI Gym): python3 experiment-pixels/train.py --verbose

To analyze results

Summary

Even though neural networks enjoy widespread use, they still struggle to learn the basic laws of physics. How might we endow them with better inductive biases? In this paper, we draw inspiration from Hamiltonian mechanics to train models that learn and respect exact conservation laws in an unsupervised manner. We evaluate our models on problems where conservation of energy is important, including the two-body problem and pixel observations of a pendulum. Our model trains faster and generalizes better than a regular neural network. An interesting side effect is that our model is perfectly reversible in time.

The HNN recipe

  1. Make a dataset of pixel-space observations of a physical system where energy is conserved. Here we're working with a pendulum.

pendulum-dataset.gif.png

  1. Train an autoencoder on the dataset. The autoencoder is a bit unusual: its latent representation gets fed to the HNN, which tries to model the system's dynamics in latent space.

autoencoder-hnn.png

  1. Since the HNN uses the latent representation to model dynamics, we can think of the latent factors as being analogous to canonical coordinates (e.g. position and velocity).

latents-hnn.png

  1. Phase space plots are a common way to visualize Hamiltonians. We can make a phase space plot in the autoencoder's latent space. We can also integrate along the energy contours of phase space to predict the dynamics of the system (in the figure below, we intentionally "add energy" halfway through).

integrate-latent-hnn.png

  1. After integrating in latent space, we can project back into pixel space to simulate the dynamics of the system.

pendulum-compare-labeled.gif

Here's what it looks like when we add energy halfway through the simulation:

pendulum-compare-labeled.gif

Modeling larger systems

We can also model larger systems -- systems with more than one pair of canonical coordinates. The two-body problem, for example, has four coordinate pairs.

orbits-compare.gif

Numbers

Train loss

  • Choose data of the form x=[x0, x1,...] and dx=[dx0, dx1,...] where dx is the time derivative of x
  • Let dx' = model.time_derivative(x)
  • Compute L2 distance between dx and dx'
Baseline NN Hamiltonian NN
Ideal mass-spring 3.7134e-02 +/- 1.9143e-03 3.6933e-02 +/- 1.9128e-03
Ideal pendulum 3.2606e-02 +/- 1.7434e-03 3.2787e-02 +/- 1.7567e-03
Real pendulum 2.7455e-03 +/- 2.0735e-04 9.2376e-03 +/- 5.0263e-04
2 body problem 3.2682e-05 +/- 1.2022e-06 2.9959e-06 +/- 6.5500e-08
3 body problem 9.5573e-02 +/- 6.5610e-02 8.0346e-02 +/- 2.1470e-02
Pixel pendulum 1.7731e-04 +/- 2.4202e-06 1.8683e-04 +/- 2.4238e-06

Test loss

Do the same thing with test data

Baseline NN Hamiltonian NN
Ideal mass-spring 3.6656e-02 +/- 1.8652e-03 3.5928e-02 +/- 1.8328e-03
Ideal pendulum 3.5273e-02 +/- 1.7970e-03 3.5586e-02 +/- 1.8178e-03
Real pendulum 2.1864e-03 +/- 3.3296e-04 5.9584e-03 +/- 6.1798e-04
2 body problem 2.9575e-05 +/- 8.8900e-07 2.8218e-06 +/- 4.2020e-08
3 body problem 3.8000e-01 +/- 4.1612e-01 4.8809e-01 +/- 4.7745e-01
Pixel pendulum 1.7306e-04 +/- 3.2413e-06 1.8451e-04 +/- 3.3422e-06

Energy MSE

  • Choose a trajectory [x0, x1,...] from test data
  • Use RK4 integration to estimate [x0', x1',...] using the model
  • Compute the L2 distance between [energy(x0), energy(x1),...] and [energy(x0'), energy(x1'),...]
Baseline NN Hamiltonian NN
Ideal mass-spring 1.7077e-01 +/- 2.06e-02 3.8416e-04 +/- 6.53e-05
Ideal pendulum 4.1519e-02 +/- 9.62e-03 2.4852e-02 +/- 5.42e-03
Real pendulum 3.8564e-01 +/- 6.92e-02 1.4477e-02 +/- 4.65e-03
2 body problem 6.3276e-02 +/- 3.36e-02 3.8751e-05 +/- 5.04e-06
3 body problem 1.0906e+02 +/- 7.74e+01 4.1926e-02 +/- 3.39e-02
Pixel pendulum 9.2748e-03 +/- 1.14e-03 1.5315e-04 +/- 8.42e-06

Dependencies

  • OpenAI Gym
  • PyTorch
  • NumPy
  • ImageIO
  • Scipy

This project is written in Python 3.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].