All Projects → tailintalent → AI_physicist

tailintalent / AI_physicist

Licence: MIT license
AI Physicist, a paradigm with algorithms for learning theories from data, by Wu and Tegmark (2019)

Programming Languages

python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to AI physicist

class-norm
Class Normalization for Continual Zero-Shot Learning
Stars: ✭ 34 (+47.83%)
Mutual labels:  lifelong-learning
um-abt
An OCaml library implementing unifiable abstract binding trees (UABTs)
Stars: ✭ 25 (+8.7%)
Mutual labels:  unification
vita
Vita - Genetic Programming Framework
Stars: ✭ 24 (+4.35%)
Mutual labels:  symbolic-regression
life-disciplines-projects
Life-Disciplines-Projects (LDP) is a life-management framework built within Obsidian. Feel free to transform it for your own personal needs.
Stars: ✭ 130 (+465.22%)
Mutual labels:  lifelong-learning
Symbolic-Regression
predicting equations from raw data with deep learning
Stars: ✭ 48 (+108.7%)
Mutual labels:  symbolic-regression
Remembering-for-the-Right-Reasons
Official Implementation of Remembering for the Right Reasons (ICLR 2021)
Stars: ✭ 27 (+17.39%)
Mutual labels:  lifelong-learning
FACIL
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.
Stars: ✭ 411 (+1686.96%)
Mutual labels:  lifelong-learning
CVPR21 PASS
PyTorch implementation of our CVPR2021 (oral) paper "Prototype Augmentation and Self-Supervision for Incremental Learning"
Stars: ✭ 55 (+139.13%)
Mutual labels:  lifelong-learning
Robust-Hexahedral-Re-Meshing
Hex-mesh simplification and optimization
Stars: ✭ 36 (+56.52%)
Mutual labels:  simplification
AirLoop
AirLoop: Lifelong Loop Closure Detection
Stars: ✭ 49 (+113.04%)
Mutual labels:  lifelong-learning
Generative Continual Learning
No description or website provided.
Stars: ✭ 51 (+121.74%)
Mutual labels:  lifelong-learning
Adam-NSCL
PyTorch implementation of our Adam-NSCL algorithm from our CVPR2021 (oral) paper "Training Networks in Null Space for Continual Learning"
Stars: ✭ 34 (+47.83%)
Mutual labels:  lifelong-learning
DeepSymReg
Official repository for the paper "Integration of Neural Network-Based Symbolic Regression in Deep Learning for Scientific Discovery"
Stars: ✭ 35 (+52.17%)
Mutual labels:  symbolic-regression
MathExpressions.NET
➗ Library for parsing math expressions with rational numbers, finding their derivatives and compiling an optimal IL code
Stars: ✭ 63 (+173.91%)
Mutual labels:  simplification
HebbianMetaLearning
Meta-Learning through Hebbian Plasticity in Random Networks: https://arxiv.org/abs/2007.02686
Stars: ✭ 77 (+234.78%)
Mutual labels:  lifelong-learning
WarpPI
WarpPI Calculator, Step-by-step algebra calculator for Raspberry Pi. (abandoned project)
Stars: ✭ 93 (+304.35%)
Mutual labels:  simplification
reproducible-continual-learning
Continual learning baselines and strategies from popular papers, using Avalanche. We include EWC, SI, GEM, AGEM, LwF, iCarl, GDumb, and other strategies.
Stars: ✭ 118 (+413.04%)
Mutual labels:  lifelong-learning
cvpr clvision challenge
CVPR 2020 Continual Learning Challenge - Submit your CL algorithm today!
Stars: ✭ 57 (+147.83%)
Mutual labels:  lifelong-learning
lifelong-learning
lifelong learning: record and analysis of my knowledge structure
Stars: ✭ 18 (-21.74%)
Mutual labels:  lifelong-learning
ftor
ftor enables ML-like type-directed, functional programming with Javascript including reasonable debugging.
Stars: ✭ 44 (+91.3%)
Mutual labels:  unification

AI_physicist

``AI_physicist'' provides a paradigm and algorithms for learning and manipulation of theories, i.e. small specialized models together with the domain that they are accurate. It contains algorithms for

  • Differentiable divide-and-conquer (DDAC) for simultaneous learning of the theories and their domains
  • Simplification of theories by Occam's razor with MDL
  • Unification of theories into master theories that can generate a continuum of theories
  • Lifelong learning by storing of theories in theory hub and proposing theories for novel environments.

More details are provided in the paper "Toward an AI Physicist for unsupervised learning", Tailin Wu and Max Tegmark (2019) [Physical Review E][arXiv].

Installation

First clone the directory. Then run

bash setup.sh

to initialize the submodule and download the datasets.

The PyTorch requirement is >=0.4.1. Other requirements are in requirements.txt

Datasets

The datasets used for the paper is provided in here. Put the unzipped "MYSTERIES", "GROUND_TRUTH" directly under datasets/. To use your own dataset, put your csv file under datasets/MYSTERIES/. For each dataset, the first num_output_dims * num_input_dims columns are used as input X, the next num_output_dims columns are used as target Y, and if is_classified = True, the last column should provide true domain ID for evaluation. Take num_output_dims = 2, num_input_dims = 2, is_classified = True as an example, the dataset should look like:

x1, y1, x2, y2, x3, y3, domain_id1

x2, y2, x3, y3, x4, y4, domain_id2

x3, y3, x4, y4, x5, y5, domain_id3

...

where the first 2 * 2 = 4 columns will be used as X, the next 2 columns will be used as Y, and the last column provides the true domain ID for evaluating domain prediction (not used in training).

If using files in "GROUND_TRUTH" (where true domain ids are provided as evaluation), set csv_dirname = "../datasets/GROUND_TRUTH/" and is_classified = True; If using files in "MYSTERIES", set csv_dirname = "../datasets/MYSTERIES/" and is_classified = False;

Usage

The main experiment file is theory_learning/theory_exp.ipynb (or the theory_exp.py counterpart for terminal. Both of which can be directly run), which contains the DDAC, simplification and lifelong learning algorithms for the AI Physicist.

Before running the experiment, first set up the path and correct settings for the datasets in line 61-83 of theory_exp.py (or the corresponding theory_exp.ipynb file). Particularly, important settings are:

  • csv_dirname: path to the dataset file
  • csv_filename_list: list of dataset files to run, so that each dataset's path is csv_dirname + env_name + ".csv", where env_name is the element in csv_filename_list
  • is_classified: whether the csv files provide the true_domain id for evaluation
  • num_output_dims: dimension of states at each time step
  • num_input_dims: number of time steps in the past used as X
  • Other important settings, e.g. num_theories_init (number of random theories to start with) and add_theory_limit (maximum allowed number of theories).

To run batch experiments in a cluster, set up the hyperparameters in run_exp/run_theory.py, and run

python run_exp/run_theory.py JOB_ID

where the ``JOB_ID'' is a number (between 0 to TOTAL # of hyperparameter combinations - 1) will map to a specific hyperparameter combination.

theory_learning/theory_unification.ipynb in addition contains the unification algorithm.

The AI Physicist uses pytorch_net for flexible construction of PyTorch neural networks with different types of layers, including Simple_Layer (dense layer), Symbolic_Layer (a symbolic layer using sympy with learnable parameters), and methods for training, simplification and conversion between different types of layers. See its tutorial for how to use it.

New features added to the AI Physicist:

Features are continuously added to the AI physicist. These features may or may not work, and can be turned on and off in the hyperparameter settings in theory_learning/theory_exp.ipynb or in run_exp/run_theory.py. Some features added compared to the original paper (Wu and Tegmark, 2019) are:

  • Autoencoder
  • Learning Lagrangian instead of the Equation of Motion
  • Annealing of order for the generalized-mean loss
  • Unification of theories with neural network (instead of symbolic)

Citation

If you compare with, build on, or use aspects of the AI Physicist work, please cite the following:

@article{wu2019toward,
    title={Toward an artificial intelligence physicist for unsupervised learning},
    author={Wu, Tailin and Tegmark, Max},
    journal={Physical Review E},
    volume={100},
    number={3},
    pages={033311},
    year={2019},
    publisher={APS}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].