All Projects → openai → Mlsh

openai / Mlsh

Code for the paper "Meta-Learning Shared Hierarchies"

Programming Languages

python
139335 projects - #7 most used programming language

Labels

Projects that are alternatives of or similar to Mlsh

Knowledge Distillation Papers
knowledge distillation papers
Stars: ✭ 422 (-22.99%)
Mutual labels:  paper
Rgan
Recurrent (conditional) generative adversarial networks for generating real-valued time series data.
Stars: ✭ 480 (-12.41%)
Mutual labels:  paper
Qlib
Qlib is an AI-oriented quantitative investment platform, which aims to realize the potential, empower the research, and create the value of AI technologies in quantitative investment. With Qlib, you can easily try your ideas to create better Quant investment strategies. An increasing number of SOTA Quant research works/papers are released in Qlib.
Stars: ✭ 7,582 (+1283.58%)
Mutual labels:  paper
Research Method
论文写作与资料分享
Stars: ✭ 436 (-20.44%)
Mutual labels:  paper
Iaf
Code for reproducing key results in the paper "Improving Variational Inference with Inverse Autoregressive Flow"
Stars: ✭ 468 (-14.6%)
Mutual labels:  paper
Mohist
Minecraft Forge Hybrid server implementing the Paper/Spigot/Bukkit API, formerly known as Thermos/Cauldron/MCPC+
Stars: ✭ 489 (-10.77%)
Mutual labels:  paper
Learning Deep Learning
Paper reading notes on Deep Learning and Machine Learning
Stars: ✭ 388 (-29.2%)
Mutual labels:  paper
Srflow
Official SRFlow training code: Super-Resolution using Normalizing Flow in PyTorch
Stars: ✭ 537 (-2.01%)
Mutual labels:  paper
Conditional Pixelcnn Decoder
Tensorflow implementation of Gated Conditional Pixel Convolutional Neural Network
Stars: ✭ 479 (-12.59%)
Mutual labels:  paper
Pycnn
Image Processing with Cellular Neural Networks in Python
Stars: ✭ 509 (-7.12%)
Mutual labels:  paper
Cvpr2021 Papers With Code
CVPR 2021 论文和开源项目合集
Stars: ✭ 7,138 (+1202.55%)
Mutual labels:  paper
Awsome Deep Learning For Video Analysis
Papers, code and datasets about deep learning and multi-modal learning for video analysis
Stars: ✭ 452 (-17.52%)
Mutual labels:  paper
Daily Paper Computer Vision
记录每天整理的计算机视觉/深度学习/机器学习相关方向的论文
Stars: ✭ 4,977 (+808.21%)
Mutual labels:  paper
Paper For Mac
🖥 Unofficial Dropbox Paper client for macOS
Stars: ✭ 427 (-22.08%)
Mutual labels:  paper
Cvpr 2019 Paper Statistics
Statistics and Visualization of acceptance rate, main keyword of CVPR 2019 accepted papers for the main Computer Vision conference (CVPR)
Stars: ✭ 527 (-3.83%)
Mutual labels:  paper
Yatopia
The Most Powerful and Feature Rich Minecraft Server Software!
Stars: ✭ 408 (-25.55%)
Mutual labels:  paper
Nlp Paper
NLP Paper
Stars: ✭ 484 (-11.68%)
Mutual labels:  paper
Hugo Paper
🥛 A simple, clean, flexible Hugo theme
Stars: ✭ 538 (-1.82%)
Mutual labels:  paper
Paper
On self sovereign human identity.
Stars: ✭ 537 (-2.01%)
Mutual labels:  paper
Arxiv Style
A Latex style and template for paper preprints (based on NIPS style)
Stars: ✭ 497 (-9.31%)
Mutual labels:  paper

Status: Archive (code is provided as-is, no updates expected)

Meta-Learning Shared Hierarchies

Code for Meta-Learning Shared Hierarchies.

Installation
Add to your .bash_profile (replace ... with path to directory):
export PYTHONPATH=$PYTHONPATH:/.../mlsh/gym;
export PYTHONPATH=$PYTHONPATH:/.../mlsh/rl-algs;

Install MovementBandits environments:
cd test_envs
pip install -e .
Running Experiments
python main.py --task AntBandits-v1 --num_subs 2 --macro_duration 1000 --num_rollouts 2000 --warmup_time 20 --train_time 30 --replay False AntAgent

Once you've trained your agent, view it by running:

python main.py [...] --replay True --continue_iter [your iteration] AntAgent

The MLSH script works on any Gym environment that implements the randomizeCorrect() function. See the envs/ folder for examples of such environments.

To run on multiple cores:

mpirun -np 12 python main.py ...
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].