All Projects → Unity-Technologies → Marathon Envs

Unity-Technologies / Marathon Envs

Licence: apache-2.0
A set of high-dimensional continuous control environments for use with Unity ML-Agents Toolkit.

Marathon Environments

A set of high-dimensional continuous control environments for use with Unity ML-Agents Toolkit.

MarathonEnvs

Preview MarathonEnvs using the Web Demo

MarathonEnvs is a set of high-dimensional continuous control benchmarks using Unity’s native physics simulator, PhysX. MarathonEnvs can be trained using Unity ML-Agents or any OpenAI Gym compatible algorithm. MarathonEnvs may be useful for:

  • Video Game researchers interested in apply bleeding-edge robotics research into the domain of locomotion and AI for video games.
  • Academic researchers looking to leverage the strengths of Unity and ML-Agents along with the body of existing research and benchmarks provided by projects such as the DeepMind Control Suite, or OpenAI Mujoco environments.

Note: This project is the result of contributions from members of the Unity community (see below) who actively maintain the repository. As such, the contents of this repository are not officially supported by Unity Technologies.

Need Help?


Environments

Controller (DReCon) - Preview
Controller

A controller based agent, inspired by the DReCon paper (link below). The agent learns to follow a simple traditional controller agent and exhibits emergent behavior. In Preview

  • ControllerMarathonMan-v0
Style Transfer (DeepMimic)

Learning from motion capture examples, inspired by the DeepMimic paper (link below). The agent learns the motion capture sequence using a phase value.

  • MarathonManWalking-v0
  • MarathonManRunning-v0
  • arathonManJazzDancing-v0
  • MarathonManMMAKick-v0
  • MarathonManPunchingBag-v0
  • MarathonManBackflip-v0
StyleTransfer
Procedural Environments
Terrain

Procedurally-generated terrains aimed at addressing overfitting in Reinforcement Learning and generalizable skills.

  • TerrainHopper-v0
  • TerrainWalker2d-v0
  • TerrainAnt-v0
  • TerrainMarathonMan-v0
Classical Environments

Classical implementations of Ant, Hopper, Walker-2d, Humanoid

  • Hopper-v0
  • Walker2d-v0
  • Ant-v0
  • MarathonMan-v0
Classical
Sparse - Experimental

Sparse reward version of a humanoid learning to walk. The agent recives a single reward at the end of the episode.

  • MarathonManSparse-v0

Releases

The latest version is v3.0.0

The following table lists releases, the required unity version, and links to release note, source code, and binaries:

Version Unity Updated Environments Source MacOS Windows Linux Web Paper
master (unstable) 2020.1 ControllerMarathonMan-v0 -- -- -- -- -- --
v3.0.0 2020.1 beta.12 ControllerMarathonMan-v0 Source MacOS -- Linux Web arXiv
v2.0.0 2018.4 LTS MarathonManWalking-v0 MarathonManRunning-v0 MarathonManJazzDancing-v0 MarathonManMMAKick-v0 MarathonManPunchingBag-v0 Source MacOS -- Linux -- --
v2.0.0-alpha.2 2018.4 LTS -- Source MacOS Windows Linux -- AAAI 2019
v2.0.0-alpha.1 2018.4 LTS MarathonManBackflip-v0 MarathonMan-v0 ManathonManSparse-v0 TerrainHopperEnv-v0, TerrainWalker2dEnv-v0, TerrainAntEnv-v0, TerrainMarathonManEnv-v0 Source -- -- -- -- --
v0.5.0a 2018.2 Hopper-v0, Walker2d-v0, Ant-v0, Humanoid-v0 Source -- -- -- -- Blog

Getting Started

Requirements

  • Unity 2018.4 (Download here).
  • Clone / Download this repro
  • Install ml-agents version 0.14.1 - install via:
pip3 install mlagents==0.14.1
  • Build or install the correct runtime for your version into the envs\ folder

Training

Guides

  • Video walkthrough:-

    Video walkthrough

  • Getting started with Marathon Environments v0.5.0a BLOG


Publications & Usage

Publications

Research using ML-Agents + MarathonEnvs


References

Citing MarathonEnvs

If you use MarathonEnvs in your research, we ask that you please cite our paper.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].