All Projects → HoME-Platform → Home Platform

HoME-Platform / Home Platform

Licence: bsd-3-clause
HoME: a Household Multimodal Environment is a platform for artificial agents to learn from vision, audio, semantics, physics, and interaction with objects and other agents, all within a realistic context.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Home Platform

Ravens
Train robotic agents to learn pick and place with deep learning for vision-based manipulation in PyBullet. Transporter Nets, CoRL 2020.
Stars: ✭ 133 (-64.05%)
Mutual labels:  artificial-intelligence, reinforcement-learning, vision, openai-gym
Tsdf Fusion
Fuse multiple depth frames into a TSDF voxel volume.
Stars: ✭ 426 (+15.14%)
Mutual labels:  artificial-intelligence, 3d, vision
Arc Robot Vision
MIT-Princeton Vision Toolbox for Robotic Pick-and-Place at the Amazon Robotics Challenge 2017 - Robotic Grasping and One-shot Recognition of Novel Objects with Deep Learning.
Stars: ✭ 224 (-39.46%)
Mutual labels:  artificial-intelligence, 3d, vision
Tsdf Fusion Python
Python code to fuse multiple RGB-D images into a TSDF voxel volume.
Stars: ✭ 464 (+25.41%)
Mutual labels:  artificial-intelligence, 3d, vision
Apc Vision Toolbox
MIT-Princeton Vision Toolbox for the Amazon Picking Challenge 2016 - RGB-D ConvNet-based object segmentation and 6D object pose estimation.
Stars: ✭ 277 (-25.14%)
Mutual labels:  artificial-intelligence, 3d, vision
Rex Gym
OpenAI Gym environments for an open-source quadruped robot (SpotMicro)
Stars: ✭ 684 (+84.86%)
Mutual labels:  artificial-intelligence, reinforcement-learning, openai-gym
Pytorch Dense Correspondence
Code for "Dense Object Nets: Learning Dense Visual Object Descriptors By and For Robotic Manipulation"
Stars: ✭ 445 (+20.27%)
Mutual labels:  artificial-intelligence, 3d, vision
Gym Starcraft
StarCraft environment for OpenAI Gym, based on Facebook's TorchCraft. (In progress)
Stars: ✭ 514 (+38.92%)
Mutual labels:  artificial-intelligence, reinforcement-learning, openai-gym
3dmatch Toolbox
3DMatch - a 3D ConvNet-based local geometric descriptor for aligning 3D meshes and point clouds.
Stars: ✭ 571 (+54.32%)
Mutual labels:  artificial-intelligence, 3d, vision
Visual Pushing Grasping
Train robotic agents to learn to plan pushing and grasping actions for manipulation with deep reinforcement learning.
Stars: ✭ 516 (+39.46%)
Mutual labels:  artificial-intelligence, 3d, vision
Basic reinforcement learning
An introductory series to Reinforcement Learning (RL) with comprehensive step-by-step tutorials.
Stars: ✭ 826 (+123.24%)
Mutual labels:  artificial-intelligence, reinforcement-learning, openai-gym
Simulator
A ROS/ROS2 Multi-robot Simulator for Autonomous Vehicles
Stars: ✭ 1,260 (+240.54%)
Mutual labels:  artificial-intelligence, 3d, reinforcement-learning
Applied Reinforcement Learning
Reinforcement Learning and Decision Making tutorials explained at an intuitive level and with Jupyter Notebooks
Stars: ✭ 229 (-38.11%)
Mutual labels:  artificial-intelligence, reinforcement-learning
Evostra
A fast Evolution Strategy implementation in Python
Stars: ✭ 227 (-38.65%)
Mutual labels:  artificial-intelligence, reinforcement-learning
Polyaxon
Machine Learning Platform for Kubernetes (MLOps tools for experimentation and automation)
Stars: ✭ 2,966 (+701.62%)
Mutual labels:  artificial-intelligence, reinforcement-learning
Dreamer
Dream to Control: Learning Behaviors by Latent Imagination
Stars: ✭ 269 (-27.3%)
Mutual labels:  artificial-intelligence, reinforcement-learning
Pytorch Blender
Seamless, distributed, real-time integration of Blender into PyTorch data pipelines
Stars: ✭ 272 (-26.49%)
Mutual labels:  reinforcement-learning, openai-gym
He4o
和(he for objective-c) —— “信息熵减机系统”
Stars: ✭ 284 (-23.24%)
Mutual labels:  artificial-intelligence, reinforcement-learning
Dreamerv2
Mastering Atari with Discrete World Models
Stars: ✭ 287 (-22.43%)
Mutual labels:  artificial-intelligence, reinforcement-learning
Amazing Machine Learning Opensource 2019
Amazing Machine Learning Open Source Tools and Projects for the Past Year (v.2019)
Stars: ✭ 198 (-46.49%)
Mutual labels:  artificial-intelligence, reinforcement-learning

HoME Platform Build Status

HoME is a platform for artificial agents to learn from vision, audio, semantics, physics, and interaction with objects and other agents, all within a realistic context.

Check out the paper on Arxiv for more details: HoME: a Household Multimodal Environment

alt tag

Dependencies

Main requirements:

  • Python 2.7+ with Numpy, Scipy and Matplotlib
  • (or Python 3.5+ but currently WIP, might have bugs)
  • Panda3d game engine for 3D rendering
  • EVERT engine for 3D acoustic ray-tracing
  • PySoundFile for Ogg Vorbis decoding

To install dependencies on Ubuntu operating systems:

sudo apt-get install python-pip python-tk python-dev build-essential libsndfile1 portaudio19-dev
sudo pip2 install --upgrade pip numpy scipy matplotlib gym panda3d pysoundfile pyaudio resampy nose coverage Pillow imageio

or, for Python 3:

sudo apt-get install python3-pip python3-tk python3-dev build-essential libsndfile1 portaudio19-dev
sudo pip3 install --upgrade pip numpy scipy matplotlib gym panda3d pysoundfile pyaudio resampy nose coverage Pillow imageio

(Packages nose and coverage are for tests only and can be omitted)

Finally you have to install EVERT. In order to do so, please follow the instructions over at https://github.com/sbrodeur/evert

SUNCG Dataset

The Home environment is based on the SUNCG dataset.

Important! Before you can use this library, you need to obtain the SUNCG dataset. In order to do so, please follow the instructions on their website.

For the test suit we included a single small house as sample in this repository.

Installing the library

Download the source code from the git repository:

mkdir -p $HOME/work
cd $HOME/work
git clone https://github.com/HoME-Platform/home-platform.git

Note that the library must be in the PYTHONPATH environment variable for Python to be able to find it:

export PYTHONPATH=$HOME/work/home-platform:$PYTHONPATH 

This can also be added at the end of the configuration file $HOME/.bashrc

Running unit tests

To ensure all libraries where correctly installed, it is advised to run the test suite:

cd $HOME/work/home-platform/tests
./run_tests.sh

Note that this can take some time.

Usage

Before you start please read steps 0 and 1 so that you know what to expect. The data installation and preparation can take some time (because it's a big dataset).

0 Obtain and install the SUNCG dataset (only once)

The dataset can be acquired by contacting the authors at their website: http://suncg.cs.princeton.edu/

Currently the dataset downloaded consists of a single zip file, which in turn contains multiple zip files (one for each directory in [house, object, object_vox, room, texture]).

Please unzip all files so that as a result you have the folloging directory structure somewhere on your PC:

/some/path/doesnt/matter/SUNCG/
/some/path/doesnt/matter/SUNCG/house/
/some/path/doesnt/matter/SUNCG/house/0004d52d1aeeb8ae6de39d6bd993e992/
/some/path/doesnt/matter/SUNCG/house/0004dd3cb11e50530676f77b55262d38/
...
/some/path/doesnt/matter/SUNCG/object/100/
/some/path/doesnt/matter/SUNCG/object/101/
...

The unzipped files take up approx 28.1GB disk space on an NTFS-formatted drive.

Warning Unzipping may take considerable time and should not be done on a network drive due to the overhead in network communication. We advise to extract everything on a local machine and if necessary copy it to networked machines via rsync like so:

rsync -avh --info=progress2 --remove-source-files /some/path/doesnt/matter/SUNCG /cluster/datasets/

Once everything is unzipped, you can remove the original zip files.

As final step please "install" the dataset by (a) symlinking it into your home directory

ln -s /some/path/doesnt/matter/SUNCG/ ~/.suncg 

or by (b) setting the environmental variable to the right path:

export SUNCG_DATA_DIR="/cluster/datasets/SUNCG"

If you do both of these things, the environmental variable takes precedent.

1 Convert the houses into a usable format (only once)

In order for Panda3D to be able to read the 3D files of the houses, you need to first convert them to a different format (from OBJ/Wavefront to EGG and BAM).

To do so, just cd into the scripts folder in this repository and run the conversion script:

# assuming you currently are in the directory of this readme file:
cd scripts

# if you installed the dataset via symlink:
./convert_suncg.sh ~/.suncg/ # the trailing slash is important

# if you installed the dataset via environmental vairable:
./convert_suncg.sh $SUNCG_DATA_DIR/ # the trailing slash might be important
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].