All Projects → dougsm → mvp_grasp

dougsm / mvp_grasp

Licence: BSD-3-Clause license
Multi-Viewpoint Picking (ICRA 2019)

Programming Languages

python
139335 projects - #7 most used programming language
CMake
9771 projects
C++
36643 projects - #6 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to mvp grasp

handle detector
ROS package to localize handles in 3D point clouds
Stars: ✭ 24 (-84.81%)
Mutual labels:  grasping
good robot
"Good Robot! Now Watch This!": Repurposing Reinforcement Learning for Task-to-Task Transfer; and “Good Robot!”: Efficient Reinforcement Learning for Multi-Step Visual Tasks with Sim to Real Transfer
Stars: ✭ 84 (-46.84%)
Mutual labels:  grasping
icra20-hand-object-pose
[ICRA 2020] Robust, Occlusion-aware Pose Estimation for Objects Grasped by Adaptive Hands
Stars: ✭ 42 (-73.42%)
Mutual labels:  grasping
graspnetAPI
Toolbox for our GraspNet-1Billion dataset.
Stars: ✭ 105 (-33.54%)
Mutual labels:  grasping
robotic-grasping
Antipodal Robotic Grasping using GR-ConvNet. IROS 2020.
Stars: ✭ 131 (-17.09%)
Mutual labels:  grasping
grasp multiObject
Robotic grasp dataset for multi-object multi-grasp evaluation with RGB-D data. This dataset is annotated using the same protocal as Cornell Dataset, and can be used as multi-object extension of Cornell Dataset.
Stars: ✭ 59 (-62.66%)
Mutual labels:  grasping
graspnet-baseline
Baseline model for "GraspNet-1Billion: A Large-Scale Benchmark for General Object Grasping" (CVPR 2020)
Stars: ✭ 146 (-7.59%)
Mutual labels:  grasping
obman render
[cvpr19] Code to generate images from the ObMan dataset, synthetic renderings of hands holding objects (or hands in isolation)
Stars: ✭ 61 (-61.39%)
Mutual labels:  grasping
multi-contact-grasping
This project implements a simulated grasp-and-lift process in V-REP using the Barrett Hand, with an interface through a python remote API.
Stars: ✭ 52 (-67.09%)
Mutual labels:  grasping
PROBOT Anno
ROS Packages for PROBOT Anno.
Stars: ✭ 75 (-52.53%)
Mutual labels:  grasping
GrabNet
GrabNet: A Generative model to generate realistic 3D hands grasping unseen objects (ECCV2020)
Stars: ✭ 146 (-7.59%)
Mutual labels:  grasping
drl grasping
Deep Reinforcement Learning for Robotic Grasping from Octrees
Stars: ✭ 160 (+1.27%)
Mutual labels:  grasping
kuka rl
Reinforcement Learning Experiments using PyBullet
Stars: ✭ 65 (-58.86%)
Mutual labels:  grasping
graspit
The GraspIt! simulator
Stars: ✭ 142 (-10.13%)
Mutual labels:  grasping
graspnetAPI
API for large scale benchmark of robotic grasping: GraspNet-1Billion: https://graspnet.net
Stars: ✭ 33 (-79.11%)
Mutual labels:  grasping
graspit interface
A GraspIt! plugin exposing a ROS interface via graspit-ros
Stars: ✭ 29 (-81.65%)
Mutual labels:  grasping
gpg
Generate grasp pose candidates in point clouds
Stars: ✭ 81 (-48.73%)
Mutual labels:  grasping

GG-CNN + Multi-View Picking

This repository contains the implementation of the Multi-View Picking system and experimental code for running on a Franka Emika Panda Robot from the paper:

Multi-View Picking: Next-best-view Reaching for Improved Grasping in Clutter

Douglas Morrison, Peter Corke, Jürgen Leitner

International Conference on Robotics and Automation (ICRA), 2019

arXiv | Video

For more information about GG-CNN, see this repository or this arXiv paper.

If you use this work, please cite the following as appropriate:

@inproceedings{morrison2019multiview, 
	title={{Multi-View Picking: Next-best-view Reaching for Improved Grasping in Clutter}}, 
	author={Morrison, Douglas and Corke, Peter and Leitner, J\"urgen}, 
	booktitle={2019 IEEE International Conference on Robotics and Automation (ICRA)}, 
	year={2019} 
}

@inproceedings{morrison2018closing, 
	title={{Closing the Loop for Robotic Grasping: A Real-time, Generative Grasp Synthesis Approach}}, 
	author={Morrison, Douglas and Corke, Peter and Leitner, J\"urgen}, 
	booktitle={Proc.\ of Robotics: Science and Systems (RSS)}, 
	year={2018} 
}

Contact

Any questions or comments contact Doug Morrison.

Setup

Hardware:

This code is designed around a Franka Emika Panda robot using an Intel Realsense D435 camera mounted on the wrist. A 3D-printalbe camera mount is available in the cad folder. DYMO M10 scales are used to detect grasp success (Optional. See the scales_interface directry for more information).

The following external packages are required to run everything completely:

Installation:

Clone this repository into your ROS worksapce and run rosdep install --from-paths src --ignore-src --rosdistro=<your_rosdistro> -y and then catkin_make/catkin build.

Local python requirements can be installed by:

pip install -r requirements.txt

Packages Overview

  • dougsm_helpers: A set of common functions for dealing with ROS and TF that are used throughout.
  • scales_interface: A simple interface to a set of DYMO scales for reading weight.
  • ggcnn: Service and Node for running GG-CNN. Provides two
  • franka_control_wrappers: Add a simple velocity controller node and MoveIt commander for controlling the Panda robot.
  • mvp_grasping: ROS nodes for executing grasps using the Multi-View Picking approach, including baselines.

Running

To run grasping experiments:

# Start the robot and required extras.
roslaunch mvp_grasping robot_bringup.launch

# Start the camera, depth conversion and static transform
roslaunch mvp_grasping wrist_realsense.launch

# # Start the scales interface (disabled by default, useful if you have compatible scales)
# roslaunch scales_interface scales.launch

# Start the Multi-View Picking backend
roslaunch mvp_grasping grasp_entropy_service.launch
 
## Execute Grasping Experiment

# For Multi-View Picking
rosrun mvp_grasping panda_mvp_grasp.py

# For Fixed data-collection baseline
rosrun mvp_grasping panda_fixed_baseline.py

# For single-view open-loop grasping baseline
roslaunch ggcnn ggcnn_service.launch
rosrun mvp_grasping panda_open_loop_grasp.py

Configuration

While this code has been written with specific hardware in mind, different physical settings or cameras may be used by customising ggcnn/cfg/ggcnn_service.yaml and mvp_grasping/cfg/mvp_grasp.yaml. New robots and cameras will require major changes.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].