All Projects → jhu-lcsr → Handeye_calib_camodocal

jhu-lcsr / Handeye_calib_camodocal

Licence: bsd-2-clause
Easy to use and accurate hand eye calibration which has been working reliably for years (2016-present) with kinect, kinectv2, rgbd cameras, optical trackers, and several robots including the ur5 and kuka iiwa.

Projects that are alternatives of or similar to Handeye calib camodocal

Grl
Robotics tools in C++11. Implements soft real time arm drivers for Kuka LBR iiwa plus V-REP, ROS, Constrained Optimization based planning, Hand Eye Calibration and Inverse Kinematics integration.
Stars: ✭ 105 (-71.15%)
Mutual labels:  robotics, ros, robot, calibration
Easy handeye
Automated, hardware-independent Hand-Eye Calibration
Stars: ✭ 355 (-2.47%)
Mutual labels:  ros, robot, camera, calibration
Lidar camera calibration
ROS package to find a rigid-body transformation between a LiDAR and a camera for "LiDAR-Camera Calibration using 3D-3D Point correspondences"
Stars: ✭ 734 (+101.65%)
Mutual labels:  ros, camera, calibration, camera-calibration
Pybotics
The Python Toolbox for Robotics
Stars: ✭ 192 (-47.25%)
Mutual labels:  robotics, robot, calibration, kinematics
Dolly
🤖🐑 It's a sheep, it's a dolly, it's a following robot. Dolly was born to be cloned.
Stars: ✭ 113 (-68.96%)
Mutual labels:  robotics, ros, robot
Free gait
An Architecture for the Versatile Control of Legged Robots
Stars: ✭ 263 (-27.75%)
Mutual labels:  robotics, ros, robot
Rosnodejs
Client library for writing ROS nodes in JavaScript with nodejs
Stars: ✭ 145 (-60.16%)
Mutual labels:  robotics, ros, robot
Extrinsic lidar camera calibration
This is a package for extrinsic calibration between a 3D LiDAR and a camera, described in paper: Improvements to Target-Based 3D LiDAR to Camera Calibration. This package is used for Cassie Blue's 3D LiDAR semantic mapping and automation.
Stars: ✭ 149 (-59.07%)
Mutual labels:  robotics, calibration, camera-calibration
Robot
Simple library for controlling a raspberry pi based robot
Stars: ✭ 156 (-57.14%)
Mutual labels:  robotics, robot, camera
Kinematics
🤖 JavaScript 6DOF robot kinematics library
Stars: ✭ 187 (-48.63%)
Mutual labels:  robotics, robot, kinematics
scikit-robot
A Flexible Framework for Robot Control in Python
Stars: ✭ 70 (-80.77%)
Mutual labels:  robot, kinematics, ros
Awesome Robotic Tooling
Tooling for professional robotic development in C++ and Python with a touch of ROS, autonomous driving and aerospace.
Stars: ✭ 1,876 (+415.38%)
Mutual labels:  robotics, ros, robot
Robotics setup
Setup Ubuntu 18.04, 16.04 and 14.04 with machine learning and robotics software plus user configuration. Includes ceres tensorflow ros caffe vrep eigen cudnn and cuda plus many more.
Stars: ✭ 110 (-69.78%)
Mutual labels:  robotics, ros, robot
Weloveinterns
中科院软件所智能软件中心实习生社区
Stars: ✭ 143 (-60.71%)
Mutual labels:  robotics, ros, robot
Ros robotics projects
Example codes of new book ROS Robotics Projects
Stars: ✭ 240 (-34.07%)
Mutual labels:  robotics, ros, robot
erwhi-hedgehog
Erwhi Hedgehog main repository
Stars: ✭ 31 (-91.48%)
Mutual labels:  robot, robotics, ros
Webots
Webots Robot Simulator
Stars: ✭ 1,324 (+263.74%)
Mutual labels:  robotics, ros, robot
Ev3dev Lang Java
A project to learn Java and create software for Mindstorms Robots using hardware supported by EV3Dev & the LeJOS way.
Stars: ✭ 79 (-78.3%)
Mutual labels:  robotics, ros, robot
Rvd
Robot Vulnerability Database. An archive of robot vulnerabilities and bugs.
Stars: ✭ 87 (-76.1%)
Mutual labels:  robotics, ros, robot
erdos
Dataflow system for building self-driving car and robotics applications.
Stars: ✭ 135 (-62.91%)
Mutual labels:  robot, robotics, ros

ROS + CamOdoCal Hand Eye Calibration

This is a ROS node integrating the Hand Eye Calibration implemented in CamOdoCal. See this stack exchange question explaining how Hand Eye Calibration works.

Example uses include determining exact transforms with both positions and orientations of a:

  • camera attached to the floor is relative to a robot arm
  • camera attached to a robot arm tip relative to a robot arm
  • set of cameras attached to a moving vehicle (this is what camodocal itself implements)
  • two robot arms bolted together

keynote presentation explaining many details about hand eye calibration for those that are interested. Practical code and instructions to calibrate your robot can be found below.

Hand Eye Calibration Basics

Two Common Solutions to Hand Eye Calibration

AX=XB Hand Eye Calibration Solution

AX=ZB Hand Eye Calibration Solution

Feeding data into CamOdoCal

  1. Each measurement taken at a different time, position, and orientation narrows down the possible transforms that can represent the unknown X

  2. Record a list of many transforms A and B taken between different time steps, or relative to the first time step

    • Rotations are in AxisAngle = UnitAxis*Angle format, or [x_axis,y_axis,z_axis]*𝜃_angle
      • ||UnitAxis||=1
      • || AxisAngle || = 𝜃_angle
    • Translations are in the normal [x,y,z] format
  3. Pass both vectors into EstimateHandEyeScrew()

  4. Returns X in the form of a 4x4 transform estimate

Camodocal Hand Eye Calibration Details

When using this with a robot arm, move it around to a variety of poses and orientations, make sure any data sources that lag behind settle down, then record each pair of poses between the robot base and the robot tip, and between the eye/camera base and the marker, fiducial, or AR tag it is viewing.

This will save out a yaml file with the results. Be sure to load the results into your system using the data formatted as a rotation matrix, dual quaternion, or quaternion + translation. Roll Pitch Yaw can degenerate and will often be inaccurate!

Installation

Linux

All dependencies can be installed via scripts in the robotics_setup repository on Ubuntu 14.04 or Ubuntu 16.04.

MacOS

On OS X you can use homebrew and the homebrew-robotics tap to install all dependencies.

ROS (both Linux + MacOS)

Once you've completed the Linux or MacOS steps, follow normal ros source package installation procedures with catkin build.

Dependencies

If installing manually, be sure to follow the instructions for each library as there are specific steps required depending on your OS.

  • ROS indigo or kinetic
  • OpenCV 2 or 3 with (recommended) nonfree components
    • Note handeye_calib_camodocal does not call any nonfree components, but some users have had difficulty configuring CMake to compile and install all the other dependencies without them.
    • OpenCV3 puts their nonfree components in opencv-contrib.
  • Eigen3
  • ceres-solver
  • glog
    • If you encounter an error about providing "FindGlog.cmake" in CMAKE_MODULE_PATH, try installing glog from source.
  • gflags

Examples

There are example pre-recorded transforms in the example folder, all config files are expected to be in handeye_calib_camodocal/launch folder by default, but if that doesn't work try checking the ~/.ros/ folder.

To verify that the software is working run:

roslaunch handeye_calib_camodocal handeye_example.launch

You should see output like the following:

# INFO: Before refinement: H_12 =
  -0.962926   -0.156063     0.22004 -0.00802514
  -0.176531    0.981315  -0.0765322   0.0242905
  -0.203985   -0.112539   -0.972484   0.0550756
          0           0           0           1
Ceres Solver Report: Iterations: 89, Initial cost: 1.367791e+01, Final cost: 6.005694e-04, Termination: CONVERGENCE
# INFO: After refinement: H_12 =
  -0.980558    0.184959   0.0655414  0.00771561
  0.0495028  -0.0900424    0.994707   0.0836796
   0.189881    0.978613   0.0791359 -0.00867321
          0           0           0           1
Result from /ee_link to /ar_marker_0:
  -0.980558    0.184959   0.0655414  0.00771561
  0.0495028  -0.0900424    0.994707   0.0836796
   0.189881    0.978613   0.0791359 -0.00867321
          0           0           0           1
Translation (x,y,z) :  0.00771561   0.0836796 -0.00867321
Rotation (w,x,y,z): -0.046193, 0.0871038, 0.672938, 0.733099

Result from /ar_marker_0 to /ee_link:
  -0.980558    0.184959   0.0655414  0.00771561
  0.0495028  -0.0900424    0.994707   0.0836796
   0.189881    0.978613   0.0791359 -0.00867321
          0           0           0           1
Inverted translation (x,y,z) : 0.00507012 0.0145954 -0.083056
Inverted rotation (w,x,y,z): -0.046193, 0.0871038, 0.672938, 0.733099
0.046193 0.0871038 0.672938 0.733099

Writing calibration to "/home/cpaxton/catkin_ws/src/handeye_calib_camodocal/launch/CalibratedTransform.yml"...
[handeye_calib_camodocal-1] process has finished cleanly
log file: /home/cpaxton/.ros/log/a829db0a-f96b-11e6-b1dd-fc4dd43dd90b/handeye_calib_camodocal-1*.log
all processes on machine have died, roslaunch will exit
shutting down processing monitor...
... shutting down processing monitor complete
done

The full terminal session can be found at:

Recording your own Transforms

To record your own session, modify launch/handeye_file.launch to specify the ROS topics that will publish the poses between which you wish to calibrate, then run:

roslaunch handeye_calib_camodocal handeye_file.launch

If you have difficulty we cover just about every problem we've seen below in the troubleshooting section. It can also help to see this stack exchange question explaining how Hand Eye Calibration works

After you run, be sure to back up TransformPairsInput.yml and CalibratedTransform.yml so you don't lose all the transforms and positions you saved!

How do I get transforms between the camera and an object it sees?

If a camera is calibrated it is possible to estimate the transform from the camera to a printed pattern with known dimensions. I don’t recommend using a checkerboard for hand eye calibration because the pattern is ambiguous. Use something like:

They provide instructions on how to set up your camera and create patterns that can be used to generate transforms.

Troubleshooting

Saved Files Not Loading?

If you have trouble finding the saved files, the default working directory of ROS applications is in the ~/.ros/ folder, so try looking there. Be sure to also check your launch file which is typically launch/handeye_file.launch this determines if transforms will be loaded from a running robot or saved files, as well as where save files are placed.

Collecting Enough Data

We recommend you collect at least ~36 accurate transforms for a good calibration. If it fails to converge (i.e. you don't get a good result out), then you probably have your transforms flipped the wrong way or there is too much noise in your data to find a sufficiently accurate calibration.

Eliminating Sensor Noise

One simple method to help deal with this problem is to create a new node that reads the data you want to read and save a rolling average of the pose. This helps to stabilize the results. There are better methods such as a kalman filter that could handle this even better. If you take a rolling average, make sure each time you take the data the robot has been set in a single position for the entire duration of the time for which the rolling average is being taken, because any error here will throw off the results.

Examples of "too much noise" when taking data

If there is too much noise you will probably see the following error:

normalization could not be handled. Your rotations and translations are probably either not aligned or not passed in properly

That means there is probably too much variation in the data you are reading to get an accurate solution. For example, if you watch the pose of an AR tag and it wobbles a little or flips this will prevent an accurate solution from being found. One way to help this is to ensure the system is completely stationary and then interpolate (average) the poses across several frames, again ensuring the system is completely stationary before recording the frame and then finally moving to the next position and repeating the process.

Your cameras must be calibrated

Camera calibration is very important! If they aren't calibrated then the poses being fed into the algorithm will be inaccurate, won't correspond, and thus the algorithm won't be able to find even a decent approximate solution and will just exit, printing an error.

Your robot and cameras must be rigidly fixed

Hand eye calibration solves for a rigid body transform, so if the whole system isn't rigidly fixed the transform you are solving for is constantly changing and thus impossible to find accurately. For example, if you have a camera and a fixed robot base, check that your robot is securely bolted to a surface. Tighten those bolts up! Also ensure the camera is securely and rigidly fixed in place in a similar fasion. Check for any wobbling and make sure to wait for everything to become still before taking your data points.

Sanity Check by Physically Measuring

Slight distortion or variation in time stamp while the arm moves slightly as you hold it can still throw it off. One additional way to test that is to have the arm go to two distant positions, and the length of the change in checkerboard poses should be equal to the length of the change in end effector tip poses assuming you can keep the orientation constant.

Sanity Check via Simulation

If you’re concerned it is a bug in the algorithm you can run it in simulation with v-rep or gazebo (os + v-rep python script is in the repo) to verify it works, since that will avoid all physical measurement problems. From there you could consider taking more real data and incorporating the real data to narrow down the source of the problem.

Sanity Check Transforms and when loading from files

If you're loading from a file you've modified by hand, check if your matrices are transposed, inverted, or in very unusual cases even just the 3x3 Rotation component of the 4x4 rotation matrix may be transposed.

Example output

Here is an example output of what you should expect when a run is executed successfully:

Writing pairs to "/home/cpaxton/catkin_ws/src/handeye_calib_camodocal/launch/TransformPairsInput.yml"...
q[ INFO] [1473813682.393291696]: Calculating Calibration...
# INFO: Before refinement: H_12 =
-0.00160534     0.99916   0.0409473 -0.00813108
-0.00487176  -0.0409546    0.999149     0.10692
  0.999987  0.00140449  0.00493341   0.0155885
         0           0           0           1
Ceres Solver Report: Iterations: 99, Initial cost: 1.882582e-05, Final cost: 1.607494e-05, Termination: CONVERGENCE
# INFO: After refinement: H_12 =
-0.00282176     0.999009    0.0444162  -0.00746998
  0.0121142   -0.0443789     0.998941     0.101617
   0.999923   0.00335684    -0.011977 -0.000671928
          0            0            0            1
Result:
-0.00282176     0.999009    0.0444162  -0.00746998
  0.0121142   -0.0443789     0.998941     0.101617
   0.999923   0.00335684    -0.011977 -0.000671928
          0            0            0            1
Translation:  -0.00746998     0.101617 -0.000671928
Rotation: -0.48498 0.513209 0.492549 0.513209

Note that this run is not a perfect one with errors of 5 mm over a motion of 1 m.

Cost

One key piece of information is the output of the cost function, which is a metric representing an estimate of solution accuracy:

Initial cost: 1.882582e-05, Final cost: 1.607494e-05

With a really good run where the calibration is dead on the final cost should be on the order of 1e-13 or 1e-14.

Results

Now lets take a look at the results:

Translation:  -0.00746998     0.101617 -0.000671928
Rotation: -0.48498 0.513209 0.492549 0.513209

The translation is in xyz format, and the rotation is in quaternion format. It is important to note that this tool and camodocal use the eigen Quaternion format which orders the four values stored in a quaternion wxyz. ROS launch files, by comparison store the data in the order xyzw. That means when copying the results into ROS you must move the first entry of the rotation to the end.

Here is an example of all 7 numbers from above correctly put into a ros launch file:

<node pkg="tf" type="static_transform_publisher" name="endpoint_to_marker" args=" -0.00746998     0.101617 -0.000671928  0.513209 0.492549 0.513209  -0.48498   $(arg ee_frame) /endpoint_marker 10"/>

Questions? Here is what we need to know.

If you try running this and have a question please create a diagram of your use case so we can understand how you are setting up the equations, then create a github issue.

See this stack exchange question explaining of how Hand Eye Calibration works for an example of such a diagram.

Authors

Andrew Hundt [email protected] Felix Jonathan [email protected]

Acknowledgements

Hand-Eye Calibration Using Dual Quaternions

@article{daniilidis1999hand,
  title={Hand-eye calibration using dual quaternions},
  author={Daniilidis, Konstantinos},
  journal={The International Journal of Robotics Research},
  volume={18},
  number={3},
  pages={286--298},
  year={1999},
  publisher={SAGE Publications}
}

CamOdoCal

Lionel Heng, Bo Li, and Marc Pollefeys,
CamOdoCal: Automatic Intrinsic and Extrinsic Calibration of a Rig with Multiple Generic Cameras and Odometry,
In Proc. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2013.

Lionel Heng, Mathias Bürki, Gim Hee Lee, Paul Furgale, Roland Siegwart, and Marc Pollefeys,
Infrastructure-Based Calibration of a Multi-Camera Rig,
In Proc. IEEE International Conference on Robotics and Automation (ICRA), 2014.

Lionel Heng, Paul Furgale, and Marc Pollefeys,
Leveraging Image-based Localization for Infrastructure-based Calibration of a Multi-camera Rig,
Journal of Field Robotics (JFR), 2015.

References

  • Strobl, K., & Hirzinger, G. (2006) . Optimal hand-eye calibration. In 2006 IEEE/RSJ international conference on intelligent robots and systems (pp. 4647–4653), October 2006.
  • Technical University of Munich (TUM) CAMP lab wiki

  • K. Daniilidis, “Hand–Eye Calibration Using Dual Quaternions,” Int. Journal of Robs. Research, vol. 18, no. 3, pp. 286–298, June 1999.
  • E. Bayro–Corrochano, K. Daniilidis, and G. Sommer, “Motor–Algebra for 3D Kinematics: The Case of Hand–Eye Calibration,” Journal for Mathem. Imaging and Vision, vol. 13, no. 2, pp. 79–100, Oct. 2000.
  • F. Dornaika and R. Horaud, “Simultaneous Robot–World and Hand– Eye Calibration,” IEEE Trans. on Robs. and Aut., vol. 14, no. 4, pp. 617–622, August 1998.
  • Note: figures and text are from mixed sources including the presentation author, the various papers referenced, and the TUM wiki.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].