All Projects → Phylliade → Ikpy

Phylliade / Ikpy

Licence: gpl-2.0
An Inverse Kinematics library aiming performance and modularity

Programming Languages

python
139335 projects - #7 most used programming language

Labels

Projects that are alternatives of or similar to Ikpy

Awesome Weekly Robotics
A list of projects that were or will be featured in Weekly Robotics newsletter
Stars: ✭ 255 (-18.01%)
Mutual labels:  robotics
Camlasercalibratool
Extrinsic Calibration of a Camera and 2d Laser
Stars: ✭ 277 (-10.93%)
Mutual labels:  robotics
Dreamerv2
Mastering Atari with Discrete World Models
Stars: ✭ 287 (-7.72%)
Mutual labels:  robotics
Gym Gazebo2
gym-gazebo2 is a toolkit for developing and comparing reinforcement learning algorithms using ROS 2 and Gazebo
Stars: ✭ 257 (-17.36%)
Mutual labels:  robotics
Bonnet
Bonnet: An Open-Source Training and Deployment Framework for Semantic Segmentation in Robotics.
Stars: ✭ 274 (-11.9%)
Mutual labels:  robotics
Furniture
IKEA Furniture Assembly Environment for Long-Horizon Complex Manipulation Tasks
Stars: ✭ 282 (-9.32%)
Mutual labels:  robotics
CLF reactive planning system
This package provides a CLF-based reactive planning system, described in paper: Efficient Anytime CLF Reactive Planning System for a Bipedal Robot on Undulating Terrain. The reactive planning system consists of a 5-Hz planning thread to guide a robot to a distant goal and a 300-Hz Control-Lyapunov-Function-based (CLF-based) reactive thread to co…
Stars: ✭ 21 (-93.25%)
Mutual labels:  robotics
Reward Learning Rl
[RSS 2019] End-to-End Robotic Reinforcement Learning without Reward Engineering
Stars: ✭ 310 (-0.32%)
Mutual labels:  robotics
Pyswip
PySwip is a Python - SWI-Prolog bridge enabling to query SWI-Prolog in your Python programs. It features an (incomplete) SWI-Prolog foreign language interface, a utility class that makes it easy querying with Prolog and also a Pythonic interface.
Stars: ✭ 276 (-11.25%)
Mutual labels:  robotics
Open quadtree mapping
This is a monocular dense mapping system corresponding to IROS 2018 "Quadtree-accelerated Real-time Monocular Dense Mapping"
Stars: ✭ 292 (-6.11%)
Mutual labels:  robotics
Free gait
An Architecture for the Versatile Control of Legged Robots
Stars: ✭ 263 (-15.43%)
Mutual labels:  robotics
Dreamer
Dream to Control: Learning Behaviors by Latent Imagination
Stars: ✭ 269 (-13.5%)
Mutual labels:  robotics
Openvslam
OpenVSLAM: A Versatile Visual SLAM Framework
Stars: ✭ 2,945 (+846.95%)
Mutual labels:  robotics
Toppra
robotic motion planning library
Stars: ✭ 254 (-18.33%)
Mutual labels:  robotics
Orb slam 2 ros
A ROS implementation of ORB_SLAM2
Stars: ✭ 294 (-5.47%)
Mutual labels:  robotics
ros-docker-images
🐳 Bring ROS to any Linux distributions.
Stars: ✭ 15 (-95.18%)
Mutual labels:  robotics
Visual Slam Roadmap
Roadmap to becoming a Visual-SLAM developer in 2021
Stars: ✭ 277 (-10.93%)
Mutual labels:  robotics
Ros Sensor Fusion Tutorial
An in-depth step-by-step tutorial for implementing sensor fusion with robot_localization! 🛰
Stars: ✭ 306 (-1.61%)
Mutual labels:  robotics
Cherry Autonomous Racecar
Implementation of the CNN from End to End Learning for Self-Driving Cars on a Nvidia Jetson TX1 using Tensorflow and ROS
Stars: ✭ 294 (-5.47%)
Mutual labels:  robotics
Se2lam
(ICRA 2019) Visual-Odometric On-SE(2) Localization and Mapping
Stars: ✭ 285 (-8.36%)
Mutual labels:  robotics

IKPy

Travis-CI PyPI

demo

IKPy on the bacter robot

Demo

Live demos of what IKPy can do (click on the image below to see the video):

Also, a presentation of IKPy: Presentation.

Features

With IKPy, you can:

  • Compute the Inverse Kinematics of every existing robot.
  • Compute the Inverse Kinematics in position, orientation, or both
  • Define your kinematic chain using arbitrary representations: DH (Denavit–Hartenberg), URDF, custom...
  • Automaticly import a kinematic chain from a URDF file.
  • Use pre-configured robots, such as baxter or the poppy-torso
  • IKPy is precise (up to 7 digits): the only limitation being your underlying model's precision, and fast: from 7 ms to 50 ms (depending on your precision) for a complete IK computation.
  • Plot your kinematic chain: no need to use a real robot (or a simulator) to test your algorithms!
  • Define your own Inverse Kinematics methods.
  • Utils to parse and analyze URDF files:

Moreover, IKPy is a pure-Python library: the install is a matter of seconds, and no compiling is required.

Installation

You have three options:

  1. From PyPI (recommended) - simply run:

    pip install ikpy
    

    If you intend to plot your robot, you can install the plotting dependencies (mainly matplotlib):

    pip install 'ikpy[plot]'
    
  2. If you work with Anaconda, there's also a Conda package of IKPy:

conda install -c https://conda.anaconda.org/phylliade ikpy
  1. From source - first download and extract the archive, then run:

    pip install ./
    

    NB: You must have the proper rights to execute this command

Quickstart

Follow this IPython notebook.

Guides and Tutorials

Go to the wiki. It should introduce you to the basic concepts of IKPy.

API Documentation

An extensive documentation of the API can be found here.

Dependencies and compatibility

The library can work with both versions of Python (2.7 and 3.x). It requires numpy and scipy.

sympy is highly recommended, for fast hybrid computations, that's why it is installed by default.

matplotlib is optional: it is used to plot your models (in 3D).

Contributing

IKPy is designed to be easily customisable: you can add your own IK methods or robot representations (such as DH-Parameters) using a dedicated developer API.

Contributions are welcome: if you have an awesome patented (but also open-source!) IK method, don't hesitate to propose adding it to the library!

Links

  • If performance is your main concern, aversive++ has an inverse kinematics module written in C++, which works the same way IKPy does.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].