All Projects → mkhuthir → Robond Kinematics Project

mkhuthir / Robond Kinematics Project

Licence: mit
Robotic Arm Pick & Place using ROS, RViz, Gazebo, and KUKA KR210

Projects that are alternatives of or similar to Robond Kinematics Project

Handeye calib camodocal
Easy to use and accurate hand eye calibration which has been working reliably for years (2016-present) with kinect, kinectv2, rgbd cameras, optical trackers, and several robots including the ur5 and kuka iiwa.
Stars: ✭ 364 (+833.33%)
Mutual labels:  robot, rotation
Meican
美餐自动订餐
Stars: ✭ 27 (-30.77%)
Mutual labels:  robot
Rex Gym
OpenAI Gym environments for an open-source quadruped robot (SpotMicro)
Stars: ✭ 684 (+1653.85%)
Mutual labels:  robot
Plen 3dmodel fusion360
PLEN2's 3D model data implemented by Autodesk Fusion 360.
Stars: ✭ 24 (-38.46%)
Mutual labels:  robot
Puloversmacrocreator
Automation Utility - Recorder & Script Generator
Stars: ✭ 803 (+1958.97%)
Mutual labels:  robot
Newbe.mahua.framework
本SDK为实现QQ机器人平台的大一统,支持多种机器人平台:酷Q、MyPCQQ、QQLight等
Stars: ✭ 849 (+2076.92%)
Mutual labels:  robot
Mvision
机器人视觉 移动机器人 VS-SLAM ORB-SLAM2 深度学习目标检测 yolov3 行为检测 opencv PCL 机器学习 无人驾驶
Stars: ✭ 6,140 (+15643.59%)
Mutual labels:  robot
True artificial intelligence
真AI人工智能
Stars: ✭ 38 (-2.56%)
Mutual labels:  robot
Ros Academy For Beginners
中国大学MOOC《机器人操作系统入门》代码示例 ROS tutorial
Stars: ✭ 861 (+2107.69%)
Mutual labels:  robot
Upboard ros
ROS nodes for upboard usage
Stars: ✭ 22 (-43.59%)
Mutual labels:  robot
Fourth robot pkg
4号機(KIT-C4)用リポジトリ
Stars: ✭ 7 (-82.05%)
Mutual labels:  robot
Robotgo
RobotGo, Go Native cross-platform GUI automation @vcaesar
Stars: ✭ 7,095 (+18092.31%)
Mutual labels:  robot
Sertain
🤖 Write more for your robot with less
Stars: ✭ 11 (-71.79%)
Mutual labels:  robot
Derek
Reduce maintainer fatigue by automating GitHub
Stars: ✭ 714 (+1730.77%)
Mutual labels:  robot
Rverify.js
✅❎ A lightweight image rotation verification plugin.
Stars: ✭ 33 (-15.38%)
Mutual labels:  rotation
Turtlebot3
ROS packages for Turtlebot3
Stars: ✭ 673 (+1625.64%)
Mutual labels:  robot
Cq Picsearcher Bot
🤖 基于 saucenao / ascii2d / whatanime 的搜图机器人
Stars: ✭ 830 (+2028.21%)
Mutual labels:  robot
Slackbot Destroyer
📣 ❌ Slack integration that can destroy all incoming messages from Slackbot.
Stars: ✭ 23 (-41.03%)
Mutual labels:  robot
Gomirai
适配于Mirai HTTP API的Golang SDK
Stars: ✭ 39 (+0%)
Mutual labels:  robot
Multi Robot Path Planning On Graphs
Multi-Robot Path Planning on Graphs Solution by A* algorithm
Stars: ✭ 36 (-7.69%)
Mutual labels:  robot

Udacity - Robotics NanoDegree Program

Robotic Arm Pick & Place Project

Using KUKA KR210 Robot

Writeup by Muthanna A. Attyah

Feb 2018

Rubric Points


Setting up the environment:

I have used the following environment to run this project:

  • Ubuntu 16.04 LTS OS
  • ROS Kinetic 1.12.12
  • Gazebo 7.9
  • RVIZ version 1.12.15 (Qt version 5.5.1 & OGRE version 1.9.0)

installation steps:

  • Clone this repository to your home directory:
$ git clone https://github.com/mkhuthir/RoboND-Kinematics-Project.git ~/catkin_ws 
  • As this project uses custom Gazebo 3D models, we need to add the path through environment variable:
$ echo "export GAZEBO_MODEL_PATH=~/catkin_ws/src/kuka_arm/models" >> ~/.bashrc
  • Install missing ROS dependencies using the rosdep install command:
$ cd ~/catkin_ws/
$ rosdep install --from-paths src --ignore-src --rosdistro=kinetic -y
  • Run catkin_make from within your workspace to build the project:
$ cd ~/catkin_ws/
$ catkin_make
  • Run the following shell commands to source the setup files:
$ echo "source ~/catkin_ws/devel/setup.bash" >> ~/.bashrc
  • For demo mode make sure the demo flag is set to true in inverse_kinematics.launch file under ~/catkin_ws/src/kuka_arm/launch/

  • You can also control the spawn location of the target object in the shelf by modifying the spawn_location argument in target_description.launch file under ~/catkin_ws/src/kuka_arm/launch/. 0-9 are valid values for spawn_location with 0 being random mode.

  • To run forward kinematics test us:

$ roslaunch kuka_arm forward_kinematics.launch
  • To run simulator use:
$ rosrun kuka_arm safe_spawner.sh
  • To run IK Server use:
$ rosrun kuka_arm IK_server.py 

Forward Kinematic Analysis

Extracting joint positions and orientations from URDF file.

from the URDF file kr210.urdf.xacro we can extract the position xyz and orientation rpy of each joint from origin tag in each joint XML section:

for example, from the following fixed base joint XML section:

 <!-- joints -->
  <joint name="fixed_base_joint" type="fixed">
    <parent link="base_footprint"/>
    <child link="base_link"/>
    <origin xyz="0 0 0" rpy="0 0 0"/>
  </joint>

In the origin tag:

<origin xyz="0 0 0" rpy="0 0 0"/>

We can see that xyz="0 0 0" and rpy="0 0 0".

Following table is showing complete extracted list for all joints base to gripper:

O joint parent child x y z r p y
0 fixed_base base_footprint base_link 0 0 0 0 0 0
1 joint_1 base_link link_1 0 0 0.33 0 0 0
2 joint_2 link_1 link_2 0 .35 0 0.42 0 0 0
3 joint_3 link_2 link_3 0 0 1.25 0 0 0
4 joint_4 link_3 link_4 0.96 0 -0.054 0 0 0
5 joint_5 link_4 link_5 0.54 0 0 0 0 0
6 joint_6 link_5 link_6 0.193 0 0 0 0 0
7 gripper link_6 gripper_link 0.11 0 0 0 0 0
. Total (m) 2.153 0 1.946 0 0 0

Now using the above table we can draw the different frames with x and z translations from one frame to another as shown in below figure.

Then we can further simplify by combining the last three joints (4,5, and 6) in in joint_5 since their axes in actual KR210 robot intersect at a single point which represent the center of the robot spherical wrist:

Note that:

Origin O(i) = intersection between Xi and Zi axis

Link Length: a(i-1) = Zi-1 - Zi along the X(i-1) axis

Link Offset: d(i) = X(i-1) - X(i) along Z(i) axis

Link Twist: alpha(i-1) = angle from Z(i-1) to Z(i) measured about Xi-1 using right hand rule

Joint Angle: theta(i) = angle from X(i-1) to X(i) measured about Zi using right hand rule. all joint angles will be zero at initial Robot state in KR210 except joint 2 which has a -90 degree constant offset between X(1) and X(2).

Gripper frame: is the end point that we care about. it is displaced from Frame 6 by a translation along Z(6).

Kuka KR210 robot DH parameters.

Using the above mentioned formulas, we can generate the DH parameters table as following:

Links i alpha(i-1) a(i-1) d(i) theta(i)
0->1 1 0 0 0.75 q1
1->2 2 -90 0.35 0 -90+q2
2->3 3 0 1.25 q3
3->4 4 -90 -0.05 1.5 q4
4->5 5 90 0 0 q5
5->6 6 -90 0 0 q6
6->7 7 0 0 0.303 q7

in which q(i) is our input to joint angles (theta(i)).

I will be using python to code the forward kinematics:

To start with, we need the following imports:

import numpy as np
from numpy import array
from sympy import symbols, cos, sin, pi, simplify, sqrt, atan2, pprint
from sympy.matrices import Matrix

Python code to represent DH parameters table is:

# DH Table
s = {alpha0:      0, a0:      0, d1:  0.75, q1:        q1,
     alpha1: -pi/2., a1:   0.35, d2:     0, q2: -pi/2.+q2,
     alpha2:      0, a2:   1.25, d3:     0, q3:        q3,
     alpha3: -pi/2., a3: -0.054, d4:  1.50, q4:        q4,
     alpha4:  pi/2., a4:      0, d5:     0, q5:        q5,
     alpha5: -pi/2., a5:      0, d6:     0, q6:        q6,
     alpha6:      0, a6:      0, d7: 0.303, q7:         0}

Creating the individual transformation matrices about each joint:

Using above DH parameter table, we can create individual transforms between various links. DH convention uses four individual transforms:

Using the DH parameter table, we can transform from one frame to another using the following matrix:

Python code for a function that will return the individual frame transformation matrix is as following:

# Function to return homogeneous transform matrix

def TF_Mat(alpha, a, d, q):
    TF = Matrix([[            cos(q),           -sin(q),           0,             a],
                 [ sin(q)*cos(alpha), cos(q)*cos(alpha), -sin(alpha), -sin(alpha)*d],
                 [ sin(q)*sin(alpha), cos(q)*sin(alpha),  cos(alpha),  cos(alpha)*d],
                 [                 0,                 0,           0,             1]])
    return TF

Then using the following code to substitute the DH parameters into the transformation matrix:

   ## Substitute DH_Table
T0_1 = TF_Mat(alpha0, a0, d1, q1).subs(dh)
T1_2 = TF_Mat(alpha1, a1, d2, q2).subs(dh)
T2_3 = TF_Mat(alpha2, a2, d3, q3).subs(dh)
T3_4 = TF_Mat(alpha3, a3, d4, q4).subs(dh)
T4_5 = TF_Mat(alpha4, a4, d5, q5).subs(dh)
T5_6 = TF_Mat(alpha5, a5, d6, q6).subs(dh)
T6_7 = TF_Mat(alpha6, a6, d7, q7).subs(dh)

To get the composition of all transforms from base to gripper we simply multiply the individual matrices using the following code:

# Composition of Homogeneous Transforms
# Transform from Base link to end effector (Gripper)
T0_2 = (T0_1 * T1_2) ## (Base) Link_0 to Link_2
T0_3 = (T0_2 * T2_3) ## (Base) Link_0 to Link_3
T0_4 = (T0_3 * T3_4) ## (Base) Link_0 to Link_4
T0_5 = (T0_4 * T4_5) ## (Base) Link_0 to Link_5
T0_6 = (T0_5 * T5_6) ## (Base) Link_0 to Link_6
T0_7 = (T0_6 * T6_7) ## (Base) Link_0 to Link_7 (End Effector)

In order to apply correction needed to account for Orientation Difference Between definition of Gripper Link_7 in URDF versus DH Convention we need to rotate around y then around z axis:

R_y = Matrix([[ cos(-np.pi/2),           0, sin(-np.pi/2), 0],
              [             0,           1,             0, 0],
              [-sin(-np.pi/2),           0, cos(-np.pi/2), 0],
              [             0,           0,             0, 1]])

R_z = Matrix([[    cos(np.pi), -sin(np.pi),             0, 0],
              [    sin(np.pi),  cos(np.pi),             0, 0],
              [             0,           0,             1, 0],
              [             0,           0,             0, 1]])


R_corr = (R_z * R_y)

T_total= (T0_7 * R_corr)

To check results we can evaluate the individual results when all thetas are equal to zero and compare it to rviz simulator values. I have used pretty print (pprint) to show the resulting matrix as shown in below code.

### Numerically evaluate transforms (compare this to output of tf_echo/rviz)

print("\nT0_7 = \n")
pprint(T0_7.evalf(subs={q1: 0, q2: 0, q3: 0, q4: 0, q5: 0, q6: 0}))

Remember that the homogeneous transform consists of a rotation part and a translation part as follows:

where Px, Py, Pz represent the position of end-effector w.r.t. base_link and RT represent the rotation part using the Roll-Pitch-Yaw angles

Individual transform matrices about each joint using the DH table are as following:

In order to compare the output of forward kinematics code with simulator I used the following ROS launch command to run simulator:

$ roslaunch kuka_arm forward_kinematics.launch

Then as shown below; I used RViz tf frames to check values of Px, Py, and Pz in compare to output of python code.

Test Case 1: When all thetas = 0 which is the robot initial state.

T0_7.evalf(subs={q1: 0, q2: 0, q3: 0, q4: 0, q5: 0, q6: 0})
Rviz output:

Python output:

Test Case 2: When theta2=0.44 and remaining thetas=0

T0_7.evalf(subs={q1: 0, q2: 0.44, q3: 0, q4: 0, q5: 0, q6: 0})
Rviz output:

Python output:

Python code for forward kinematics test FK_kr210.py is located on this link

Inverse Kinematics Analysis

Since the last three joints in KUKA KR210 robot (Joint_4, Joint_5, and Joint_6) are revolute and their joint axes intersect at a single point (Joint_5), we have a case of spherical wrist with joint_5 being the common intersection point; the wrist center (WC). This allows us to kinematically decouple the IK problem into Inverse Position and Inverse Orientation problems.

Inverse Position

First step is to get the end-effector position(Px, Py, Pz) and orientation (Roll, Pitch, Yaw) from the test cases data class as shown in below code:

    # Requested end-effector (EE) position
    px = req.poses[x].position.x
    py = req.poses[x].position.y
    pz = req.poses[x].position.z
    
    # store EE position in a matrix
    EE = Matrix([[px],
                 [py],
                 [pz]])
    
    # Requested end-effector (EE) orientation
    (roll,pitch,yaw) = tf.transformations.euler_from_quaternion(
        [req.poses[x].orientation.x,
         req.poses[x].orientation.y,
         req.poses[x].orientation.z,
         req.poses[x].orientation.w])

We will need rotation matrix for the end-effector:

R_rpy = Rot(Z, yaw) * Rot(Y, pitch) * Rot(X, roll)

and orientation difference correction matrix (Rot_corr) as earlier discussed in FK section.

R_EE = R_rpy * R_corr

We substitute the obtained roll, pitch and yaw in the final rotation matrix. Python Code is as following:

 # Find EE rotation matrix RPY (Roll, Pitch, Yaw)
    r,p,y = symbols('r p y')

    # Roll
    ROT_x = Matrix([[       1,       0,       0],
                    [       0,  cos(r), -sin(r)],
                    [       0,  sin(r),  cos(r)]])
    # Pitch
    ROT_y = Matrix([[  cos(p),       0,  sin(p)],
                    [       0,       1,       0],
                    [ -sin(p),       0,  cos(p)]])
    # Yaw
    ROT_z = Matrix([[  cos(y), -sin(y),       0],
                    [  sin(y),  cos(y),       0],
                    [       0,       0,       1]])

    ROT_EE = ROT_z * ROT_y * ROT_x

    # Correction Needed to Account for Orientation Difference Between
    # Definition of Gripper Link_G in URDF versus DH Convention

    ROT_corr = ROT_z.subs(y, radians(180)) * ROT_y.subs(p, radians(-90))
    
    ROT_EE = ROT_EE * ROT_corr
    ROT_EE = ROT_EE.subs({'r': roll, 'p': pitch, 'y': yaw})

The obtained matrix will be the rotation part of the full homogeneous transform matrix as yellow highlighted in the following:

where l, m and n are orthonormal vectors representing the end-effector orientation along X, Y, Z axes of the local coordinate frame.

Since n is the vector along the z-axis of the gripper_link, we can say the following:

Where,

Px, Py, Pz = end-effector positions obtained from test case data

Xwc, Ywc, Zwc = wrist center positions that we are trying to find.

d6 = link_6 length obtained from DH table (d6=0)

d7 = end-effector length obtained from DH table (d7=0.303)

The same equation in vectorized version (d is the displacement):

In Python code:

    # Calculate Wrest Center
    WC = EE - (0.303) * ROT_EE[:,2]

WC is now having position of wrist center (Wx, Wy, Wz).

To find 𝜃1, we need to project Wz onto the ground plane Thus,

Theta1=atan2(Wy,Wx)

    # Calculate theat1
    theta1 = atan2(WC[1],WC[0])

Using trigonometry, we can calculate 𝜃2 and 𝜃3.

We have a triangle (the green color in below figure) with two sides known to us (A = d4 = 1.5) and (C = a2 = 1.25), the 3rd side (B) can be calculated as following:

Below is the same in Python code:

    #SSS triangle for theta2 and theta3
    A = 1.501
    C = 1.25
    B = sqrt(pow((sqrt(WC[0]*WC[0] + WC[1]*WC[1]) - 0.35), 2) + pow((WC[2] - 0.75), 2))

Now since we have all three sides of the triangle known to us we can calculate all of the three inner angles of the triangle from the known three sides Using trigonometry (specifically the Cosine Laws SSS type).

The same in Python code:

   a = acos((B*B + C*C - A*A) / (2*B*C))
   b = acos((A*A + C*C - B*B) / (2*A*C))
   c = acos((A*A + B*B - C*C) / (2*A*B))

Finally we calculate 𝜃2 and 𝜃3

    theta2 = pi/2 - a - atan2(WC[2]-0.75, sqrt(WC[0]*WC[0]+WC[1]*WC[1])-0.35)
    theta3 = pi/2 - (b+0.036) # 0.036 accounts for sag in link4 of -0.054m

Inverse Orientation

For the Inverse Orientation problem, we need to find values of the final three joint variables 𝜃4, 𝜃5 and 𝜃6.

Using the individual DH transforms we can obtain the resultant transform and hence resultant rotation by:

R0_6 = R0_1R1_2R2_3R3_4R4_5*R5_6

Since the overall RPY (Roll Pitch Yaw) rotation between base_link and gripper_link must be equal to the product of individual rotations between respective links, following holds true:

R0_6 = R_EE

where,

R_EE = Homogeneous RPY rotation between base_link and gripper_link as calculated above.

We can substitute the values we calculated for 𝜃1, 𝜃2 and 𝜃3. in their respective individual rotation matrices and pre-multiply both sides of the above equation by inv(R0_3) which leads to:

R3_6 = inv(R0_3) * R_EE

The resultant matrix on the RHS (Right Hand Side of the equation) does not have any variables after substituting the joint angle values, and hence comparing LHS (Left Hand Side of the equation) with RHS will result in equations for 𝜃4, 𝜃5 and 𝜃6.

    # Extract rotation matrix R0_3 from transformation matrix T0_3 the substitute angles q1-3
    R0_3 = T0_1[0:3,0:3] * T1_2[0:3,0:3] * T2_3[0:3,0:3]
    R0_3 = R0_3.evalf(subs={q1: theta1, q2: theta2, q3:theta3})

    # Get rotation matrix R3_6 from (inverse of R0_3 * R_EE)
    R3_6 = R0_3.inv(method="LU") * ROT_EE

I have added if/else to select the best solution for 𝜃4, 𝜃5 and 𝜃6.

# Euler angles from rotation matrix
            theta5 = atan2(sqrt(R3_6[0,2]*R3_6[0,2] + R3_6[2,2]*R3_6[2,2]),R3_6[1,2])
            
            # select best solution based on theta5
            if (theta5 > pi) :
                theta4 = atan2(-R3_6[2,2], R3_6[0,2])
                theta6 = atan2(R3_6[1,1],-R3_6[1,0])
            else:
                theta4 = atan2(R3_6[2,2], -R3_6[0,2])
                theta6 = atan2(-R3_6[1,1],R3_6[1,0])

Also I have added to the forward kinematics code to help in checking for errors.

FK = T0_7.evalf(subs={q1:theta1,q2:theta2,q3:theta3,q4:theta4,q5:theta5,q6:theta6})

The rest of the code will utilize wrist center position WC and the thetas to calculate the corresponding errors. Using these error values as a basis, We can gauge how well our current IK performs.

I have added one line of code to print out the test case number. rest of the code is as provided.

    # Print test case number
    print ("Using Test Case Number %d" %test_case_number)

The output of all 3 provided test cases are as following:

Test Case 1 output:

Total run time to calculate joint angles from pose is 0.6331 seconds
Using Test Case Number 1

Wrist error for x position is: 0.00000046
Wrist error for y position is: 0.00000032
Wrist error for z position is: 0.00000545
Overall wrist offset is: 0.00000548 units

Theta 1 error is: 0.00093770
Theta 2 error is: 0.00181024
Theta 3 error is: 0.00205031
Theta 4 error is: 0.00172067
Theta 5 error is: 0.00197873
Theta 6 error is: 0.00251871

**These theta errors may not be a correct representation of your code, due to the fact            
that the arm can have multiple positions. It is best to add your forward kinematics to            
confirm whether your code is working or not**
 

End effector error for x position is: 0.00002010
End effector error for y position is: 0.00001531
End effector error for z position is: 0.00002660
Overall end effector offset is: 0.00003668 units 

Test Case 2 output:

Total run time to calculate joint angles from pose is 0.6635 seconds
Using Test Case Number 2

Wrist error for x position is: 0.00002426
Wrist error for y position is: 0.00000562
Wrist error for z position is: 0.00006521
Overall wrist offset is: 0.00006980 units

Theta 1 error is: 3.14309971
Theta 2 error is: 0.27930449
Theta 3 error is: 1.86835102
Theta 4 error is: 3.08639294
Theta 5 error is: 0.06340564
Theta 6 error is: 6.13524247

**These theta errors may not be a correct representation of your code, due to the fact            
that the arm can have multiple positions. It is best to add your forward kinematics to            
confirm whether your code is working or not**
 

End effector error for x position is: 0.00002566
End effector error for y position is: 0.00002581
End effector error for z position is: 0.00000461
Overall end effector offset is: 0.00003668 units 

Test Case 3 output:

Total run time to calculate joint angles from pose is 0.6569 seconds
Using Test Case Number 3

Wrist error for x position is: 0.00000503
Wrist error for y position is: 0.00000512
Wrist error for z position is: 0.00000585
Overall wrist offset is: 0.00000926 units

Theta 1 error is: 0.00136747
Theta 2 error is: 0.00325738
Theta 3 error is: 0.00339563
Theta 4 error is: 6.28212730
Theta 5 error is: 0.00284405
Theta 6 error is: 6.28223850

**These theta errors may not be a correct representation of your code, due to the fact            
that the arm can have multiple positions. It is best to add your forward kinematics to            
confirm whether your code is working or not**
 

End effector error for x position is: 0.00000069
End effector error for y position is: 0.00000011
End effector error for z position is: 0.00003668
Overall end effector offset is: 0.00003668 units 

Python code for forward kinematics test IK_debug.py is located on this link

Project Implementation

In order to proceed in testing IK code in simulator I have done the following:

  • Copied the relevant functions/code from IK_debug.py to IK_server.py
  • Further optimized the code to improve its performance by removing not needed code out of the loop and converting some of the calculations to fixed numbers.
  • Modified the shell script safe_spawner.sh to include automatic run of IK_server.py.
  • Changed the launch file inverse_kinematics.launch to set "demo" value to "false"

Python code for forward kinematics test IK_server.py is located on this link

Issues faced during project

  • Handling VMWare virtual machine image was difficult considering its big size (4.5+GB compressed and 12+ GB uncompressed). I decided to install ROS Kinetics and Gazebo 7.9 on Ubuntu 16.04 LTS with all its required dependencies and updates. It took me some time but I felt that it was much more beneficial in learning Ubuntu/ROS/Gazebo/RViz installation, update, upgrade, and package management tricks.

  • Gazebo was crashing frequently however I was able to get better stability when I updated it to the 7.9 release. killall gzserver was very handy to clear stuck process after a crash.

  • when compliling using catkin_make I used to get error "cannot convert to bool". I resolved it by adding static_cast<bool>(). see this

  • Slowness of the forward and inverse kinematics was one of the issues I faced at the initial stage of the code writing/testing. I was able to improve performance by removing unwanted sympy "simplify" calls then further improve it by pre-calculating some of the equations that are having constants.

  • When I use "continue" in RViz, It doesn't give the gripper enough time to close and isn't picking up the cylinder. I have added a delay in line 327 in the /src/trajectory_sampler.cpp file. and changed the following code:

 // changed the value for close_gripper from 0.02 to 0.03 to allow proper grasping
 // Set finger joint values
  if (close_gripper)
  {
    gripper_joint_positions[0] = 0.03;  // radians 
    gripper_joint_positions[1] = 0.03;  // radians
  }
  else
  {
    gripper_joint_positions[0] = -0.01;  // radians
    gripper_joint_positions[1] = -0.01;  // radians
  }
  • rosdep install --from-paths src --ignore-src -r -y was a live saver for me to check missing ROS packages and get it installed automatically.

Future improvements

  • I recommend adding ROS/Ubuntu/Gazebo manual installation as one of the supported platforms for this ND and show required steps which will be a great addon value for students to learn how to install those great tools on their own PCs or DevBoards. Starting a new dedicated ROS NanoDegree will be great, and for sure I will be one of the first to enroll in it.

  • I think performance of IK will be much better if we converted some or all of the code parts from sympy to numpy see this link for more info.

  • To use a class with all symbols and functions, and create it when calling the IK_server.py, that way it will be speeding up code, since the matrices all symbols would be created just once.

Recorded Test video

The full testing video after implementing all above mentioned steps is uploaded on YouTube on the following link:

test video

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].