All Projects → youtalk → iknet

youtalk / iknet

Licence: Apache-2.0 license
Inverse kinematics estimation of ROBOTIS Open Manipulator X with neural networks

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to iknet

dynamixel control
ros2_control packages for ROBOTIS Dynamixel
Stars: ✭ 69 (+155.56%)
Mutual labels:  dynamixel, ros2, robotis
turtlebot3 msgs
ROS msgs package for TurtleBot3
Stars: ✭ 53 (+96.3%)
Mutual labels:  dynamixel, robotis
emanual
Welcome to the ROBOTIS e-Manual ! The e-Manual page rendered from this repository is available for everyone. Just simply click the provided link below :)
Stars: ✭ 105 (+288.89%)
Mutual labels:  dynamixel, robotis
ROBOTIS-OP3
ROS packages for the ROBOTIS OP3
Stars: ✭ 56 (+107.41%)
Mutual labels:  dynamixel, robotis
open manipulator simulations
ROS Simulation for OpenManipulator
Stars: ✭ 15 (-44.44%)
Mutual labels:  dynamixel, robotis
dynamixel-workbench
ROS packages for Dynamixel controllers, msgs, single_manager, toolbox, tutorials
Stars: ✭ 91 (+237.04%)
Mutual labels:  dynamixel, robotis
rmw ecal
ROS2 middleware based on eCAL
Stars: ✭ 30 (+11.11%)
Mutual labels:  ros2
astuff sensor msgs
A set of messages specific to each sensor supported by AutonomouStuff.
Stars: ✭ 37 (+37.04%)
Mutual labels:  ros2
isaac ros visual odometry
Visual odometry package based on hardware-accelerated NVIDIA Elbrus library with world class quality and performance.
Stars: ✭ 101 (+274.07%)
Mutual labels:  ros2
HexapodHDA
University Project: Design of a six-legged Hexapod with 3 DoF at each leg. Communication and control implementation on an Arduino 2560.
Stars: ✭ 20 (-25.93%)
Mutual labels:  dynamixel
EvoArm
An open-source 3D-printable robotic arm
Stars: ✭ 114 (+322.22%)
Mutual labels:  dynamixel
ROS
ROS机器人操作系统 学习(写于2020年夏)
Stars: ✭ 102 (+277.78%)
Mutual labels:  ros2
raspimouse ros2 examples
ROS 2 examples for Raspberry Pi Mouse
Stars: ✭ 29 (+7.41%)
Mutual labels:  ros2
li slam ros2
ROS2 package of tightly-coupled lidar inertial ndt/gicp slam
Stars: ✭ 160 (+492.59%)
Mutual labels:  ros2
reachy-legacy
A 7 dof prothesis robotic arm.
Stars: ✭ 66 (+144.44%)
Mutual labels:  dynamixel
realant
RealAnt robot platform for low-cost, real-world reinforcement learning
Stars: ✭ 40 (+48.15%)
Mutual labels:  dynamixel
ros2-ORB SLAM2
ROS2 node wrapping the ORB_SLAM2 library
Stars: ✭ 41 (+51.85%)
Mutual labels:  ros2
DDS-Router
The DDS Router is an application developed by eProsima that allows, using Fast DDS, to communicate by DDS protocol different networks.
Stars: ✭ 34 (+25.93%)
Mutual labels:  ros2
bootFromUSB
Boot NVIDIA Nano Jetson Developer Kit from a mass storage USB device (Jetson Nano devices A02, B01, 2GB and possibly Jetson TX1)
Stars: ✭ 96 (+255.56%)
Mutual labels:  jetson-nano
Awesome Robotic Tooling
Tooling for professional robotic development in C++ and Python with a touch of ROS, autonomous driving and aerospace.
Stars: ✭ 1,876 (+6848.15%)
Mutual labels:  ros2

IKNet: Inverse kinematics neural networks

IKNet is an inverse kinematics estimation with simple neural networks. This repository also contains the training and test dataset by manually moving the 4 DoF manipulator ROBOTIS Open Manipulator X.

IKNet can be trained on tested on NVIDIA Jetson Nano 2GB, Jetson family or PC with/without NVIDIA GPU. The training needs 900MB of GPU memory under default options.

Data collection

Set up

Install ROS 2 on Ubuntu 18.04 by following the ROBOTIS e-Manual.

https://emanual.robotis.com/docs/en/platform/openmanipulator_x/ros2_setup/#ros-setup

Then build some additional packages to modify a message open_manipulator_msgs/msg/KinematicsPose to add timestamp.

$ mkdir -p ~/ros2/src && cd ~/ros2/src
$ git clone https://github.com/youtalk/open_manipulator.git -b kinematics-pose-header
$ git clone https://github.com/youtalk/open_manipulator_msgs.git -b kinematics-pose-header
$ cd ~/ros2
$ colcon build
$ . install/setup.bash

Demo

First launch Open Manipulator X controller and turn the servo off to manually move it around.

$ ros2 launch open_manipulator_x_controller open_manipulator_x_controller.launch.py
$ ros2 service call /set_actuator_state open_manipulator_msgs/srv/SetActuatorState

Then collect the pair of the kinematics pose and the joint angles by recording /kinematics_pose and /joint_states topics under csv format.

$ ros2 topic echo --csv /kinematics_pose > kinematics_pose.csv & \
  ros2 topic echo --csv /joint_states > joint_states.csv

Finally append the headers into them to load by Pandas DataFrame.

$ sed -i "1s/^/sec,nanosec,frame_id,position_x,position_y,position_z,orientation_x,orientation_y,orientation_z,orientation_w,max_accelerations_scaling_factor,max_velocity_scaling_factor,tolerance\n/" kinematics_pose.csv
$ sed -i "1s/^/sec,nanosec,frame_id,name0,name1,name2,name3,name4,position0,position1,position2,position3,position4,velocity0,velocity1,velocity2,velocity3,velocity4,effort0,effort1,effort2,effort3,effort4\n/" joint_states.csv

IKNet data collection with Open Manipulator X

Training

Set up

Install PyTorch and the related packages.

$ conda install pytorch cudatoolkit=11.0 -c pytorch
$ pip3 install pytorch-pfn-extras matplotlib

Demo

Train IKNet with training dataset which is inside dataset/train directory or prepared by yourself. The dataset/train dataset contains a 5-minutes movement at 100 [Hz] sampling.

The training may be stopped before maximum epochs by the early stopping trigger.

$ python3 iknet_training.py --help
usage: iknet_training.py [-h] [--kinematics-pose-csv KINEMATICS_POSE_CSV]
                         [--joint-states-csv JOINT_STATES_CSV] [--train-val-ratio TRAIN_VAL_RATIO]
                         [--batch-size BATCH_SIZE] [--epochs EPOCHS] [--lr LR] [--save-model]

optional arguments:
  -h, --help            show this help message and exit
  --kinematics-pose-csv KINEMATICS_POSE_CSV
  --joint-states-csv JOINT_STATES_CSV
  --train-val-ratio TRAIN_VAL_RATIO
  --batch-size BATCH_SIZE
  --epochs EPOCHS
  --lr LR
  --save-model

$ python3 iknet_training.py
epoch       iteration   train/loss  lr          val/loss
1           3           0.0188889   0.01        0.0130676
2           6           0.0165503   0.01        0.0132546
3           9           0.0167138   0.01        0.0134633
...
61          183         0.00267084  0.01        0.00428417
62          186         0.00266047  0.01        0.00461381
63          189         0.00260262  0.01        0.00461737

The training can be run on NVIDIA Jetson Nano 2GB.

IKNet training on NVIDIA Jetson Nano 2GB

The loss indicates the L1 norm of the joint angles. So the final networks solved 0.00461737 [rad] accuracy on average.

train/loss and val/loss

Test

Demo

Evaluate accuracy of IKNet with test dataset which is inside dataset/test directory or prepared by yourself. The dataset/test dataset contains a 1-minute movement at 100 [Hz] sampling.

$ python3 iknet_test.py --help
usage: iknet_test.py [-h] [--kinematics-pose-csv KINEMATICS_POSE_CSV]
                     [--joint-states-csv JOINT_STATES_CSV] [--batch-size BATCH_SIZE]

optional arguments:
  -h, --help            show this help message and exit
  --kinematics-pose-csv KINEMATICS_POSE_CSV
  --joint-states-csv JOINT_STATES_CSV
  --batch-size BATCH_SIZE

$ python3 iknet_test.py
Total loss = 0.006885118103027344

Inference

Demo

Estimate the inverse kinematics of IKNet using Open Manipulator X. First launch Open Manipulator X controller.

$ ros2 launch open_manipulator_x_controller open_manipulator_x_controller.launch.py

Then run iknet_inference.py to input the pose (position and orientation) and move the robot. Note that the orientation is described by quaternion (qx, qy, qz, qw).

$ . ~/ros2/install/setup.bash
$ python3 iknet_inference.py --help
usage: iknet_inference.py [-h] [--model MODEL] [--trt] [--x X] [--y Y] [--z Z]
                          [--qx QX] [--qy QY] [--qz QZ] [--qw QW]

optional arguments:
  -h, --help     show this help message and exit
  --model MODEL
  --trt
  --x X
  --y Y
  --z Z
  --qx QX
  --qy QY
  --qz QZ
  --qw QW

$ python3 iknet_inference.py --x 0.1 --z 0.1
input dimentsions: [400, 300, 200, 100, 50]
dropout: 0.1
input: tensor([0.1000, 0.0000, 0.1000, 0.0000, 0.0000, 0.0000, 1.0000],
       device='cuda:0')
output: tensor([-0.0769, -0.9976,  1.3582, -0.2827], device='cuda:0',
       grad_fn=<AddBackward0>)

Inverse kinematics estimation by IKNet

Demo with TensorRT

If you would like to use the inference with TensorRT, first convert the PyTorch model to TensorRT enabled model using torch2trt.

$ python3 iknet_trt_export.py --help
usage: iknet_trt_export.py [-h] [--input-model INPUT_MODEL] [--output-model OUTPUT_MODEL]

optional arguments:
  -h, --help            show this help message and exit
  --input-model INPUT_MODEL
  --output-model OUTPUT_MODEL

Then run the iknet_inference.py mentioned above with the options.

$ python3 iknet_inference.py --trt --model iknet-trt.pth --x 0.1 --z 0.1
input dimentsions: [400, 300, 200, 100, 50]
dropout: 0.1
input: tensor([0.1000, 0.0000, 0.1000, 0.0000, 0.0000, 0.0000, 1.0000],
       device='cuda:0')
output: tensor([-0.0769, -0.9976,  1.3582, -0.2827], device='cuda:0',
       grad_fn=<AddBackward0>)

Reference

  • Theofanidis, Michail & Sayed, Saif & Cloud, Joe & Brady, James & Makedon, Fillia. (2018). Kinematic Estimation with Neural Networks for Robotic Manipulators: 27th International Conference on Artificial Neural Networks, Rhodes, Greece, October 4–7, 2018, Proceedings, Part III. 10.1007/978-3-030-01424-7_77.
  • Duka, Adrian-Vasile. (2014). Neural Network based Inverse Kinematics Solution for Trajectory Tracking of a Robotic Arm. Procedia Technology. 12. 20–27. 10.1016/j.protcy.2013.12.451.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].