All Projects → vibhuthasak → Obstacle_Avoidance_ROS

vibhuthasak / Obstacle_Avoidance_ROS

Licence: other
Autonomous Obstacle Avoidance Robot ROS + GAZEBO

Programming Languages

CMake
9771 projects
python
139335 projects - #7 most used programming language
C++
36643 projects - #6 most used programming language

Projects that are alternatives of or similar to Obstacle Avoidance ROS

leobot
LeoBot telepresence robot
Stars: ✭ 19 (-63.46%)
Mutual labels:  ros-kinetic
summit xl sim
Packages for the simulation of the Summit XL, Summit XL HL and Summit-X (including X-WAM) robots
Stars: ✭ 32 (-38.46%)
Mutual labels:  ros-kinetic
SASensorProcessing
ROS node to create pointcloud out of stereo images from the KITTI Vision Benchmark Suite
Stars: ✭ 26 (-50%)
Mutual labels:  ros-kinetic
robotiq 2finger grippers
ROS packages enabling the control, visualization and simulation of the Robotiq 2 Fingers Adaptive Grippers model version C3
Stars: ✭ 59 (+13.46%)
Mutual labels:  ros-kinetic
2019-UGRP-DPoom
2019 DGIST DPoom project under UGRP : SBC and RGB-D camera based full autonomous driving system for mobile robot with indoor SLAM
Stars: ✭ 35 (-32.69%)
Mutual labels:  ros-kinetic
robocar
Buillt on top of ROS and donkeycar hardware
Stars: ✭ 44 (-15.38%)
Mutual labels:  ros-kinetic
ORB-SLAM2 ROS
A ROS wrapper for ORB-SLAM2
Stars: ✭ 54 (+3.85%)
Mutual labels:  ros-kinetic
annotate
Create 3D labelled bounding boxes in RViz
Stars: ✭ 104 (+100%)
Mutual labels:  ros-kinetic
ROS-GPS
GPS Localization with ROS, OSM and rviz
Stars: ✭ 19 (-63.46%)
Mutual labels:  ros-kinetic
ROS-Intelligent-Service-Robot
A ROS robot supporting voice control, autonomous navigation and robot arm motion.
Stars: ✭ 55 (+5.77%)
Mutual labels:  ros-kinetic
roskinectic src
This ROS kinectic workspace src folder, which was created on Ubuntu 16.04. Here I worked on ROS1 projects like 2d & 3D SLAM, Motion Planning, SWARM of drone, RL based drone, Surveilling Robot etc.
Stars: ✭ 44 (-15.38%)
Mutual labels:  ros-kinetic
ros openvino
A ROS package to wrap openvino inference engine and get it working with Myriad and GPU
Stars: ✭ 57 (+9.62%)
Mutual labels:  ros-kinetic
find moving objects
A ROS library that finds moving objects and derives their position and velocity, based on 2D laser scan or 3D point cloud 2 data streams.
Stars: ✭ 23 (-55.77%)
Mutual labels:  ros-kinetic
ddpg biped
Repository for Planar Bipedal walking robot in Gazebo environment using Deep Deterministic Policy Gradient(DDPG) using TensorFlow.
Stars: ✭ 65 (+25%)
Mutual labels:  ros-kinetic
RACLAB
No description or website provided.
Stars: ✭ 33 (-36.54%)
Mutual labels:  ros-kinetic
motion-planner-reinforcement-learning
End to end motion planner using Deep Deterministic Policy Gradient (DDPG) in gazebo
Stars: ✭ 99 (+90.38%)
Mutual labels:  ros-kinetic
FusionAD
An open source autonomous driving stack by San Jose State University Autonomous Driving Team
Stars: ✭ 30 (-42.31%)
Mutual labels:  ros-kinetic
ros webconsole
🌐 A ROS WEB console to control remotely your robot. Based with robotwebtools.
Stars: ✭ 71 (+36.54%)
Mutual labels:  ros-kinetic
Optical-Flow-based-Obstacle-Avoidance
Image based obstacle avoidance using optical flow
Stars: ✭ 24 (-53.85%)
Mutual labels:  ros-kinetic
hfsd
This is a ROS package used to detect directions of free space in enclosed areas where sensors fail to get returns
Stars: ✭ 13 (-75%)
Mutual labels:  ros-kinetic

Obstacle_Avoidance_ROS

Project in action : Watch on youtube

Cloning the project

Create a new directory in <catkin_workspace>/src/testbot_description and clone all of project files to that folder.

sample commands:

  1. mkdir ~/catkin_ws/src/testbot_description
  2. cd ~/catkin_ws/src/testbot_description
  3. git clone https://github.com/vibhuthasak/Obstacle_Avoidance_ROS.git
  4. cd ~/catkin_ws
  5. catkin_make

Step by Step ROS Command Explanation

  1. First you need to Launch the ROS node.

    roslaunch testbot_description testbot_gazebo.launch

    testbot_description is the package name that I gave. And testbot_gazebo.launch file can be found at /launch folder After executing this command, Gazebo will open like below. alt text

Everthing looking great so far right ?

  1. Now you need to run the python script that is created to listen sensor data from our virtual robot and move the robot.

    rosrun testbot_description sensor_data_listerner.py

Python Script Explanation in brief:

The python script is available at Obstacle_Avoidance_ROS/scripts/sensor_data_listener.py path.

The script is simply subcribes to /scan topic's sensor_msgs.msg.LaserScan message. The message is modified using the LaserScanProcess() function.

rospy.Subscriber("scan", sensor_msgs.msg.LaserScan , LaserScanProcess)

And it is publishing commands to /cmd_vel topic and the message type is Twist.

pub = rospy.Publisher('/cmd_vel', Twist, queue_size=10)

If you echo the /cmd_vel topic using following command

rostopic echo cmd_vel

You will see it is printing Linear and Angular velocities.

DONE

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].