All Projects → jhu-asco → aerial_autonomy

jhu-asco / aerial_autonomy

Licence: MPL-2.0 license
Easily extendable package for interacting with and defining state machines for autonomous aerial systems

Programming Languages

C++
36643 projects - #6 most used programming language
python
139335 projects - #7 most used programming language
CMake
9771 projects
c
50402 projects - #5 most used programming language
shell
77523 projects
matlab
3953 projects

Projects that are alternatives of or similar to aerial autonomy

Behaviortree.cpp
Behavior Trees Library in C++. Batteries included.
Stars: ✭ 793 (+3504.55%)
Mutual labels:  state-machine, robotics, ros
Ardupilot
ArduPlane, ArduCopter, ArduRover, ArduSub source
Stars: ✭ 6,637 (+30068.18%)
Mutual labels:  drone, robotics, ros
erdos
Dataflow system for building self-driving car and robotics applications.
Stars: ✭ 135 (+513.64%)
Mutual labels:  drone, robotics, ros
Robotics-Object-Pose-Estimation
A complete end-to-end demonstration in which we collect training data in Unity and use that data to train a deep neural network to predict the pose of a cube. This model is then deployed in a simulated robotic pick-and-place task.
Stars: ✭ 153 (+595.45%)
Mutual labels:  ros, manipulation, autonomy
Gort
Command Line Interface (CLI) for RobotOps
Stars: ✭ 425 (+1831.82%)
Mutual labels:  drone, robotics
grvc-ual
An abstraction layer for unmanned aerial vehicles
Stars: ✭ 35 (+59.09%)
Mutual labels:  drone, ros
interbotix ros manipulators
ROS Packages for Interbotix Arms
Stars: ✭ 32 (+45.45%)
Mutual labels:  robotics, ros
Eyantra drone
Metapackage to control the edrone via services and topics -https://www.youtube.com/watch?v=M-RYyMyRl9g
Stars: ✭ 57 (+159.09%)
Mutual labels:  drone, ros
Rafcon
RAFCON (RMC advanced flow control) uses hierarchical state machines, featuring concurrent state execution, to represent robot programs. It ships with a graphical user interface supporting the creation of state machines and contains IDE like debugging mechanisms. Alternatively, state machines can programmatically be generated using RAFCON's API.
Stars: ✭ 112 (+409.09%)
Mutual labels:  state-machine, robotics
Onboard Sdk
DJI Onboard SDK Official Repository
Stars: ✭ 669 (+2940.91%)
Mutual labels:  drone, robotics
Mrs uav system
The entry point to the MRS UAV system.
Stars: ✭ 64 (+190.91%)
Mutual labels:  drone, ros
uav core
The main integrator of MRS UAV packages in ROS, part of the "mrs_uav_system".
Stars: ✭ 28 (+27.27%)
Mutual labels:  drone, ros
Djim100 People Detect Track
A ros demo for people detection and tracking on DJI M100 drone
Stars: ✭ 150 (+581.82%)
Mutual labels:  drone, ros
Px4 Autopilot
PX4 Autopilot Software
Stars: ✭ 5,090 (+23036.36%)
Mutual labels:  drone, ros
Smacc
An Event-Driven, Asynchronous, Behavioral State Machine Library for real-time ROS (Robotic Operating System) applications written in C++
Stars: ✭ 129 (+486.36%)
Mutual labels:  state-machine, ros
Jagcs
Just another ground control station
Stars: ✭ 99 (+350%)
Mutual labels:  drone, robotics
Clover
ROS-based framework and RPi image to control PX4-powered drones 🍀
Stars: ✭ 177 (+704.55%)
Mutual labels:  drone, ros
P
The P programming language.
Stars: ✭ 2,309 (+10395.45%)
Mutual labels:  state-machine, robotics
Autonomous Drone
This repository intends to enable autonomous drone delivery with the Intel Aero RTF drone and PX4 autopilot. The code can be executed both on the real drone or simulated on a PC using Gazebo. Its core is a robot operating system (ROS) node, which communicates with the PX4 autopilot through mavros. It uses SVO 2.0 for visual odometry, WhyCon for visual marker localization and Ewok for trajectoy planning with collision avoidance.
Stars: ✭ 87 (+295.45%)
Mutual labels:  drone, ros
Gymfc
A universal flight control tuning framework
Stars: ✭ 210 (+854.55%)
Mutual labels:  drone, robotics

ASCO Aerial Autonomy

Introduction

The doxygen documentation to the project can be found here.

Setup

Run the setup script in scripts/setup/setup.sh to configure Git hooks.

Install the following dependencies (lcov, protobuf, doxygen, doxypy, coverxygen, google-glog, class-loader). On Ubuntu 18.04 run the following in a terminal (for different versions of Ubuntu replace melodic with your ROS version)

sudo apt-get install lcov protobuf-compiler libprotobuf-dev doxygen doxypy libgoogle-glog-dev ros-melodic-class-loader ros-melodic-ar-track-alvar-msgs autoconf python-pip ros-melodic-serial ros-melodic-map-server libarmadillo-dev
sudo pip install coverxygen

Install protobuf 3.1: (Alternatively, protobuf 3.0.0, which is default with ROS Melodic, can be used and these steps can be skipped. Check version with protoc --version)

git clone https://github.com/google/protobuf.git
cd protobuf
git checkout v3.1.0
./autogen.sh
./configure
make
sudo make install
sudo ldconfig

Install googletest release 1.8.0. This version fixes a bug with ASSERT_TRUE as explained here. To install googletest, follow these steps

git  clone https://github.com/google/googletest.git
cd googletest
git checkout release-1.8.0
mkdir build
cd build
cmake .. -DCMAKE_BUILD_TYPE=Release -DBUILD_SHARED_LIBS=ON -DBUILD_GMOCK=ON -DBUILD_GTEST=ON
make
sudo make install
sudo ldconfig

Install OpenCV with OpenCV Contrib (version must include tracking module). Follow the steps for installing from source here to install from source on Ubuntu 18.04. If a version of OpenCV is already installed on your system you may want to install that version from source. Note: Source code for OpenCV 3.2.0 has an extra else statement on line 21 of cmake/OpenCVCompilerOptions.cmake. This block needs to be removed. The following can be used to check if your system currently has a version of OpenCV installed:

pkg-config --modversion opencv
python3 -c "import cv2; print(cv2.version)"
python2 -c "import cv2; print(cv2.version)"

Install our GCOP (Geometric Control, Optimization, and Planning) package. Build with support for casadi (USE_CASADI) and install the dependences from the GCOP README. Do the following after required and optional dependencies from the GCOP README have been installed (Numbers 5 and 6):

git clone https://github.com/jhu-asco/gcop.git
cd gcop
mkdir build
cd build
cmake -DUSE_CASADI=ON ..
sudo make install

Create a ROS workspace. Run the following in your ROS workspace src folder to setup UAV hardware drivers

git clone -b hydro-devel https://github.com/jhu-asco/quadcopter_parsers.git
git clone -b 3.2.3 https://github.com/jhu-asco/Onboard-SDK-ROS.git

Install gcop_comm for trajectory visualization (other packages in the repo can be ignored) in the ROS workspace src folder

git clone -b hydro-devel https://github.com/jhu-asco/gcop_ros_packages.git

Optional: Manipulator packages

Optionally, to install drivers related to aerial manipulation, run the following in your ROS src folder

git clone https://git.lcsr.jhu.edu/mshecke1/arm_plugins.git
git clone https://git.lcsr.jhu.edu/ggarime1/controllers.git
git clone https://git.lcsr.jhu.edu/ggarime1/dynamixelsdk.git

Build

This package can be cloned into the same ROS workspace src folder and built with catkin build. Be sure to source the workspace's devel/setup.bash.

Arm Plugins

Building with arm plugins can be turned off by setting the USE_ARM_PLUGINS cmake argument to OFF

catkin build -DUSE_ARM_PLUGINS=OFF

This is recommended, when arm plugins are not needed, for the code to compile faster and using less system resources.

Running Executables

The package provides a uav_system_node executable which loads a state machine and hardware and waits for event commands from a ROS topic. The rqt_aerial_autonomy_gui script provides a GUI to generate events for the state machine. The rqt plugin can be loaded along with rqt_rviz in the rqt_gui framework.

The simulator.launch file in the launch folder executes the state machine node using simulated hardware. The GUI can be launched individually using rosrun, or with simulator_rqt_aerial_autonomy.launch. The steps to launch a simulated quadrotor with the state machine are

roslaunch aerial_autonomy simulator.launch
roslaunch aerial_autonomy simulator_rqt_aerial_autonomy.launch  # In a separate tab

Running Tests

To build and run tests use catkin build aerial_autonomy --catkin-make-args run_tests. Output of individual tests can be checked using rosrun aerial_autonomy test_name. To see all test outputs run catkin run_tests --this.

Logging

GLOG is used to log messages from the state machine. The messages are divided into different levels (INFO, WARNING, ERROR, etc.,). The information messages are divided into different verbosity levels (0,1,2 and so on). The verbosity level can be adjusted using the environment variable GLOG_v. If the environment variable is set to 1 (export GLOG_v=1), then all the messages with verbosity 0 and 1 are streamed to stderr output.

The log messages are also recorded into log files in the logs folder. The symbolic links uav_system_node.INFO and uav_system_node.WARNING in the log folder point to the latest log files. The log directory can be changed using the GLOG_log_dir environment variable. More information about the log files can be found in the Google Log documentation.

The simulator launch file introduced above allows for specifying the log level and log directory using roslaunch arguments log_level and log_dir respectively. For example

roslaunch aerial_autonomy simulator.launch log_level:=1  # Prints all the verbose log messages with priority 0 and 1.

Style

This repository uses clang-format for style checking. Pre-commit hooks ensure that all staged files conform to the style conventions. To skip pre-commit hooks and force a commit, use git commit -n.

Documentation Coverage

We use coverxygen to generate documentation coverage for the doc: https://github.com/psycofdj/coverxygen

Use the script scripts/generate_documentation_coverage.bash to generate documentation into .documentation_coverage_info folder. Check the html page in .documentation_coverage_info/index.html to verify the documentation coverage of the code.

Documentation coverage is also added as a pre-push hook. This verifies that 95% of the code is covered with documentation before pushing to remote. It can be skipped for branches with their name starting with develop* and also by using git push --no-verify command.

Test Coverage

We use lcov to generate the test coverage report into the .test_coverage_info folder. The script scripts/generate_test_coverage.bash is used to run tests in the project and generate test coverage report into .test_coverage_info folder. The script is generated using CMake. Run catkin build aerial_autonomy to create the script. Check the html page .test_coverage_info/index.html to check the line and function coverage. The bash script is generated by running CMake using catkin build aerial_autonomy.

The test generation is integrated into the pre-push commit hook. This runs the above test coverage generation script and verifies that the coverage level is above 95% threshold. This can be skipped by either naming the branch as develop[your_branch_name] or using git push --no-verify.

Uploading documentation

The documentation is uploaded through gh-pages branch. The docs are created in master and passed to the gh-pages branch using scripts/applydocs.bash script. The script checks that there are not uncommited changes before uploading documentation to avoid issues with git. The script also requires that you explicitly link gh-pages branch to the remote using git branch --set-upstream-to=[GH_PAGES_REMOTE]

Generating Visual graphs from state machines

The script scripts/generate_dot_files.py converts the transition tables in state machines to dot format and also png format. The script automatically runs through all the state machines stored in the include/aerial_autonomy/state_machines folder.

Usage: ./generate_dot_files.py

Hand-eye Calibration

This section describes how to automatically calibrate a transform from a camera to an arm.

Data Collection

Attach an AR tag to the end effector of your arm.

Use rosbag record /ar_pose_marker /your_end_effector_position where /your_end_effector_position is published by your arm driver and gives the position of the end effector in the arm frame (probably based on forward kinematics).

Launch ar_track_alvar and move the arm around so that the end effector AR tag is visible in the camera.

Extract the AR marker data from the bag file to a csv: rostopic echo -b your_data.bag -p --nostr /ar_pose_marker > marker_data.csv

Calibration script

Use the matlab script scripts/calib/arm_camera_calib.m along with your_data.bag and marker_data.csv to generate a calibrated transformation. It uses non-linear least squares to find the hand-eye transformation.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].