All Projects → TamasSzepessy → DJITelloOpticalControl

TamasSzepessy / DJITelloOpticalControl

Licence: MIT license
Autonomous navigation and data collection for Tello using mainly OpenCV's ArUco libraries.

Programming Languages

python
139335 projects - #7 most used programming language
matlab
3953 projects

Projects that are alternatives of or similar to DJITelloOpticalControl

robo-playground
Games and examples built for RoboMaster EP with RoboMasterPy | 与你的大疆机甲大师愉快玩耍,基于RoboMasterPy构建
Stars: ✭ 33 (+37.5%)
Mutual labels:  dji, opencv-python
dji-tello
Java API for the DJI Tello Drone.
Stars: ✭ 13 (-45.83%)
Mutual labels:  dji, tello
Node-RED-Tello-Control
Node-RED flows to control the Ryze Tello Drone
Stars: ✭ 121 (+404.17%)
Mutual labels:  dji, tello
openCVtutorials
OpenCV Tutorials
Stars: ✭ 12 (-50%)
Mutual labels:  opencv-python
StructuredLight
Creating a 3D reconstruction of an object using multiple images
Stars: ✭ 42 (+75%)
Mutual labels:  opencv-python
aruco localization
ROS localization node using an ArUco Marker Map.
Stars: ✭ 30 (+25%)
Mutual labels:  aruco
QGISFMV
QGIS Full Motion Video (FMV)
Stars: ✭ 104 (+333.33%)
Mutual labels:  dji
Counting-people-video
Counting the number of people in a video.
Stars: ✭ 60 (+150%)
Mutual labels:  opencv-python
Realtime-OpenCV-Chess
♔ Chess-playing with Open-CV [Human vs AI (Stockfish engine)]
Stars: ✭ 18 (-25%)
Mutual labels:  opencv-python
Image deionising auto encoder
Noise removal from images using Convolutional autoencoder
Stars: ✭ 34 (+41.67%)
Mutual labels:  opencv-python
camera calibration
Code and resources for camera calibration using arUco markers and opencv
Stars: ✭ 38 (+58.33%)
Mutual labels:  opencv-python
MobyCAIRO
Computer-assisted image straightening and cropping
Stars: ✭ 16 (-33.33%)
Mutual labels:  opencv-python
Face-Detection-and-Tracking
Computer Vision model to detect face in the first frame of a video and to continue tracking it in the rest of the video. This is implemented in OpenCV 3.3.0 and Python 2.7
Stars: ✭ 24 (+0%)
Mutual labels:  opencv-python
Face-Detection-and-Tracking
Face Detection and tracking using CamShift, Kalman Filter, Optical Flow
Stars: ✭ 30 (+25%)
Mutual labels:  opencv-python
UAV-Stereo-Vision
A program for controlling a micro-UAV for obstacle detection and collision avoidance using disparity mapping
Stars: ✭ 30 (+25%)
Mutual labels:  opencv-python
haar-cascade-files
A complete collection of Haar-Cascade files. Every Haar-Cascades here!
Stars: ✭ 55 (+129.17%)
Mutual labels:  opencv-python
BlurryFaces
A tool to blur faces or other regions in images and videos 🤡🔍
Stars: ✭ 58 (+141.67%)
Mutual labels:  opencv-python
opencv-proto
Allows fast prototyping in Python for OpenCV
Stars: ✭ 18 (-25%)
Mutual labels:  opencv-python
britishMuseumFacesDetection
A python example for using OpenCV to identify faces within @BritishMuseum images.
Stars: ✭ 66 (+175%)
Mutual labels:  opencv-python
Feature-Detection-and-Matching
Feature Detection and Matching with SIFT, SURF, KAZE, BRIEF, ORB, BRISK, AKAZE and FREAK through the Brute Force and FLANN algorithms using Python and OpenCV
Stars: ✭ 95 (+295.83%)
Mutual labels:  opencv-python

Optical control for Tello drone

Using Damià Fuentes Escoté's library (https://github.com/damiafuentes/DJITelloPy) as a base.

For my bachelor thesis I created a control system for the Tello with the inbuilt, monocular camera: autonomous navigation and data collection. Demonstration: https://www.youtube.com/watch?v=B8aU0DVYYco

Tested with Python 3.7 and OpenCV 4.1.0 and 7x7 ArUco dictionary.

Requirements

djitellopy
pygame
opencv-python 4.1.0
opencv-contrib 4.1.0
numpy
scipy
pykalman
matplotlib

Instructions

The control system is based on ArUco markers, which must be placed correctly for the drone to navigate through them. Before use, you must generate the ArUco markers from the OpenCV library and list the function of each marker in src/marker_list/marker_conf.csv as a configuration file. Each marker type contributes to a setpoint in the markers local coordinate system (decoded by targeter.py), to which the drone must navigate to using a simple numeric PID control (pid.py), then it can continue its path to the nearest seen marker. All marker path calculations and storing are done by the Markers class in marker_class.py.

You must also calibrate the drone's camera beforehand with the chessboard calibration algorithm of OpenCV. In cam_class.py you can set multiple parameters for chessboard tile edge length for calibration, marker edge length for measurements and edge filtering for distorted markers.

I modified and complemented the original djitellopy/tello.py script with the Tello state read, which runs in a separate thread, not blocking the main execution.

For flight, you must run the main.py script, which tries to connect automatically to your Tello. After successful connection, you can take off with "T" key, navigate with the arrow keys, control up-down with W/S and yaw with A/D, after flight, press "L" to land the drone.

In-flight, you can use the following keys for further functions:

  • "M" takes a picture and saves it under images

  • "K" starts calibration mode, which waits for a chessboard, then collects 20 samples of it and calibrates camera (the calibration matrix is saved under src/calibration_files/camcalib.npz)

  • "C" starts capturing drone coordinates from seen ArUco markers, using the first as global origin (this can be used in manual flight mode)

  • "O" starts automatic navigation and video capture through a marker path between a placed "Start" and "End" marker

All control is done in the main loop which calls the separate functions with events and flags. There are four threads running, one for pyGame window and GUI, two for Tello UDP communication and one for parallel video capture. Data can be sent thread safely between the separate threads using queues.

All matrix transformations are done by the functions in transformations.py. The basic principle is that when two markers are seen together at the same time, you can calculate the transformation matrix between the two coordinate systems. (Taking multiple samples for averaging the matrices between two markers.) The drone can map its path this way, using a chain of transformations from the global origin, the first seen marker used as a base. All coordinate transformations are done real time, the global points are then saved.

The remaining scripts are for post processing only:

  • plot3d.py uses Mathplotlib Axes3D to show the flight path in a Cartesian coordinate system and animating it

  • video_writer.py is the one running in a separate thread for capturing the drone's camera feed while in navigation

Documentation

The documentation for the project can be found under doc. It is a Bachelor thesis written at the Budapest University of Technology and Economics, for the Faculty of Mechatronics. The original Hungarian only has an English abstract, but I used DeepL translator to create a full English version. (Any misinterpretations and broken links are because of this.)

Author

  • Tamás Szepessy

License

This project is licensed under the MIT License.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].