All Projects → yazeedalrubyli → SDRC

yazeedalrubyli / SDRC

Licence: GPL-3.0 license
Self-Driving RC Car - DIY

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to SDRC

Formula1Epoch
An autonomous R.C. racecar which detects people.
Stars: ✭ 63 (+70.27%)
Mutual labels:  self-driving-car
Self-Driving-Car
Implemented a Convolutional Neural Network for end-to-end driving in a simulator using Tensorflow and Keras. The project involves training over 13,000 images in a unity3d simulator to steer the car successfully throughout the track
Stars: ✭ 29 (-21.62%)
Mutual labels:  self-driving-car
Awesome-Lane-Detection
A paper list with code of lane detection.
Stars: ✭ 34 (-8.11%)
Mutual labels:  self-driving-car
Autonomous-RC-Car
Self-driving RC Car ROS Software
Stars: ✭ 17 (-54.05%)
Mutual labels:  self-driving-car
Lane-Lines-Detection-Python-OpenCV
Lane Lines Detection using Python and OpenCV for self-driving car
Stars: ✭ 77 (+108.11%)
Mutual labels:  self-driving-car
Monocular-Vehicle-Localization
Estimating the orientation and the relative dimensions of vehicles by producing a 3d bounding frame
Stars: ✭ 28 (-24.32%)
Mutual labels:  self-driving-car
end-to-end-deep-learning
Autonomous driving simulation in the Unity engine.
Stars: ✭ 27 (-27.03%)
Mutual labels:  self-driving-car
Lyft-Perception-Challenge
The 4th place and the fastest solution of the Lyft Perception Challenge (Image semantic segmentation with PyTorch)
Stars: ✭ 69 (+86.49%)
Mutual labels:  self-driving-car
Hybrid-A-Star-U-Turn-Solution
Autonomous driving trajectory planning solution for U-Turn scenario
Stars: ✭ 75 (+102.7%)
Mutual labels:  self-driving-car
Traffic-Signs
Second Project of the Udacity Self-Driving Car Nanodegree Program
Stars: ✭ 35 (-5.41%)
Mutual labels:  self-driving-car
fusion-ekf
An extended Kalman Filter implementation in C++ for fusing lidar and radar sensor measurements.
Stars: ✭ 113 (+205.41%)
Mutual labels:  self-driving-car
Auto-Birds-Eye
Bird's eye/Top Down view generation and mapping with deep learning.
Stars: ✭ 129 (+248.65%)
Mutual labels:  self-driving-car
Self-Driving-Car-Steering-Simulator
The aim of this project is to allow a self driving car to steer autonomously in a virtual environment.
Stars: ✭ 15 (-59.46%)
Mutual labels:  self-driving-car
sdc-behaviour-cloning
Teaching a deep learning model to drive a car based on image and telemetry inputs
Stars: ✭ 14 (-62.16%)
Mutual labels:  self-driving-car
openpilot
FOR PRE-AP/AP1/AP2 TESLA CARS ONLY: open source driving agent. You can help development by donating @ https://github.com/sponsors/BogGyver
Stars: ✭ 30 (-18.92%)
Mutual labels:  self-driving-car
ROSE
ROSE project car race game
Stars: ✭ 24 (-35.14%)
Mutual labels:  self-driving-car
MachineLearning
Machine learning code base of Meng Li
Stars: ✭ 22 (-40.54%)
Mutual labels:  self-driving-car
opendlv
OpenDLV - A modern microservice-based software ecosystem powered by libcluon to make vehicles autonomous.
Stars: ✭ 67 (+81.08%)
Mutual labels:  self-driving-car
FisheyeDistanceNet
FisheyeDistanceNet
Stars: ✭ 33 (-10.81%)
Mutual labels:  self-driving-car
SecondaryAwesomeCollection
收集深度学习相关的awesome 资源列表,欢迎补充
Stars: ✭ 75 (+102.7%)
Mutual labels:  self-driving-car

SDRC

Step 1 : Equipment

1. RC Car

2. Raspberry Pi 3

3. Raspberry Pi Camera Module V2

4. Jumber Wires

5. Portable Battery

6- L293d


Step 2: Setup

In my RC Car I have two motors, front to control left/right and rear which controls forward/backward motion. Starting with Raspberry Pi, the figure below shows the pinout digram of the Raspberry Pi where you will hook-up the wires from it to the motors through L293d. L293d contains two full H-bridges which enable us to control two DC motors bi-directionally. For the consistency when we go over the software setup, we will use these pins(40, 38, 36) for front motor and pins(37, 35, 33) for rear. Feel free to change them to whatever you prefere if you know what you’re doing.


Step 3: Manual Control Using RPi & Python

In order to let the car autonomously drive, we need it to control itself. So, we need first to figure out a way to control it using some keys and I chose keys(w,a,s,d) on the keyboard as they used in games, w: Forward, s: Straight, a: Left, d: Right. You can find the code for controlling the RC Car in my Github Repo. Note: Please submit Pull Request if you have better implementation.

Youtube Video


Step 4: Put Everything In Place

Youtube Video


Step 5: Server-Less Control Using Computer Vision

First, we get an image from the stream and take the Y component from YUV color space which represent the gray scale of the original RGB image.

camera = picamera.PiCamera() # Initialise Camera Object 
stream = picamera.array.PiYUVArray(camera) # Initialise Stream
camera.capture(stream, format='yuv') # Capture YUV image from Stream
img = self.stream.array[:,:,0] # Choose Y Channel

min = 0        
max = 135        
binary = np.zeros_like(img)        
binary[(img >= min) & (img <= max)] = 1

histogram = np.sum(binary[binary.shape[0]//2:,:], axis=0)

Based on certain calculations that differ from one situation to another, in my case I divided the histogram into right and left. If the right has more weight than the left steer left otherwise steer right. So I run the raspberry pi, camera and motors on the same portable battery (5v-2A) and it works just fine, but as you may notice in the video below it is relatively slow.

Youtube Video


What Next?

I’ll continue to improve the algorithm and add some documentation on GitHub. I’ll post a new story if a major improvement is made. In the meanwhile, we came to the end of this series, if you want to go further, you may need object detection and avoidance. Your support, inputs are highly appreciated and if you find it useful please share.


Originally Posted on Medium

Building Self-Driving RC Car Series

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].