All Projects → caipeide → drift_drl

caipeide / drift_drl

Licence: MIT license
High-speed Autonomous Drifting with Deep Reinforcement Learning

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to drift drl

Airsim
Open source simulator for autonomous vehicles built on Unreal Engine / Unity, from Microsoft AI & Research
Stars: ✭ 12,528 (+15178.05%)
Mutual labels:  deep-reinforcement-learning, autonomous-vehicles
Carla
Open-source simulator for autonomous driving research.
Stars: ✭ 7,012 (+8451.22%)
Mutual labels:  deep-reinforcement-learning, autonomous-vehicles
LC NGSIM
lane change trajectories extracted from NGSIM
Stars: ✭ 98 (+19.51%)
Mutual labels:  driving-behavior
FinRL Podracer
Cloud-native Financial Reinforcement Learning
Stars: ✭ 179 (+118.29%)
Mutual labels:  deep-reinforcement-learning
automile-php
Automile offers a simple, smart, cutting-edge telematics solution for businesses to track and manage their business vehicles.
Stars: ✭ 28 (-65.85%)
Mutual labels:  driving-behavior
Deep-Reinforcement-Learning-for-Automated-Stock-Trading-Ensemble-Strategy-ICAIF-2020
Live Trading. Please star.
Stars: ✭ 1,251 (+1425.61%)
Mutual labels:  deep-reinforcement-learning
community-projects
Webots projects (PROTO files, controllers, simulation worlds, etc.) contributed by the community.
Stars: ✭ 20 (-75.61%)
Mutual labels:  autonomous-vehicles
automile-net
Automile offers a simple, smart, cutting-edge telematics solution for businesses to track and manage their business vehicles.
Stars: ✭ 24 (-70.73%)
Mutual labels:  driving-behavior
SCUTTLE
SCUTTLE™: Sensing, Connected, Utility Transport Taxi for Level Environments [An open-source Mobile Robot]
Stars: ✭ 58 (-29.27%)
Mutual labels:  autonomous-vehicles
mmn
Moore Machine Networks (MMN): Learning Finite-State Representations of Recurrent Policy Networks
Stars: ✭ 39 (-52.44%)
Mutual labels:  deep-reinforcement-learning
Visualizing-lidar-data
Visualizing lidar data using Uber Autonomous Visualization System (AVS) and Jupyter Notebook Application
Stars: ✭ 75 (-8.54%)
Mutual labels:  autonomous-vehicles
dig-into-apollo
Apollo notes (Apollo学习笔记) - Apollo learning notes for beginners.
Stars: ✭ 1,786 (+2078.05%)
Mutual labels:  autonomous-vehicles
pomdp-baselines
Simple (but often Strong) Baselines for POMDPs in PyTorch - ICML 2022
Stars: ✭ 162 (+97.56%)
Mutual labels:  deep-reinforcement-learning
deep-rts
A Real-Time-Strategy game for Deep Learning research
Stars: ✭ 152 (+85.37%)
Mutual labels:  deep-reinforcement-learning
Master-Thesis
Deep Reinforcement Learning in Autonomous Driving: the A3C algorithm used to make a car learn to drive in TORCS; Python 3.5, Tensorflow, tensorboard, numpy, gym-torcs, ubuntu, latex
Stars: ✭ 33 (-59.76%)
Mutual labels:  deep-reinforcement-learning
LWDRLC
Lightweight deep RL Libraray for continuous control.
Stars: ✭ 14 (-82.93%)
Mutual labels:  deep-reinforcement-learning
motion-planner-reinforcement-learning
End to end motion planner using Deep Deterministic Policy Gradient (DDPG) in gazebo
Stars: ✭ 99 (+20.73%)
Mutual labels:  deep-reinforcement-learning
decentralized-rl
Decentralized Reinforcment Learning: Global Decision-Making via Local Economic Transactions (ICML 2020)
Stars: ✭ 40 (-51.22%)
Mutual labels:  deep-reinforcement-learning
TF RL
Eagerly Experimentable!!!
Stars: ✭ 22 (-73.17%)
Mutual labels:  deep-reinforcement-learning
pyMHT
Track oriented, multi target, multi hypothesis tracker
Stars: ✭ 66 (-19.51%)
Mutual labels:  autonomous-vehicles

High-speed Autonomous Drifting with Deep Reinforcement Learning

IEEE Robotics and Automation Letters & ICRA-2020

🖥️ Homepage 📜 Paper

High-speed drifting cornering by the proposed deep RL controller High-speed drifting cornering by the proposed deep RL controller High-speed drifting cornering by the proposed deep RL controller

Requirements

  1. Tested on Ubuntu 16.04 and Ubuntu 20.04.
  2. Nvidia GPU equipped, and driver Installed. Tested on GTX 1080Ti.
  3. Install Anaconda, which is a package manager, environment manager, and Python distribution.
  4. Install the environment:
conda env create -f environment_drift.yaml

This command will create a conda environment named drift

Reference trajectorires for seven maps

Seven maps designed in this work

Reference trajectories for the maps are located in code/ref_trajectory

traj_0: for map(a), for first-stage training.

traj_1...traj_5: for map(b-f), for second-stage training.

traj_6: for map(g), for evaluation

Start the Simulator

We build the simulator based on Carla 0.9.5. You can download our build version from this link.

Then add these two lines to your ~/.bashrc (assume you download the simulator to Downloads folder):

export PYTHONPATH=$PYTHONPATH:~/Downloads/CARLA_DRIFT_0.9.5/PythonAPI/carla/dist/carla-0.9.5-py3.5-linux-x86_64.egg
export PYTHONPATH=$PYTHONPATH:~/Downloads/CARLA_DRIFT_0.9.5/PythonAPI/carla/

Then open a new terminal and start the simulator:

cd ~/Downloads/CARLA_DRIFT_0.9.5
./CarlaUE4.sh /Game/Carla/ExportedMaps/test_refined

You can use W A S D and mouse to navigate in the simulator. Press Alt+Tab to restore your cursor.

Test the Model

Model weights are located in weights/, where four kinds of models are included: SAC, SAC-WOS, DDPG, DQN

Note that sac-stg1 and sac-stg2 are different stages of our SAC controller during training. sac-stg2 is the final version and sac-stg1 are only trained on map(a).

To test the models, make sure you have started the simulator, then open a new terminal and do the followings:

cd code
conda activate drift
sh test.sh

Then different models will be tested on map(g). The driving data (timestamp, speed, location, heading, slip angle, control commands, etc.) will be recorded in code/test/ after the testing process.

If you want to test a single model, for example, DQN:

cd code
conda activate drift
python test_dqn.py

Citation

Please consider to cite our paper if this work helps:

@article{Cai2020HighSpeedAD,
  title={High-Speed Autonomous Drifting With Deep Reinforcement Learning},
  author={Peide Cai and X. Mei and L. Tai and Yuxiang Sun and M. Liu},
  journal={IEEE Robotics and Automation Letters},
  year={2020},
  volume={5},
  pages={1247-1254}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].