All Projects → bli-tamu → Virtual-Lane-Boundary-Generation

bli-tamu / Virtual-Lane-Boundary-Generation

Licence: LGPL-3.0 license
Virtual Lane Boundary Generation for Human-Compatible Autonomous Driving

Programming Languages

Makefile
30231 projects
C++
36643 projects - #6 most used programming language
CMake
9771 projects
c
50402 projects - #5 most used programming language
shell
77523 projects
matlab
3953 projects

Projects that are alternatives of or similar to Virtual-Lane-Boundary-Generation

conde simulator
Autonomous Driving Simulator for the Portuguese Robotics Open
Stars: ✭ 31 (+40.91%)
Mutual labels:  autonomous-driving, lane-detection, lane-tracking
copilot
Lane and obstacle detection for active assistance during driving. Uses windowed sweep for lane detection. Combination of object tracking and YOLO for obstacles. Determines lane change, relative velocity and time to collision
Stars: ✭ 95 (+331.82%)
Mutual labels:  autonomous-driving, lane-detection, lane-tracking
highway-path-planning
My path-planning pipeline to navigate a car safely around a virtual highway with other traffic.
Stars: ✭ 39 (+77.27%)
Mutual labels:  motion-planning, autonomous-driving
YOLOP
You Only Look Once for Panopitic Driving Perception.(https://arxiv.org/abs/2108.11250)
Stars: ✭ 1,228 (+5481.82%)
Mutual labels:  autonomous-driving, lane-detection
LaneandYolovehicle-DetectionLinux
Lane depertaure and Yolo objection detection C++ Linux
Stars: ✭ 16 (-27.27%)
Mutual labels:  lane-detection, lane-tracking
lane-detection
Lane detection MATLAB code for Kalman Filter book chapter: Lane Detection
Stars: ✭ 21 (-4.55%)
Mutual labels:  autonomous-driving, lane-detection
MotionPlanner
Motion Planner for Self Driving Cars
Stars: ✭ 129 (+486.36%)
Mutual labels:  motion-planning, autonomous-driving
Autonomousvehiclepaper
无人驾驶相关论文速递
Stars: ✭ 406 (+1745.45%)
Mutual labels:  motion-planning, autonomous-driving
Hybrid A Star Annotation
Hybrid A*路径规划器的代码注释
Stars: ✭ 188 (+754.55%)
Mutual labels:  motion-planning
OpenMaterial
3D model exchange format with physical material properties for virtual development, test and validation of automated driving.
Stars: ✭ 23 (+4.55%)
Mutual labels:  autonomous-driving
Gpmp2
Gaussian Process Motion Planner 2
Stars: ✭ 161 (+631.82%)
Mutual labels:  motion-planning
Rrt Algorithms
n-dimensional RRT, RRT* (RRT-Star)
Stars: ✭ 195 (+786.36%)
Mutual labels:  motion-planning
dwl
The Dynamic Whole-body Locomotion library (DWL)
Stars: ✭ 70 (+218.18%)
Mutual labels:  motion-planning
Xpp
Visualization of Motions for Legged Robots in ros-rviz
Stars: ✭ 177 (+704.55%)
Mutual labels:  motion-planning
Robotics-Resources
List of commonly used robotics libraries and packages
Stars: ✭ 71 (+222.73%)
Mutual labels:  motion-planning
Am traj
Alternating Minimization Based Trajectory Generation for Quadrotor Aggressive Flight
Stars: ✭ 142 (+545.45%)
Mutual labels:  motion-planning
Aikido
Artificial Intelligence for Kinematics, Dynamics, and Optimization
Stars: ✭ 133 (+504.55%)
Mutual labels:  motion-planning
lqRRT
Kinodynamic RRT implementation
Stars: ✭ 76 (+245.45%)
Mutual labels:  motion-planning
dreyeve
[TPAMI 2018] Predicting the Driver’s Focus of Attention: the DR(eye)VE Project. A deep neural network learnt to reproduce the human driver focus of attention (FoA) in a variety of real-world driving scenarios.
Stars: ✭ 88 (+300%)
Mutual labels:  autonomous-driving
Pathfinder
Cross-Platform, Multi-Use Motion Profiling and Trajectory Generation
Stars: ✭ 227 (+931.82%)
Mutual labels:  motion-planning

Virtual Lane Boundary Generation for Human-Compatible Autonomous Driving: A Tight Coupling between Perception and Planning

License

Our code is released under a GPLv3 license. In this code, we aim to recognize, generate and track virtual lane boundaries to help navigate autonomous vehicles in urban scenarios. It has been tested on Ubuntu 16.04LTS and should work on 16.04 or newer versions. We extend our previous work in

@INPROCEEDINGS{li2019virtual,
  title={Virtual Lane Boundary Generation for Human-Compatible Autonomous Driving: A Tight Coupling between Perception and Planning},
  author={Li, Binbin and Song, Dezhen and Ramchandani, Ankit and Cheng, Hsin-Min and Wang, Di and Xu, Yiliang and Chen, Baifan},
  booktitle={2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  pages={3733--3739},
  year={2019},
  organization={IEEE}
}

@inproceedings{li2018lane,
  title={Lane marking quality assessment for autonomous driving},
  author={Li, Binbin and Song, Dezhen and Li, Haifeng and Pike, Adam and Carlson, Paul},
  booktitle={2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  pages={1--9},
  year={2018},
  organization={IEEE}
}

If you use our code in an academic work, please cite the aforementioned paper.

Clone

git clone --recurse-submodules [email protected]:bli-tamu/LDRT.git

Dependencies

You can use ./install-dependencies.sh for convenience, or do it manually as follows:

sudo apt-get install --no-install-recommends    \
  build-essential                               \
  cmake                                         \
  git                                           \
  wget                                          \
  libatlas-base-dev   `#For Caffe`              \
  libboost-all-dev                              \
  libgflags-dev                                 \
  libgoogle-glog-dev                            \
  libhdf5-serial-dev                            \
  libleveldb-dev                                \
  liblmdb-dev                                   \
  libopencv-dev                                 \
  libprotobuf-dev                               \
  libsnappy-dev                                 \
  protobuf-compiler                             \
  python3-pip                                   \
  python3-dev                                   \
  libmatio-dev        `#MAT File I/O`           \
  libpugixml-dev      `#XML processing`         \
  libalglib-dev       `#Numerical analysis`     \
  libgsl-dev          `#GNU scientific library` \
  libopenblas-dev     `#Linear algebra library` \
  libeigen3-dev                                 \
  libpcap-dev         `#Packet capture library`

pip3 install numpy matplotlib pandas scipy pybind11

Build Instruction

You can use ./build.sh for default build, or do it manually as follows:

  • ICNet

    Copy the example configuration file and apply patch.

    cp 3rdParty/ICNet/PSPNet/Makefile.config.example 3rdParty/ICNet/PSPNet/Makefile.config
    patch -p0 -i 3rdParty/patches/ICNet.patch
    

    Modified the configuration file before compile if needed.

    cd 3rdParty/ICNet/PSPNet
    make
    
  • This project uses CMake (http://www.cmake.org), a cross-platform build system.

    mkdir build && cd build
    cmake ..
    make
    

External component

  • Install VINS-Fusion to generate visual odometry vio.txt from the dataset, and put it under demo/data/datasetName folder.
  • Download GPS way points from Google maps, utilize the script matlab/process_KITTIdata.m to generate the map, and put the data in the demo/data/datasetName/priorMap.txt.

Prepare to run a demo

  1. We use KITTI dataset to demostrate our code, prepDemo.sh is a script that downloads and prepares the demo. For example, ./prepDemo.sh 2011_09_26_drive_0056_sync will automatically prepare the PSPNet models, 2011_09_26_drive_0056_sync dataset and its calibration files.
  2. Run the dataset with VINS-Fusion and place the vio.txt in demo/data/datasetName.

Run a demo

  • Run with runDemo.sh , for example, ./runDemo.sh 2011_09_26_drive_0056_sync will run the code and keep a log file in demo/data/2011_09_26_drive_0056_sync/.

Contact

  1. Binbin Li [email protected]
  2. Ankit Ramchandani [email protected]
  3. Di Wang [email protected]
  4. Aaron Kingery [email protected]
  5. Aaron Angert [email protected]
  6. Dezhen Song [email protected]
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].