All Projects → woven-planet → l5kit

woven-planet / l5kit

Licence: other
L5Kit - https://level-5.global/

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to l5kit

autonomous-delivery-robot
Repository for Autonomous Delivery Robot project of IvLabs, VNIT
Stars: ✭ 65 (-90.48%)
Mutual labels:  planning, autonomous-vehicles
BtcDet
Behind the Curtain: Learning Occluded Shapes for 3D Object Detection
Stars: ✭ 104 (-84.77%)
Mutual labels:  autonomous-vehicles, self-driving
DnaWeaver
A route planner for DNA assembly
Stars: ✭ 20 (-97.07%)
Mutual labels:  planning
zenoh-flow
zenoh-flow aims at providing a zenoh-based data-flow programming framework for computations that span from the cloud to the device.
Stars: ✭ 27 (-96.05%)
Mutual labels:  autonomous-vehicles
SecondaryAwesomeCollection
收集深度学习相关的awesome 资源列表,欢迎补充
Stars: ✭ 75 (-89.02%)
Mutual labels:  autonomous-vehicles
Auto-Birds-Eye
Bird's eye/Top Down view generation and mapping with deep learning.
Stars: ✭ 129 (-81.11%)
Mutual labels:  autonomous-vehicles
copilot
Lane and obstacle detection for active assistance during driving. Uses windowed sweep for lane detection. Combination of object tracking and YOLO for obstacles. Determines lane change, relative velocity and time to collision
Stars: ✭ 95 (-86.09%)
Mutual labels:  autonomous-vehicles
Model-Predictive-Control
This project is to use Model Predictive Control (MPC) to drive a car in a game simulator. The server provides reference waypoints (yellow line in the demo video) via websocket, and we use MPC to compute steering and throttle commands to drive the car. The solution must be robust to 100ms latency, since it might encounter in real-world application.
Stars: ✭ 93 (-86.38%)
Mutual labels:  autonomous-vehicles
OpenHDMap
An open HD map production process for autonomous car simulation
Stars: ✭ 152 (-77.75%)
Mutual labels:  autonomous-vehicles
Monocular-Vehicle-Localization
Estimating the orientation and the relative dimensions of vehicles by producing a 3d bounding frame
Stars: ✭ 28 (-95.9%)
Mutual labels:  autonomous-vehicles
buzzmobile
An autonomous parade float/vehicle
Stars: ✭ 18 (-97.36%)
Mutual labels:  autonomous-vehicles
language-planner
Official Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
Stars: ✭ 84 (-87.7%)
Mutual labels:  planning
Robotics-Planning-Dynamics-and-Control
RPDC : This contains all my MATLAB codes for the Robotics, Planning, Dynamics and Control . The implementations model various kinds of manipulators and mobile robots for position control, trajectory planning and path planning problems.
Stars: ✭ 171 (-74.96%)
Mutual labels:  planning
AdvancedLaneLines
Lane identification system for camera based systems.
Stars: ✭ 61 (-91.07%)
Mutual labels:  autonomous-vehicles
carla-colab
How to run CARLA simulator on colab
Stars: ✭ 81 (-88.14%)
Mutual labels:  self-driving
mapus
A map tool with real-time collaboration 🗺️
Stars: ✭ 2,687 (+293.41%)
Mutual labels:  planning
scrum-planning-poker
Please feel FREE to try it and give feedback by searching Scrum敏捷估算 in WeChat mini program.
Stars: ✭ 30 (-95.61%)
Mutual labels:  planning
PyLidar3
PyLidar3 is python 3 package to get data from Lidar devices from various manufacturers.
Stars: ✭ 35 (-94.88%)
Mutual labels:  autonomous-vehicles
VPGNet for lane
Vanishing Point Guided Network for lane detection, with post processing
Stars: ✭ 33 (-95.17%)
Mutual labels:  self-driving
jpp
Joint Perception and Planning For Efficient Obstacle Avoidance Using Stereo Vision
Stars: ✭ 42 (-93.85%)
Mutual labels:  planning

Welcome to L5Kit. L5Kit is a Python library with functionality for the development and training of learned prediction, planning and simulation models for autonomous driving applications.

Click here for documentation

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].