GitPlanet
Projects
Users
Categories
Languages
About
All Categories
→
No Category
→ triton-inference-server
Top 3 triton-inference-server open source projects
onnxruntime backend
The Triton backend for the ONNX Runtime.
✭ 40
C++
python
CMake
backend
inference
triton-inference-server
onnx-runtime
yolov4-triton-tensorrt
This repository deploys YOLOv4 as an optimized TensorRT engine to Triton Inference Server
✭ 224
C++
python
Cuda
CMake
docker
deep-learning
object-detection
tensorrt
yolov4
triton-inference-server
yolov4-tiny
isaac ros dnn inference
Hardware-accelerated DNN model inference ROS2 packages using NVIDIA Triton/TensorRT for both Jetson and x86_64 with CUDA-capable GPU
✭ 67
python
C++
CMake
ai
deep-learning
gpu
dnn
ros
nvidia
triton
deeplearning
tao
jetson
ros2
tensorrt
triton-inference-server
tensorrt-inference
ros2-humble
1-3
of
3
triton-inference-server projects