All Categories → No Category → triton-inference-server

Top 3 triton-inference-server open source projects

yolov4-triton-tensorrt
This repository deploys YOLOv4 as an optimized TensorRT engine to Triton Inference Server
isaac ros dnn inference
Hardware-accelerated DNN model inference ROS2 packages using NVIDIA Triton/TensorRT for both Jetson and x86_64 with CUDA-capable GPU
1-3 of 3 triton-inference-server projects