All Projects → tensorflow → Tensorrt

tensorflow / Tensorrt

Licence: apache-2.0
TensorFlow/TensorRT integration

Projects that are alternatives of or similar to Tensorrt

Deeplearning
深度学习入门教程, 优秀文章, Deep Learning Tutorial
Stars: ✭ 6,783 (+1149.17%)
Mutual labels:  jupyter-notebook
Python intro
Jupyter notebooks in Russian. Introduction to Python, basic algorithms and data structures
Stars: ✭ 538 (-0.92%)
Mutual labels:  jupyter-notebook
Mongood
A MongoDB GUI with Fluent Design
Stars: ✭ 540 (-0.55%)
Mutual labels:  jupyter-notebook
Training Data Analyst
Labs and demos for courses for GCP Training (http://cloud.google.com/training).
Stars: ✭ 5,653 (+941.07%)
Mutual labels:  jupyter-notebook
Srflow
Official SRFlow training code: Super-Resolution using Normalizing Flow in PyTorch
Stars: ✭ 537 (-1.1%)
Mutual labels:  jupyter-notebook
Keras Openface
Keras-OpenFace is a project converting OpenFace from Torch implementation to a Keras version
Stars: ✭ 538 (-0.92%)
Mutual labels:  jupyter-notebook
Python Deliberate Practice
Deliberate Practice for Learning Python
Stars: ✭ 528 (-2.76%)
Mutual labels:  jupyter-notebook
Machine Learning Books With Python
Chapter by Chapter notes, exercises and code for a variety of machine learning books using Python
Stars: ✭ 539 (-0.74%)
Mutual labels:  jupyter-notebook
Torchdyn
A PyTorch based library for all things neural differential equations
Stars: ✭ 535 (-1.47%)
Mutual labels:  jupyter-notebook
Pose Hg Train
Training and experimentation code used for "Stacked Hourglass Networks for Human Pose Estimation"
Stars: ✭ 541 (-0.37%)
Mutual labels:  jupyter-notebook
Terrapattern
Enabling journalists, citizen scientists, humanitarian workers and others to detect “patterns of interest” in satellite imagery.
Stars: ✭ 536 (-1.29%)
Mutual labels:  jupyter-notebook
Justenoughscalaforspark
A tutorial on the most important features and idioms of Scala that you need to use Spark's Scala APIs.
Stars: ✭ 538 (-0.92%)
Mutual labels:  jupyter-notebook
Attention Networks For Classification
Hierarchical Attention Networks for Document Classification in PyTorch
Stars: ✭ 540 (-0.55%)
Mutual labels:  jupyter-notebook
Dlaicourse
Notebooks for learning deep learning
Stars: ✭ 5,355 (+886.19%)
Mutual labels:  jupyter-notebook
Cookbook 2nd Code
Code of the IPython Cookbook, Second Edition, by Cyrille Rossant, Packt Publishing 2018 [read-only repository]
Stars: ✭ 541 (-0.37%)
Mutual labels:  jupyter-notebook
Digital Signal Processing Lecture
Digital Signal Processing - Theory and Computational Examples
Stars: ✭ 532 (-2.03%)
Mutual labels:  jupyter-notebook
Notebooks
Some sample IPython notebooks for scikit-learn
Stars: ✭ 539 (-0.74%)
Mutual labels:  jupyter-notebook
Pytips
Useful Python tips!
Stars: ✭ 542 (-0.18%)
Mutual labels:  jupyter-notebook
Nanovna
Very Tiny Palmtop Vector Network Analyzer
Stars: ✭ 539 (-0.74%)
Mutual labels:  jupyter-notebook
Photomosaic
Creating fun photomosaics, GIFs, and murals from your family pictures using ML & similarity search
Stars: ✭ 540 (-0.55%)
Mutual labels:  jupyter-notebook

Documentation for TensorRT in TensorFlow (TF-TRT)

The documentaion on how to accelerate inference in TensorFlow with TensorRT (TF-TRT) is here: https://docs.nvidia.com/deeplearning/dgx/tf-trt-user-guide/index.html

Examples for TensorRT in TensorFlow (TF-TRT)

This repository contains a number of different examples that show how to use TF-TRT. TF-TRT is a part of TensorFlow that optimizes TensorFlow graphs using TensorRT. We have used these examples to verify the accuracy and performance of TF-TRT. For more information see Verified Models.

Examples

Using TensorRT in TensorFlow (TF-TRT)

This module provides necessary bindings and introduces TRTEngineOp operator that wraps a subgraph in TensorRT. This module is under active development.

Installing TF-TRT

Currently Tensorflow nightly builds include TF-TRT by default, which means you don't need to install TF-TRT separately. You can pull the latest TF containers from docker hub or install the latest TF pip package to get access to the latest TF-TRT.

If you want to use TF-TRT on NVIDIA Jetson platform, you can find the download links for the relevant Tensorflow pip packages here: https://docs.nvidia.com/deeplearning/dgx/index.html#installing-frameworks-for-jetson

Installing TensorRT

In order to make use of TF-TRT, you will need a local installation of TensorRT from the NVIDIA Developer website. Installation instructions for compatibility with TensorFlow are provided on the TensorFlow GPU support guide.

Documentation

TF-TRT documentaion gives an overview of the supported functionalities, provides tutorials and verified models, explains best practices with troubleshooting guides.

Tests

TF-TRT includes both Python tests and C++ unit tests. Most of Python tests are located in the test directory and they can be executed uring bazel test or directly with the Python command. Most of the C++ unit tests are used to test the conversion functions that convert each TF op to a number of TensorRT layers.

Compilation

In order to compile the module, you need to have a local TensorRT installation (libnvinfer.so and respective include files). During the configuration step, TensorRT should be enabled and installation path should be set. If installed through package managers (deb,rpm), configure script should find the necessary components from the system automatically. If installed from tar packages, user has to set path to location where the library is installed during configuration.

bazel build --config=cuda --config=opt //tensorflow/tools/pip_package:build_pip_package
bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/

License

Apache License 2.0

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].