All Projects → NervanaSystems → ngraph-onnx

NervanaSystems / ngraph-onnx

Licence: Apache-2.0 license
nGraph™ Backend for ONNX

Programming Languages

python
139335 projects - #7 most used programming language
groovy
2714 projects
shell
77523 projects
Dockerfile
14818 projects

Projects that are alternatives of or similar to ngraph-onnx

onnx learn
No description or website provided.
Stars: ✭ 74 (+76.19%)
Mutual labels:  onnx
ai-serving
Serving AI/ML models in the open standard formats PMML and ONNX with both HTTP (REST API) and gRPC endpoints
Stars: ✭ 122 (+190.48%)
Mutual labels:  onnx
gluon2pytorch
Gluon to PyTorch deep neural network model converter
Stars: ✭ 72 (+71.43%)
Mutual labels:  onnx
Peppa-Facial-Landmark-PyTorch
Facial Landmark Detection based on PyTorch
Stars: ✭ 172 (+309.52%)
Mutual labels:  onnx
onnx2caffe
pytorch to caffe by onnx
Stars: ✭ 341 (+711.9%)
Mutual labels:  onnx
ONNX.jl
Read ONNX graphs in Julia
Stars: ✭ 112 (+166.67%)
Mutual labels:  onnx
optimizer
Actively maintained ONNX Optimizer
Stars: ✭ 383 (+811.9%)
Mutual labels:  onnx
YOLOv4MLNet
Use the YOLO v4 and v5 (ONNX) models for object detection in C# using ML.Net
Stars: ✭ 61 (+45.24%)
Mutual labels:  onnx
kinference
Running ONNX models in vanilla Kotlin
Stars: ✭ 70 (+66.67%)
Mutual labels:  onnx
mtomo
Multiple types of NN model optimization environments. It is possible to directly access the host PC GUI and the camera to verify the operation. Intel iHD GPU (iGPU) support. NVIDIA GPU (dGPU) support.
Stars: ✭ 24 (-42.86%)
Mutual labels:  onnx
torch-model-compression
针对pytorch模型的自动化模型结构分析和修改工具集,包含自动分析模型结构的模型压缩算法库
Stars: ✭ 126 (+200%)
Mutual labels:  onnx
PSGAN-NCNN
PSGAN running with ncnn⚡妆容迁移/仿妆⚡Imitation Makeup/Makeup Transfer⚡
Stars: ✭ 140 (+233.33%)
Mutual labels:  onnx
pytorch-android
[EXPERIMENTAL] Demo of using PyTorch 1.0 inside an Android app. Test with your own deep neural network such as ResNet18/SqueezeNet/MobileNet v2 and a phone camera.
Stars: ✭ 105 (+150%)
Mutual labels:  onnx
deepvac
PyTorch Project Specification.
Stars: ✭ 507 (+1107.14%)
Mutual labels:  onnx
onnxruntime-rs
Rust wrapper for Microsoft's ONNX Runtime (version 1.8)
Stars: ✭ 149 (+254.76%)
Mutual labels:  onnx
arcface retinaface mxnet2onnx
arcface and retinaface model convert mxnet to onnx.
Stars: ✭ 53 (+26.19%)
Mutual labels:  onnx
InsightFace-REST
InsightFace REST API for easy deployment of face recognition services with TensorRT in Docker.
Stars: ✭ 308 (+633.33%)
Mutual labels:  onnx
yolov5 tensorrt int8 tools
tensorrt int8 量化yolov5 onnx模型
Stars: ✭ 105 (+150%)
Mutual labels:  onnx
tractjs
Run ONNX and TensorFlow inference in the browser.
Stars: ✭ 67 (+59.52%)
Mutual labels:  onnx
AnimeGANv2-ONNX-Sample
「PyTorch Implementation of AnimeGANv2」のPythonでのONNX推論サンプル
Stars: ✭ 54 (+28.57%)
Mutual labels:  onnx

ngraph-onnx Build Status

nGraph Backend for ONNX.

This repository contains tools to run ONNX models using the Intel nGraph library as a backend.

Installation

Follow our build instructions to install nGraph-ONNX from sources.

Usage example

Importing an ONNX model

You can download models from the ONNX model zoo. For example ResNet-50:

$ wget https://s3.amazonaws.com/download.onnx/models/opset_8/resnet50.tar.gz
$ tar -xzvf resnet50.tar.gz

Use the following Python commands to convert the downloaded model to an nGraph model:

# Import ONNX and load an ONNX file from disk
>>> import onnx
>>> onnx_protobuf = onnx.load('resnet50/model.onnx')

# Convert ONNX model to an ngraph model
>>> from ngraph_onnx.onnx_importer.importer import import_onnx_model
>>> ng_function = import_onnx_model(onnx_protobuf)

# The importer returns a list of ngraph models for every ONNX graph output:
>>> print(ng_function)
<Function: 'resnet50' ([1, 1000])>

This creates an nGraph Function object, which can be used to execute a computation on a chosen backend.

Running a computation

After importing an ONNX model, you will have an nGraph Function object. Now you can create an nGraph Runtime backend and use it to compile your Function to a backend-specific Computation object. Finally, you can execute your model by calling the created Computation object with input data.

# Using an ngraph runtime (CPU backend) create a callable computation object
>>> import ngraph as ng
>>> runtime = ng.runtime(backend_name='CPU')
>>> resnet_on_cpu = runtime.computation(ng_function)

# Load an image (or create a mock as in this example)
>>> import numpy as np
>>> picture = np.ones([1, 3, 224, 224], dtype=np.float32)

# Run computation on the picture:
>>> resnet_on_cpu(picture)
[array([[2.16105007e-04, 5.58412226e-04, 9.70510227e-05, 5.76671446e-05,
         7.45318757e-05, 4.80892748e-04, 5.67404088e-04, 9.48728994e-05,
         ...
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].