All Projects → jiazhihao → Taso

jiazhihao / Taso

Licence: apache-2.0
The Tensor Algebra SuperOptimizer for Deep Learning

Projects that are alternatives of or similar to Taso

Pose Residual Network
Code for the Pose Residual Network introduced in 'MultiPoseNet: Fast Multi-Person Pose Estimation using Pose Residual Network (ECCV 2018)' paper
Stars: ✭ 337 (-13.81%)
Mutual labels:  deep-neural-networks
Action Recognition Visual Attention
Action recognition using soft attention based deep recurrent neural networks
Stars: ✭ 350 (-10.49%)
Mutual labels:  deep-neural-networks
Mobilenetv2.pytorch
72.8% MobileNetV2 1.0 model on ImageNet and a spectrum of pre-trained MobileNetV2 models
Stars: ✭ 369 (-5.63%)
Mutual labels:  deep-neural-networks
Tensorflow Open nsfw
Tensorflow Implementation of Yahoo's Open NSFW Model
Stars: ✭ 338 (-13.55%)
Mutual labels:  deep-neural-networks
Real Time Gesrec
Real-time Hand Gesture Recognition with PyTorch on EgoGesture, NvGesture, Jester, Kinetics and UCF101
Stars: ✭ 339 (-13.3%)
Mutual labels:  deep-neural-networks
Predictive Maintenance Using Lstm
Example of Multiple Multivariate Time Series Prediction with LSTM Recurrent Neural Networks in Python with Keras.
Stars: ✭ 352 (-9.97%)
Mutual labels:  deep-neural-networks
Deepcut
A Thai word tokenization library using Deep Neural Network
Stars: ✭ 330 (-15.6%)
Mutual labels:  deep-neural-networks
First Steps Towards Deep Learning
This is an open sourced book on deep learning.
Stars: ✭ 376 (-3.84%)
Mutual labels:  deep-neural-networks
Curl
CURL: Contrastive Unsupervised Representation Learning for Sample-Efficient Reinforcement Learning
Stars: ✭ 346 (-11.51%)
Mutual labels:  deep-neural-networks
Easy Deep Learning With Keras
Keras tutorial for beginners (using TF backend)
Stars: ✭ 367 (-6.14%)
Mutual labels:  deep-neural-networks
Caffe
Caffe for Sparse and Low-rank Deep Neural Networks
Stars: ✭ 339 (-13.3%)
Mutual labels:  deep-neural-networks
Fire Detection Cnn
real-time fire detection in video imagery using a convolutional neural network (deep learning) - from our ICIP 2018 paper (Dunnings / Breckon) + ICMLA 2019 paper (Samarth / Bhowmik / Breckon)
Stars: ✭ 340 (-13.04%)
Mutual labels:  deep-neural-networks
Openchem
OpenChem: Deep Learning toolkit for Computational Chemistry and Drug Design Research
Stars: ✭ 356 (-8.95%)
Mutual labels:  deep-neural-networks
Vergeml
Machine Learning Environment - alpha version
Stars: ✭ 338 (-13.55%)
Mutual labels:  deep-neural-networks
U Net
U-Net: Convolutional Networks for Biomedical Image Segmentation
Stars: ✭ 374 (-4.35%)
Mutual labels:  deep-neural-networks
Keras Mmoe
A Keras implementation of "Modeling Task Relationships in Multi-task Learning with Multi-gate Mixture-of-Experts" (KDD 2018)
Stars: ✭ 332 (-15.09%)
Mutual labels:  deep-neural-networks
Magnet
Deep Learning Projects that Build Themselves
Stars: ✭ 351 (-10.23%)
Mutual labels:  deep-neural-networks
Flow Forecast
Deep learning PyTorch library for time series forecasting, classification, and anomaly detection (originally for flood forecasting).
Stars: ✭ 368 (-5.88%)
Mutual labels:  deep-neural-networks
Rmdl
RMDL: Random Multimodel Deep Learning for Classification
Stars: ✭ 375 (-4.09%)
Mutual labels:  deep-neural-networks
Openrec
OpenRec is an open-source and modular library for neural network-inspired recommendation algorithms
Stars: ✭ 360 (-7.93%)
Mutual labels:  deep-neural-networks

TASO: The Tensor Algebra SuperOptimizer for Deep Learning

TASO optimizes the computation graphs of DNN models using automatically generated and verified graph transformations. For an arbitrary DNN model, TASO uses the auto-generated graph transformations to build a large search space of potential computation graphs that are equivalent to the original DNN model. TASO employs a cost-based search algorithm to explore the space, and automatically discovers highly optimized computation graphs. TASO outperforms the graph optimizers in existing deep learning frameworks by up to 3x.

End-to-end inference performance comparison on a NVIDIA V100 GPU.

Install TASO

See instructions to install TASO from source. We also provide prebuilt docker images with all dependencies pre-installed.

Use TASO

TASO can directly optimize any pre-trained DNN models in ONNX, TensorFlow, and PyTorch graph formats. TASO also provides a Python interface for optimizing arbitrary DNN architectures. TASO supports exporting the optimized computation graphs to ONNX, which can be directly used as inputs by most existing deep learning frameworks.

Optimize ONNX Models

TASO can directly optimize pre-trained ONNX models, and this can be done in just a few lines of Python code. The following code snippet shows how to load a pre-trained DNN model from ONNX, optimize the model, and save the optimized model into a ONNX file.

import taso
import onnx

old_model = taso.load_onnx("/path/to/load/onnx/model")
taso_graph = taso.optimize(old_model)
new_model = taso.export_onnx(taso_graph)
onnx.save(new_model, "/path/to/save/new/onnx/model")

The optimized model has the same accuracy as the original and can be directly used by existing deep learning frameworks. Some original and TASO-optimized ONNX files are available in the onnx folder.

Optimize TensorFlow Models

TASO can optimize TensorFlow models by converting the model to ONNX using tf2onnx.

  • First, install tf2onnx from PyPi as follows or from source.
pip install -U tf2onnx
  • Second, convert a TensorFlow model to ONNX using tf2onnx.
python -m tf2onnx.convert \
       --saved-model /path/to/tensorflow/saved/model \
       --output /path/to/onnx/model/file

Optimize PyTorch Models

PyTorch has built-in support for ONNX as a part of the torch.onnx package. TASO can directly optimize PyTorch models in the ONNX format.

Optimize Arbitrary DNN Models using the Python Interface

TASO can also optimize arbitrary DNN architectures using the TASO Python interface. The following code snippet builds the left-most DNN graph depicted in the figure. TASO automatically performs a series of non-trivial transformations, and eventually discovers the right-most DNN graph, which is 1.3x faster on a V100 GPU. More DNN examples are available in the examples folder.

import taso
import onnx

#Build DNN model
graph = taso.new_graph()
input = graph.new_input(dims=(1,128,56,56))
w1 = graph.new_weight(dims=(128,128,3,3))
w2 = graph.new_weight(dims=(128,128,1,1))
w3 = graph.new_weight(dims=(128,128,3,3))
left = graph.conv2d(input=input, weight=w1, strides=(1,1), padding="SAME", activation="RELU")
left = graph.conv2d(input=left, weight=w3, strides=(1,1), padding="SAME")
right = graph.conv2d(input=input, weight=w2, strides=(1,1), padding="SAME", activation="RELU")
output = graph.add(left, right)
output = graph.relu(output)

#Optimize DNN model
new_graph = taso.optimize(graph)
onnx_model = taso.export_onnx(new_graph)
onnx.save(onnx_model, "/path/to/save/new/onnx/model")

Publication

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].