MTlab / Onnx2caffe
Licence: mit
pytorch to caffe by onnx
Stars: ✭ 282
Programming Languages
python
139335 projects - #7 most used programming language
Projects that are alternatives of or similar to Onnx2caffe
Mmdnn
MMdnn is a set of tools to help users inter-operate among different deep learning frameworks. E.g. model conversion and visualization. Convert models between Caffe, Keras, MXNet, Tensorflow, CNTK, PyTorch Onnx and CoreML.
Stars: ✭ 5,472 (+1840.43%)
Mutual labels: caffe, onnx
Keras Oneclassanomalydetection
[5 FPS - 150 FPS] Learning Deep Features for One-Class Classification (AnomalyDetection). Corresponds RaspberryPi3. Convert to Tensorflow, ONNX, Caffe, PyTorch. Implementation by Python + OpenVINO/Tensorflow Lite.
Stars: ✭ 102 (-63.83%)
Mutual labels: caffe, onnx
Pinto model zoo
A repository that shares tuning results of trained models generated by TensorFlow / Keras. Post-training quantization (Weight Quantization, Integer Quantization, Full Integer Quantization, Float16 Quantization), Quantization-aware training. TensorFlow Lite. OpenVINO. CoreML. TensorFlow.js. TF-TRT. MediaPipe. ONNX. [.tflite,.h5,.pb,saved_model,tfjs,tftrt,mlmodel,.xml/.bin, .onnx]
Stars: ✭ 634 (+124.82%)
Mutual labels: caffe, onnx
Php Opencv Examples
Tutorial for computer vision and machine learning in PHP 7/8 by opencv (installation + examples + documentation)
Stars: ✭ 333 (+18.09%)
Mutual labels: caffe, onnx
arcface retinaface mxnet2onnx
arcface and retinaface model convert mxnet to onnx.
Stars: ✭ 53 (-81.21%)
Mutual labels: caffe, onnx
Visualdl
Deep Learning Visualization Toolkit(『飞桨』深度学习可视化工具 )
Stars: ✭ 4,258 (+1409.93%)
Mutual labels: caffe, onnx
Onnx Chainer
Add-on package for ONNX format support in Chainer
Stars: ✭ 83 (-70.57%)
Mutual labels: caffe, onnx
Centerx
This repo is implemented based on detectron2 and centernet
Stars: ✭ 403 (+42.91%)
Mutual labels: caffe, onnx
Netron
Visualizer for neural network, deep learning, and machine learning models
Stars: ✭ 17,193 (+5996.81%)
Mutual labels: caffe, onnx
X2paddle
Deep learning model converter for PaddlePaddle. (『飞桨』深度学习模型转换工具)
Stars: ✭ 315 (+11.7%)
Mutual labels: caffe, onnx
Deepo
Setup and customize deep learning environment in seconds.
Stars: ✭ 6,145 (+2079.08%)
Mutual labels: caffe, onnx
Ncnn
ncnn is a high-performance neural network inference framework optimized for the mobile platform
Stars: ✭ 13,376 (+4643.26%)
Mutual labels: caffe, onnx
ppq
PPL Quantization Tool (PPQ) is a powerful offline neural network quantization tool.
Stars: ✭ 281 (-0.35%)
Mutual labels: caffe, onnx
caffe-cifar-10-and-cifar-100-datasets-preprocessed-to-HDF5
Both deep learning datasets can be imported in python directly with h5py (HDF5 format). The datasets can be directly imported or converted with a python script.
Stars: ✭ 14 (-95.04%)
Mutual labels: caffe
Caffe Rotate Pool
Rotate RoI Align and Rotate Position Sensitive RoI Align Operation in Caffe
Stars: ✭ 16 (-94.33%)
Mutual labels: caffe
Convert pytorch to Caffe by ONNX
This tool converts pytorch model to Caffe model by ONNX
only use for inference
Dependencies
- caffe (with python support)
- pytorch 0.4 (optional if you only want to convert onnx)
- onnx
we recomand using protobuf 2.6.1 and install onnx from source
git clone --recursive https://github.com/onnx/onnx.git
cd onnx
python setup.py install
How to use
run test.py to make sure it has been installed correctly
To convert onnx model to caffe:
python convertCaffe.py ./model/MobileNetV2.onnx ./model/MobileNetV2.prototxt ./model/MobileNetV2.caffemodel
Current support operation
- Conv
- ConvTranspose
- BatchNormalization
- MaxPool
- AveragePool
- Relu
- Sigmoid
- Dropout
- Gemm (InnerProduct only)
- Add
- Mul
- Reshape
- Upsample
- Concat
- Flatten
TODO List
- [ ] support all onnx operations (which is impossible)
- [ ] merge batchnormization to convolution
- [ ] merge scale to convolution
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].