Top 162 inference open source projects

Neuropod
A uniform interface to run deep learning models from multiple frameworks
Variational gradient matching for dynamical systems
Sample code for the NIPS paper "Scalable Variational Inference for Dynamical Systems"
Turbotransformers
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Xnnpack
High-efficiency floating-point neural network inference operators for mobile, server, and Web
Multi Model Server
Multi Model Server is a tool for serving neural net models for inference
Awesome Ml Demos With Ios
The challenge projects for Inferencing machine learning models on iOS
Ultra Light Fast Generic Face Detector 1mb
💎1MB lightweight face detection model (1MB轻量级人脸检测模型)
Lua Language Server
Lua Language Server coded by Lua
Awesome Emdl
Embedded and mobile deep learning research resources
Torchlayers
Shape and dimension inference (Keras-like) for PyTorch layers and neural networks
Lightseq
LightSeq: A High Performance Inference Library for Sequence Processing and Generation
Io Ts
Runtime type system for IO decoding/encoding
Causaldiscoverytoolbox
Package for causal inference in graphs and in the pairwise settings. Tools for graph structure recovery and dependencies are included.
Model server
A scalable inference server for models optimized with OpenVINO™
Tensorflow Cmake
TensorFlow examples in C, C++, Go and Python without bazel but with cmake and FindTensorFlow.cmake
Gpu Rest Engine
A REST API for Caffe using Docker and Go
Cubert
Fast implementation of BERT inference directly on NVIDIA (CUDA, CUBLAS) and Intel MKL
Gfocal
Generalized Focal Loss: Learning Qualified and Distributed Bounding Boxes for Dense Object Detection, NeurIPS2020
Cppflow
Run TensorFlow models in C++ without installation and without Bazel
Snips Nlu Rs
Snips NLU rust implementation
Identywaf
Blind WAF identification tool
Filetype.py
Small, dependency-free, fast Python package to infer binary file types checking the magic numbers signature
Mmdetection To Tensorrt
convert mmdetection model to tensorrt, support fp16, int8, batch input, dynamic shape etc.
Chaidnn
HLS based Deep Neural Network Accelerator Library for Xilinx Ultrascale+ MPSoCs
Tnn
TNN: developed by Tencent Youtu Lab and Guangying Lab, a uniform deep learning inference framework for mobile、desktop and server. TNN is distinguished by several outstanding features, including its cross-platform capability, high performance, model compression and code pruning. Based on ncnn and Rapidnet, TNN further strengthens the support and …
Adversarial Robustness Toolbox
Adversarial Robustness Toolbox (ART) - Python Library for Machine Learning Security - Evasion, Poisoning, Extraction, Inference - Red and Blue Teams
Amazon Sagemaker Examples
Example 📓 Jupyter notebooks that demonstrate how to build, train, and deploy machine learning models using 🧠 Amazon SageMaker.
smfsb
Documentation, models and code relating to the 3rd edition of the textbook Stochastic Modelling for Systems Biology
typeql
TypeQL: the query language of TypeDB - a strongly-typed database
hoice
An ICE-based predicate synthesizer for Horn clauses.
tf-cpp-pose-estimation
Tensorflow C++ examples for Visual Studio. Features Pose Estimation and various techniques to utilize the Tensorflow C++ interface
causaldag
Python package for the creation, manipulation, and learning of Causal DAGs
FATE-Serving
A scalable, high-performance serving system for federated learning models
JOCI
Ordinal Common-sense Inference
serving-runtime
Exposes a serialized machine learning model through a HTTP API.
caffe
This fork of BVLC/Caffe is dedicated to supporting Cambricon deep learning processor and improving performance of this deep learning framework when running on Machine Learning Unit(MLU).
forestError
A Unified Framework for Random Forest Prediction Error Estimation
InferenceHelper
C++ Helper Class for Deep Learning Inference Frameworks: TensorFlow Lite, TensorRT, OpenCV, OpenVINO, ncnn, MNN, SNPE, Arm NN, NNabla, ONNX Runtime, LibTorch, TensorFlow
pyinfer
Pyinfer is a model agnostic tool for ML developers and researchers to benchmark the inference statistics for machine learning models or functions.
arboreto
A scalable python-based framework for gene regulatory network inference using tree-based ensemble regressors.
optimum
🏎️ Accelerate training and inference of 🤗 Transformers with easy to use hardware optimization tools
model analyzer
Triton Model Analyzer is a CLI tool to help with better understanding of the compute and memory requirements of the Triton Inference Server models.
inferelator
Task-based gene regulatory network inference using single-cell or bulk gene expression data conditioned on a prior network.
61-120 of 162 inference projects