All Projects → cdpierse → pyinfer

cdpierse / pyinfer

Licence: Apache-2.0 license
Pyinfer is a model agnostic tool for ML developers and researchers to benchmark the inference statistics for machine learning models or functions.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to pyinfer

nn-Meter
A DNN inference latency prediction toolkit for accurately modeling and predicting the latency on diverse edge devices.
Stars: ✭ 211 (+1010.53%)
Mutual labels:  inference
sagemaker-sparkml-serving-container
This code is used to build & run a Docker container for performing predictions against a Spark ML Pipeline.
Stars: ✭ 44 (+131.58%)
Mutual labels:  inference
FAST-Pathology
⚡ Open-source software for deep learning-based digital pathology
Stars: ✭ 54 (+184.21%)
Mutual labels:  inference
fast-fomm-mobile
Сompresssing First Order Motion Model for Image Animation to enable its real-time inference on mobile devices
Stars: ✭ 25 (+31.58%)
Mutual labels:  inference
woodwork
Woodwork is a Python library that provides robust methods for managing and communicating data typing information.
Stars: ✭ 97 (+410.53%)
Mutual labels:  inference
concurrent-video-analytic-pipeline-optimization-sample-l
Create a concurrent video analysis pipeline featuring multistream face and human pose detection, vehicle attribute detection, and the ability to encode multiple videos to local storage in a single stream.
Stars: ✭ 39 (+105.26%)
Mutual labels:  inference
modelbox
A high performance, high expansion, easy to use framework for AI application. 为AI应用的开发者提供一套统一的高性能、易用的编程框架,快速基于AI全栈服务、开发跨端边云的AI行业应用。
Stars: ✭ 48 (+152.63%)
Mutual labels:  inference
optimum
🏎️ Accelerate training and inference of 🤗 Transformers with easy to use hardware optimization tools
Stars: ✭ 567 (+2884.21%)
Mutual labels:  inference
safety-gear-detector-python
Observe workers as they pass in front of a camera to determine if they have adequate safety protection.
Stars: ✭ 54 (+184.21%)
Mutual labels:  inference
BMW-IntelOpenVINO-Detection-Inference-API
This is a repository for a No-Code object detection inference API using the OpenVINO. It's supported on both Windows and Linux Operating systems.
Stars: ✭ 66 (+247.37%)
Mutual labels:  inference
ReactiveMP.jl
Julia package for automatic Bayesian inference on a factor graph with reactive message passing
Stars: ✭ 58 (+205.26%)
Mutual labels:  inference
typedb
TypeDB: a strongly-typed database
Stars: ✭ 3,152 (+16489.47%)
Mutual labels:  inference
BMW-IntelOpenVINO-Segmentation-Inference-API
This is a repository for a semantic segmentation inference API using the OpenVINO toolkit
Stars: ✭ 31 (+63.16%)
Mutual labels:  inference
onnxruntime-rs
Rust wrapper for Microsoft's ONNX Runtime (version 1.8)
Stars: ✭ 149 (+684.21%)
Mutual labels:  inference
inferelator
Task-based gene regulatory network inference using single-cell or bulk gene expression data conditioned on a prior network.
Stars: ✭ 24 (+26.32%)
Mutual labels:  inference
motor-defect-detector-python
Predict performance issues with manufacturing equipment motors. Perform local or cloud analytics of the issues found, and then display the data on a user interface to determine when failures might arise.
Stars: ✭ 24 (+26.32%)
Mutual labels:  inference
vuex-context
Write fully type inferred Vuex modules
Stars: ✭ 11 (-42.11%)
Mutual labels:  inference
arboreto
A scalable python-based framework for gene regulatory network inference using tree-based ensemble regressors.
Stars: ✭ 33 (+73.68%)
Mutual labels:  inference
model analyzer
Triton Model Analyzer is a CLI tool to help with better understanding of the compute and memory requirements of the Triton Inference Server models.
Stars: ✭ 144 (+657.89%)
Mutual labels:  inference
spark-ml-serving
Spark ML Lib serving library
Stars: ✭ 49 (+157.89%)
Mutual labels:  inference

Pyinfer logo

build docs

Pyinfer is a model agnostic tool for ML developers and researchers to benchmark the inference statistics for machine learning models or functions.

Installation

pip install pyinfer

Overview

Inference Report

InferenceReport is for reporting inference statistics on a single model artifact. To create a valid report simply pass it a callable model function or method, valid input(s), and either n_iterations or n_seconds to determine what interval the report uses for its run duration. Check out the docs for more information on the optional parameters that can be passed.

Pyinfer Example Usage

Multi Inference Report

MultiInferenceReport is for reporting inference statistics on a list of model artifacts. To create a valid multi report pass it a list of callable model functions or methods, a list of valid input(s), and either n_iterations or n_seconds to determine what interval the report uses for its run duration. Check out the docs for more information on the optional parameters that can be passed.

Pyinfer Example Usage

Example Outputs

Table Report

Pyinfer Table Report

Run Plot

Pyinfer Report Plot

Stats Currently Included

  • Success Rate - Number of successful inferences within a specified time range.
  • Failures - Number of inferences above specified time range.
  • Time Taken - Total time taken to run all inferences.
  • Inference Per Second - Estimate of how many inferences per second the selected model can perform.
  • Max Run - The max time taken to perform an inference for a given run.
  • Min Run - The min time taken to perform an inference for a given run.
  • Std - The Standard deviation between runs.
  • Mean - The mean run time.
  • Median - The median run time.
  • IQR - The inter quartile range of the runs.
  • Cores Logical - The number of logical cores on the host machine.
  • Cores Physical - The number of physical Cores on the host machine.

Planned Future Stats

  • Model Size - Information relating to the size of the model in bytes.
  • GPU Stat Support - Information about if GPU is available and if it is being utilized.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].