All Projects → BMW-InnovationLab → Bmw Classification Inference Gpu Cpu

BMW-InnovationLab / Bmw Classification Inference Gpu Cpu

Licence: apache-2.0
This is a repository for an image classification inference API using the Gluoncv framework. The inference REST API works on CPU/GPU. It's supported on Windows and Linux Operating systems. Models trained using our Gluoncv Classification training repository can be deployed in this API. Several models can be loaded and used at the same time.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Bmw Classification Inference Gpu Cpu

Gfocal
Generalized Focal Loss: Learning Qualified and Distributed Bounding Boxes for Dense Object Detection, NeurIPS2020
Stars: ✭ 376 (+1292.59%)
Mutual labels:  classification, inference
Torch2trt
An easy to use PyTorch to TensorRT converter
Stars: ✭ 2,974 (+10914.81%)
Mutual labels:  classification, inference
Bert Ner
Pytorch-Named-Entity-Recognition-with-BERT
Stars: ✭ 829 (+2970.37%)
Mutual labels:  inference
Text classification
all kinds of text classification models and more with deep learning
Stars: ✭ 7,179 (+26488.89%)
Mutual labels:  classification
Variational gradient matching for dynamical systems
Sample code for the NIPS paper "Scalable Variational Inference for Dynamical Systems"
Stars: ✭ 22 (-18.52%)
Mutual labels:  inference
Dog Cat Classification
This is a Flask app that can classify images of dog and cat. The underlying model is a CNN trained using Keras framework
Stars: ✭ 17 (-37.04%)
Mutual labels:  classification
Deep Music Genre Classification
🎵 Using Deep Learning to Categorize Music as Time Progresses Through Spectrogram Analysis
Stars: ✭ 23 (-14.81%)
Mutual labels:  classification
Xnnpack
High-efficiency floating-point neural network inference operators for mobile, server, and Web
Stars: ✭ 808 (+2892.59%)
Mutual labels:  inference
Tribuo
Tribuo - A Java machine learning library
Stars: ✭ 882 (+3166.67%)
Mutual labels:  classification
Eda nlp
Data augmentation for NLP, presented at EMNLP 2019
Stars: ✭ 902 (+3240.74%)
Mutual labels:  classification
Awesome Fraud Detection Papers
A curated list of data mining papers about fraud detection.
Stars: ✭ 843 (+3022.22%)
Mutual labels:  classification
137 Stopmove
Algorithms to automatically discover stops and moves in GPS trajectories.
Stars: ✭ 19 (-29.63%)
Mutual labels:  classification
Pyts
A Python package for time series classification
Stars: ✭ 895 (+3214.81%)
Mutual labels:  classification
Metacache
memory efficient, fast & precise taxnomomic classification system for metagenomic read mapping
Stars: ✭ 26 (-3.7%)
Mutual labels:  classification
Skin Cancer Image Classification
Skin cancer classification using Inceptionv3
Stars: ✭ 16 (-40.74%)
Mutual labels:  classification
Neuropod
A uniform interface to run deep learning models from multiple frameworks
Stars: ✭ 858 (+3077.78%)
Mutual labels:  inference
Turbotransformers
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Stars: ✭ 826 (+2959.26%)
Mutual labels:  inference
Gesturerecognition
Gesture Recognition using TensorFlow
Stars: ✭ 19 (-29.63%)
Mutual labels:  classification
Jsmlt
🏭 JavaScript Machine Learning Toolkit
Stars: ✭ 22 (-18.52%)
Mutual labels:  classification
Servenet
Service Classification based on Service Description
Stars: ✭ 21 (-22.22%)
Mutual labels:  classification

Gluoncv classification GPU/CPU Inference API

This is a repository for an image classification inference API using the Gluoncv framework.

The inference REST API works on CPU/GPU. It's supported on Windows and Linux Operating systems.

Models trained using our Gluoncv Classification training repository can be deployed in this API. Several models can be loaded and used at the same time.

api

Prerequisites

  • OS:
    • Windows or Linux
  • Docker

Check for prerequisites

To check if you have docker-ce installed:

docker --version

Install prerequisites

-Ubuntu

Use the following command to install docker on Ubuntu:

chmod +x install_prerequisites.sh && source install_prerequisites.sh

-Windows 10

To install Docker on Windows, please follow the link.

P.S: For Windows users, open the Docker Desktop menu by clicking the Docker Icon in the Notifications area. Select Settings, and then Advanced tab to adjust the resources available to Docker Engine.

Build The Docker Image

In order to build the project run the following command from the project's root directory:

docker build -t  gluoncv_classification -f {CPU or GPU}/dockerfile .

Behind a proxy

docker build --build-arg http_proxy='' --build-arg https_proxy='' -t gluoncv_classification -f ./{CPU or GPU}/dockerfile .

Run the docker container

To run the API, go the to the API's directory and run the following:

-Using Linux based docker:

  • CPU:
sudo docker run -itv $(pwd)/models:/models -p 4343:4343 gluoncv_classification
  • GPU:
sudo nvidia-docker run -itv $(pwd)/models:/models -p 4343:4343 gluoncv_classification

-Using Windows based docker:

For windows inference is only supported on CPU

  • CPU:
docker run -itv ${PWD}/models:/models -p 4343:4343 gluoncv_classification

The API file will be run automatically, and the service will listen to http requests on the chosen port.

API Endpoints

To see all available endpoints, open your favorite browser and navigate to:

http://localhost:4343/docs

The 'predict_batch' endpoint is not shown on swagger. The list of files input is not yet supported.

P.S: If you are using custom endpoints like /load, /detect, and /get_labels, you should always use the /load endpoint first and then use /detect or /get_labels

Endpoints summary

/load (GET)

Loads all available models and returns every model with it's hashed value. Loaded models are stored and aren't loaded again

/detect (POST)

Performs inference on specified model, image, and returns class

/get_labels (POST)

Returns all of the specified model labels with their hashed values

/models (GET)

Lists all available models

/models/{model_name}/load (GET)

Loads the specified model. Loaded models are stored and aren't loaded again

/models/{model_name}/predict (POST)

Performs inference on specified model, image, and returns class

/models/{model_name}/labels (GET)

Returns all of the specified model labels

/models/{model_name}/config (GET)

Returns the specified model's configuration

Model structure

The folder "models" contains subfolders of all the models to be loaded. Inside each subfolder there should be a:

  • classes.txt file: contains the name of the classes separated by a ','

  • .params file : contain the models parameters

  • -0000.params file : contains the models parameters

  • -symbol.json file : contains the models architecture

  • Config.json (This is a json file containing information about the model)

       {
           "cpu": true, 
           "max_number_of_predictions": 3, 
           "minimum_confidence": 0.6,
           "inference_engine_name": "classification"
       }
    
    

    P.S:

    • Make sure to change cpu to {true/false} based on the image used {CPU/GPU}
    • You can change confidence and predictions values while running the API
    • The API will return a response with a confidence higher than the "minimum_confidence" value. A high "minimum_confidence" can show you only accurate predictions
    • The "max_number_of_predictions" value specifies the maximum number of classes returned and analyzed in the API response

Acknowledgements

Roy Anwar,Beirut, Lebanon

Fouad Chaccour, Beirut, Lebanon

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].