All Projects → Hydrospheredata → spark-ml-serving

Hydrospheredata / spark-ml-serving

Licence: Apache-2.0 license
Spark ML Lib serving library

Programming Languages

scala
5932 projects
shell
77523 projects

Projects that are alternatives of or similar to spark-ml-serving

sagemaker-sparkml-serving-container
This code is used to build & run a Docker container for performing predictions against a Spark ML Pipeline.
Stars: ✭ 44 (-10.2%)
Mutual labels:  inference, serving
Delta
DELTA is a deep learning based natural language and speech processing platform.
Stars: ✭ 1,479 (+2918.37%)
Mutual labels:  inference, serving
serving-runtime
Exposes a serialized machine learning model through a HTTP API.
Stars: ✭ 15 (-69.39%)
Mutual labels:  inference, serving
Ml Model Ci
MLModelCI is a complete MLOps platform for managing, converting, profiling, and deploying MLaaS (Machine Learning-as-a-Service), bridging the gap between current ML training and serving systems.
Stars: ✭ 122 (+148.98%)
Mutual labels:  inference, serving
Model server
A scalable inference server for models optimized with OpenVINO™
Stars: ✭ 431 (+779.59%)
Mutual labels:  inference, serving
Tensorflow template application
TensorFlow template application for deep learning
Stars: ✭ 1,851 (+3677.55%)
Mutual labels:  inference, serving
pia
📚 🔬 PIA - Protein Inference Algorithms
Stars: ✭ 19 (-61.22%)
Mutual labels:  inference
Barracuda-PoseNet-Tutorial
This tutorial series provides step-by-step instructions for how to perform human pose estimation in Unity with the Barracuda inference library.
Stars: ✭ 53 (+8.16%)
Mutual labels:  inference
infer
🔮 Use TensorFlow models in Go to evaluate Images (and more soon!)
Stars: ✭ 65 (+32.65%)
Mutual labels:  inference
RECCON
This repository contains the dataset and the PyTorch implementations of the models from the paper Recognizing Emotion Cause in Conversations.
Stars: ✭ 126 (+157.14%)
Mutual labels:  inference
concurrent-video-analytic-pipeline-optimization-sample-l
Create a concurrent video analysis pipeline featuring multistream face and human pose detection, vehicle attribute detection, and the ability to encode multiple videos to local storage in a single stream.
Stars: ✭ 39 (-20.41%)
Mutual labels:  inference
travelling-salesman
Rules for Kiwi.com travelling salesman competition
Stars: ✭ 14 (-71.43%)
Mutual labels:  scoring
fast-fomm-mobile
Сompresssing First Order Motion Model for Image Animation to enable its real-time inference on mobile devices
Stars: ✭ 25 (-48.98%)
Mutual labels:  inference
modelbox
A high performance, high expansion, easy to use framework for AI application. 为AI应用的开发者提供一套统一的高性能、易用的编程框架,快速基于AI全栈服务、开发跨端边云的AI行业应用。
Stars: ✭ 48 (-2.04%)
Mutual labels:  inference
typedb
TypeDB: a strongly-typed database
Stars: ✭ 3,152 (+6332.65%)
Mutual labels:  inference
ai-serving
Serving AI/ML models in the open standard formats PMML and ONNX with both HTTP (REST API) and gRPC endpoints
Stars: ✭ 122 (+148.98%)
Mutual labels:  inference
go-topics
Latent Dirichlet Allocation
Stars: ✭ 23 (-53.06%)
Mutual labels:  inference
onnxruntime-rs
Rust wrapper for Microsoft's ONNX Runtime (version 1.8)
Stars: ✭ 149 (+204.08%)
Mutual labels:  inference
safety-gear-detector-python
Observe workers as they pass in front of a camera to determine if they have adequate safety protection.
Stars: ✭ 54 (+10.2%)
Mutual labels:  inference
nn-Meter
A DNN inference latency prediction toolkit for accurately modeling and predicting the latency on diverse edge devices.
Stars: ✭ 211 (+330.61%)
Mutual labels:  inference

Build Status

Spark-ml-serving

Contextless ML implementation of Spark ML.

Proposal

To serve small ML pipelines there is no need to create SparkContext and use cluster-related features. In this project we made our implementations for ML Transformers. Some of them call context-independent Spark methods.

Structure

Instead of using DataFrames, we implemented simple LocalData class to get rid of SparkContext. All Transformers are rewritten to accept LocalData.

How to use

  1. Import this project as dependency:
scalaVersion := "2.11.8"
// Artifact name is depends of what version of spark are you usng for model training:
// spark 2.0.x
libraryDependencies += Seq(
  "io.hydrosphere" %% "spark-ml-serving-2_0" % "0.3.0",
  "org.apache.spark" %% "spark-mllib" % "2.0.2"
)
// spark 2.1.x
libraryDependencies += Seq(
  "io.hydrosphere" %% "spark-ml-serving-2_1" % "0.3.0",
  "org.apache.spark" %% "spark-mllib" % "2.1.2"
)
// spark 2.2.x
libraryDependencies += Seq(
  "io.hydrosphere" %% "spark-ml-serving-2_2" % "0.3.0",
  "org.apache.spark" %% "spark-mllib" % "2.2.0"

)
  1. Use it: example
import io.hydrosphere.spark_ml_serving._
import LocalPipelineModel._

// ....
val model = LocalPipelineModel.load("PATH_TO_MODEL") // Load
val columns = List(LocalDataColumn("text", Seq("Hello!")))
val localData = LocalData(columns)
val result = model.transform(localData) // Transformed result

More examples of different ML models are in tests.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].