All Projects → lunardog → Kelner

lunardog / Kelner

Licence: mit
Serve your models

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Kelner

Charla
A IRC Server / Daemon written in Python using the circuits Application Framework.
Stars: ✭ 8 (-61.9%)
Mutual labels:  server
Clusterws
💥 Lightweight, fast and powerful framework for building scalable WebSocket applications in Node.js
Stars: ✭ 868 (+4033.33%)
Mutual labels:  server
Start Server And Test
Starts server, waits for URL, then runs test command; when the tests end, shuts down server
Stars: ✭ 879 (+4085.71%)
Mutual labels:  server
Puma
A Ruby/Rack web server built for parallelism
Stars: ✭ 6,924 (+32871.43%)
Mutual labels:  server
Blink Java
Simplified pure Java http server
Stars: ✭ 10 (-52.38%)
Mutual labels:  server
Webpack Dev Server
Serves a webpack app. Updates the browser on changes. Documentation https://webpack.js.org/configuration/dev-server/.
Stars: ✭ 7,250 (+34423.81%)
Mutual labels:  server
Litecloud
User management system for the server (Home Cloud).
Stars: ✭ 26 (+23.81%)
Mutual labels:  server
Boilerplate Nodejs Swagger
A Node.js RESTful API application boilerplate with TypeScript, Docker, Koa, Swagger, Jest, and CircleCI
Stars: ✭ 21 (+0%)
Mutual labels:  server
Deprecated
🚀 Framework for building universal web app and static website in Vue.js (beta)
Stars: ✭ 858 (+3985.71%)
Mutual labels:  server
Firenet
Deprecated master server for creating MMO games based on CRYENGINE
Stars: ✭ 14 (-33.33%)
Mutual labels:  server
Crony
Cron scheduler as a service 🕠
Stars: ✭ 9 (-57.14%)
Mutual labels:  server
Express
Swift Express is a simple, yet unopinionated web application server written in Swift
Stars: ✭ 855 (+3971.43%)
Mutual labels:  server
C10k Server
A toy asynchronous server, written in C++14 (WIP)
Stars: ✭ 14 (-33.33%)
Mutual labels:  server
Ansible Role Docker
Ansible Role - Docker
Stars: ✭ 845 (+3923.81%)
Mutual labels:  server
Connect2ssh
Manage SSH and SSHFS connections via the command line using BASH!
Stars: ✭ 15 (-28.57%)
Mutual labels:  server
Jerrymouse
A scalable java servlet container base on reactor
Stars: ✭ 27 (+28.57%)
Mutual labels:  server
Clientserverproject
一个C-S模版,该模版由三部分的程序组成,一个服务端运行的程序,一个客户端运行的程序,还有一个公共的组件,实现了基础的账户管理功能,版本控制,软件升级,公告管理,消息群发,共享文件上传下载,批量文件传送功能。具体的操作方法见演示就行。本项目的一个目标是:提供一个基础的中小型系统的C-S框架,客户端有三种模式,无缝集成访问,winform版本,wpf版本,asp.net mvc版本,方便企业进行中小型系统的二次开发和个人学习。同时网络组件方便的支持读写三菱和西门子PLC的数据,详细见Readme
Stars: ✭ 873 (+4057.14%)
Mutual labels:  server
Steppy Toolkit
Curated set of transformers that make your work with steppy faster and more effective 🔭
Stars: ✭ 21 (+0%)
Mutual labels:  tensorflow-models
Aidp
weiboAd Infrastructure Data Processor : kafka consumer embedded Lua scripting language in data process framework
Stars: ✭ 20 (-4.76%)
Mutual labels:  server
Tinytcpserver
A small tcp server working under Mono or .NET (4.0) and provides hooks for handling data exchange with clients (works under mono and .net). Behaviour/protocol/reaction could be specified via custom C# script.
Stars: ✭ 14 (-33.33%)
Mutual labels:  server

Kelner

Build Status

Kelner logo

Ridiculously simple model serving.

  1. Get an exported model (download or train and save)
  2. kelnerd -m SAVED_MODEL_FILE
  3. There is no step 3, your model is served

Quickstart

Install kelner

    $ pip install kelner

Download a Tensorflow ProtoBuff file

    $ wget https://storage.googleapis.com/download.tensorflow.org/models/inception_dec_2015.zip
    $ unzip inception_dec_2015.zip
        Archive:  inception_dec_2015.zip
        inflating: imagenet_comp_graph_label_strings.txt
        inflating: LICENSE
        inflating: tensorflow_inception_graph.pb
    $ kelnerd -m tensorflow_inception_graph.pb --engine tensorflow --input-node ExpandDims --output-node softmax

Run the server

    $ kelnerd -m tensorflow_inception_graph.pb --engine tensorflow --input-node ExpandDims --output-node softmax

Send a request to the model:

    $ curl --data-binary "@dog.jpg" localhost:61453 -X POST -H "Content-Type: image/jpeg"

The response should be a JSON-encoded array of floating point numbers.

For a fancy client (not really necessary, but useful) you can use the kelner command.

This is how you get the top 5 labels from the server you ran above (note the head -n 5 part):

    $ kelner classify dog.jpg --imagenet-labels --top 5
    boxer: 0.973630
    Saint Bernard: 0.001821
    bull mastiff: 0.000624
    Boston bull: 0.000486
    Greater Swiss Mountain dog: 0.000377

Use kelner in code

If you need to, you can also use kelner in your code.

Let's create an example model:

import keras

l1 = keras.layers.Input((2,))
l2 = keras.layers.Dense(3)(l1)
l3 = keras.layers.Dense(1)(l2)
model = keras.models.Model(inputs=l1, outputs=l3)
model.save("saved_model.h5")

Now load the model in kelner:

import kelner

loaded_model = kelner.model.load("saved_model.h5")  # keras engine is the default
kelner.serve(loaded_model, port=8080)

FAQ

Who is this for?

Machine learning researchers who don't want to deal with building a web server for every model they export.

Kelner loads a saved Keras or Tensorflow model and starts an HTTP server that pipes POST request body to the model and returns JSON-encoded model response.

How is it different from Tensorflow Serving?

  1. Kelner is ridiculously simple to install and run
  2. Kelner also works with saved Keras models
  3. Kelner works with one model per installation
  4. Kelner doesn't do model versioning
  5. Kelner is JSON over HTTP while tf-serving is ProtoBuf over gRPC
  6. Kelner's protocol is:
    • GET returns model input and output specs as JSON
    • POST expects JSON or an image file, returns JSON-encoded result of model inference
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].