All Projects → sshane → Konverter

sshane / Konverter

Licence: MIT License
Convert simple Keras models to pure Python 🐍+ NumPy

Programming Languages

python
139335 projects - #7 most used programming language
Makefile
30231 projects

Projects that are alternatives of or similar to Konverter

tf-faster-rcnn
Tensorflow 2 Faster-RCNN implementation from scratch supporting to the batch processing with MobileNetV2 and VGG16 backbones
Stars: ✭ 88 (+158.82%)
Mutual labels:  keras-tensorflow
keras-yolo3-facedetection
Real-time face detection model using YOLOv3 with Keras
Stars: ✭ 13 (-61.76%)
Mutual labels:  keras-tensorflow
Deep-Learning-Course
Deep Learning with TensorFlow and Keras
Stars: ✭ 13 (-61.76%)
Mutual labels:  keras-tensorflow
digit-recognizer-live
Recognize Digits using Deep Neural Networks in Google Chrome live!
Stars: ✭ 29 (-14.71%)
Mutual labels:  keras-tensorflow
Deep-Quality-Value-Family
Official implementation of the paper "Approximating two value functions instead of one: towards characterizing a new family of Deep Reinforcement Learning Algorithms": https://arxiv.org/abs/1909.01779 To appear at the next NeurIPS2019 DRL-Workshop
Stars: ✭ 12 (-64.71%)
Mutual labels:  keras-tensorflow
GestureAI
RNN(Recurrent Nerural network) model which recognize hand-gestures drawing 5 figures.
Stars: ✭ 20 (-41.18%)
Mutual labels:  keras-tensorflow
keras tfrecord
Extending Keras to support tfrecord dataset
Stars: ✭ 61 (+79.41%)
Mutual labels:  keras-tensorflow
MAX-Audio-Classifier
Identify sounds in short audio clips
Stars: ✭ 115 (+238.24%)
Mutual labels:  keras-tensorflow
One-Shot-Learning
Matching Networks Tensorflow 2 Implementation for few-shot AD diagnosis
Stars: ✭ 22 (-35.29%)
Mutual labels:  keras-tensorflow
Bebop-Autonomy-Vision
An autonomous, vision-based Bebop drone.
Stars: ✭ 24 (-29.41%)
Mutual labels:  keras-tensorflow
G-SimCLR
This is the code base for paper "G-SimCLR : Self-Supervised Contrastive Learning with Guided Projection via Pseudo Labelling" by Souradip Chakraborty, Aritra Roy Gosthipaty and Sayak Paul.
Stars: ✭ 69 (+102.94%)
Mutual labels:  keras-tensorflow
dl-relu
Deep Learning using Rectified Linear Units (ReLU)
Stars: ✭ 20 (-41.18%)
Mutual labels:  keras-tensorflow
Rus-SpeechRecognition-LSTM-CTC-VoxForge
Распознавание речи русского языка используя Tensorflow, обучаясь на базе Voxforge
Stars: ✭ 50 (+47.06%)
Mutual labels:  keras-tensorflow
fall-detection-two-stream-cnn
Real-time fall detection using two-stream convolutional neural net (CNN) with Motion History Image (MHI)
Stars: ✭ 49 (+44.12%)
Mutual labels:  keras-tensorflow
Detection-of-Small-Flying-Objects-in-UAV-Videos
Code for paper "Detection of Flying Honeybees in UAV Videos"
Stars: ✭ 47 (+38.24%)
Mutual labels:  keras-tensorflow
Real-Time-Violence-Detection-in-Video-
No description or website provided.
Stars: ✭ 54 (+58.82%)
Mutual labels:  keras-tensorflow
keras-complex
Keras-Tensorflow implementation of complex-valued convolutional neural networks
Stars: ✭ 96 (+182.35%)
Mutual labels:  keras-tensorflow
extra keras datasets
📃🎉 Additional datasets for tensorflow.keras
Stars: ✭ 20 (-41.18%)
Mutual labels:  keras-tensorflow
fashion-parser
Fashion item segmentation with deep learning
Stars: ✭ 22 (-35.29%)
Mutual labels:  keras-tensorflow
stocktwits-sentiment
Stocktwits market sentiment analysis in Python with Keras and TensorFlow.
Stars: ✭ 23 (-32.35%)
Mutual labels:  keras-tensorflow

Konverter Konverter Tests

Convert your Keras models into pure Python 🐍+ NumPy.

The goal of this tool is to provide a quick and easy way to execute Keras models on machines or setups where utilizing TensorFlow/Keras is impossible. Specifically, in my case, to replace SNPE (Snapdragon Neural Processing Engine) for inference on phones with Python.

Supported Keras Model Attributes

  • Models:
    • Sequential
  • Layers:
    • Dense
    • Dropout
      • Will be ignored during inference (SNPE 1.19 does NOT support dropout with Keras!)
    • SimpleRNN
      • Batch predictions do not currently work correctly.
    • GRU
      • Important: The current GRU support is based on GRU v3 in tf.keras 2.1.0. It will not work correctly with older versions of TensorFlow if not using implementation=2 (example).
      • Batch prediction untested
    • BatchNormalization
      • Works with all supported layers
  • Activations:
    • ReLU
    • LeakyReLU (supports custom alphas)
    • Sigmoid
    • Softmax
    • Tanh
    • Linear/None

Roadmap 🛣

The project to do list can be found here.

Features 🤖

  • Super quick conversion of your models. Takes less than a second. 🐱‍👤
  • Usually reduces the size of Keras models by about 69.37%. 👌
  • In some cases, prediction is quicker than Keras or SNPE (dense models). 🏎
    • RNNs: Since we lose the GPU using NumPy, predictions may be slower
  • Stores the weights and biases of your model in a separate compressed NumPy file. 👇

Benchmarks 📈

Benchmarks can be found in BENCHMARKS.md.

Installation & Usage 🌍

Install Konverter using pip:

pip install keras-konverter

Konverting using the CLI: 🖥

konverter examples/test_model.h5 examples/test_model.py (py suffix is optional)

Type konverter to get all possible arguments and flags!

  • Arguments 💢:
    • input_model: Either the the location of your tf.keras .h5 model, or a preloaded Sequential model if using with Python. This is required
    • output_file: Optional file path for your output model, along with the weights file. Default is same name, same directory
  • Flags 🎌:
    • --indent, -i: How many spaces to use for indentation, default is 2
    • --silent, -i: Whether you want Konverter to silently Konvert
    • --no-watermark, -nw: Removes the watermark prepended to the output model file

Konverting programmatically: 🤖

All parameters with defaults: konverter.konvert(input_model, output_file=None, indent=2, silent=False, no_watermark=False, tf_verbose=False)

>>> import konverter
>>> konverter.konvert('examples/test_model.h5', output_file='examples/test_model')

Note: The model file will be saved as f'{output_file}.py' and the weights will be saved as f'{output_file}_weights.npz' in the same directory. Make sure to change the path inside the model wrapper if you move the files after Konversion.


That's it! If your model is supported (check Supported Keras Model Attributes), then your newly converted Konverter model should be ready to go.

To predict: Import your model wrapper and run the predict() function. Always double check that the outputs closely match your Keras model's Automatic verification will come soon. For the integrity of the predictions, always make sure your input is a np.float32 array.

import numpy as np
from examples.test_model import predict
predict([np.random.rand(3).astype(np.float32)])

See limitations and issues.

Demo

Dependencies

Thanks to @apiad you can now use Poetry to install all the needed dependencies for this tool! However the requirements are a pretty short list:

  • It seems most versions of TensorFlow that include Keras work perfectly fine. Tested from 1.14 to 2.2 using Actions and no issues have occurred. (Make sure you use implementation 2/v3 with GRU layers if not on TF 2.x)
    • Important: You must create your models with tf.keras currently (not keras)
  • Python >= 3.6 (for the glorious f-strings!)

To install all needed dependencies, simply cd into the base directory of Konverter, and run:

poetry install --no-dev

If you would like to use this version of Konverter (not from pip), then you may need to also run poetry shell after to enter poetry's virtualenv environment. If you go down this path, make sure to remove --no-dev so TensorFlow installs in the venv!

Current Limitations and Issues 😬

  • Dimensionality of input data:

    When working with models using softmax, the dimensionality of the input data matters. For example, predicting on the same data with different input dimensionality sometimes results in different outputs:

    >>> model.predict([[1, 3, 5]])  # keras model, correct output
    array([[14.792273, 15.59787 , 15.543163]])
    >>> predict([[1, 3, 5]])  # Konverted model, wrong output
    array([[11.97839948, 18.09931636, 15.48014805]])
    >>> predict([1, 3, 5])  # And correct output
    array([14.79227209, 15.59786987, 15.54316282])

    If trying a batch prediction with classes and softmax, the model fails completely:

    >>> predict([[0.5], [0.5]])
    array([[0.5, 0.5, 0.5, 0.5], [0.5, 0.5, 0.5, 0.5]])

    Always double check that predictions are working correctly before deploying the model.

  • Batch prediction with SimpleRNN (and possibly all RNN) layers

    Currently, the converted model has no way of determining if you're feeding a single prediction or a batch of predictions, and it will fail to give the correct output in certain cases (more likely with recurrent layers and softmax dense outputs layers). Support will be added soon.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].