All Projects → hpi-xnor → ios-image-classification

hpi-xnor / ios-image-classification

Licence: MIT license
No description or website provided.

Programming Languages

C++
36643 projects - #6 most used programming language

Projects that are alternatives of or similar to ios-image-classification

Binary-Neural-Networks
Implemented here a Binary Neural Network (BNN) achieving nearly state-of-art results but recorded a significant reduction in memory usage and total time taken during training the network.
Stars: ✭ 55 (+358.33%)
Mutual labels:  binary-neural-networks
binary-nets
PyTorch implementation of binary neural networks
Stars: ✭ 39 (+225%)
Mutual labels:  binary-neural-networks
S2-BNN
S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)
Stars: ✭ 53 (+341.67%)
Mutual labels:  binary-neural-networks

MXNet on the iPhone

This is an example project running a binarized neural network on iOS. It can classify the live camera feed.

Requirements

  • ios device
  • model (binarized version pre-trained on Imagenet included)

Usage

Use xcode to build the app and try it on your phone!

What it does

Under the hood, we need the amalgamated MXNet source, the c predict headers, a trained and saved model from mxnet and set some additional environment variables in xcode.

The mxnet script amalgamate_mxnet.sh will amalgamate mxnet and perform the changes necessary for ios as described in the amalgamation readme file.

The xcode project already contains the preprocessor settings required to build mxnet for ios:

  • "MXNET_PREDICT_ONLY=1"
  • "MXNET_USE_OPENCV=0"
  • "MSHADOW_USE_CUDA=0"
  • "MSHADOW_USE_SSE=0"
  • "BINARY_WORD_32=1" (set to 32bit for ARM7 devices)
  • "BINARY_WORD_64=0"

There are pre-trained models included in the projects.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].