All Projects → quic → sense-iOS

quic / sense-iOS

Licence: MIT license
Enhance your iOS app with the ability to see and interact with humans using the RGB camera.

Programming Languages

swift
15916 projects
ruby
36898 projects - #4 most used programming language

Projects that are alternatives of or similar to sense-iOS

Coreml Training
Source code for my blog post series "On-device training with Core ML"
Stars: ✭ 77 (+305.26%)
Mutual labels:  edge-computing, coreml
motor-defect-detector-python
Predict performance issues with manufacturing equipment motors. Perform local or cloud analytics of the issues found, and then display the data on a user interface to determine when failures might arise.
Stars: ✭ 24 (+26.32%)
Mutual labels:  edge-computing
EdgeSim
Simulate the real environment, perform edge computing, edge caching experiments
Stars: ✭ 53 (+178.95%)
Mutual labels:  edge-computing
zenoh-c
zenoh client library written in C and targeting micro-controllers.
Stars: ✭ 28 (+47.37%)
Mutual labels:  edge-computing
faas-sim
A framework for trace-driven simulation of serverless Function-as-a-Service platforms
Stars: ✭ 33 (+73.68%)
Mutual labels:  edge-computing
gesto
You can set up drag, pinch events in any browser.
Stars: ✭ 47 (+147.37%)
Mutual labels:  gesture-recognition
MicrosoftCloudWorkshop-Asia
Microsoft Cloud Workshop Asia for Intelligent Cloud / Intelligent Edge
Stars: ✭ 20 (+5.26%)
Mutual labels:  edge-computing
nn-Meter
A DNN inference latency prediction toolkit for accurately modeling and predicting the latency on diverse edge devices.
Stars: ✭ 211 (+1010.53%)
Mutual labels:  edge-computing
Remote-Appliance-Control-using-Face-Gestures
Developed a pipeline to remotely control appliances using minimal face gestures and neck movements.
Stars: ✭ 14 (-26.32%)
Mutual labels:  gesture-recognition
ambianic-edge
The core runtime engine for Ambianic Edge devices.
Stars: ✭ 98 (+415.79%)
Mutual labels:  edge-computing
BootFinder
Boot Finder demonstrates the power of using on-device machine learning models to delight users in new and innovative ways. It's private too! Because this model runs on-device, customer photos never leave the phone!
Stars: ✭ 34 (+78.95%)
Mutual labels:  coreml
intruder-detector-python
Build an application that alerts you when someone enters a restricted area. Learn how to use models for multiclass object detection.
Stars: ✭ 16 (-15.79%)
Mutual labels:  edge-computing
AdvantEDGE
AdvantEDGE, Mobile Edge Emulation Platform
Stars: ✭ 36 (+89.47%)
Mutual labels:  edge-computing
YOLOv3-CoreML
YOLOv3 for iOS implemented using CoreML.
Stars: ✭ 166 (+773.68%)
Mutual labels:  coreml
iOS-CoreML-Inceptionv3
Real-time Object Recognition using Apple's CoreML 2.0 and Vision API -
Stars: ✭ 46 (+142.11%)
Mutual labels:  coreml
SentimentVisionDemo
🌅 iOS11 demo application for visual sentiment prediction.
Stars: ✭ 34 (+78.95%)
Mutual labels:  coreml
MLEdgeDeploy
Automatic Over the Air Deployment of Improved Machine Learning Models to IoT Devices for Edge Processing
Stars: ✭ 26 (+36.84%)
Mutual labels:  coreml
object-size-detector-python
Monitor mechanical bolts as they move down a conveyor belt. When a bolt of an irregular size is detected, this solution emits an alert.
Stars: ✭ 26 (+36.84%)
Mutual labels:  edge-computing
rpi-urban-mobility-tracker
The easiest way to count pedestrians, cyclists, and vehicles on edge computing devices or live video feeds.
Stars: ✭ 75 (+294.74%)
Mutual labels:  edge-computing
ML-MCU
Code for IoT Journal paper title 'ML-MCU: A Framework to Train ML Classifiers on MCU-based IoT Edge Devices'
Stars: ✭ 28 (+47.37%)
Mutual labels:  edge-computing

State-of-the-art Real-time Action Recognition for iOS


WebsiteBlogpostGetting StartedDeploy Your Own ClassifierDatasetsSDK License

Documentation GitHub GitHub release Contributor Covenant


This repository contains the iOS version of sense which allows you to build an iOS demo app running the pytorch models after converting them to CoreML using the provided script.

You can convert and deploy the existing gesture detection model as is, or, use the transfer learning script in sense to train on your own custom classification outputs on top of it. More models will be supported soon.

The model uses an efficientnet backbone and was confirmed to run smoothly on iOS devices with A11 chips (e.g. iPhone 8 or higher) and may also work on devices with A10 chips (e.g. iPad 6/7, iPhone 7).


Requirements and Installation

The following steps will help you install the necessary components to get up and running in no time with your project.

Step 1: Clone this repository

To begin, clone this repository, as well as sense, to a local directory of your choice:

git clone https://github.com/TwentyBN/sense-iOS.git

Step 2: Clone and install the sense repository

You will also need to clone sense (we will use it to convert Pytorch models to CoreML):

git clone https://github.com/TwentyBN/sense.git
cd sense

Next, follow the instructions for sense to install its dependencies.

Step 3: Download our pre-trained models

You will need to download our pre-trained models to build the demo application. Once again, please follow the instructions in sense to access them (you will have to create an account and agree to our terms and conditions).

Step 4: Install the pods

This project relies on Pods to install Tensorflow Lite. If you don't have cocoapods installed on your mac, you can install it using brew:

brew install cocoapods

You then need to install the pods by running the following command line:

# If you are in sense-iOS root directory:
pod install

Getting Started

This section will explain how you can deploy our pre-trained models, or your own custom model, to an iOS application.

Step 1: Converting a Pytorch model to Tensorflow Lite

The iOS demo requires a Tensorflow Lite version of our model checkpoint which you can produce using the script provided in sense which, for our pre-trained gesture control model, can be run using:

python tools/conversion/convert_to_tflite.py --backbone=efficientnet --classifier=efficient_net_gesture_control --output_name=model

You should now have the following Tensorflow Lite file: sense/resources/model_conversion/model.tflite.

Step 2: Move the converted model to the correct location

The Tensorflow Lite file created in the last step can be moved from sense to sense-iOS in the following location: sense-iOS/sense-iOS/model.tflite

# If you are in sense
mv ./resources/model_conversion/model.tflite ../sense-iOS/sense-iOS/model.tflite

Step 3: Build the project

You can now open the iOS project with Xcode and build it to your device. Have fun!


Deploy your own classifier

Using our transfer learning script, it is possible to further fine-tune our model to your own classification tasks. If you do so, you'll have to reflect the new outputs in various files in the iOS project:

sense-iOS/sensenet_labels.json

By default, the dictionary in sensenet_labels.json contains the labels our model was trained on for the gesture control task. Replace these with the contents of the label2int.json file produced during training.


Citation

We now have a blogpost you can cite:

@misc{sense2020blogpost,
    author = {Guillaume Berger and Antoine Mercier and Florian Letsch and Cornelius Boehm and 
              Sunny Panchal and Nahua Kang and Mark Todorovich and Ingo Bax and Roland Memisevic},
    title = {Towards situated visual AI via end-to-end learning on video clips},
    howpublished = {\url{https://medium.com/twentybn/towards-situated-visual-ai-via-end-to-end-learning-on-video-clips-2832bd9d519f}},
    note = {online; accessed 23 October 2020},
    year=2020,
}

License

The code is copyright (c) 2020 Twenty Billion Neurons GmbH under an MIT Licence. See the file LICENSE for details. Note that this license only covers the source code of this repo. Pre-trained weights come with a separate license available here.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].