All Projects → exelban → myo-armband-nn

exelban / myo-armband-nn

Licence: GPL-3.0 license
Gesture recognition using myo armband via neural network (tensorflow library).

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to myo-armband-nn

pyomyo
PyoMyo - Python Opensource Myo armband library
Stars: ✭ 61 (+84.85%)
Mutual labels:  myo, myo-armband
myo-ruby
Connect to Myo armband in Ruby
Stars: ✭ 14 (-57.58%)
Mutual labels:  myo
GestureControlledCamera2D
A Camera2D node controlled through gestures. It's also an example of how to use the Godot Touch Input Manager.
Stars: ✭ 39 (+18.18%)
Mutual labels:  gesture
HandTrackingGestureRecorder
Unity script to record any gesture with your own hands
Stars: ✭ 74 (+124.24%)
Mutual labels:  gesture
GIMLeT
GIMLeT – Gestural Interaction Machine Learning Toolkit
Stars: ✭ 33 (+0%)
Mutual labels:  gesture
simple gesture detector
Easy to use, reliable and lightweight gesture detector for Flutter apps, exposing simple API for basic gestures
Stars: ✭ 26 (-21.21%)
Mutual labels:  gesture
Swipycell
Easy to use UITableViewCell implementing swiping to trigger actions.
Stars: ✭ 230 (+596.97%)
Mutual labels:  gesture
mapbox-gestures-android
The Mapbox Gestures for Android library makes it easy to detect and handle user gestures on an Android device.
Stars: ✭ 25 (-24.24%)
Mutual labels:  gesture
react-native-gesture-flip-card
A card flipping animation component using gesture for react-native.
Stars: ✭ 93 (+181.82%)
Mutual labels:  gesture
Sensitive
Special way to work with gestures in iOS
Stars: ✭ 549 (+1563.64%)
Mutual labels:  gesture
fusuma-plugin-tap
Tap and Hold gestures plugin for Fusuma
Stars: ✭ 16 (-51.52%)
Mutual labels:  gesture
myo-android-2048
The game 2048, playable with a Myo
Stars: ✭ 32 (-3.03%)
Mutual labels:  myo
gestures
A library for normalized events and gesture for desktop and mobile.
Stars: ✭ 31 (-6.06%)
Mutual labels:  gesture
react-native-pinchable
Instagram like pinch to zoom for React Native
Stars: ✭ 187 (+466.67%)
Mutual labels:  gesture
gestures-android
Gesture recognizers for Android
Stars: ✭ 18 (-45.45%)
Mutual labels:  gesture
Transferee
一个帮助您完成从缩略视图到原视图无缝过渡转变的神奇框架
Stars: ✭ 2,697 (+8072.73%)
Mutual labels:  gesture
Intel-Realsense-Hand-Toolkit-Unity
Intel Realsense Toolkit for Hand tracking and Gestural Recognition on Unity3D
Stars: ✭ 72 (+118.18%)
Mutual labels:  gesture
liveGestureDemo
仿映客双屏直播,OpenCV 竖屏检测,人脸贴纸
Stars: ✭ 26 (-21.21%)
Mutual labels:  gesture
iMoney
iMoney 金融项目
Stars: ✭ 55 (+66.67%)
Mutual labels:  gesture
Gestures-Samples
Project Prague Code Samples
Stars: ✭ 98 (+196.97%)
Mutual labels:  gesture

Archived project. No maintenance.

This project is not maintained anymore and is archived. Feel free to fork and make your own changes if needed. It's because Myo production and sales has officially ended as of Oct 12, 2018.

Thanks to everyone for their valuable feedback.

myo-armband-nn

Gesture recognition using myo armband via neural network (tensorflow library).

Requirement

Library Version
Python ^3.5
Tensorflow ^1.1.0
Numpy ^1.12.0
sklearn ^0.18.1
myo-python ^0.2.2

Collecting data

You can use your own scripts for collecting EMG data from Myo armband. But you need to push 64-value array with data from each sensor.
By default myo-python returns 8-value array from each sensors. Each output return by 2-value array: [datetime, [EMG DATA]].
64 - value array its 8 output from armband. Just put it to one dimension array. So you just need to collect 8 values with gesture from armband (if you read data 10 times/s its not a problem).

In repo are collected dataset from Myo armband collected by me. Dataset contains only 5 gestures:

👍 - Ok    (1)
✊️ - Fist  (2)
✌️ - Like  (3)
🤘 - Rock  (4)
🖖 - Spock (5)

Training network

python3 train.py

75k iteration take about 20 min on GTX 960 or 2h on i3-6100.

Accuracy after ~75k iteration (98.75%):

Loose after ~75k iteration (1.28):

Prediction

Prediction on data from MYO armband

python3 predict.py

You must have installed MYO SDK. Script will return number (0-5) witch represent gesture (0 - relaxed arm).

Prediction on training dataset

python3 predict_train_dataset.py

Example output:

Accuracy on Test-Set: 98.27% (19235 / 19573)
[2438    5    9    6    4   20] (0) Relax
[   4 2652   45    1    3    9] (1) Ok
[   8   44 4989    1    1    9] (2) Fist
[   8    2    2 4152   28   13] (3) Like
[   2    5    6   27 1839    1] (4) Rock
[  14   22   13   21    5 3165] (5) Spock
 (0) (1) (2) (3) (4) (5)

I know that making prediction on training dataset wrong. But i don't have time to make testing dataset(

Model

Fully connected 1 (528 neurons)
ReLu
Fully connected 2 (786 neurons)
ReLu
Fully connected 3 (1248 neurons)
ReLu
Dropout
Softmax_linear

License

GNU General Public License v3.0

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].