All Projects → bm371613 → gest

bm371613 / gest

Licence: GPL-3.0 license
Hand gestures as an input device

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to gest

gesture-recognition-for-human-robot-interaction
Gesture Recognition For Human-Robot Interaction with modelling, training, analysing and recognising gestures based on computer vision and machine learning techniques. This work was done at Distributed Artificial Intelligence Lab (DAI Labor), Berlin.
Stars: ✭ 62 (+67.57%)
Mutual labels:  gesture-recognition, hand-gestures
Intel-Realsense-Hand-Toolkit-Unity
Intel Realsense Toolkit for Hand tracking and Gestural Recognition on Unity3D
Stars: ✭ 72 (+94.59%)
Mutual labels:  gesture-recognition, hand-gestures
mGesf
A sensor fusion approach to the recognition of microgestures.
Stars: ✭ 21 (-43.24%)
Mutual labels:  gesture-recognition
Kinect-Vision
A computer vision based gesture detection system that automatically detects the number of fingers as a hand gesture and enables you to control simple button pressing games using you hand gestures.
Stars: ✭ 47 (+27.03%)
Mutual labels:  gesture-recognition
Remote-Appliance-Control-using-Face-Gestures
Developed a pipeline to remotely control appliances using minimal face gestures and neck movements.
Stars: ✭ 14 (-62.16%)
Mutual labels:  gesture-recognition
mmwave-gesture-recognition
Basic Gesture Recognition Using mmWave Sensor - TI AWR1642
Stars: ✭ 32 (-13.51%)
Mutual labels:  gesture-recognition
mapbox-gestures-android
The Mapbox Gestures for Android library makes it easy to detect and handle user gestures on an Android device.
Stars: ✭ 25 (-32.43%)
Mutual labels:  gesture-recognition
gestop
A tool to navigate the desktop with hand gestures. Builds on mediapipe.
Stars: ✭ 20 (-45.95%)
Mutual labels:  hand-gestures
GIMLeT
GIMLeT – Gestural Interaction Machine Learning Toolkit
Stars: ✭ 33 (-10.81%)
Mutual labels:  gesture-recognition
gesto
You can set up drag, pinch events in any browser.
Stars: ✭ 47 (+27.03%)
Mutual labels:  gesture-recognition
TriangleGAN
TriangleGAN, ACM MM 2019.
Stars: ✭ 28 (-24.32%)
Mutual labels:  gesture-recognition
Openpose-based-GUI-for-Realtime-Pose-Estimate-and-Action-Recognition
GUI based on the python api of openpose in windows using cuda10 and cudnn7. Support body , hand, face keypoints estimation and data saving. Realtime gesture recognition is realized through two-layer neural network based on the skeleton collected from the gui.
Stars: ✭ 69 (+86.49%)
Mutual labels:  gesture-recognition
sign-language
Android application which uses feature extraction algorithms and machine learning (SVM) to recognise and translate static sign language gestures.
Stars: ✭ 35 (-5.41%)
Mutual labels:  gesture-recognition
btt
Low level MacOS management in JavaScript via BetterTouchTool
Stars: ✭ 92 (+148.65%)
Mutual labels:  gesture-recognition
Magic-Leap-Gesture-IoT-Example
Control lights in the physical world from the augmented world using hand gestures. Using Magic Leap Hand Poses (Gestures) and PubNub.
Stars: ✭ 18 (-51.35%)
Mutual labels:  gesture-recognition
spockpy
✊ ✋ ✌️ ☝️ 🖖 A Python hand gesture recognition library for Kinetic User Interface (KUI).
Stars: ✭ 50 (+35.14%)
Mutual labels:  hand-gestures
sense-iOS
Enhance your iOS app with the ability to see and interact with humans using the RGB camera.
Stars: ✭ 19 (-48.65%)
Mutual labels:  gesture-recognition
Gesture-Recognition
Recognize gestures using a simple webcam.
Stars: ✭ 27 (-27.03%)
Mutual labels:  gesture-recognition
Touchegg
Linux multi-touch gesture recognizer
Stars: ✭ 2,241 (+5956.76%)
Mutual labels:  gesture-recognition
san
The official PyTorch implementation of "Context Matters: Self-Attention for sign Language Recognition"
Stars: ✭ 17 (-54.05%)
Mutual labels:  gesture-recognition

gest

Hand gestures as an input device

example

Why

For health related reasons, I had to stop using a mouse and a keyboard. Talon allowed me to type with my voice and move the cursor with my eyes. This project was started to complement this setup with hand gestures.

Development status

The project is in an early stage of development. I use it on daily basis, so it should be good enough for some.

What is implemented:

  • pinching gesture recognition, in one hand orientation
  • heatmap output, separate for left and right hand, indicating pinched point position
  • demo for testing recognition models
  • example script for simulating mouse clicks and scrolling
  • scripts for producing and reviewing training data

Bias

The gesture recognition model was trained on images of my hands, taken with my hardware in my working environment, so it is probably heavily biased. I hope people who want to use it, but recognition quality prevents them from it, would capture some images of their hands using included tooling and donate it to the project, so that over time it works well for everyone.

Installation

Use Python 3.6, 3.7 or 3.8 and in a virtual environment run

pip install gest

If you clone this repository, you can get the exact versions of required libraries that I am using with Poetry

poetry install

Walkthrough

Demo

First check how the included model works for you. Run

python -m gest.demo

and see if it recognizes your gestures as here:

demo

If you have multiple cameras, you can pick one like

python -m gest.demo --camera 2

Camera numbers are not necessarily consecutive. Two cameras may be accessible as 0 and 2. This option is supported by other commands as well.

Example script

In the presentation on top I am running

python -m gest.examples.two_handed_scroll_and_click

It only acts if it detects both hands pinching and based on their relative position:

  • double clicks if you cross your hands
  • scrolls up or down if your hands pinch at different heights
  • left clicks if your hands (almost) touch
  • right clicks if your hands are on the same height, but not close horizontally (this action is delayed by a fraction of a second to prevent accidental use)

Controlling CPU load

For everyday use, you don't want to dedicate too much resources to gesture recognition. You can control it by setting OMP_NUM_THREADS, as in

OMP_NUM_THREADS=2 python -m gest.examples.two_handed_scroll_and_click

Try different values to find balance between responsiveness and CPU load.

Custom scripts

The demo and example scripts serve two additional purposes: they can be used as templates for custom scripts and they define the public API for the purpose of semantic versioning.

Training data annotation

Capturing

python -m gest.annotation.capture --countdown 5 data_directory

will help you create annotated images. Once you start automatic annotation (press a to start/stop) it will ask you to pinch a given point with your left or right hand, or to not pinch ("background").

You will have 5 seconds before the image is captured (the --countdown).

You will also see the last annotated image for quick review. It can be deleted with d.

Reviewing

python -m gest.annotation.review --time 1 data_directory closed_pinch_left

will let you review all images annotated as left hand pinch in data_directory, showing you each for 1 second if you start/stop automatic advancing with a. Otherwise you can go to the next/previous image with n/p. Delete incorrectly annotated images with d.

You should also review closed_pinch_right and background.

Annotation guidelines

It makes sense to annotate realistic training data that the model performs poorly on, like if

  • it mistakenly detects a pinch when you pick up the phone,
  • it doesn't detect pinching when you wear a skin colored shirt.

If it performs poorly overall, it's good to capture the images in many short sessions, with different lighting, clothes, background, camera angle.

The point isn't though to look for tricky cases or stretch the definition of a pinching gesture to include a different hand orientation (eg. with pinching fingers pointing towards the camera).

Donating annotated data

Contact me [email protected]

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].