All Projects β†’ andypotato β†’ Fingerpose

andypotato / Fingerpose

Licence: mit
Finger pose classifier for hand landmarks detected by TensorFlow.js handpose model

Programming Languages

javascript
184084 projects - #8 most used programming language

Projects that are alternatives of or similar to Fingerpose

Poseosc
πŸ“ΉπŸ€Έβ€β™‚οΈπŸ€Ύβ€β™€οΈπŸ€Ί PoseNet + OSC: send realtime human pose estimation data to your apps
Stars: ✭ 40 (-60.78%)
Mutual labels:  pose-estimation
Fight detection
Real time Fight Detection Based on 2D Pose Estimation and RNN Action Recognition
Stars: ✭ 65 (-36.27%)
Mutual labels:  pose-estimation
Ios Openpose
OpenPose Example App
Stars: ✭ 85 (-16.67%)
Mutual labels:  pose-estimation
Pose Tensorflow
Human Pose estimation with TensorFlow framework
Stars: ✭ 1,042 (+921.57%)
Mutual labels:  pose-estimation
Supergluepretrainednetwork
SuperGlue: Learning Feature Matching with Graph Neural Networks (CVPR 2020, Oral)
Stars: ✭ 1,122 (+1000%)
Mutual labels:  pose-estimation
Stag
STag: A Stable Fiducial Marker System
Stars: ✭ 75 (-26.47%)
Mutual labels:  pose-estimation
Hyperpose
HyperPose: A Collection of Real-time Human Pose Estimation
Stars: ✭ 961 (+842.16%)
Mutual labels:  pose-estimation
Pytorch pose proposal networks
Pytorch implementation of pose proposal networks
Stars: ✭ 93 (-8.82%)
Mutual labels:  pose-estimation
Margipose
Stars: ✭ 64 (-37.25%)
Mutual labels:  pose-estimation
Ros Openpose
CMU's OpenPose for ROS
Stars: ✭ 81 (-20.59%)
Mutual labels:  pose-estimation
Carposedemo
Real-time Mobile Car Pose Estimation with CoreML
Stars: ✭ 49 (-51.96%)
Mutual labels:  pose-estimation
Wisppn
test example of paper, Can WiFi Estimate Person Pose?
Stars: ✭ 55 (-46.08%)
Mutual labels:  pose-estimation
Convolutional Pose Machines Pytorch
Pytroch version of Convolutional Pose Machines
Stars: ✭ 77 (-24.51%)
Mutual labels:  pose-estimation
Oat
Real-time position tracker for behavioral research
Stars: ✭ 45 (-55.88%)
Mutual labels:  pose-estimation
Dataset utilities
NVIDIA Dataset Utilities (NVDU)
Stars: ✭ 90 (-11.76%)
Mutual labels:  pose-estimation
Hierarchical Localization
Visual localization made easy with hloc
Stars: ✭ 997 (+877.45%)
Mutual labels:  pose-estimation
Adversarial Pose Pytorch
A PyTorch implementation of adversarial pose estimation for multi-person
Stars: ✭ 67 (-34.31%)
Mutual labels:  pose-estimation
Eao Slam
[IROS 2020] EAO-SLAM: Monocular Semi-Dense Object SLAM Based on Ensemble Data Association
Stars: ✭ 95 (-6.86%)
Mutual labels:  pose-estimation
Awesome Computer Vision
Awesome Resources for Advanced Computer Vision Topics
Stars: ✭ 92 (-9.8%)
Mutual labels:  pose-estimation
Pose Estimation tutorials
Tools and tutorials of pose estimation and deep learning
Stars: ✭ 79 (-22.55%)
Mutual labels:  pose-estimation

fingerpose

Finger pose classifier for hand landmarks detected by TensorFlow.js' handpose model. It can detect hand gestures like "Victory" ✌️or "Thumbs Up" πŸ‘inside a webcam source picture. You can define additional hand gestures using gesture descriptions.

"Thumbs up" and "Victory" gestures detected

How it works

Gesture detection works in three steps:

  1. Detect the hand landmarks inside the video picture
  2. Estimating the direction and curl of each individual finger
  3. Comparing the result to a set of gesture descriptions

Step (1) is performed by TensorFlow's "handpose", Step (2) and (3) are handled by this library.

Installation

Install the module via NPM:

npm i --save fingerpose

Usage

A fully working example can be found inside the dist folder. The basic steps are outlined below:

Include "handpose" and this library

<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs"></script>
<script src="https://cdn.jsdelivr.net/npm/@tensorflow-models/handpose"></script>
<script src="fingerpose.js" type="text/javascript"></script>

Configure the gesture recognizer with known gestures

// add "✌🏻" and "πŸ‘" as sample gestures
const GE = new fp.GestureEstimator([
    fp.Gestures.VictoryGesture,
    fp.Gestures.ThumbsUpGesture
]);

Use "handpose" to estimate the landmarks

const model = await handpose.load();
const predictions = await model.estimateHands(video, true);

Estimate the gestures

// using a minimum confidence of 7.5 (out of 10)
const estimatedGestures = GE.estimate(predictions.landmarks, 7.5);

The result is an object containing possible gestures and their confidence, for example:

{
    poseData: [ ... ],
    gestures: [
        { name: 'thumbs_up', confidence: 9.25 },
        { ... }
    ]
}

In addition you receive the poseData array including the raw pose and direction information for each finger. This is useful for debugging purposes as it can help you understand how an individual finger curl / direction is estimated by the library.

// example for raw pose data
poseData: [
    ['Thumb', 'No Curl', 'Vertical Up],
    ['Index', 'Half Curl', 'Diagonal Up Right'],
    ...
]

Define your own gestures

You can create any number of hand gestures for this library to recognize. To see how a gesture is described, have a look at the included sample gestures Victory and Thumbs Up.

A gesture is defined by describing the expected curl and direction of each individual finger. For example for a "Thumbs Up" gesture is defined by the stretched out thumb pointing up while all other fingers are curled and pointing to the left or right πŸ‘.

To describe gestures, you can use the provided Finger Description Constants:

Finger Name
0 Finger.Thumb
1 Finger.Index
2 Finger.Middle
3 Finger.Ring
4 Finger.Pinky

Probably no further explanation is required for finger names... πŸ‘‹

Curl Name
0 FingerCurl.NoCurl
1 FingerCurl.HalfCurl
2 FingerCurl.FullCurl

You can refer to the images below for an example how the index finger is curled (no curl, half curl, full curl): | enter image description here | enter image description here | enter image description here | |--|--|--| | No curl | Half curl | Full curl |

Direction Name
0 Vertical Up πŸ‘†
1 Vertical Down πŸ‘‡
2 Horizontal Left πŸ‘ˆ
3 Horizontal Right πŸ‘‰
4 Diagonal Up Right ↗️
5 Diagonal Up Left ↖️
6 Diagonal Down Right β†˜οΈ
7 Diagonal Down Left ↙️

Example: Thumbs down gesture description πŸ‘Ž

First create a new GestureDescription object:

const thumbsDownGesture = new fp.GestureDescription('thumbs_down');

Expect the thumb to be stretched out and pointing down:

thumbsDownGesture.addCurl(fp.Finger.Thumb, fp.FingerCurl.NoCurl, 1.0);
thumbsDownGesture.addDirection(fp.Finger.Thumb, fp.FingerDirection.VerticalDown, 1.0);
thumbsDownGesture.addDirection(fp.Finger.Thumb, fp.FingerDirection.DiagonalDownLeft, 0.5);
thumbsDownGesture.addDirection(fp.Finger.Thumb, fp.FingerDirection.DiagonalDownRight, 0.5);

This will define that a thumb pointing downwards will result in the highest confidence (1.0) for this gesture. If the thumb is angled to diagonal down left / right we can somehow still accept it, but with a lower confidence (0.5).

All other fingers are expected to be fully curled. For this gesture it doesn't really matter which direction the curled fingers are pointing at therefore only the curl description is added.

// do this for all other fingers
thumbsDownGesture.addCurl(fp.Finger.Index, fp.FingerCurl.FullCurl, 1.0);
thumbsDownGesture.addCurl(fp.Finger.Middle, fp.FingerCurl.FullCurl, 1.0);
thumbsDownGesture.addCurl(fp.Finger.Ring, fp.FingerCurl.FullCurl, 1.0);
thumbsDownGesture.addCurl(fp.Finger.Pinky, fp.FingerCurl.FullCurl, 1.0);

The meaning of confidence

"Confidence" is a number between 0 and 10 which describes how accurate a given combination of finger curl / positions matches a predefined gesture. You should design your gestures so a perfect match will result in a confidence of "10".

Tips to improve detection

  • The most stable detection is achieved when you can require confidences of 8 or higher.
  • Many poses do not require fingers pointing in a specific direction but are defined by curls only. In these cases just do not add direction constraints to your pose.
  • You can also have individual fingers reduce confidence which means "This finger should absolutely not appear in this way".

Debugging your gestures

Look at the raw pose data result in GestureEstimator::estimate() to understand the detected curls / directions for each finger to the console. This way you can verify if your assumed curls / directions match with what the estimator actually sees.

Known issues / limitations

  • Currently only one hand is supported at the same time. This is a limitation of the underlying handpose model and may or may not change in the future.
  • The handpose model has issues detecting a single stretched out finger (for example index finger). It will occasionally not detect a finger going from "curled" to "not curled" or vice-versa.

Credits

The hand gesture recognition module is based on the amazing work by Prasad Pai. This module is more or less a straight JavaScript port of his FingerPoseEstimate Python module.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].