All Projects → MahmudulAlam → Fingertip-Mixed-Reality

MahmudulAlam / Fingertip-Mixed-Reality

Licence: MIT license
Affine transformation virtual 3D object using a finger gesture-based interactive system in the virtual environment.

Programming Languages

python
139335 projects - #7 most used programming language
C#
18002 projects
ShaderLab
938 projects

Projects that are alternatives of or similar to Fingertip-Mixed-Reality

WebVRExamples
yonet.github.io/webvrexamples/examples/cubes.html
Stars: ✭ 19 (-9.52%)
Mutual labels:  virtual-reality, mixed-reality
Realityui
A Swift Package for creating familiar UI Elements and animations in a RealityKit rendered Augmented Reality or Virtual Reality scene.
Stars: ✭ 275 (+1209.52%)
Mutual labels:  virtual-reality, mixed-reality
MixedRealityResources
Mixed Reality related resources
Stars: ✭ 190 (+804.76%)
Mutual labels:  virtual-reality, mixed-reality
awesome-3d
Awesome list of 3D resources. AR/MR/VR is the future, and 3D model is the basics of all of them.
Stars: ✭ 42 (+100%)
Mutual labels:  virtual-reality, mixed-reality
Apertusvr
Virtual Reality Software Library
Stars: ✭ 112 (+433.33%)
Mutual labels:  virtual-reality, mixed-reality
Ar Vrcourse
VR,AR,MR 开发入门教程
Stars: ✭ 298 (+1319.05%)
Mutual labels:  virtual-reality, mixed-reality
MoonMotion
Moon Motion Toolkit - Free and open source toolkit for VR locomotion
Stars: ✭ 38 (+80.95%)
Mutual labels:  virtual-reality, mixed-reality
Stereokit
An easy-to-use mixed reality library for building HoloLens and VR applications with C# and OpenXR!
Stars: ✭ 195 (+828.57%)
Mutual labels:  virtual-reality, mixed-reality
Realitymixer
Mixed Reality app for iOS
Stars: ✭ 520 (+2376.19%)
Mutual labels:  virtual-reality, mixed-reality
Mapssdk Unity
This repository contains samples, documentation, and supporting scripts for Maps SDK, a Microsoft Garage project.
Stars: ✭ 307 (+1361.9%)
Mutual labels:  virtual-reality, mixed-reality
Holoviveobserver
Shared Reality: Observe a VR session from the same room using a HoloLens!
Stars: ✭ 126 (+500%)
Mutual labels:  virtual-reality, mixed-reality
UnityPlugin
Ultraleap SDK for Unity.
Stars: ✭ 447 (+2028.57%)
Mutual labels:  virtual-reality, mixed-reality
Hand-Detection-Finger-Counting
Detect Hand and count number of fingers using Convex Hull algorithm in OpenCV lib in Python
Stars: ✭ 21 (+0%)
Mutual labels:  hand-detection
SpatialAlignment
Helpful components for aligning and keeping virtual objects aligned with the physical world.
Stars: ✭ 29 (+38.1%)
Mutual labels:  mixed-reality
zed-unreal-examples
Stereolabs ZED - UE4 Examples
Stars: ✭ 75 (+257.14%)
Mutual labels:  mixed-reality
HadesVR
The "DIY" SteamVR compatible VR setup made for tinkerers.
Stars: ✭ 88 (+319.05%)
Mutual labels:  virtual-reality
cybernetic-landscapes
No description or website provided.
Stars: ✭ 27 (+28.57%)
Mutual labels:  virtual-reality
PassthroughMeasure
Use your Oculus Quest with Passthrough as a tape measure
Stars: ✭ 43 (+104.76%)
Mutual labels:  mixed-reality
hobo vr
SteamVR driver prototyping tool
Stars: ✭ 44 (+109.52%)
Mutual labels:  virtual-reality
VisualProfiler-Unity
The Visual Profiler provides a drop in solution for viewing your mixed reality Unity application's frame rate, scene complexity, and memory usage.
Stars: ✭ 120 (+471.43%)
Mutual labels:  mixed-reality

Affine Transformation of Virtual Object

A convolutional neural network (CNN) based thumb and index fingertip detection system are presented here for seamless interaction with a virtual 3D object in the virtual environment. First, a two-stage CNN is employed to detect the hand and fingertips, and using the information of the fingertip position, the scale, rotation, translation, and in general, the affine transformation of the virtual object is performed.

Update

This is the version 2.0 that includes a more generalized affine transformation of virtual objects in the virtual environment with more experimentation and analysis. Previous versions only include the geometric transformation of a virtual 3D object with respect to a finger gesture. To get the previous version visit here.

GitHub stars GitHub forks Downloads GitHub license

Paper

Paper for the affine transformation of the virtual 3D object has been published in Virtual Reality & Intelligent Hardware, Elsevier Science Publishers in 2020. To get more detail, please go through the paper. Paper for the geometric transformation of the virtual object v1.0 has also been published. For more detail, please go through this paper. If you use the code or data from the project, please cite the following papers:

Paper

Affine transformation of virtual 3D object using 2D localization of fingertips 🔗

@article{alam2020affine,
  title={Affine transformation of virtual 3D object using 2D localization of fingertips},
  author={Alam, Mohammad Mahmudul and Rahman, SM Mahbubur},
  journal={Virtual Reality \& Intelligent Hardware},
  volume={2},
  number={6},
  pages={534--555},
  year={2020},
  publisher={Elsevier}
}

Paper

Detection and Tracking of Fingertips for Geometric Transformation of Objects in Virtual Environment 🔗

@inproceedings{alam2019detection,
  title={Detection and Tracking of Fingertips for Geometric Transformation of Objects in Virtual Environment},
  author={Alam, Mohammad Mahmudul and Rahman, SM Mahbubur},
  booktitle={2019 IEEE/ACS 16th International Conference on Computer Systems and Applications (AICCSA)},
  address = {Abu Dhabi, United Arab Emirates},
  pages={1--8},
  year={2019},
  organization={IEEE}
}

System Overview

Here it the real-time demo of the scale, rotation, translation, and overall affine transformation of the virtual object using finger interaction.

Dataset

To train the hand and fingertip detection model two different datasets are used. One is a self-made publicly released dataset called TI1K Dataset which contains 1000 images with the annotations of hand and fingertip position and another one is Scut-Ego-Gesture Dataset.

Requirements

  • TensorFlow-GPU==1.15.0
  • OpenCV==4.2.0
  • Cython==0.29.2
  • ImgAug==0.2.6
  • Weights: download the trained weights file for both hand and fingertip detection model and put the weights folder in the working directory.

Downloads

Experimental Setup

The experimental setup has a server and client-side. Fingertip detection and tracking and all other machine learning stuff are programmed in the server-side using Python. On the client-side, the virtual environment is created using Unity along with the Vuforia software development kit (SDK). To locate and track a virtual object using the webcam, Vuforia needs marker assistance. For that purpose, a marker is designed which works as an image target. The marker/ folder contains the pdf of the designed marker. To use the system print a copy of the marker.

How to Use

First, to run the server-side directly run 'server.py'. It will wait until the client-side (Unity) is starting to send images to the server.

directory > python server_track.py

Open the 'Unity Affine Transformation' environment using Unity and hit the play button. Make sure a webcam is connected.

Bring your hand in front of the webcam and interact with the virtual object using your finger gesture.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].