All Projects → johnjcsmith → Iphonemocapios

johnjcsmith / Iphonemocapios

Licence: mit

Programming Languages

swift
15916 projects

Projects that are alternatives of or similar to Iphonemocapios

Piano
Easily play combinations of sound effects and Taptic Engine vibrations on iOS.
Stars: ✭ 202 (-40.24%)
Mutual labels:  apple, iphone
Meme-Maker-iOS
Meme Maker open source iOS app made in Swift.
Stars: ✭ 59 (-82.54%)
Mutual labels:  apple, iphone
Framegrabber
📱iOS app to extract full-resolution video frames as images.
Stars: ✭ 237 (-29.88%)
Mutual labels:  apple, iphone
Bender
Easily craft fast Neural Networks on iOS! Use TensorFlow models. Metal under the hood.
Stars: ✭ 1,728 (+411.24%)
Mutual labels:  apple, iphone
Focus Points
Plugin for Lightroom to show which focus point was active in the camera when a photo was taken
Stars: ✭ 272 (-19.53%)
Mutual labels:  iphone, face
Apple Device Model List
All Apple devices model name list. 通过内部编号判断 iOS 设备型号。
Stars: ✭ 149 (-55.92%)
Mutual labels:  apple, iphone
iOS-ARKit
Basic Concepts and Projects using ARKit on iOS.
Stars: ✭ 18 (-94.67%)
Mutual labels:  apple, iphone
Biometricauthentication
Use Apple FaceID or TouchID authentication in your app using BiometricAuthentication.
Stars: ✭ 746 (+120.71%)
Mutual labels:  apple, face
ios code sign
iOS 签名简介
Stars: ✭ 23 (-93.2%)
Mutual labels:  apple, iphone
BDLocalizedDevicesModels
Apple devices model names localized.
Stars: ✭ 23 (-93.2%)
Mutual labels:  apple, iphone
Popcorntimetv
Popcorn Time for Apple TV 4, iPhone and iPad
Stars: ✭ 1,216 (+259.76%)
Mutual labels:  apple, iphone
Arkit
ARKit Base Project. Place virtual objects based on WWDC example project
Stars: ✭ 297 (-12.13%)
Mutual labels:  apple, iphone
Sbsanimoji
🐵 Animoji app using Apples AvatarKit
Stars: ✭ 884 (+161.54%)
Mutual labels:  apple, iphone
Iboot64helper
IDAPython loader to help with AArch64 iBoot, iBEC, and SecureROM reverse engineering
Stars: ✭ 189 (-44.08%)
Mutual labels:  apple, iphone
In App Purchase
A Node.js module for in-App-Purchase for iOS, Android, Amazon and Windows.
Stars: ✭ 868 (+156.8%)
Mutual labels:  apple, unity
Watusi For Whatsapp
Your all-in-one tweak for WhatsApp Messenger!
Stars: ✭ 240 (-28.99%)
Mutual labels:  apple, iphone
Hackers
Hackers is an elegant iOS app for reading Hacker News written in Swift.
Stars: ✭ 513 (+51.78%)
Mutual labels:  apple, iphone
Open Source Ios Apps
📱 Collaborative List of Open-Source iOS Apps
Stars: ✭ 28,826 (+8428.4%)
Mutual labels:  apple, iphone
ALButtonMenu
A simple, fully customizable menu solution for iOS.
Stars: ✭ 45 (-86.69%)
Mutual labels:  apple, iphone
Gtsheet
An easy to integrate solution for presenting UIViewControllers in a bottom sheet
Stars: ✭ 282 (-16.57%)
Mutual labels:  apple, iphone

Real-time Facial Performance Capture with iPhone X

When Apple announced iPhone X and Animoji the first thought we had was: can we use this to animate arbitrary 3D characters? That is, not just the Animoji designed by Apple.

It turns out that yeah, not only is this possible. It's pretty easy and the ARKit face APIs are powerful enough to produce useful animation. And we can capture it in real-time.

Face Demo

We're going to cover the following:

  • Apple's blend shapes
  • Building the blend shapes on our model
  • Our iOS app to transmit the blend shapes over UDP
  • The Unity extension to receive the data from our iOS app
  • Source code for both projects

The result is the ability to stream blend shape parameters live from your iPhone X into Unity to control your animation rig.

Hasn't this been done?

There was a recent article showing the 3D output from the iPhone X front camera module. It shows the raw vertex data captured from iPhone X and put into Houdini (3D animation software). What we wanted, however, was to get the facial motion data itself and re-target it to an arbitrary 3D model.

So, for example, you would be able to perform your in-game character's lip sync and facial expressions just by holding your iPhone X up to your face. Or maybe you could animate a character for a TV series.

Current automated facial animation techniques analyse voice data for phonemes (e.g., ee, oo, ah) and map those sounds to 3D model blend shapes. We figured the iPhone X could produce more dynamic facial expressions, including brow movement, blinking, nose flaring, and eye lid movement.

Retargeting Facial Motion to a Mesh Using iPhone X

It turns out that ARKit not only gives you the raw vertex data computed from your face, it gives you a set of blend shape values. Blend shape values are just numbers between 0.0 and ¡1.0 that tell you how much ARKit thinks a certain muscle in your face is moving.

So, for example, the Jaw Open blend shape would be 0.0 when your jaw is closed, and 1.0 when your jaw is open. Any value in-between would indicate a partially open jaw.

This is really powerful because if you are a 3D artist not only can you map Apple's blend shapes to your 3D character, you can design an animation rig around the various values. For example, maybe you have a cartoon fox with pointy ears, when you detect a frown you could automatically turn the ears downwards (in fact, Apple does this with their own Animoji).

Making the Morph Targets

The most labour intensive part is mimicking Apple's morph targets on your custom 3D mesh.

There are a lot of blend shapes.

Apple ARKit Blend Shapes

In total there are 51 blend shapes including things like eyeBlinkLeft, eyeWideRight, mouthFunnel, mouthLowerDownLeft and so on. Most of these are symmetrical in that they have left and right equivalents.

Here are the blend shapes we made for our sample model. These are fairly basic and were made quickly so we could test the validity of the idea. Your own custom models could have much nicer, more intricate blend shapes.

Morph Targets

How does it work?

The demo consists of two parts. The iOS app and the Unity extension host.

iOS App

You can get it here: github.com/johnjcsmith/iPhoneMoCap

The iOS app streams the Blend Shapes Apple provides in ARFaceAnchor.blendShapes to the Unity host through a UDP socket. Essentially emitting a stream of messages, each with 50 blend shapes in the format 'blend-shape-name:blend-shape-value'.

Live Demo

There are lots of performance improvements to be made here but it works for the purpose of a demo.

Unity Extension Host

You can get it here: github.com/johnjcsmith/iPhoneMoCapUnity

Inside of the Unity host we have an extension which opens up a UDP socket to listen for the iPhone's messages. When it receives a message it applies the blend shape values to the corresponding blend shape on the rig.

The Unity extension targets a SkinnedMeshRenderer with the name blendShapeTarget which

How to run the project

  • Clone and open the Unity project from here.
  • Run the Unity project's scene
  • In the menu bar select iPhoneMoCap -> MeshPreview
  • Enable Mesh preview
  • Grab the iOS app project from here
  • Make sure your iPhone X is connected to the same Wifi network and build / run this application. (Don't forget to pod install)
  • This application should discover the unity host and begin streaming the motion data.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].