All Projects → googlesamples → Arcore Depth Lab

googlesamples / Arcore Depth Lab

Licence: other
ARCore Depth Lab is a set of Depth API samples that provides assets using depth for advanced geometry-aware features in AR interaction and rendering. (UIST 2020)

Projects that are alternatives of or similar to Arcore Depth Lab

Webxr Polyfill
A polyfill and example code for building augmented reality (AR) and virtual reality (VR) applications using WebXR.
Stars: ✭ 227 (-44.23%)
Mutual labels:  ar, arcore
Awesome Arcore
A curated list of awesome ARCore projects and resources. Feel free to contribute!
Stars: ✭ 106 (-73.96%)
Mutual labels:  mobile, arcore
Arcore
ARCore Course
Stars: ✭ 148 (-63.64%)
Mutual labels:  ar, arcore
Norman Ar
Decorate your world with AR animations.
Stars: ✭ 122 (-70.02%)
Mutual labels:  ar, arcore
stardust-SDK
Stardust SDK and sample app for Unity
Stars: ✭ 23 (-94.35%)
Mutual labels:  ar, arcore
Ar Drawing Java
A simple AR drawing experiment build in Java using ARCore.
Stars: ✭ 387 (-4.91%)
Mutual labels:  ar, arcore
Processing Android
Processing mode and core library to create Android apps with Processing
Stars: ✭ 643 (+57.99%)
Mutual labels:  mobile, ar
Arkit Cardboard Vr
ARkit + GVR to make VR and Mixed Reality 6dof AR for iphone
Stars: ✭ 132 (-67.57%)
Mutual labels:  mobile, ar
google-ar-asset-converter
Sceneform SDK command to generate SFB files
Stars: ✭ 83 (-79.61%)
Mutual labels:  ar, arcore
Arcoreinsideouttrackinggearvr
Inside Out Positional Tracking (6DoF) for GearVR/Cardboard/Daydream using ARCore v1.6.0
Stars: ✭ 150 (-63.14%)
Mutual labels:  mobile, arcore
Jeelizar
JavaScript object detection lightweight library for augmented reality (WebXR demos included). It uses convolutional neural networks running on the GPU with WebGL.
Stars: ✭ 296 (-27.27%)
Mutual labels:  ar, arcore
sceneform-android
Sceneform Maintained is an ARCore Android SDK with Google Filament as 3D engine. This is the continuation of the archived Sceneform
Stars: ✭ 303 (-25.55%)
Mutual labels:  ar, arcore
Pokedex Ar
🦄 Android Pokedex-AR using ARCore, Sceneform, Hilt, Coroutines, Flow, Jetpack (Room, ViewModel, LiveData) based on MVVM architecture.
Stars: ✭ 347 (-14.74%)
Mutual labels:  ar, arcore
Arcore Location
Allows items to be placed within the AR world with real-world GPS coordinates using ARCore.
Stars: ✭ 399 (-1.97%)
Mutual labels:  arcore
Fluttergrocery Shoppingappui
🍔😋 Grocery Shopping App template UI kit in Flutter
Stars: ✭ 388 (-4.67%)
Mutual labels:  mobile
Nimbus Eth1
Nimbus: an Ethereum 1.0 and 2.0 Client for Resource-Restricted Devices
Stars: ✭ 386 (-5.16%)
Mutual labels:  mobile
Reactjs101
從零開始學 ReactJS(ReactJS 101)是一本希望讓初學者一看就懂的 React 中文入門教學書,由淺入深學習 ReactJS 生態系 (Flux, Redux, React Router, ImmutableJS, React Native, Relay/GraphQL etc.)。
Stars: ✭ 4,004 (+883.78%)
Mutual labels:  mobile
Framework7 Template Vue Webpack
Deprecated! Framework7 Vue Webpack starter app template with hot-reload & css extraction
Stars: ✭ 399 (-1.97%)
Mutual labels:  mobile
Cordova Plugman
Apache Cordova Plugman
Stars: ✭ 379 (-6.88%)
Mutual labels:  mobile
Mosdepth
fast BAM/CRAM depth calculation for WGS, exome, or targeted sequencing
Stars: ✭ 376 (-7.62%)
Mutual labels:  depth

ARCore Depth Lab - Depth API Samples for Unity

Copyright 2020 Google LLC. All rights reserved.

Depth Lab is a set of ARCore Depth API samples that provides assets using depth for advanced geometry-aware features in AR interaction and rendering. Some of these features have been used in this Depth API overview video.

ARCore Depth API is enabled on a subset of ARCore-certified Android devices. iOS devices (iPhone, iPad) are not supported. Find the list of devices with Depth API support (marked with Supports Depth API) here: https://developers.google.com/ar/discover/supported-devices. See the ARCore developer documentation for more information.

Download the pre-built ARCore Depth Lab app on Google Play Store today.

Get ARCore Depth Lab on Google Play

Sample features

The sample scenes demonstrate three different ways to access depth:

  1. Localized depth: Sample single depth values at certain texture coordinates (CPU).
  • Character locomotion on uneven terrain
  • Collision checking for AR object placement
  • Laser beam reflections
  • Oriented 3D reticles
  • Rain and snow particle collision
  1. Surface depth: Create a connected mesh representation of the depth data (CPU/GPU).
  • AR shadow receiver
  • Paint splat
  • Physics simulation
  • Surface retexturing
  1. Dense depth: Process depth data at every screen pixel (GPU).
  • AR fog
  • Occlusions
  • Depth-of-field blur
  • Environment relighting
  • False-color depth map

Unity project setup

These samples target Unity 2018.4.24f1 and require ARCore SDK for Unity v1.18.0 or newer. Download and import arcore-unity-sdk-1.18.0.unitypackage or newer into the sample project. Close and reopen the project and reimport all demo shaders to resolve any dependency issues in the Unity editor. This project only builds with the Build Platform Android. Instant Preview is not enabled for Depth API yet. Build the project to an Android device instead of using the Play button in the Unity editor.

In Unity 2019 or newer you may see code errors associated with SpatialTracking, NetworkBehaviour, and scripts in Assets/GoogleARCore/*. In Unity > Window > Package Manager add the following packages to resolve the issue:

  • Multiplayer HLAPI (com.unity.multiplayer-hlapi)
  • XR Legacy Input Helpers (com.unity.xr.legacyinputhelpers)

Building samples

Individual scenes can be built and run by just enabling a particular scene, e.g. FogEffect to try out the depth-aware fog filter.

We also provide a demo user interface that allows users to seamlessly switch between examples. Please make sure to set the Build Platform to Android and verify that the main DemoCarousel scene is the first enabled scene in the Scenes In Build list under Build Settings. Enable all scenes that are part of the demo user interface.

Assets/ARRealismDemos/DemoCarousel/Scenes/DemoCarousel.unity Assets/ARRealismDemos/OrientedReticle/Scenes/OrientedReticle.unity Assets/ARRealismDemos/DepthEffects/Scenes/DepthEffects.unity Assets/ARRealismDemos/MaterialWrap/Scenes/MaterialWrap.unity Assets/ARRealismDemos/Splat/Scenes/OrientedSplat.unity Assets/ARRealismDemos/Collider/Scenes/Collider.unity Assets/ARRealismDemos/LaserBeam/Scenes/LaserBeam.unity Assets/ARRealismDemos/AvatarLocomotion/Scenes/AvatarLocomotion.unity Assets/ARRealismDemos/Relighting/Scenes/PointsRelighting.unity Assets/ARRealismDemos/DepthEffects/Scenes/FogEffect.unity Assets/ARRealismDemos/SnowParticles/Scenes/ArCoreSnowParticles.unity Assets/ARRealismDemos/RainParticles/Scenes/RainParticlesScene.unity Assets/ARRealismDemos/DepthEffects/Scenes/DepthOfFieldEffect.unity Assets/ARRealismDemos/Water/Scenes/Water.unity Assets/ARRealismDemos/CollisionDetection/Scenes/CollisionAwareObjectPlacement.unity Assets/ARRealismDemos/PointCloud/Scenes/PointCloud.unity Assets/ARRealismDemos/ScreenSpaceDepthMesh/Scenes/ScreenSpaceDepthMesh.unity Assets/ARRealismDemos/ScreenSpaceDepthMesh/Scenes/StereoPhoto.unity

Upcoming breaking change affecting 32-bit-only apps

The project is set up to use the IL2CPP scripting backend instead of Mono to build an ARM64 app. You may be prompted to locate the Android NDK folder. You can download the NDK by navigating to Unity > Preferences > External Tools > NDK and clicking the Download button.

In August 2020, Google Play Services for AR (ARCore) will remove support for 32-bit-only ARCore-enabled apps running on 64-bit devices. Support for 32-bit apps running on 32-bit devices is unaffected.

If you have published a 32-bit-only (armeabi-v7a) version of your ARCore-enabled app without publishing a corresponding 64-bit (arm64-v8a) version, you must update your app to include 64-bit native libraries before August 2020. 32-bit-only ARCore-enabled apps that are not updated by this time may crash when attempting to start an augmented reality (AR) session.

To learn more about this breaking change, and for instructions on how to update your app, see https://developers.google.com/ar/64bit.

Sample project structure

The main sample assets are placed inside the Assets/ARRealismDemos folder. Each subfolder contains sample features or helper components.

AvatarLocomotion

The AR character in this scene follows user-set waypoints while staying close to the surface of an uneven terrain. This scene uses raycasting and depth lookups on the CPU to calculate a 3D point on the surface of the terrain.

Collider

This physics simulation playground uses screen-space depth meshes to enable collisions between Unity's rigid-body objects and the physical environment.

After pressing an on-screen button, a Mesh object is procedurally generated from the latest depth map. This is used to update the sharedMesh parameter of the MeshCollider object. A randomly selected primitive rigid-body object is then thrown into the environment.

CollisionDetection

This AR object placement scene uses depth lookups on the CPU to test collisions between the vertices of virtual objects and the physical environment.

Common

This folder contains scripts and prefabs that are shared between the feature samples. For more details, see the Helper Classes section below.

DemoCarousel

This folder contains the main scene, which provides a carousel user interface. This scene allows the user to seamlessly switch between different features. A scene can be selected by directly touching a preview thumbnail or dragging the carousel UI to the desired position.

DepthEffects

This folder contains three dense depth shader processing examples.

The DepthEffects scene contains a fragment-shader effect that can transition from the AR camera view to a false-color depth map. Warm colors indicate closer regions in the depth map. Cold colors indicate further regions.

The DepthOfFieldEffect scene contains a simulated Bokeh fragment-shader effect. This blurs the regions of the AR view that are not at the user-defined focus distance. The focus anchor is set in the physical environment by touching the screen. The focus anchor is a 3D point that is locked to the environment and always in focus.

The FogEffect scene contains a fragment-shader effect that adds a virtual fog layer on the physical environment. Close objects will be more visible than objects further away. A slider controls the density of the fog.

LaserBeam

This laser reflection scene allows the user to shoot a slowly moving laser beam by touching anywhere on the screen.

This uses:

  • The DepthSource.GetVertexInWorldSpaceFromScreenXY(..) function to look up a raycasted 3D point
  • The ComputeNormalMapFromDepthWeightedMeanGradient(..) function to look up the surface normal based on a provided 2D screen position.

MaterialWrap

This experience allows the user to change the material of real-world surfaces through touch. This uses depth meshes.

OrientedReticle

This sample uses depth hit testing to obtain the raycasted 3D position and surface normal of a raycasted screen point.

PointCloud

This samples computes a point cloud on the CPU using the depth array. Press the Update button to compute a point cloud based on the latest depth data.

RainParticles

This sample uses the GPU depth texture to compute collisions between rain particles and the physical environment.

Relighting

This sample uses the GPU depth texture to computationally re-light the physical environment through the AR camera. Areas of the physical environment close to the artificial light sources are lit, while areas farther away are darkened.

ScreenSpaceDepthMesh

This sample uses depth meshes. A template mesh containing a regular grid of triangles is created once on the CPU. The GPU shader displaces each vertex of the regular grid based on the reprojection of the depth values provided by the GPU depth texture. Press Freeze to take a snapshot of the mesh and press Unfreeze to revert back to the live updating mesh.

StereoPhoto

This sample uses depth meshes and ScreenSpaceDepthMesh. After freezing the mesh, we cache the current camera's projection and view matrices, circulate the camera around a circle, and perform projection mapping onto the depth mesh with the cached camera image. Press Capture to create the animated 3D photo and press Preview to go back to camera preview mode.

SnowParticles

This sample uses the GPU depth texture to compute collisions between snow particles, the physical environment, and the orientation of each snowflake.

Splat

This sample uses the Oriented Reticle and the depth mesh in placing a surface-aligned texture decal within the physical environment.

Water

This sample uses a modified GPU occlusion shader to create a flooding effect with artificial water in the physical environment.

Developing your own ARCore Depth-enabled Unity experiences

Please make sure that the Unity scene is properly set up to run ARCore. Provide depth data by attaching the ARCoreSession to the appropriate configuration. Please see the example provided in the ARCore SDK for Unity package to correctly set up an ARCore Depth-enabled Unity scene.

Please follow the steps below to utilize the depth utilities provided in this ARCore Depth Lab sample package:

  1. Attach at least one DepthTarget component to the scene. This makes sure that the DepthSource class provides depth data to the scene.

  2. A DepthSource component can be explicitly placed within the scene. Otherwise an instance will be created automatically. A few parameters can be customized in the editor when DepthSource is explicitly placed in the scene.

  3. The depth texture is directly set to the material of a MeshRenderer when the DepthTarget script is attached to a GameObject with a Meshrenderer component.

Helper classes

DepthSource

A singleton instance of this class contains references to the CPU array and GPU texture of the depth map, camera intrinsics, and many other depth look up and coordinate transformation utilities. This class acts as a high-level wrapper for the MotionStereoDepthDataSource class.

DepthTarget

Each GameObject containing a DepthTarget becomes a subscriber to the GPU depth data. DepthSource will automatically update the depth data for each DepthTarget. At least one instance of DepthTarget has to be present in the scene in order for DepthSource to provide depth data.

MotionStereoDepthDataSource

This class contains low-level operations and access to the depth data. It should only be use by advanced developers.

User privacy requirements

You must prominently disclose the use of Google Play Services for AR (ARCore) and how it collects and processes data in your application. This information must be easily accessible to end users. You can do this by adding the following text on your main menu or notice screen: "This application runs on Google Play Services for AR (ARCore), which is provided by Google LLC and governed by the Google Privacy Policy".

Related Publication

Please refer to https://augmentedperception.github.io/depthlab/ for our paper published in ACM UIST 2020: "DepthLab: Real-Time 3D Interaction With Depth Maps for Mobile Augmented Reality".

References

If you use ARCore Depth Lab in your research, please reference it as:

@inproceedings{Du2020DepthLab,
  title = {{DepthLab: Real-time 3D Interaction with Depth Maps for Mobile Augmented Reality}},
  author = {Du, Ruofei and Turner, Eric and Dzitsiuk, Maksym and Prasso, Luca and Duarte, Ivo and Dourgarian, Jason and Afonso, Joao and Pascoal, Jose and Gladstone, Josh and Cruces, Nuno and Izadi, Shahram and Kowdle, Adarsh and Tsotsos, Konstantine and Kim, David},
  booktitle = {Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology},
  year = {2020},
  publisher = {ACM},
  numpages = {15},
  series = {UIST},
}

or

Ruofei Du, Eric Turner, Maksym Dzitsiuk, Luca Prasso, Ivo Duarte, Jason Dourgarian, Joao Afonso, Jose Pascoal, Josh Gladstone, Nuno Cruces, Shahram Izadi, Adarsh Kowdle, Konstantine Tsotsos, and David Kim. 2020. DepthLab: Real-Time 3D Interaction With Depth Maps for Mobile Augmented Reality. Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, pp. 15.

Additional information

You may use this software under the Apache 2.0 License.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].