All Projects → zalo → WarpedCAVE

zalo / WarpedCAVE

Licence: other
Cheap arbitrary-surface projection mapping for 3D rendering.

Programming Languages

C#
18002 projects
GLSL
2045 projects

Projects that are alternatives of or similar to WarpedCAVE

sketch-mapper
A reimagining of SurfaceMapperGUI for Processing
Stars: ✭ 33 (-15.38%)
Mutual labels:  projection-mapping
badMapper
Simple Projection Mapping tool inspired by http://www.madmapper.com/
Stars: ✭ 18 (-53.85%)
Mutual labels:  projection-mapping
VR-reversal
VR-Reversal - Player for conversion of 3D video to 2D with optional saving of head tracking data and rendering out of 2D copies.
Stars: ✭ 128 (+228.21%)
Mutual labels:  projection-mapping
3d-earth
🌏🌎🌍 3D Earth with Mapbox GL, D3.js and Three.js
Stars: ✭ 68 (+74.36%)
Mutual labels:  projection-mapping
SurfaceMapperGUI
A simple projection mapping interface using Processing's SurfaceMapper and ControlP5 libraries.
Stars: ✭ 63 (+61.54%)
Mutual labels:  projection-mapping
skymapper
Mapping astronomical survey data on the sky, handsomely
Stars: ✭ 35 (-10.26%)
Mutual labels:  projection-mapping

WarpedCAVE

http://paulbourke.net/dome/mirrorbox/

An experiment extending Paul Bourke's mirrored sphere projection technique to 3D in Unity, for the purpose of building low-cost single-projector, single computer immersive CAVEs on arbitrary surfaces. After some initial pre-computation, this technique allows one to render perspective corrected 3D imagery on arbitrary surfaces using a very cheap vertex shader and an on-axis rendering camera (enabling a variety of graphical effects that traditional CAVEs are not normally capable of).

Includes a guided calibration routine for assembling the projection geometry (via probing the bottom tip of a Vive controller to various surfaces to ascertain their shape). The automated routine calculates the projector's projection matrix, three flat walls, and the position + radius of the mirrored sphere reflector (though each step can be easily extended to arbitrary geometry).

How the pre-distortion works:

  1. For each vertex in the distortion plane, trace a ray from the projector to the projection surface (simulating the reflection off of the mirrored sphere in between).

  2. Record the hit position, and bake it into the vertex color of the distortion plane mesh.

  3. In the distortion mesh, reproject that world position back into the viewport space of the eye's render texture camera.

  4. Set the UV of that vertex in the distortion plane to be equal to 3)'s viewport space coordinate.

This example scene is set up such that the "eye camera" is looking through the ring on the right vive controller.

Here are Before/After comparison screenshots:

Video (shows 3D effect):

3D Warped CAVE Demo

(note the wavy distortions in the grid pattern introduced by the budget spherical mirror; primary surface reflectors would not have this issue)

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].