All Projects → google → Neural Light Transport

google / Neural Light Transport

Licence: apache-2.0
Code and Data Release for Neural Light Transport (NLT)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Neural Light Transport

Solarsys
Realistic Solar System simulation with three.js
Stars: ✭ 49 (-50%)
Mutual labels:  computer-graphics
Imath
Imath is a C++ and python library of 2D and 3D vector, matrix, and math operations for computer graphics
Stars: ✭ 70 (-28.57%)
Mutual labels:  computer-graphics
Shoebot
Easy vector graphics with Python
Stars: ✭ 88 (-10.2%)
Mutual labels:  computer-graphics
Gloom
A minimalistic boilerplate for OpenGL with C++.
Stars: ✭ 54 (-44.9%)
Mutual labels:  computer-graphics
Graphics Algorithm
3D图形学算法Code。包括软渲染、光线追踪、PBR等等~
Stars: ✭ 67 (-31.63%)
Mutual labels:  computer-graphics
Bru 9
Aesthetic Engine 2
Stars: ✭ 74 (-24.49%)
Mutual labels:  computer-graphics
Pifu
This repository contains the code for the paper "PIFu: Pixel-Aligned Implicit Function for High-Resolution Clothed Human Digitization"
Stars: ✭ 1,021 (+941.84%)
Mutual labels:  computer-graphics
Spectral Clara Lux Tracer
A physically based ray tracer with multiple shading models support and Color Rendering Index (CRI) evaluation. Project developed for my master degree thesis at University Milano-Bicocca.
Stars: ✭ 91 (-7.14%)
Mutual labels:  computer-graphics
Metatrace
Stars: ✭ 67 (-31.63%)
Mutual labels:  computer-graphics
Opengl Renderer
Modern OpenGL renderer written in C++17
Stars: ✭ 85 (-13.27%)
Mutual labels:  computer-graphics
Bfm to flame
Convert from Basel Face Model (BFM) to the FLAME head model
Stars: ✭ 55 (-43.88%)
Mutual labels:  computer-graphics
Pix2pix
Image-to-image translation with conditional adversarial nets
Stars: ✭ 8,765 (+8843.88%)
Mutual labels:  computer-graphics
Seam Erasure
Seamlessly erase seams from your favorite 3D models.
Stars: ✭ 80 (-18.37%)
Mutual labels:  computer-graphics
Graphics Snippets
Shading techniques and GLSL snippets
Stars: ✭ 53 (-45.92%)
Mutual labels:  computer-graphics
Miyuki Renderer
Experimental Physically Based Renderer
Stars: ✭ 89 (-9.18%)
Mutual labels:  computer-graphics
Meshcnn
Convolutional Neural Network for 3D meshes in PyTorch
Stars: ✭ 1,032 (+953.06%)
Mutual labels:  computer-graphics
Ptahrenderer
A small software graphics renderer
Stars: ✭ 71 (-27.55%)
Mutual labels:  computer-graphics
Tiny3d
A Small OpenGL Based Renderer
Stars: ✭ 94 (-4.08%)
Mutual labels:  computer-graphics
Cubbyflow V1
Voxel-based fluid simulation engine for computer games
Stars: ✭ 90 (-8.16%)
Mutual labels:  computer-graphics
Attend infer repeat
A Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (-16.33%)
Mutual labels:  computer-graphics

Neural Light Transport (NLT)

ACM Transactions on Graphics 2021

[Original Resolution] [arXiv] [Publisher] [Video] [Project] [BibTeX]

teaser

This is the authors' code release for:

Neural Light Transport for Relighting and View Synthesis
Xiuming Zhang, Sean Fanello, Yun-Ta Tsai, Tiancheng Sun, Tianfan Xue, Rohit Pandey, Sergio Orts-Escolano, Philip Davidson, Christoph Rhemann, Paul Debevec, Jonathan T. Barron, Ravi Ramamoorthi, William T. Freeman
TOG 2021

in which we show how to train neural networks to perform simultaneous relighting and view sythesis, exhibiting complex light transport effects (such as specularity, subsurface scattering, global illumination, and etc.):

Dragon (specular) Dragon (subsurface scattering)
dragon_specular dragon_sss

This repository contains our rendered data, the code that rendered those data, and TensorFlow 2 (eager) code for training and testing NLT models.

If you use this repository or find it useful, please cite the paper (BibTeX).

This is not an officially supported Google product.

Before You Start...

Relighting Only?

The UV texture space formulation is most useful when views vary. If you are doing relighting from a fixed viewpoint, you can simply skip mapping between the camera and UV spaces. That is, you can just treat the camera-space ("normal-looking") images as UV-unwrapped ones. Intuitively, this is equivalent to using an identity mapping as UV (un)wrapping.

Relighting or View Synthesis (Not Simultaneously)?

If you do not care about simultaneous relighting and view synthesis, you can simply use a "slice" of the released data. For instance, if you are doing just view synthesis, then you can fix lighting by training on just the multi-view data under that lighting condition.

If you are rendering your own scene (see the data generation folder), use a single JSON path with no wildcard to fix the view or light.

Data

We provide both our rendered data and the scripts, so that you can either just use our data or render your own Blender scenes.

Download Metadata

See "Downloads -> Metadata" of the project page.

Download Our Rendered Data

See "Downloads -> Rendered Data" of the project page.

(Optional) Render Your Own Data

Blender 2.78c is used for scene modeling and rendering. The code was tested on Ubuntu 16.04.6 and 18.04.3 LTS, but should work with other reasonable OS versions.

See the data generation folder and its own README.

Model Training and Testing

We use TensorFlow 2 (eager execution) for neural network training and testing. The code was tested on Ubuntu 16.04.6 LTS, but should work with other reasonable TensorFlow or OS versions.

See the training and testing folder and its own README.

Pre-Trained Models

See "Downloads -> Pre-Trained Models" of the project page.

Issues or Questions?

If the issue is code-related, please open an issue here.

For questions, please also consider opening an issue as it may benefit future reader. Otherwise, email Xiuming Zhang.

Changelog

  • 01/05/2021: See the 01/07/2021 commits; v3 paper and video (TOG camera-ready ver.).
  • 08/20/2020: Updated the paper and the video.
  • 08/07/2020: Initial release.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].