All Projects β†’ HannesStark β†’ SMPL-NeRF

HannesStark / SMPL-NeRF

Licence: other
Embed human pose information into neural radiance fields (NeRF) to render images of humans in desired poses πŸƒ from novel views

Programming Languages

python
139335 projects - #7 most used programming language
C++
36643 projects - #6 most used programming language
Cuda
1817 projects
POV-Ray SDL
72 projects

Projects that are alternatives of or similar to SMPL-NeRF

improved-nerfmm
Unofficial & improved implementation of NeRF--: Neural Radiance Fields Without Known Camera Parameters
Stars: ✭ 172 (+493.1%)
Mutual labels:  nerf, neural-radiance-fields
New-View-Synthesis
Collecting papers about new view synthesis
Stars: ✭ 437 (+1406.9%)
Mutual labels:  nerf, neural-radiance-fields
Nerf
Code release for NeRF (Neural Radiance Fields)
Stars: ✭ 4,062 (+13906.9%)
Mutual labels:  nerf, neural-radiance-fields
DSNeRF
Code release for DS-NeRF (Depth-supervised Neural Radiance Fields)
Stars: ✭ 343 (+1082.76%)
Mutual labels:  nerf, neural-radiance-fields
ICON
ICON: Implicit Clothed humans Obtained from Normals (CVPR 2022)
Stars: ✭ 641 (+2110.34%)
Mutual labels:  smpl
torch-ngp
A pytorch CUDA extension implementation of instant-ngp (sdf and nerf), with a GUI.
Stars: ✭ 1,317 (+4441.38%)
Mutual labels:  nerf
instant-ngp
Instant neural graphics primitives: lightning fast NeRF and more
Stars: ✭ 1,863 (+6324.14%)
Mutual labels:  nerf
tiny-cuda-nn
Lightning fast & tiny C++/CUDA neural network framework
Stars: ✭ 908 (+3031.03%)
Mutual labels:  nerf
gnerf
[ ICCV 2021 Oral ] Our method can estimate camera poses and neural radiance fields jointly when the cameras are initialized at random poses in complex scenarios (outside-in scenes, even with less texture or intense noise )
Stars: ✭ 152 (+424.14%)
Mutual labels:  nerf
PyMAF
[ICCV21, Oral] PyMAF: 3D Human Pose and Shape Regression with Pyramidal Mesh Alignment Feedback Loop
Stars: ✭ 333 (+1048.28%)
Mutual labels:  smpl
CIPS-3D
3D-aware GANs based on NeRF (arXiv).
Stars: ✭ 562 (+1837.93%)
Mutual labels:  nerf
hypernerf
Code for "HyperNeRF: A Higher-Dimensional Representation for Topologically Varying Neural Radiance Fields".
Stars: ✭ 735 (+2434.48%)
Mutual labels:  nerf
MEVA
Official implementation of ACCV 2020 paper "3D Human Motion Estimation via Motion Compression and Refinement" (Identical repo to https://github.com/KlabCMU/MEVA, will be kept in sync)
Stars: ✭ 93 (+220.69%)
Mutual labels:  smpl
NeuRay
[CVPR2022] Neural Rays for Occlusion-aware Image-based Rendering
Stars: ✭ 291 (+903.45%)
Mutual labels:  nerf
HybrIK
Official code of "HybrIK: A Hybrid Analytical-Neural Inverse Kinematics Solution for 3D Human Pose and Shape Estimation", CVPR 2021
Stars: ✭ 395 (+1262.07%)
Mutual labels:  smpl
mvsnerf
[ICCV 2021] Our work presents a novel neural rendering approach that can efficiently reconstruct geometric and neural radiance fields for view synthesis.
Stars: ✭ 533 (+1737.93%)
Mutual labels:  nerf
cv-arxiv-daily
πŸŽ“Automatically Update CV Papers Daily using Github Actions (Update Every 12th hours)
Stars: ✭ 216 (+644.83%)
Mutual labels:  nerf
neurecon
Multi-view 3D reconstruction using neural rendering. Unofficial implementation of UNISURF, VolSDF, NeuS and more.
Stars: ✭ 697 (+2303.45%)
Mutual labels:  nerf
MISE
Multimodal Image Synthesis and Editing: A Survey
Stars: ✭ 214 (+637.93%)
Mutual labels:  nerf
Nerf-Gun-Call-of-Duty-Warzone-Controller
DIY Call of Duty Warzone controller built using a nerf gun powered by the Raspberry Pi 4.
Stars: ✭ 18 (-37.93%)
Mutual labels:  nerf

PyTorch Neural Radiance Fields (NeRF) with SMPL embedding

Video | Paper

This repository is a PyTorch implementation of NeRF which can be trained on images of a scene to then render novel views. To enjoy the the vanilla NeRF just run the train file with model_type=nerf. Additionally, different model types are supported that embed an SMPL model of a human to control its pose in addition to the view point.

Quickstart

  • Create a 128x128 synthetic dataset of humans with different arm angles for the model type smpl_nerf.
python create_dataset.py --dataset=smpl_nerf --save_dir=data --resolution=128 --start_angle=0 --end_angle=1 --number_steps=1 --human_number_steps=10 --multi_human_pose=1 --human_start_angle=0 --human_end_angle=60
  • Install torchsearchsorted.
cd torchsearchsorted
pip install .
  • Run the train file.
python train.py --experiment_name=SMPLNeRF --model_type=smpl_nerf --dataset_dir=data --batchsize=64 --batchsize_val=64 --num_epochs=100 --netdepth=8 --run_fine=1 --netdepth_fine=8
  • Start tensorboard.
tensorboard --logdir=logs/summaries --port=6006

Navigate to localhost:6006 in your browser and watch the model train.

Requirements

  • PyTorch >=1.4
  • matplotlib
  • numpy
  • imageio
  • configargparse
  • torchsearchsorted
  • smplx

Creating synthetic datasets requires

  • pyrender
  • trimesh

Setting up the Baseline

  • Clone Pix2Pix repo (TODO: install dependencies):
mkdir baseline/
cd baseline/
git clone https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix.git
cd ..
  • Create Dataset with (RGB, depth)-pairs
python create_dataset.py --dataset=pix2pix --save_dir=baseline/pytorch-CycleGAN-and-pix2pix/datasets/smpl --resolution=128 --start_angle=-90 --end_angle=90 --number_steps=10
  • Train Pix2Pix on datasets (set name for experiment, set gpu_ids=-1 for CPU)
cd baseline/pytorch-CycleGAN-and-pix2pix/
python train.py --gpu_ids=0 --model=pix2pix --dataroot=datasets/smpl --name=SMPL_pix2pix --direction=BtoA --save_epoch_freq=50

Model Types

  • nerf: the vanilla NeRF
  • image_wise_dynamic: the rays are not shuffeled between the image during processing so that the warp for every ray is only calculated once.

NeRF is from the Paper:

Ben Mildenhall, Pratul P. Srinivasan, Matthew Tancik, Jonathan T. Barron, Ravi Ramamoorthi, & Ren Ng. (2020). NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].