All Projects → nghorbani → moshpp

nghorbani / moshpp

Licence: other
Motion and Shape Capture from Sparse Markers

Programming Languages

python
139335 projects - #7 most used programming language
C++
36643 projects - #6 most used programming language
cython
566 projects

Projects that are alternatives of or similar to moshpp

VRMocap
A SteamVR powered mocap solution for Unreal Engine
Stars: ✭ 88 (+2.33%)
Mutual labels:  mocap
blockbuster
The Machinima Studio mod
Stars: ✭ 108 (+25.58%)
Mutual labels:  mocap
vicon
Code for working with the Vicon tracking system
Stars: ✭ 20 (-76.74%)
Mutual labels:  vicon
vicon2gt
Vicon-IMU fusion for groundtruth trajectory generation.
Stars: ✭ 41 (-52.33%)
Mutual labels:  vicon
biomechanics dataset
Information of public available data sets for biomechanics.
Stars: ✭ 31 (-63.95%)
Mutual labels:  mocap
ManosOsc
(Eyebeam #13 of 13) Output OSC, MIDI, and After Effects/Maya animation scripts from the Leap Motion controller.
Stars: ✭ 53 (-38.37%)
Mutual labels:  mocap
gaitutils
Extract and visualize gait data
Stars: ✭ 28 (-67.44%)
Mutual labels:  vicon
QTM-Connect-For-Unreal
Unreal plugin for real-time streaming from Qualisys Track Manager
Stars: ✭ 18 (-79.07%)
Mutual labels:  mocap
Azure-Kinect-Python
Python 3 bindings for the Azure Kinect SDK
Stars: ✭ 36 (-58.14%)
Mutual labels:  mocap

MoSh++

This repository contains the official chumpy implementation of mocap body solver used for AMASS:

AMASS: Archive of Motion Capture as Surface Shapes
Naureen Mahmood, Nima Ghorbani, Nikolaus F. Troje, Gerard Pons-Moll, Michael J. Black
Full paper | Video | Project website | Poster

Description

This repository holds the code for MoSh++, introduced in AMASS, ICCV'19. MoSh++ is the upgraded version of MoSh, Sig.Asia'2014. Given a labeled marker-based motion capture (mocap) c3d file and the correspondences of the marker labels to the locations on the body, MoSh can return model parameters for every frame of the mocap sequence. The current MoSh++ code works with the following models:

Installation

The Current repository requires Python 3.7 and chumpy; a CPU based auto-differentiation package. This package is assumed to be used along with SOMA, the mocap auto-labeling package. Please install MoSh++ inside the conda environment of SOMA. Clone the moshpp repository, and run the following from the root directory:

sudo apt install libtbb-dev

pip install -r requirements.txt

cd src/moshpp/scan2mesh
sudo apt install libeigen3-dev
pip install -r requirements.txt
2. sudo apt install libtbb-dev
cd mesh_distance
make

cd ../../../..
python setup.py install

Tutorials

This repository is a complementary package to SOMA, an automatic mocap solver. Please refer to the SOMA repository for tutorials and use cases.

Citation

Please cite the following paper if you use this code directly or indirectly in your research/projects:

@inproceedings{AMASS:2019,
  title={AMASS: Archive of Motion Capture as Surface Shapes},
  author={Mahmood, Naureen and Ghorbani, Nima and Troje, Nikolaus F. and Pons-Moll, Gerard and Black, Michael J.},
  booktitle = {The IEEE International Conference on Computer Vision (ICCV)},
  year={2019},
  month = {Oct},
  url = {https://amass.is.tue.mpg.de},
  month_numeric = {10}
}

Please consider citing the initial version of MoSh from Loper et al. Sig. Asia'14:

   @article{Loper:SIGASIA:2014,
     title = {{MoSh}: Motion and Shape Capture from Sparse Markers},
     author = {Loper, Matthew M. and Mahmood, Naureen and Black, Michael J.},
     address = {New York, NY, USA},
     publisher = {ACM},
     month = nov,
     number = {6},
     volume = {33},
     pages = {220:1--220:13},
     journal = {ACM Transactions on Graphics, (Proc. SIGGRAPH Asia)},
     url = {http://doi.acm.org/10.1145/2661229.2661273},
     year = {2014},
     doi = {10.1145/2661229.2661273}
   }

License

Software Copyright License for non-commercial scientific research purposes. Please read carefully the terms and conditions and any accompanying documentation before you download and/or use the MoSh++ data and software, (the "Data & Software"), software, scripts, and animations. By downloading and/or using the Data & Software (including downloading, cloning, installing, and any other use of this repository), you acknowledge that you have read these terms and conditions, understand them, and agree to be bound by them. If you do not agree with these terms and conditions, you must not download and/or use the Data & Software. Any infringement of the terms of this agreement will automatically terminate your rights under this License.

The software is compiled using CGAL sources following the license in CGAL_LICENSE.pdf

Contact

The code in this repository is developed by Nima Ghorbani, Naureen Mahmood, and Matthew Loper while at Max-Planck Institute for Intelligent Systems, Tübingen, Germany.

If you have any questions you can contact us at [email protected].

For commercial licensing, contact [email protected]

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].