Giter Site home page Giter Site logo

ahmedosman / star Goto Github PK

View Code? Open in Web Editor NEW
629.0 20.0 95.0 8.96 MB

ECCV2020 - Official code repository for the paper : STAR - A Sparse Trained Articulated Human Body Regressor

Home Page: https://star.is.tue.mpg.de

License: Other

Python 100.00%
smpl smplx human body pose-estimation graphics-3d eccv2020 3d-models graphics

star's Introduction

STAR: Sparse Trained Articulated Human Body Regressor

[Project Page] [Paper] [Supp. Mat.]

Table of Contents

License

Software Copyright License for non-commercial scientific research purposes. Please read carefully the LICENSE file and any accompanying documentation before you download and/or use the STAR model and software, (the "Data & Software"). By downloading and/or using the Data & Software (including downloading, cloning, installing, and any other use of the corresponding github repository), you acknowledge that you have read these terms and conditions in the LICENSE file, understand them, and agree to be bound by them. If you do not agree with these terms and conditions, you must not download and/or use the Data & Software. Any infringement of the terms of this agreement will automatically terminate your rights under this License

Description

STAR - A Sparse Trained Articulated Human Body Regressor is a generateive 3D human body model, that is designed to be a drop-in replacement for the widely used SMPL model. STAR is trained on a large dataset of 14,000 human subjects, with a learned set of sparse and spatially local pose corrective blend shapes. In the Figure below, a single joint movement only influences a sparse set of the model vertices. The mesh vertices in gray are not affected by the joint movement. In contrast, for SMPL, bending the left elbow causes a bulge in the right elbow.
STAR is publicly avaiable with the full 300 principal-component shape space for research purposes from our website https://star.is.tue.mpg.de/

For more details, please see our ECCV paper STAR: Sparse Trained Articulated Human Body Regressor.

Content

This repository contains the model loader for the following auto-differention frameworks:

  • PyTorch.
  • Tensorflow 2.0.
  • Chumpy.

Code tested on Python 3.69, CUDA 10.1, CuDNN 7.6.5 and PyTorch 1.6.0, Tensorflow 2.3, Chumpy 0.69 on Ubuntu 18.04

Installation

Install

We recommend doing the following in a python3 virtual environment.

  1. Clone the repository:
git clone [email protected]:ahmedosman/STAR.git
  1. Install your favorite framework
    Chumpy
pip install chumpy==0.69
pip install opencv-python

PyTorch

pip install pytorch==1.6

Tensorflow

pip install tensorflow-gpu==2.3
  1. Download the models from our website https://star.is.tue.mpg.de/

  2. Update the model paths in the config.py file.

path_male_star = '/mypath/male/model.npz'
path_female_star = '/mypath/female/model.npz'
path_neutral_star = '/mypath/neutral/model.npz'
  1. Install with pip
pip install .

Usage

Under demos/* there are scripts demonstrating how to load and use the model in all frameworks.

    $PATH_TO_REPO/
    ├── demos
    │   │
    │   ├── compare_frameworks.py #Unit test script constructing the model with three frameworks and comparing the output
    │   └── load_chumpy.py        #A script demonstrating loading the model in chumpy
    │   └── load_tf.py            #A script demonstrating loading the model in Tensorflow
    │   └── load_torch.py         #A script demonstrating loading the model in PyTorch
    │   └── profile_tf.py         #A script profiling the STAR graph as a function of batch Size in Tensorflow
    |   └── profile_torch.py      #A script profiling the STAR graph as a function of batch Size in PyTorch

SMPL Comparison

STAR is designed to be a drop in replacement for SMPL, similar to SMPL it is parameterised with pose and shape parameters, with the same template resolution and kinematic tree.

STAR Kinematic Tree

Citation

If you find this Model & Software useful in your research we would kindly ask you to cite:

      author = {Osman, Ahmed A A and Bolkart, Timo and Black, Michael J.},
      title = {{STAR}: A Sparse Trained Articulated Human Body Regressor},
      booktitle = {European Conference on Computer Vision (ECCV)},
      pages = {598--613},
      year = {2020},
      url = {https://star.is.tue.mpg.de}
}    

Acknowledgments

We thank Naureen M. Mahmood, Talha Zaman, Nikos Athanasiou, Joachim Tesch, Muhammed Kocabas, Nikos Kolotouros and Vassilis Choutas for the discussions and Sai Kumar Dwivedi, Lea Muller, Amir Ahmad and Nitin Saini for proof reading the script and Mason Landry for the video voice over and Benjamin Pellkofer for the IT support.

Contact

For questions, please contact [email protected].

For commercial licensing (and all related questions for business applications), please contact [email protected].

star's People

Contributors

ahmedosman avatar michaeljblack avatar neoglez avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

star's Issues

Question about input data format of "convert_smpl_to_star.py"

Thanks for your amazing work!
I want to convert smpl vertices to star but failed.
I think I may have mistaken the format of the input data. The input data I use is smpl vertices of people_shapshot dataset, and the dim is (6890,3).
I run the "convert_smpl_to_star.py", and the output is:

/project/macaoyuan/STAR/convertors/losses.py:78: UserWarning: The Default optimization parameters (MAX_ITER_EDGES,MAX_ITER_VERTS) were tested on batch size 32 or smaller batches
  'The Default optimization parameters (MAX_ITER_EDGES,MAX_ITER_VERTS) were tested on batch size 32 or smaller batches')
Loading the SMPL Meshes and 
STAGE 1/2 - Fitting the Model on Edges Objective
Traceback (most recent call last):
  File "./convertors/convert_smpl_to_star.py", line 56, in <module>
    np_poses , np_betas , np_trans , star_verts = convert_smpl_2_star(smpl,**opt_parms)
  File "/project/macaoyuan/STAR/convertors/losses.py", line 98, in convert_smpl_2_star
    d = star(poses, betas, trans)
  File "/home/macaoyuan/.local/lib/python3.7/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/macaoyuan/.conda/envs/neuralbody/lib/python3.7/site-packages/star/pytorch/star.py", line 139, in forward
    v = v + trans[:,None,:]
RuntimeError: CUDA out of memory. Tried to allocate 544.00 MiB (GPU 0; 15.78 GiB total capacity; 13.47 GiB already allocated; 194.00 MiB free; 14.25 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

I think there is something wrong with my input data. Can you tell me the usage of the convert python file?

Number of betas is hard-coded to 10

The number of betas is hard-coded to 10 in

shapedirs = self.shapedirs.view(-1, 10)[None, :].expand(batch_size, -1, -1)

This is a problem because if you want to use the full published 300 shapes blandshapes torch throws the error

invalid argument 6: wrong matrix size at...

Database of STAR pose/shape parameters

Hi,

Thanks for your awesome work. :)
I would like to train an ML task using the STAR model. Do you know what is the biggest / most diverse set of pose and shape parameters for the STAR model available online?

Thanks a lot for your help,
Thibault

Input poses are not affecting the body model

Hi Ahmed,

Thank you for putting the effort into the STAR model, it is a great project.

I was trying to transition from SMPL to STAR, however, I realize that the input pose is not affecting the body model. So, I got back to one of your examples and I noticed you did a test with random poses (code line), and still got the same result when using your torch function at this line.

Do you know if there is an issue with the input poses or am I doing something wrong? Deformation with the betas is working as expected.

Here is the test code I'm using:

import numpy as np
import trimesh
import torch
import os
from star.pytorch import star
from torch.autograd import Variable

# Set the device
if torch.cuda.is_available():
    device = torch.device("cuda:0")
else:
    device = torch.device("cpu")
    print("WARNING: CPU only, this will be slow!")

betas = np.array([
            np.array([ 0.25176191, -3.0, 0.46747496, 0.89178988,
                      2.20098416, 0.26102114, -3.07428093, 0.55708514,
                      -3.94442258, -2.88552087])])

num_betas=10
batch_size=1
bodymodel = star.STAR(gender='male',num_betas=num_betas)

pose_np  = np.random.normal(0,2,(1,72))
# print('poses: ', pose_np)

# Random poses
poses = torch.cuda.FloatTensor(pose_np)
poses = Variable(poses,requires_grad=True)
betas = torch.cuda.FloatTensor(betas)
betas = Variable(betas,requires_grad=True)
trans = torch.cuda.FloatTensor(np.zeros((batch_size,3)))
trans = Variable(trans,requires_grad=True)
model = bodymodel(poses, betas,trans)
shaped = model.v_shaped

vertices = shaped.detach().cpu().numpy().squeeze()
faces = model.f
joints = model.J_transformed

mesh_tm = trimesh.Trimesh(vertices, faces, process=False) 
mesh_tm.show()

Thank you!

Pedro.

Are arms unnaturaly long in STAR (SIMPL)?

If I remember correctly in the original CEASAR data hands are scanned as fists. I guess the registration process ignored the fit of hands to create a statistical shape model including hands with fingers spread apart.

I am trying to register the STAR model to a surface from the Virtual Population dataset, segmented from magnetic resonance images. I get reasonable fits, except for the hands. The arms seem to remain too long.

I measured the length in the mean model (female) and found the arms are 10% longer than those of Ella.

Has anybody else observed this?

How to get in STAR mesh format?

I am trying to make a STAR mesh from a human lidar scan but am not sure how to go about this. The mesh I am using is incomplete as shown and as a .glb file. Is there a way to go about this conversion process? Thanks

image

how to texture in uv space the STAR model?

Hi

I tried to texture the STAR model using the same code as for SMPL - thought that with the same number 6890 of vertices it is going to work. However, the results look very scrambled.

Were vertices re-ordered or changed when retraining STAR? I thought that only the shape and pose spaces are different, but the vertices in the mesh are the same? Is there a similar example how to use UV texturing for the STAR model?

bad_uv

thanks a lot
Nikolay

How get standard 14 common 3d joints location from STAR verts

if i use STAR model to replace SMPL model for training a 3d human pose estimation task. Such as VIBE and SPIN. I need 14 3d joints location which are not include in the STAR kinetic tree. i think the original J_regressor in the model can't accomplish this task.

quat_feat implementation

@ahmedosman
in the quat_feat function, in the line
quat = torch.cat([v_sin * normalized,v_cos-1], dim=1)

is the -1 intentional? The comment says it should return a normalized quaternion but this -1 seems to make it non-normalized.

The error with output mesh ?

In compare_frameworks.py, I want to see the result of mesh. I write obj result with tf_star,d and model.r respectively, all these results are wrong, but obj result with v_shaped is right. What's wrong

Parameters/ Buffers formats

Hello
I wanted to know if there is a documentation on what formats the buffer attributes of the STAR class use ? (e.g. how are the 'faces' stored ? or the 'kintree_table' (I guess giving the position of joints) ? or even the vertices parameter verts). (I want to fit the model to the point cloud of a body using optimization.)

I also wanted to ask why the variable used to fill the buffers is called smpl_model although it is from the config file of a STAR model ?

thank you

FBX model for STAR

Hi,
I want to use STAR model in 3D software (e.g. Unity, Blender and Maya) by FBX model .
I guess it may be possible produce an FBX model from .pkl and .npy files.
It would be great if we can use an FBX model for STAR.

Thanks for the great work!

human model orientation

Thanks for your great work! When i compare the STAR model with the smpl and smplx, I find that the STAR model doesn't have the parameters 'global_orient' like the smpl and smplx. So I wonder know if there is a way to help the STAR model to rotate like the smpl?

Question about training shape and pose parameters

Very nice work and thanks for sharing your code. After reading your paper, I have a question regarding the training of the STAR model. The original SMPL paper first trains the pose corrective deformation functions and utilizes these to "normalize" the pose of the 3D scans before learning the shape PCA space. However, since the pose corrective deformations for STAR also depend on the beta_2 shape parameter, I wonder what the order of training was. Did you estimate beta_2 from the BMI annotations (as you show that BMI and beta_2 are correlated) and first trained the pose correctives? If not, did you first train the shape space and how did you do the pose normalization? Or do you do joint/iterative training?

【QUESTION】The necessity of convert SMPL parameter to STAR

hello,thanks to the wonderful works!
now, I have a task to regress SMPL parameters from image, so the annotations are SMPL groundtruth, and I use SMPL module to get human mesh vertices and then regress 3d joints location from human mesh. I wonder to know if I want to replace SMPL module to STAR, do I need run the covert script to convert SMPL annotations to STAR, and use the new annotations as supervision. Does the convert process is necessary? Because STAR and SMPL are both use 10 dimension beta and 72 dimension pose parameters, and I use original SMPL parameter as input and feed into STAR model, I got similar result human mesh but STAR result's human mesh shape a more accurate, but pose are similar. Can not distinguish the differences of pose between them by just watching.

How to correctly apply Joint angle?

This question could also be applied to SMPL.
I'm not an expert in human skeleton, and I got stuck in applying existing joint angle annotations from other human pose datasets(e.g. Human3.6M) to STAR.
When all zero angles are applied to the model, it returns T-pose.
From my understandings, joint index 1, 2, 3 have joint index 0(pelvis) as a parent.
So, when we measure the axis-angle using the vector from the position of joint 0 to the joint 1(in T-pose), for example, the angle would not be zero.
Is there some kind of reference vector and we measure the angle between the reference vector and our pose vector in axis-angle format?
Sorry for the weird term. they are just made-ups by myself.

how to get pose by given joints 3D-locations (24,3)?

Thanks for your great work with excellent performance!
I am now trying ton use your trained model(like female.npz) and the actually the given 3d-location (24,3) to rebuild a mesh.
But I notice that the model only get a pose input. I am already know that the pose can be compute by the two vector(before and after). My question is, what exact the rest-pose be, I mean to compute every pose, it seems like we must know the situation when pose is 0 (T-pose).
I would be most grateful if you could answer my questions in your busy schedule!

can smpl's beta directly be used in STAR?

I have just used same beta parameters in SMPL and STAR, but they are totally different. whether I use it in a wrong way or not?

smpl_para=joblib.load("20200901_zhh_y_t.pkl")
device = torch.device('cuda')
for i in tqdm(range(len(smpl_para[1]['frame_ids']))):
pose=smpl_para[1]['pose'][i]
pose=pose[np.newaxis, :]
beta=smpl_para[1]['betas'][i]
beta=beta[np.newaxis, :]

poses = torch.cuda.FloatTensor(pose)
poses = Variable(poses,requires_grad=True)
betas = torch.cuda.FloatTensor(beta)
betas = Variable(betas,requires_grad=True)
trans = torch.cuda.FloatTensor(np.zeros((1,3)))
trans = Variable(trans,requires_grad=True)
d = star(poses, betas, trans)

f="%06d" % i

vertices=d.detach().cpu().numpy().squeeze()
faces=d.f
out_mesh = trimesh.Trimesh(vertices, faces)
out_mesh.export("star_obj/"+f+".obj")

STAR pose prior needed

Hi
first thanks for sharing that code, it is a very cool project.

I have a use case when I am optimizing the pose and shape parameters w.r.t. some animation loss function, but the model makes very unnatural appearance. Do you have a function or data which gives a statistical prior on the reasonable pose/shape parameters?

(e.g. as SMPL has, e.g. https://github.com/vchoutas/smplify-x/blob/master/smplifyx/prior.py )

Or at the very least, what is the numerical interval of the pose and shape parameters?
Are the shape parameters actually angles in radians?

Or would it be a good idea to convert the old SMPL data into STAR format and calculate the prior myself?

thanks, any help will be appreciated

Running without GPU

I'm working on a machine without a GPU, and it would be really nice to be able to use STAR without cuda for developing. From my understanding, most of the code imports torch.cuda so when GPU isn't available, it throws an error. Would it be possible to disable that?

Changing Betas does not change shape of models

Hi,
When I edit the beta's of a model it does not result in a change to the shape of the generated Model
Is this the correct way to change the body shape?

I am using the chumpy version of STAR

Converter script, "convert_smpl_to_star.py", doesn't work properly

Thank you for sharing your great work.

I'm trying to convert from SMPL to STAR format with AMASS dataset.
However, your converter script doesn't work properly.
https://github.com/ahmedosman/STAR/blob/master/convertors/convert_smpl_to_star.py

I got an error as follows.

ValueError                                Traceback (most recent call last)
<ipython-input-44-c9f32fe20734> in <module>
----> 1 np_poses , np_betas , np_trans , star_verts = convert_smpl_2_star(smpl,**opt_parms)
      2 results = {'poses':np_poses,'betas':np_betas,'trans':np_trans,'star_verts':star_verts}

C:\workspace/local/STAR/star/convertors\losses.py in convert_smpl_2_star(smpl, MAX_ITER_EDGES, MAX_ITER_VERTS, NUM_BETAS, GENDER)
     78             'The Default optimization parameters (MAX_ITER_EDGES,MAX_ITER_VERTS) were tested on batch size 32 or smaller batches')
     79 
---> 80     star = STAR(gender=GENDER)
     81     global_pose = torch.cuda.FloatTensor(np.zeros((batch_size, 3)))
     82     global_pose = Variable(global_pose, requires_grad=True)

C:\workspace/local/STAR/star\pytorch\star.py in __init__(self, gender, num_betas)
     64         self.register_buffer('weights', torch.cuda.FloatTensor(star_model['weights']))
     65         # Model pose corrective blend shapes
---> 66         self.register_buffer('posedirs', torch.cuda.FloatTensor(star_model['posedirs'].reshape((-1,93))))
     67         # Mean Shape
     68         self.register_buffer('v_template', torch.cuda.FloatTensor(star_model['v_template']))

ValueError: cannot reshape array of size 9487530 into shape (93)

According to the result, the shape of star_model["posedirs"] doesn't fit into 93.

In addition, I checked that the shape of the model which was the latest version (1.1) of the STAR model downloaded from your project page.

star_model = np.load(path_female_star, allow_pickle=True)
print(star_model['posedirs'].shape)

(6890, 3, 459)

I think that your converter script doesn't work properly with the latest model (1.1).
Did you change the number of pose blend shapes from 93 to 459?

Could you please tell me how to convert from SMPL to STAR format with AMASS dataset?

How can SMPL or STAR get the displacement of the body, like hair?

截屏2023-08-10 09 36 54 As shown in this figure, when I want to get a canonical shape of such a sequence, how should I deal with the displacement like hair shown in the example? Cuz As we can see from the last frame of the example, both gt and STAR has the displacement of hair, how does STAR learn it?

Kinetic Tree Plotting

Hello @neoglez

Thank you for the work you posted. It really stands an edge ahead of SMPL.

I saw the kinetic tree of STAR. I was wondering how can we plot those keypoints on the 3D model.
I have the necessary parameters, like - j_regressor, betas, thetas, camera_params, 3d_model, joints.

However, I am not quite understanding, how can I plot the keypoints on the model. Could you help?
Is there any specific function in STAR which does this???

Thank you in advance. I look forward to hearing from you.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.