Giter Site home page Giter Site logo

Comments (12)

EricGuo5513 avatar EricGuo5513 commented on June 17, 2024 1

from humanml3d.

Kebii avatar Kebii commented on June 17, 2024

I meet the same problem.

from humanml3d.

eanson023 avatar eanson023 commented on June 17, 2024

Hi Ying156209, I have a question to ask you, how does Blender render and export the eps file to achieve the effect in the paper

from humanml3d.

EricGuo5513 avatar EricGuo5513 commented on June 17, 2024

Hi Ying156209, I have a question to ask you, how does Blender render and export the eps file to achieve the effect in the paper

Hi, to visualize in Blender, TEMOS provides a good introduction. https://github.com/Mathux/TEMOS

from humanml3d.

EricGuo5513 avatar EricGuo5513 commented on June 17, 2024

Hi, I try to visualize the joints rotation representation, since converting from XYZ to joints rotation takes time. And the result seems not right. Here is my script. I have checked all the joints indexes and also make sure the bvh construction is correct.

from common.quaternion import *
from paramUtil import *
import sys
from rotation_conversions import *

mean = np.load('./HumanML3D/Mean.npy')
std = np.load('./HumanML3D/Std.npy')
ref2 = np.load('./HumanML3D/new_joint_vecs/012314.npy')

def recover_rot(data):
    # dataset [bs, seqlen, 263/251] HumanML/KIT
    joints_num = 22 if data.shape[-1] == 263 else 21
    data = torch.Tensor(data)
    r_rot_quat, r_pos = recover_root_rot_pos(data)
    r_pos_pad = torch.cat([r_pos, torch.zeros_like(r_pos)], dim=-1).unsqueeze(-2)
    r_rot_cont6d = quaternion_to_cont6d(r_rot_quat)
    start_indx = 1 + 2 + 1 + (joints_num - 1) * 3
    end_indx = start_indx + (joints_num - 1) * 6
    cont6d_params = data[..., start_indx:end_indx]
    cont6d_params = torch.cat([r_rot_cont6d, cont6d_params], dim=-1)
    cont6d_params = cont6d_params.view(-1, joints_num, 6)   # frames, joints, joints_dim
    cont6d_params = torch.cat([cont6d_params, r_pos_pad], dim=-2)
    return cont6d_params
   
def feats2rots(features):   
    features = features * std + mean
    return recover_rot(features)

rot6d _all = feats2rots(ref2).numpy()
rot6d_all = rot6d_all.reshape(-1, 23,6)
rot6d_all_trans = rot6d_all[:,-1,:3]
rot6d_all_rot = rot6d_all[:,:-1]

matrix =  rotation_6d_to_matrix(torch.Tensor(rot6d_all_rot))  
euler = matrix_to_euler_angles(matrix, "XYZ")
euler = euler/np.pi*180
np.save('euler_rot_gt.npy', euler)
np.save('trans_rot_gt.npy',rot6d_all_trans)

the func 'rotation_6d_to_matrix' and 'matrix_to_euler_angles' are from https://github.com/Mathux/ACTOR/blob/master/src/utils/rotation_conversions.py I got the skeleton results like this. (w/wo mean and std makes little different ) image

Have you tried to visualize rots? Need your help. Thanks so much!!

Hello, unfortunately, I didn't tried to visualize the rots directly. Usually I will transform it to xyz coordinates. Also, in generation, we do not use the generated rots. They only play the role of regularization. For your case, I try to give some comments:

  1. The new_joint_vecs are not normalized, you don't need to recover them.
  2. For 6d->matrix, you could use cont6d_to_matrix in quaternion.py, not sure if this makes difference.
  3. You could try different matrix_to_euler_angles functions, for Blender, not sure if it need extrinsic/intrinsic parameters.

from humanml3d.

XinandYu avatar XinandYu commented on June 17, 2024

Hi.
Can I recover the joints rotation from the position? I tried the inverse_kinematics_np func from Skeleton class. However, it seems not to work properly.

from humanml3d.

XinandYu avatar XinandYu commented on June 17, 2024

Hi, I try to visualize the joints rotation representation, since converting from XYZ to joints rotation takes time. And the result seems not right. Here is my script. I have checked all the joints indexes and also make sure the bvh construction is correct.

from common.quaternion import *
from paramUtil import *
import sys
from rotation_conversions import *

mean = np.load('./HumanML3D/Mean.npy')
std = np.load('./HumanML3D/Std.npy')
ref2 = np.load('./HumanML3D/new_joint_vecs/012314.npy')

def recover_rot(data):
    # dataset [bs, seqlen, 263/251] HumanML/KIT
    joints_num = 22 if data.shape[-1] == 263 else 21
    data = torch.Tensor(data)
    r_rot_quat, r_pos = recover_root_rot_pos(data)
    r_pos_pad = torch.cat([r_pos, torch.zeros_like(r_pos)], dim=-1).unsqueeze(-2)
    r_rot_cont6d = quaternion_to_cont6d(r_rot_quat)
    start_indx = 1 + 2 + 1 + (joints_num - 1) * 3
    end_indx = start_indx + (joints_num - 1) * 6
    cont6d_params = data[..., start_indx:end_indx]
    cont6d_params = torch.cat([r_rot_cont6d, cont6d_params], dim=-1)
    cont6d_params = cont6d_params.view(-1, joints_num, 6)   # frames, joints, joints_dim
    cont6d_params = torch.cat([cont6d_params, r_pos_pad], dim=-2)
    return cont6d_params
   
def feats2rots(features):   
    features = features * std + mean
    return recover_rot(features)

rot6d _all = feats2rots(ref2).numpy()
rot6d_all = rot6d_all.reshape(-1, 23,6)
rot6d_all_trans = rot6d_all[:,-1,:3]
rot6d_all_rot = rot6d_all[:,:-1]

matrix =  rotation_6d_to_matrix(torch.Tensor(rot6d_all_rot))  
euler = matrix_to_euler_angles(matrix, "XYZ")
euler = euler/np.pi*180
np.save('euler_rot_gt.npy', euler)
np.save('trans_rot_gt.npy',rot6d_all_trans)

the func 'rotation_6d_to_matrix' and 'matrix_to_euler_angles' are from https://github.com/Mathux/ACTOR/blob/master/src/utils/rotation_conversions.py I got the skeleton results like this. (w/wo mean and std makes little different ) image
Have you tried to visualize rots? Need your help. Thanks so much!!

Hello, unfortunately, I didn't tried to visualize the rots directly. Usually I will transform it to xyz coordinates. Also, in generation, we do not use the generated rots. They only play the role of regularization. For your case, I try to give some comments:

  1. The new_joint_vecs are not normalized, you don't need to recover them.
  2. For 6d->matrix, you could use cont6d_to_matrix in quaternion.py, not sure if this makes difference.
  3. You could try different matrix_to_euler_angles functions, for Blender, not sure if it need extrinsic/intrinsic parameters.

And I got the same result as this picture. I'm wondering it's because of the limitation of the algrithm or I use this in a wrong way. Looking forward to your reply.

from humanml3d.

wyhuai avatar wyhuai commented on June 17, 2024

I meet the same problem, it seems that the rot in the 263-vector can not yield the correct visualization result. Did you solve that problem?

from humanml3d.

Ssstirm avatar Ssstirm commented on June 17, 2024

I meet the same problem, it seems that the rot in the 263-vector can not yield the correct visualization result. Did you solve that problem?

I meet the same problem, it seems that the rot in the 263-vector can not yield the correct visualization result. Did you solve that problem?

Not yet. I'm trying to figure out what's going on in the code. I assume that there is no bug in the code. In my view, the reason might be:
It calculates the local rotation and the input should be the world rotation.
Analytic method might suffer from that terrible result.

from humanml3d.

EricGuo5513 avatar EricGuo5513 commented on June 17, 2024

Hi, I got a lot of comments that our current rotation representation seems not compatible to other 3D softwares like blender. I kind of get the reason. In IK/FK in skeleton.py, for i_th bone, we are calculating the rotations for itself. While in bvh, actually we should get the rotations of it parent instead. Therefore, in line 91, you could try to use its parent bone, instead of the bone itself. I am not sure if it works. Here I attach the codes of our FK and bvh FK, you may see the difference while obtaining global positions:
Our FK:

for i in range(1, len(chain)): 
      R = qmul(R, quat_params[:, chain[i]])
      offset_vec = offsets[:, chain[i]]
      joints[:, chain[i]] = qrot(R, offset_vec) + joints[:, chain[i-1]]

BVH FK:

for i in range(1, len(self.parents)): 
    global_quats[:, i] = qmul(global_quats[:, self.parents[i]], local_quats[:, i])
    global_pos[:, i] = qrot(global_quats[:, self.parents[i]], offsets[:, i]) + global_pos[:, self.parents[i]]

Hope this helps you. I do not have time to validate this idea. But if anyone figure it out in this or any other ways, I would appreciate so much if you could let me know. If it does not work, I know the recent work ReMoDiffuse managed to use the rotation representation in their demo. You may refer to them.

BTW: I have updated the quaternion_euler_cont6d functions in quaternion.py, which should be safe to use.

from humanml3d.

yufu-liu avatar yufu-liu commented on June 17, 2024

Hi, thanks for creating this useful dataset and amazing work in text2motion!

I would like to ask a question about 263-vector.
I know the shape = (#frame, 263), and it contains local velocity, rotation, rotation velocity, and foot contact...etc. However, I don't know the index range for each of them.
So may I have the detailed description about the it?

from humanml3d.

mingdianliu avatar mingdianliu commented on June 17, 2024

Hi Eric,

You mentioned that the rot data only play the role of regularization. Could you explain a little bit about this claim? Thanks.

from humanml3d.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.