Giter Site home page Giter Site logo

Comments (8)

liminchen avatar liminchen commented on July 24, 2024

Hi Chung Min,

Thanks a lot for your interest in our project! Your tests are very interesting and there are certainly some bugs for calculating the normal forces. We will look into it.
Notice that for different resolution, it should be that higher resolution will give more accurate numbers. In fact, if you only count the vertical component it should be identical to the weight, but if you count the magnitude to the total force, due to the summed barrier approximation, there could be some tangential errors which will vanish as resolution goes higher. In addition, using an analytical plane won't have tangential errors.

from ipc.

zfergus avatar zfergus commented on July 24, 2024

The save_friction_data function saves the normal force magnitudes that are used internally for computing the friction potential. This means that we cancel out a / h^2. If you want to compute the actual normal force value you should divide by h^2 where h is the timestep size. This gets you pretty close (i.e., 100.856 / 0.1^2 = 10,085.6).

I will have to think more about why the density of the pad affects the normal force, but I think Minchen is right to try to refine the mesh and see if this improves the accuracy when the density is smaller.

from ipc.

chungmin99 avatar chungmin99 commented on July 24, 2024

Thank you so much for the quick response - I just realized that in my post, I made a typo; I intended to say Young's modulus (1e10), not density. Sorry about the miscommunication.


When the mesh is very refined (using mat225x225t40.msh) I find that the normal force is calculated to be about ~9900 for a variety of E values (tested 1e5, 1e6, 1e10), which now is approximately its expected value. In fact, even just using the 40x40 mesh appears to have pretty good results with the cube!

However, if I swap out the cube with a sphere (sphere1K),
0
All properties + scaling are kept the same as the first post, except for the pad mesh resolution and young's modulus as stated below.

Using mat40x40 as the "pad" mesh resolution,
with E=1e6, the sum of of all normal force magnitudes is calculated to be 317;
with E=1e10, the normal force is calculated to be 5441.
(expected upwards force is around 5136).

I do realize that I should be accounting for only the upwards components to retrieve the actual normal force applied by the sphere, but the sum should be overestimating the normal force (since it also adds up the forces along the x- and z- directions). Therefore, the force-sum with E=1e10 makes sense (overestimate), but the sum with E=1e6 doesn't (severe underestimate).

Do you possibly have any insight as to why this might be happening?

Edit: the contact area decreased compared to the cube, so this may simply be an indication that a higher mesh resolution is required; however, I expect that the softer pad (smaller E) should have a larger contact area than the stiffer pad (larger E), or at least a similar number of "localized forces" should be present. Yet, there is a large discrepancy between the two normal forces.

Edit2: Using mat225x225t40, E=1e6 for the pad settings, the sphere-contact returns a force-sum of ~292. Based on this result, it seems that the higher mesh resolution did not help the contact force estimation.

from ipc.

liminchen avatar liminchen commented on July 24, 2024

At static state, we have the following system of equations hold:
CodeCogsEqn
where x is the stacked nodal positions, and the above equation has 3n rows where n is the number of nodes.
For each contact pair, we have
CodeCogsEqn (1)
So I'm not sure whether the sum of all \lambda_k/(\Delta t^2) should equal to the sum of all gravity of the nodes? In other words, we may want to calculate the net contact force for an object more wisely. For example, summing up the contact force vector on all nodes and then calculate the norm of this sum? Maybe this is equal to the gravity of the object, which is also the norm of the sum of gravity vectors at all the nodes.

from ipc.

chungmin99 avatar chungmin99 commented on July 24, 2024

Based on your recommendation ("summing up the contact force vector on all nodes and then calculate the norm of this sum"), I wrote up a simple python-based contact-force calculator as shown below:

import numpy as np

# pymesh.load_matrix reads dmat files; for details see
# https://pymesh.readthedocs.io/en/latest/api_misc.html
from pymesh import load_matrix 

import sys
basedir = sys.argv[1] # e.g., 'sphere1K-mat40x40_null_NH_BE_interiorPoint_20210824145722t8/'

# fric_4_7_* files correspond to the last saved friction-data files for the simulation saved in basedir
lambdas = load_matrix(os.path.join(basedir, 'fric_4_7_lambda.dmat'))
bases = load_matrix(os.path.join(basedir, 'fric_4_7_bases.dmat'))

# bases describe the tangent plane, use cross product to retrieve normal to plane
contact_normals = np.zeros((lambdas.shape[0], 3))
for i in range(lambdas.shape[0]):
    contact_normals[i] = np.cross(
        bases[3*i:3*(i+1), 0],
        bases[3*i:3*(i+1), 1],
    )
    # adjust contact normal to point upwards
    if contact_normals[i][1] < 0:
        contact_normals[i] *= -1

contact_forces = lambdas * contact_normals
norm_of_sum = np.linalg.norm(np.sum(contact_forces, axis=0))
upwards_force = np.sum(contact_forces, axis=0)[1]
print(norm_of_sum, upwards_force)

However, I am still getting numbers similar to my previous reply (the config files are kept the same, with timestep=1, so lambda = lambda/h^2).
norm_of_sum and upwards_force values end up quite similar, so I'm writing the norm_of_sum here:

  • mat40x40, E=1e10Pa: 5441
  • mat40x40, E=1e6Pa: 317
  • mat225x225, E=1e6Pa: 287

I also re-ran the calculator without the lines if contact_normals[i][1] < 0.... then the values are:

  • mat40x40, E=1e10Pa: 5431
  • mat40x40, E=1e6Pa: 128.7 (upwards_force is -128.47)
  • mat225x225, E=1e6Pa: 76

Also, a side question, but is IPC deterministic? So far, when I re-run scenes, the contact forces appear to stay the same.

from ipc.

liminchen avatar liminchen commented on July 24, 2024

So I see this is
image
where n_k is the contact normal of stencil k.
However this is not what I suggested, as in each stencil, the contact force acted on each node can have different directions, not necessarily in n_k.
For what I suggested I think you can get the barrier energy gradient (which has the opposite contact force times dt^2 on each DOF) and then sum up each DOF's 3D vector and take the negation to obtain the contact force. Remember to only take the sum among DOF of a single object.
For the barrier energy gradient vector (in 3n length, n is total num of DOF), lines 3456-3513 of Optimizer.cpp can give you. DOF of objects you added first in config.txt will appear in the front of the 3n vector.

from ipc.

chungmin99 avatar chungmin99 commented on July 24, 2024

Based on your feedback from last time, I believe that you were referring to the gradient variable, which includes the deformation energy gradient term (shown below)

switch (animConfig.timeIntegrationType) {
case TIT_BE: {
energyTerms[0]->computeGradient(data, redoSVD, svd, F,
dtSq * energyParams[0], gradient_ET[0], projectDBC);
gradient = gradient_ET[0];
for (int eI = 1; eI < energyTerms.size(); eI++) {
energyTerms[eI]->computeGradient(data, redoSVD, svd, F,
dtSq * energyParams[eI], gradient_ET[eI], projectDBC);
gradient += gradient_ET[eI];
}
break;
}
case TIT_NM: {
energyTerms[0]->computeGradient(data, redoSVD, svd, F, dtSq * beta_NM * energyParams[0], gradient_ET[0], projectDBC);
gradient = gradient_ET[0];
for (int eI = 1; eI < energyTerms.size(); eI++) {
energyTerms[eI]->computeGradient(data, redoSVD, svd, F, dtSq * beta_NM * energyParams[eI], gradient_ET[eI], projectDBC);
gradient += gradient_ET[eI];
}
break;
}
}
) and the effects of considering the constraints (shown below)
Eigen::VectorXd constraintVal;
for (int coI = 0; coI < animConfig.collisionObjects.size(); ++coI) {
int startCI = constraintVal.size();
animConfig.collisionObjects[coI]->evaluateConstraints(data, activeSet[coI], constraintVal);
for (int cI = startCI; cI < constraintVal.size(); ++cI) {
compute_g_b(constraintVal[cI], dHat, constraintVal[cI]);
}
animConfig.collisionObjects[coI]->leftMultiplyConstraintJacobianT(data, activeSet[coI],
constraintVal.segment(startCI, activeSet[coI].size()), gradient, kappa);
// friction
if (activeSet_lastH[coI].size() && fricDHat > 0.0 && animConfig.collisionObjects[coI]->friction > 0.0) {
animConfig.collisionObjects[coI]->augmentFrictionGradient(data.V, result.V_prev, activeSet_lastH[coI],
lambda_lastH[coI], gradient, fricDHat, 1.0);
}
}
for (int coI = 0; coI < animConfig.meshCollisionObjects.size(); ++coI) {
int startCI = constraintVal.size();
animConfig.meshCollisionObjects[coI]->evaluateConstraints(data, MMActiveSet[coI], constraintVal);
for (int cI = startCI; cI < constraintVal.size(); ++cI) {
compute_g_b(constraintVal[cI], dHat, constraintVal[cI]);
}
animConfig.meshCollisionObjects[coI]->leftMultiplyConstraintJacobianT(data, MMActiveSet[coI],
constraintVal.segment(startCI, MMActiveSet[coI].size()), gradient, kappa);
animConfig.meshCollisionObjects[coI]->augmentParaEEGradient(data,
paraEEMMCVIDSet[coI], paraEEeIeJSet[coI], gradient, dHat, kappa);
}
if (animConfig.isSelfCollision) {
int startCI = constraintVal.size();
SelfCollisionHandler<dim>::evaluateConstraints(data, MMActiveSet.back(), constraintVal);
for (int cI = startCI; cI < constraintVal.size(); ++cI) {
compute_g_b(constraintVal[cI], dHat, constraintVal[cI]);
}
SelfCollisionHandler<dim>::leftMultiplyConstraintJacobianT(data, MMActiveSet.back(),
constraintVal.segment(startCI, MMActiveSet.back().size()), gradient, kappa);
SelfCollisionHandler<dim>::augmentParaEEGradient(data,
paraEEMMCVIDSet.back(), paraEEeIeJSet.back(), gradient, dHat, kappa);
if (MMActiveSet_lastH.back().size() && fricDHat > 0.0 && animConfig.selfFric > 0.0) {
SelfCollisionHandler<dim>::augmentFrictionGradient(data.V, result.V_prev, MMActiveSet_lastH.back(),
MMLambda_lastH.back(), MMDistCoord.back(), MMTanBasis.back(), gradient, fricDHat, animConfig.selfFric);
To check the value of gradient, I exported this vector along the friction data at
saveFrictionData(data);
as total_gradient in the json file ("total", as it incorporates friction and deformation effects, unlike the pre-existing gradient vector storing friction gradient data).

Reading this, I ran the following lines to retrieve the gradient sum for the first-object DOFs:

>>> # results is sphere-mat20x20 interaction with mat20x20’s E set to 1e6
>>> results = ‘sphere1K-mat40x40_null_NH_BE_interiorPoint_20210905212324t8’
>>> with open(results + '/friction_data_12.json') as f: #_12 is the last written friction_data file
...     mat_E_1e6 = json.load(f)
...
>>> mat_E_1e6.keys()
dict_keys(['mmcvids', 'dhat_squared', 'barrier_stiffness', 'epsv_times_h_squared', 'mu', 'normal_force_magnitudes', 'closest_point_coordinates', 'tangent_bases', 'energy', 'gradient', 'total_gradient', 'hessian_triplets', 'V_start', 'V_lagged', 'V_end'])
>>> np.array(bar['total_gradient']).shape
(14880,)
>>> np.array(mat_E_1e6['V_start']).shape
(4960, 3)
>>> (4960 * 3)
14880

The sphere mesh (consisting of 1760 nodes) was placed before the mat40x40 mesh (3200 nodes), and since the barrier energy gradient vector consisted of all the vertices, I summed up the 3D DOF vectors for the first 1760 vertices as shown below:

>>> # mat_E_1e10 is created similarly to mat_E_1e6, but with E=1e6
>>> np.array(mat_E_1e6['total_gradient']).reshape((-1, 3))[:1760].sum(axis=0)
array([-3.46058609e+00,  4.75051200e+03, -5.11991339e+00])
>>> np.array(mat_E_1e10['total_gradient']).reshape((-1, 3))[:1760].sum(axis=0)
array([-8.04705607e-02, -2.69475376e+02,  1.13115857e-01])

, and found that

  1. The outputs from the two Young’s modulus values still do not match, and
  2. Neither match the expected value ( around 5136).

from ipc.

liminchen avatar liminchen commented on July 24, 2024

I think you only want the contact part of the gradient, and look at only the sphere. The total_gradient should be everywhere nearly 0 as the scene becomes static.

from ipc.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.