Giter Site home page Giter Site logo

ifmatch's Introduction

IFMatch

PyTorch implementation for our ECCV 2022 paper Implicit field supervision for robust non-rigid shape matching

Setup

  1. Setup the conda environment using from the ifmatch_env.yml as conda env create -f ifmatch_env.yml and activate it.

  2. Install the pytorch-meta as cd pytorch-meta && python setup.py install

Dataset and Pre-Processing

  1. We provide the datasets and variants used in our paper here

  2. Once the dataset have been downloaded, we have two-staged pre-processing,

    1. Sampling SDF: To sample points with SDF, we follow the DeepSDF scheme as given here. Place all the npz files into (say) /path/to/npz.

    2. Sampling surface with normal: For this, we use the mesh-to-sdf package. To perform this step, run data_process.py by providing the path to ply files and the npz files from previous point. Run with --help option to know other required parameters.

  3. Once the pre-processing is done, your data directory should have three directories, free_space_pts containing the SDF, surface_pts_n_normal containing the surface points along with normal information and vertex_pts_n_normal containing vertex points.

  4. Step 2 is repeated for both training and test dataset alike.

  5. We provide pre-processed samples consisting of test-set shapes from the FAUST-Remesh dataset here: https://nuage.lix.polytechnique.fr/index.php/s/gb8D3KHBeb7zqNL.

Training

To train, run the following by appropriately replacing parameters,

python train.py --config configs/train/<dataset>.yml --split split/train/dataset.txt --exp_name <my_exp>

Evaluation

Our evaluation is two staged, first we find the optimal latent vector (MAP), then we solve for the P2P map between shapes

  1. To run the MAP step,
python evaluate.py --config configs/eval/<dataset.yml>
  1. To obtain the P2P map,
python run_matching.py --config configs/eval/<dataset.yml> --latent_opt

Pre-trained Models

We perform 3 distinct training in total for reported results in the paper. Respective models can be downloaded from here

Citation

If you find our work useful, please cite the arxiv version below. (To be updated soon...)

@misc{sundararaman2022implicit,
    title={Implicit field supervision for robust non-rigid shape matching},
    author={Ramana Sundararaman and Gautam Pai and Maks Ovsjanikov},
    year={2022},
    eprint={2203.07694},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}

Acknowledgements

We thank authors of DIF-Net and SIREN for graciously open-sourcing their code.

ifmatch's People

Contributors

sentient07 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

ifmatch's Issues

Dataset problem

Thank you first for open-source the code.

In the README, you provided raw data link and data processing tutorials. However, some cpp library have new versions over time, and there are difficulties replicating the DeepSDF .

Also, part of the test data set you provided is also missing.

I would appreciate it if you can provide the processed data set!

PointCloudMulti in dataset

Hi Ramana,

Thanks for your work. It's really cool.

I have a question about the dataset. I noticed the len of PointCloud_with_FreePoints is max_points, i.e. 200000. When shuffling items in PointCloudMulti, does this mean that for each instance, it is obtained for 200000 times but with different random indices for on_surface points, free points, etc?

Looking forward to hearing from you.

preprocess the template with the same way as train dataset results in wrong data structure and cause keyError in _get_template_info

I follow the Dataset and Pre-Processing part to prerocess train dataset and the template but I found that the "surface_pts_n_normal" result of template couldn't satisfied the need in _get_template_info.
There are some codes in _get_template_info
def _get_template_info(self):
...
templ_surf_pth = './data/template/surface_pts_n_normal/%s.mat'%self.template_name
templ_surf_mat = loadmat(templ_surf_pth)
templ_surf_pc = templ_surf_mat['p'][:, :3]
templ_trig_id = np.squeeze(templ_surf_mat['trig_id'])
templ_baryc = templ_surf_mat['bary_coord']
...

It seems that my templ_surf_mat has no key names "trig_id" or "bary_coord"
File "<ipython-input-2-5376759be172>", line 1, in <module> templ_surf_mat['trig_id'] KeyError: 'trig_id'
File "<ipython-input-3-27547a3a9825>", line 1, in <module> templ_surf_mat['bary_coord'] KeyError: 'bary_coord'
The data structure of my templ_surf_mat is:
{'__header__': b'MATLAB 5.0 MAT-file Platform: posix, Created on: Mon Oct 10 11:19:59 2022', '__version__': '1.0', '__globals__': [], 'p': array([[]])}

I wonder how to preprocess the template data correctly?

visualization code

Thank you for your work. I noticed that the regression result in this article is deformation, and there is a distance error between it and the gt, but the final visualization feels the same as the point position of the gt. Could you please release the visualization code?

Template shape setting

Thanks for providing the template file.

Now I meet another problem. According to the paper, there is only one human template for all different human datasets and one animal template for animal datasets, right? Then, how do you get the ground truth correspondences between the template and all training shapes? Are there any pre-processing code?

I think it is related to the parameters of 'trig_id' and 'bary_coord'? But how can I get them?

Evaluation setting--template info.

Hi there,

Your work is excellent. I like it very much.

Now I am trying to run the evaluation code with the pre-trained model and the test-set shapes (FAUST-Remesh dataset) you provided. But it seems that I need the template shape, I think it should be the one you use for training, right? Could you provide it?

I randomly set the template shape as 'tr_reg_080', but it does not work, since the corresponding surface point file does not have 'trig_id' and 'bary_coord'.

File "/IFMatch/code/dataset.py", line 220, in _get_template_info templ_trig_id = np.squeeze(templ_surf_mat['trig_id']) KeyError: 'trig_id'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.