Giter Site home page Giter Site logo

jinlibot7 / garmentnets Goto Github PK

View Code? Open in Web Editor NEW

This project forked from real-stanford/garmentnets

0.0 0.0 0.0 543 KB

[ICCV 2021] GarmentNets: Category-Level Pose Estimation for Garments via Canonical Space Shape Completion

Home Page: https://garmentnets.cs.columbia.edu/

Python 100.00%

garmentnets's Introduction

GarmentNets

This repository contains the source code for the paper GarmentNets: Category-Level Pose Estimation for Garments via Canonical Space Shape Completion. This paper has been accepted to ICCV 2021.

Overview

Cite this work

@inproceedings{chi2021garmentnets,
  title={GarmentNets: Category-Level Pose Estimation for Garments via Canonical Space Shape Completion},
  author={Chi, Cheng and Song, Shuran},
  booktitle={The IEEE International Conference on Computer Vision (ICCV)},
  year={2021}
}

Datasets

  1. GarmentNets Dataset (GarmentNets training and evaluation)

  2. GarmentNets Simulation Dataset (raw Blender simluation data to generate the GarmentNets Dataset)

  3. CLOTH3D Dataset (cloth meshes in a canonical pose)

The GarmentNets Dataset contains point clouds before and after gripping simulation with point-to-point correspondance, as well as the winding number field ($128^3$ volume).

The GarmentNets Simulation Dataset contains the raw vertecies, RGBD images and per-pixel UV from Blender simulation and rendering of CLOTH3D dataset. Each cloth instance in CLOTH3D is simulated 21 times with different random gripping points.

Both datasets are stored using Zarr format.

๐Ÿ†• Sample Datasets

Upon popular demand, we added a small subset of our datasets, which contains 25 instances of the Tshirt category. We are also working on alternative ways to host the complete datasets for reserchers outside of the United States.

  1. GarmentNets Dataset Sample

  2. GarmentNets Simulation Dataset Sample

Pretrained Models

GarmentNets Pretrained Models

GarmentNets are trained in 2 stages:

  1. PointNet++ canoninicalization network
  2. Winding number field and warp field prediction network

The checkpoints for 2 stages x 6 categories (12 in total) are all included. For evaluation, the checkpoints in the garmentnets_checkpoints/pipeline_checkpoints directory should be used.

Usage

Installation

A conda environment.yml for python=3.9, pytorch=1.9.0, cudatoolkit=11.1 is provided.

conda env create --file environment.yml

Alternatively, you can directly executive following commands:

conda install pytorch torchvision cudatoolkit=11.1 pytorch-geometric pytorch-scatter wandb pytorch-lightning igl hydra-core scipy scikit-image matplotlib zarr numcodecs tqdm dask numba -c pytorch -c nvidia -c rusty1s -c conda-forge

pip install potpourri3d==0.0.4

Evaluation

Assuming the project directory is ~/dev/garmentnets. Assuming the GarmentNets Dataset has been extracted to <PROJECT_ROOT>/data/garmentnets_dataset.zarr and GarmentNets Pretrained Models has been extracted to <PROJECT_ROOT>/data/garmentnets_checkpoints.

Generate prediction Zarr with

(garmentnets)$ python predict.py datamodule.zarr_path=<PROJECT_ROOT>/data/garmentnets_dataset.zarr/Dress main.checkpoint_path=<PROJECT_ROOT>/data/garmentnets_checkpoints/pipeline_checkpoints/Dress_pipeline.ckpt

Note that the dataset zarr_path and checkpoitn_path must belong to the same category (Dress in this case).

Hydra should automatically create a run directory such as <PROJECT_ROOT>/outputs/2021-07-31/01-43-33. To generate evaluation metrics, execute:

(garmentnets)$ python eval.py main.prediction_output_dir=<PROJECT_ROOT>/outputs/2021-07-31/01-43-33

The all_metrics_agg.csv and summary.json should show up in the Hydra generated directory for this run.

Training

As mentioned above, GarmentNets are trained in 2 stages. Using a single Nvidia RTX 2080Ti, training stage 1 will take roughly a week and training stage 2 can usually be done overnight.

To retrain stage 2 with a pre-trained stage 1 checkpoint:

(garmentnets)$ python train_pipeline.py datamodule.zarr_path=<PROJECT_ROOT>/data/garmentnets_dataset.zarr pointnet2_model.checkpoint_path=<PROJECT_ROOT>/data/garmentnets_checkpoints/pointnet2_checkpoints/Dress_pointnet2.ckpt

To train stage 1 from scratch:

(garmentnets)$ python train_pointnet2.py datamodule.zarr_path=<PROJECT_ROOT>/data/garmentnets_dataset.zarr

garmentnets's People

Contributors

cheng-chi avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.