Giter Site home page Giter Site logo

facebookresearch / habitat-matterport3d-dataset Goto Github PK

View Code? Open in Web Editor NEW
115.0 9.0 10.0 1.07 MB

This repository contains code to reproduce experimental results from our HM3D paper in NeurIPS 2021.

License: MIT License

Python 80.72% Shell 16.72% MATLAB 2.56%

habitat-matterport3d-dataset's Introduction

Habitat-Matterport 3D Dataset (HM3D)

The Habitat-Matterport 3D Research Dataset is the largest-ever dataset of 3D indoor spaces. It consists of 1,000 high-resolution 3D scans (or digital twins) of building-scale residential, commercial, and civic spaces generated from real-world environments.

HM3D is free and available here for academic, non-commercial research. Researchers can use it with FAIR’s Habitat simulator to train embodied agents, such as home robots and AI assistants, at scale.

example

This repository contains the code and instructions to reproduce experiments from our NeurIPS 2021 paper. If you use the HM3D dataset or the experimental code in your research, please cite the HM3D paper.

@inproceedings{ramakrishnan2021hm3d,
  title={Habitat-Matterport 3D Dataset ({HM}3D): 1000 Large-scale 3D Environments for Embodied {AI}},
  author={Santhosh Kumar Ramakrishnan and Aaron Gokaslan and Erik Wijmans and Oleksandr Maksymets and Alexander Clegg and John M Turner and Eric Undersander and Wojciech Galuba and Andrew Westbury and Angel X Chang and Manolis Savva and Yili Zhao and Dhruv Batra},
  booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
  year={2021},
  url={https://openreview.net/forum?id=-v4OuqNs5P}
}

Please check out our website for details on downloading and visualizing the HM3D dataset.

Installation instructions

We provide a common set of instructions to setup the environment to run all our experiments.

  1. Clone the HM3D github repository and add it to PYTHONPATH.

    git clone https://github.com/facebookresearch/habitat-matterport3d-dataset.git
    cd habitat-matterport3d-dataset
    export PYTHONPATH=$PYTHONPATH:$PWD
    
  2. Create conda environment and activate it.

    conda create -n hm3d python=3.8.3
    conda activate hm3d
    
  3. Install habitat-sim using conda.

    conda install habitat-sim headless -c conda-forge -c aihabitat
    

    See habitat-sim's installation instructions for more details.

  4. Install trimesh with soft dependencies.

    pip install "trimesh[easy]==3.9.1"
    
  5. Install remaining requirements from pip.

    pip install -r requirements.txt
    

Downloading datasets

In our paper, we benchmarked HM3D against prior indoor scene datasets such as Gibson, MP3D, RoboThor, Replica, and ScanNet.

  • Download each dataset based on these instructions from habitat-sim. In the case of RoboThor, convert the raw scan assets to GLB using assimp.

    assimp export <SOURCE SCAN FILE> <GLB FILE PATH>
    
  • Once the datasets are download and processed, create environment variables pointing to the corresponding scene paths.

    export GIBSON_ROOT=<PATH TO GIBSON glbs>
    export MP3D_ROOT=<PATH TO MP3D glbs>
    export ROBOTHOR_ROOT=<PATH TO ROBOTHOR glbs>
    export HM3D_ROOT=<PATH TO HM3D glbs>
    export REPLICA_ROOT=<PATH TO REPLICA plys>
    export SCANNET_ROOT=<PATH TO SCANNET glbs>
    

Running experiments

We provide the code for reproducing the results from our paper in different directories.

  • scale_comparison contains the code for comparing the scale of HM3D with other datasets (Tab. 1 in the paper).
  • quality_comparison contains the code for comparing the reconstruction completeness and visual fidelity of HM3D with other datasets (Fig. 4 and Tab. 5 in the paper).
  • pointnav_comparison contains the configs and instructions to train and evaluate PointNav agents on HM3D and other datasets (Tab. 2 and Fig. 7 in the paper).

We further provide README files within each directory with instructions for running the corresponding experiments.

Acknowledgements

We thank all the volunteers who contributed to the dataset curation effort: Harsh Agrawal, Sashank Gondala, Rishabh Jain, Shawn Jiang, Yash Kant, Noah Maestre, Yongsen Mao, Abhinav Moudgil, Sonia Raychaudhuri, Ayush Shrivastava, Andrew Szot, Joanne Truong, Madhawa Vidanapathirana, Joel Ye. We thank our collaborators at Matterport for their contributions to the dataset: Conway Chen, Victor Schwartz, Nicole Rogers, Sachal Dhillon, Raghu Munaswamy, Mark Anderson.

License

The code in this repository is MIT licensed. See the LICENSE file for details. The trained models are considered data derived from the correspondent scene datasets.

habitat-matterport3d-dataset's People

Contributors

srama2512 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

habitat-matterport3d-dataset's Issues

Using the dataset in NVIDIA isaac

Hi, thanks for providing this dataset for research. I was wondering if anyone managed to import this dataset into NVIDIA isaac sim. The glb files are directly importable into isaac, but I am not really sure if it's amenable to spawn robots into it/generate navigation meshes, get semantic camera outputs etc. Any pointers would be really helpful!

Unable to get HM3D house information

Hi,

I would like to retrieve room information such as scene.levels, scene.regions, aabb.center, aabb.sizes etc. However, the result is null:

print(sim.semantic_scene.levels)
print(sim.semantic_scene.regions[0].aabb.center)
print(sim.semantic_scene.regions[0].aabb.sizes)

output:
[]
[0. 0. 0.]
[-inf -inf -inf]

I suspect that the semantics were not loaded correctly. I have the following file structure:

ls
hm3d_annotated_basis.scene_dataset_config.json
minival/

ls /minival
00800-TEEsavR23oF/ 00806-tQ5s4ShP627/
00801-HaxA7YrQdEC/ 00807-rsggHU7g7dh/
00802-wcojb4TFT35/ 00808-y9hTuugGdiq/
00803-k1cupFYWXJ6/ 00809-Qpor2mEya8F/
00804-BHXhpBwSMLh/ hm3d_annotated_minival_basis.scene_dataset_config.json
00805-SUHsP6z2gcJ/ hm3d_minival_basis.scene_dataset_config.json

ls /minival/00800-TEEsavR23oF/
TEEsavR23oF.basis.glb TEEsavR23oF.basis.navmesh
TEEsavR23oF.semantic.glb TEEsavR23oF.semantic.txt

I have enabled the semantic sensor and set the scene_dataset_config_file configuration variable in habitat-sim.

May I know how to get the house information?

Topdown photo

Hi There,

Does the dataset have topdown view photo for each floor in all environments?

Or is that a way to use the Habitat simulator to produce the top down photos?

Thanks!

Floorplan annotations

Thanks for your amazing work! I wonder is there 2D floorplan annotation for the scene (Or do you have any plans to release this kind of annotation in the near future)?

Best,
Yuanwen

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.