Giter Site home page Giter Site logo

xrosliang / edgedepth-release Goto Github PK

View Code? Open in Web Editor NEW

This project forked from twjiannuo/edgedepth-release

0.0 1.0 0.0 1.02 MB

Github Repo for Paper "The Edge of Depth: Explicit Constraints between Segmentation and Depth"

Python 86.78% C++ 1.24% Cuda 11.99%

edgedepth-release's Introduction

EdgeDepth

This is the reference PyTorch implementation for training and testing depth estimation models using the method described in

The Edge of Depth: Explicit Constraints between Segmentation and Depth

Shengjie Zhu, Garrick Brazil and Xiaoming Liu

CVPR 2020

⚙️ Setup

  1. Compile Morphing operaiton:

    We implement a customized Morphing Operation in our evaluation and training codes. You can still do training and evaluation without it with a sacrifice of performance. To enable it, you can do as follows:

    1. Guranttee your computer's cuda version the same as your pytorch cuda version.

    2. Type:

    cd bnmorph
    python setup.py install
    cd ..

    You should be able to successfully compile it if you can compile cuda codes in this Pytorch Tutorial

  2. Prepare Kitti Data: We use Kitti Raw Dataset as well as predicted semantics label from this Paper.

    1. To download Kitti Raw Data
    wget -i splits/kitti_archives_to_download.txt -P kitti_data/
    1. Use thins Link to download precomputed semantics Label

⏳ Training

Training Code will be released soon.

📊 evaluation

  1. Pretrained Model is available here

  2. Precompute GroundTruth DepthMap

    python export_gt_depth.py --data_path [Your Kitti Raw Data Address] --split eigen
  3. To Evaluate without using Morphing, use command:

    python evaluate_depth.py --split eigen --dataset kitti --data_path [Your Kitti Raw Data Address] --load_weights_folder [Your Model Address] --eval_stereo \
     --num_layers 50 --post_process

    To Evaluate using Morphing, use command:

    python evaluate_depth.py --split eigen --dataset kitti --data_path [Your Kitti Raw Data Address] --load_weights_folder [Your Model Address] --eval_stereo \
     --num_layers 50 --post_process --bnMorphLoss --load_semantics --seman_path [Your Predicted Semantic Label Address]
  4. You should get performance similar to Entry "Ours" listed in the table:

    Method Name Use Lidar Groundtruth? Is morphed? KITTI abs. rel. error delta < 1.25
    BTS Yes No 0.091 0.904
    Depth Hints No No 0.096 0.890
    Ours No No 0.091 0.898
    Ours No Yes 0.090 0.899

Acknowledgment

Quite a few our code base come from Monodepth2 and Depth Hints

edgedepth-release's People

Contributors

twjiannuo avatar

Watchers

James Cloos avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.