Giter Site home page Giter Site logo

beautyscribbles / selfsupervised_flow Goto Github PK

View Code? Open in Web Editor NEW

This project forked from mercedes-benz/selfsupervised_flow

0.0 0.0 0.0 368 KB

Code for Paper "Self-Supervised LiDAR Scene Flow and Motion Segmentation"

License: MIT License

Shell 0.42% C++ 6.11% Python 92.42% CMake 1.06%

selfsupervised_flow's Introduction

Code for Paper: Self-Supervised LiDAR Scene Flow and Motion Segmentation

Stefan Baur*, David Emmerichs*, Frank Moosmann, Peter Pinggera, Björn Ommer and Andreas Geiger (*: equal conribution)

[Paper] [Supplementary Material] [Project Page]

Teaser Image

PRESENTATION VIDEOS

Watch the video](https://youtu.be/z34H43BOEVw)

INSTALLATION

This repository has been developed and tested for

  • Ubuntu 18.04
  • CUDA 10.2
  • CUDNN 7.6.5

First of all, clone the repository:

git clone --recursive

Download the datasets Kitti Raw, nuscenes, Kitti Scene Flow. Then, create folder with symlinks to your datasets (referred to as INPUT_DATADIR in the following) that you would like to use.

mkdir -p /path/to/my_datasets
cd /path/to/my_datasets
ln -s /path/to/nuscenes/dataset nuscenes
ln -s /path/to/kitti_raw kitti_raw
ln -s /path/to/kitti_scene_flow_data kitti_sf_data

Also, create a destination folder for the training logs (OUTPUT_DATADIR):

mkdir -p /path/to/output_destination

Change into the base directory of the repo and set INPUT_DATADIR (path to your datasets with the symlinks) and OUTPUT_DATADIR (path to logs) in scripts/set_env_variables.bash:

cd path/to/this/repo
vim scripts/set_env_variables.bash

Create the virtual environment, build tensorflow custom ops and set the necessary env variables: SRC_DIR, CFG_DIR, INPUT_DATADIR, OUTPUT_DATADIR.

# in root folder of this repository execute
python3 -m venv .venv --prompt usfl
source .venv/bin/activate
pip3 install -U setuptools pip
pip3 install -r requirements.txt


bash scripts/build_all.bash # builds TF custom ops
source scripts/set_env_variables.bash # sets environment variables

Also, you need to install the nuscenes-devkit.

Running experiments

The entry point for trainings is unsup_flow/cli.py. We use a unsup_flow/config/config.yml with incremental configurations to configure the experiments. The default configuration is loaded for every experiment.

# default experiment
python unsup_flow/cli.py --prod

Incremental config changes can be applied using the -c flag. For example, to run a training using the RAFT flow backbone on nuscenes without using the confidence weights for static aggregation:

python unsup_flow/cli.py --prod -c nuscenes sota_us sota_net no_stataggr_weight

This will, starting from the default config, recursively apply the changes to the config that are specified in the config file under sections nuscenes, sota_us, sota_net, no_stataggr_weight in that order.

Individual changes to the config can also be made using the -kv flag, for example to change the initial learning rate from 0.0001 (default) to 0.005:

python unsup_flow/cli.py --prod -c nuscenes -kv learning_rate initial 0.005

❗ NOTE: -kv takes a list of strings (e.g. learning_rate initial) and then a value (e.g. 0.005). The list of strings specify the config value location in the hierarchy of dictionaries!

Using your own dataset

Go to usfl_io/io_tools.py:1009 and implement a function that returns a TF Dataset, returning the following dict:

dict:
    name []
    pcl_t0 [N0, 3]
    ref_t0 [N0] (opt.)
    semantics_t0 [N0] (opt.)
    pcl_t1 [N1, 3]
    ref_t1 [N1] (opt.)
    semantics_t1 [N1] (opt.)
    odom_t0_t1 [4, 4] (opt.)
    flow_gt_t0_t1 dict (opt.):
        flow [N0, 3]
        annotation_mask [N0]
        nn_interpolated_mask [N0]
        exact_gt_mask [N0]
        ego_flow_mask [N0] (opt.)
    flow_gt_t1_t0 dict (opt.):
        flow [N1, 3]
        annotation_mask [N1]
        nn_interpolated_mask [N1]
        exact_gt_mask [N1]
        ego_flow_mask [N1] (opt.)

You can use one of the existing datasets for inspiration. Most of the above data is optional (opt.) and will be used to generate informative metrics.

Coordinate system: Positive z-Axis is opposite to the direction of gravity, x is forward, y to the left. (Right handed coordinate system.) The origin should be around 1.7m above ground and in the center of the recording sensor.

##FOR CONTRIBUTIONS: For this, use the dataset creation scripts in the folder here: unsup_flow/datasets/ where you can find each create.py script for the respective datasets. Run each script once to convert the datasets into our format.

CITE

@InProceedings{Baur_2021_ICCV,
   author    = {Baur, Stefan Andreas and Emmerichs, David Josef and Moosmann, Frank and Pinggera, Peter and Ommer, Bj\"orn and Geiger, Andreas},
   title     = {SLIM: Self-Supervised LiDAR Scene Flow and Motion Segmentation},
   booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
   month     = {October},
   year      = {2021},
   pages     = {13126-13136}
}

LEGAL CONTRIBUTIONS

So the code has been tested solely for our use cases, which might differ from yours. Notice: Before you use the program in productive use, please take all necessary precautions, e.g. testing and verifying the program with regard to your specific use.

Code of Conduct Contribution Guidelines

selfsupervised_flow's People

Contributors

baurst avatar demmerichs avatar beautyscribbles avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.