Giter Site home page Giter Site logo

liruiw / omg-planner Goto Github PK

View Code? Open in Web Editor NEW
106.0 9.0 25.0 29.34 MB

An Optimization-based Motion and Grasp Planner

Home Page: https://sites.google.com/view/omg-planner

License: MIT License

Python 31.44% Shell 0.20% C++ 37.46% Cuda 0.45% CMake 2.19% C 27.61% GLSL 0.61% Dockerfile 0.05%
robotics motion-planning grasping trajectory-optimization chomp

omg-planner's Introduction

OMG-Planner

[webpage, paper]

image

Installation

git clone https://github.com/liruiw/OMG-Planner.git --recursive
  1. Setup: Ubuntu 16.04 or above, CUDA 10.0 or above

  2. Install anaconda and create the virtual env for python 2 / 3

    conda create --name omg python=3.6.9/2.7.15
    conda activate omg
    pip install -r requirements.txt
    
  3. Install ycb_render

    cd ycb_render
    python setup.py develop
  4. Install the submodule Sophus. Check if the submodule is correctly downloaded.

    cd Sophus
    mkdir build
    cd build
    cmake .. -Wno-error=deprecated-declarations -Wno-deprecated-declarations
    make -j8
    sudo make install
  5. Install Eigen from the Github source code here

  6. Compile the new layers under layers we introduce.

    cd layers
    python setup.py install
  7. Install the submodule PyKDL

    cd orocos_kinematics_dynamics
    cd sip-4.19.3
    python configure.py
    make -j8; sudo make install
    
    export ROS_PYTHON_VERSION=3
    cd ../orocos_kdl
    mkdir build; cd build;
    cmake ..
    make -j8; sudo make install
    
    cd ../../python_orocos_kdl
    mkdir build; cd build;
    cmake ..  -DPYTHON_VERSION=3.6.9 -DPYTHON_EXECUTABLE=~/anaconda2/envs/omg/bin/python3.6
    make -j8;  cp PyKDL.so ~/anaconda2/envs/omg/lib/python3.6/site-packages/

Docker Setup

Install Docker and NVIDIA Docker.

Modify docker_build.py and docker_run.py to your needs.

  1. Build the image:
$ python docker/docker_build.py 
  1. For local machines:
$ python docker/docker_run.py

Common Usages

  1. run ./download_data.sh for data (Around 600 MB).
  2. Run the planner to grasp objects.
python -m omg.core -v -f demo_scene_0 python -m omg.core -v -f demo_scene_1
  1. Run the planner from point cloud inputs.
python -m omg.core -v -f demo_scene_0 -p python -m omg.core -v -f demo_scene_1 -p
  1. Run the planner in kitchen scenes with interfaces.
python -m real_world.trial -s script.txt -v -f kitchen0 python -m real_world.trial -s script2.txt -v -f kitchen1
  1. Run the planner in kitchen scenes with mouse clicks.
python -m real_world.trial_mouse -v -f kitchen0 python -m real_world.trial_mouse -v -f kitchen1
  1. Loop through the 100 generated scenes and write videos.
    python -m omg.core -exp -w

PyBullet Experiments and Demonstrations

  1. Install PyBullet pip install pybullet gym (build with eglRender for faster rendering)

  2. Run planning in PyBullet simulator.

python -m bullet.panda_scene -v -f demo_scene_2 python -m bullet.panda_scene -v -f demo_scene_3
  1. Run planning in PyBullet simulator for kitchen scene.
python -m bullet.panda_kitchen_scene -v -f kitchen0 python -m bullet.panda_kitchen_scene -v -f kitchen1 -s script2.txt
  1. Loop through the 100 generated scenes and write videos.

    python -m bullet.panda_scene -exp -w
  2. Generate demonstration data data/demonstrations.

    python -m bullet.gen_data -w
  3. Visualize saved data.

    python -m bullet.vis_data -o img

Process New Shapes

  1. To process shape, one would need SDFGen, blenderpy, VHACD.

  2. Generate related files for your own mesh. (.obj file in data/objects/)

    python -m real_world.process_shape -a -f box_box000
  3. Graspit can be used to generate grasps with this ros_package and the panda gripper. Then save the poses as numpy or json files to be used in OMG Planner. Alternatively one can use DexNet or direct physics simulation in Bullet.

File Structure

├── ...
├── OMG
|   |── data
|   |   |── grasps              # grasps of the objects
|   |   |── objects             # object meshes, sdf, urdf, etc
|   |   |── robots      # robot meshes, urdf, etc
|   |   |── demonstrations  # saved images of trajectory
|   |   └── scenes      # table-top planning scenes
|   |── bullet
|   |   |── panda_scene     # tabletop grasping environment
|   |   |── panda_kitchen_scene # pick-and-place environment for the cabinet scene
|   |   |── panda_gripper   # bullet franka panda model with gripper
|   |   |── gen_data        # generate and save trajectories
|   |   └── vis_data        # visualize saved data
|   |── layers          # faster SDF queries with CUDA
|   |── omg                 # core algorithm code
|   |   |── core        # planning scene and object/env definitions
|   |   |── config          # config for planning scenes and planner
|   |   |── planner         # OMG planner in a high-level
|   |   |── cost        # different cost functions and gradients
|   |   |── online_learner  # goal selection mechanism
|   |   |── optimizer       # chomp and chomp-project update
|   |   └── ...
|   |── real_world               # real-world related code
|   |   |── trial            # cabinet environment with an interface
|   |   |── process_shape    # generate required file from obj
|   |   └── ...
|   |── ycb_render               # rendering code
|   |   |── robotPose            # panda-specific robot kinematics
|   |   └── ...
|   └── ...
└── ...

Note

  1. The environment can process either known object poses for databse target grasp or predicted grasps from other grasp detectors. It can use known obejct SDF for obstacle scenes or perceived point clouds as approximations.
  2. Parameters tuning can be found in omg/config.py.
  3. Example manipulation pipelines for kitchen scenes are provided.
  4. Please use Github issue tracker to report bugs. For other questions please contact Lirui Wang.

Citation

If you find OMG-Planner useful in your research, please consider citing:

@inproceedings{wang2020manipulation,
  title={Manipulation Trajectory Optimization with Online Grasp Synthesis and Selection},
  author={Lirui Wang and Yu Xiang and Dieter Fox},
  booktitle={Robotics: Science and Systems (RSS)},
  year={2020}
}

License

The OMG Planner is licensed under the MIT License.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.