Giter Site home page Giter Site logo

dexmv-sim's Introduction

DexMV: Imitation Learning for Dexterous Manipulation from Human Videos

DexMV: Imitation Learning for Dexterous Manipulation from Human Videos, Yuzhe Qin*, Yueh-Hua Wu*, Shaowei Liu, Hanwen Jiang, Ruihan Yang, Yang Fu, Xiaolong Wang, ECCV 2022.

DexMV is a novel system and pipeline for learning dexterous manipulation from human demonstration videos. This repo contains the simulated environment and retargeting code for DexMV. The learning part for DexMV is maintained in dexmv-learn.

Pretrained model are provided to try our environment without training. Demonstrations generated by DexMV pipeline are also provided if you want to try from scratch.

DexMV Teaser

Bibtex

@misc{qin2021dexmv,
      title={DexMV: Imitation Learning for Dexterous Manipulation
      from Human Videos},
      author={Qin, Yuzhe and Wu, Yueh-Hua and Liu, Shaowei and Jiang, Hanwen, 
      and Yang, Ruihan and Fu, Yang and Wang, Xiaolong},
      year={2021},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
      }

Installation

  1. Install the MuJoCo. It is recommended to use conda to manage python package:

Install MuJoCo from: http://www.mujoco.org/ and put your MuJoCo licence to your install directory. If you already have MuJoCo on your computer, please skip this step. Note that we use MuJoCo 2.0 for our experiments.

  1. Install Python dependencies Create a conda env with all the Python dependencies.
# Download the code from this repo for the simulated environment, retargeting and examples
git clone https://github.com/yzqin/dexmv-sim
export DEXMV_PACKAGE_PATH=`pwd`
cd dexmv-sim
conda env create -f environment.yml
conda activate dexmv
pip install -e .

# Download the code from dexmv-learn repo for the RL/Imitation algorithm.
# dexmv-learn is necessary in order to test on pretrained model or train from scratch, either with RL or with imitation.
cd $DEXMV_PACKAGE_PATH
git clone https://github.com/yzqin/dexmv-learn
cd dexmv-learn
pip install -e .
cd mjrl
pip install -e .
  1. The file structure is listed as follows:

dexmv-sim/hand_imitation/: environments, kinematics retargeting, and demonstration generation

dexxmv-sim/hand_imitation/hand_imitation/kinematics: kinematics retargeting and demonstration generation

examples/: example code to try DexMV

dexmv-learn/: training and inference for RL/Imitation learning algorithm

Additional docs other than this README: Demonstration Generation, Environment

Visualize Pretrained Policy

Relocate

You may need to setup the MuJoCo related environment variables, e.g. LD_LIBRARY_PATH before running the code.

cd DEXMV_PACKAGE_PATH/examples
python visualize_policy.py --env_name=relocate --object_name=tomato_soup_can # for relocate task

You can also try different object_name: mug, sugar_box, large_clamp, mustard_bottle, potted_meat_can, foam_brick

If it does not work for you, please check the GitHub issues for more discussion: openai/mujoco-py#268

Pour and Place Inside

python visualize_policy.py --env_name=pour # run this in the `example` directory
python visualize_policy.py --env_name=place_inside

Troubleshooting

If you encounter the following error:

ERROR: GLEW initalization error: Missing GL version

This is an issue related to MuJoCo OpenGL renderer. A quick fix is modifying the dynamic library loading order by:

export LD_PRELOAD=/usr/lib/x86_64-linux-gnu/libGLEW.so

Train from scratch

Download processed demonstrations

First download the demonstrations from the Google Drive and place all the .pkl file in the dexmv-sim/demonstrations directory.

Training config

We use a config system to specify the training parameters, including task, object, imitation learning algorithm or RL only. For convenience, we have provided several config files for you.

Training

For example, if you want to train relocate mug task using DAPG with demonstrations:

# Download processed demonstrations first before training
python train.py --cfg configs/dapg-mug-example.yaml  # run this in the `example` directory

Similarly, there are several config files for other tasks and algorithms in the example/configs directory you can use for training.

Hand Motion Retargeting

In this repo, we provide an minimal example for retargeting the human hand to robot hand. The pose data of an example trajectory are also provided here for convenience.

The following code takes the human hand pose estimation results and retargets it to robot joint position sequence. The result sequence will be saved as example_retargeting.pkl

python retarget_human_hand.py --hand_dir=./retargeting_source/relocate_mustard_example_seq/hand_pose --output_file=example_retargeting.pkl

To visualize the retargeted robot trajectory along with the object, you can run the following:

python visualize_retargeting.py --retargeting_result=example_retargeting.p --object_dir=./retargeting_source/relocate_mustard_example_seq/object_pose

Note: Demonstration generation is more than hand motion retargeting. It also involves trajectory generation, time alignment, inverse dynamics and hindsight, to match the state and action space defined by RL environment. Please check the demo_generation.md

Retargeting

Environment

If you want to use our simulated environment for your own research, check the env.md

Demonstration Generation

For more details about demonstration generation, check the demo_generation.md

Acknowledge

Some file in this repository is modified based on the code from robosuite, soil, dapg , dm_control, py-min-jerk. We gratefully thank the authors of these amazing projects.

We also thank for Jiashun Wang for insightful discussions for the connection between human and robot hand, and thanks the DexPilot authors for their detailed explanation of their method.

dexmv-sim's People

Contributors

yzqin avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.