Giter Site home page Giter Site logo

spur's Introduction

ROS package suite for SPUR, a mobile omni-directional base robot with extensible arm made at Okada Lab, Tamagawa University (玉川大学) (a RoboCup@Home contender).

See the video on http://wiki.ros.org/spur.

日本語での 簡易インストール方法はこちら

"apt-get"table binary (DEB) files are generated at ROS build farm maintained by OSRF). The following is the status of the build jobs.

ROS Distro Source deb Development Branch (travis) Development branch (ros.org) Release Branch binarydeb Precise AMD64 Documentation (ros.org)
Indigo buildstatus_sourcedeb buildstatus_devel_travis buildstatus_devel_ros.org buildstatus_release buildstatus_binarydeb_amd64 buildstatus_doc

job_devel-indigo-spur

Following is assumed to be already installed:

__(May 9, 2015) This is indeed recommended, however, due to some ongoing work, installing from source is required. Once tork-a#16 gets resolved this limitation will be gone.__

The following set of commands will install both ROS Indigo and spur ROS package together for your convenience. For the complete instruction for installing ROS, see its wiki.

Ubuntu$ sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu trusty main" > /etc/apt/sources.list.d/ros-latest.list'
Ubuntu$ wget https://raw.githubusercontent.com/ros/rosdistro/master/ros.key -O - | sudo apt-key add -
Ubuntu$ sudo apt-get update && sudo apt-get install ros-indigo-desktop-full ros-indigo-spur
Ubuntu$ sudo rosdep init && rosdep update

Ubuntu$ echo "### For ROS setting"
Ubuntu$ echo "source /opt/ros/indigo/setup.bash" >> ~/.bashrc
Ubuntu$ source ~/.bashrc

Installing via source is only recommended for development purpose. The directory ~/catkin_ws/ will be used as a source directory for this instruction.

  1. Set up catkin workspace. Download SPUR ROS package.
$ mkdir -p ~/catkin_ws/src && cd ~/catkin_ws/src && catkin_init_workspace
$ git clone https://github.com/tork-a/spur.git

1-1. Install joy stick (temporarilly required for joy stick users).

Joy stick operation for omni-directional robots in ROS is still under development. As of now (April, 2015), install the joy stick ROS driver from source. This step will become not required in the future (related).

$ cd ~/catkin_ws/src
$ git clone https://github.com/130s/teleop_twist_joy.git && cd teleop_twist_joy && git checkout add/omnidir

1-2. Install scan matcher tools (temporarilly required until deb release)

$ cd ~/catkin_ws/src
$ git clone https://github.com/ccny-ros-pkg/scan_tools.git && cd scan_tools && git checkout b5efb32268911cada4bf5144af3578a5561dcfef -b 20150711
  1. Install depended libraries. Build sources.
$ cd ~/catkin_ws
$ rosdep install --from-paths src --ignore-src -r -y
$ catkin_make install && source install/setup.bash

As always the case with all kinds of robots, you should first test on the simulator then run the real robot.

$ roslaunch spur_gazebo spur_world.launch    # Simulation
$ roslaunch spur_bringup minimal.launch      # Real robot
$ roslaunch spur_gazebo spur_world.launch visualize_laser:=true

Laser range visualized on RViz and Gazebo

  1. Make sure your joystick device is paired with your computer.
  1. Then run:
$ roslaunch spur_bringup joy_teleop.launch
$ roslaunch spur_bringup joy_teleop.launch joy_port:=/dev/input/js1    # If joy is found on a different port
  1. To operate with PS3-Elecom joystick

The following note is confirmed with PS3-Elecom (sorry only Japanese web sites are found).

  • press "Mode" button twice to enable analog input. Also you may need to keep pressing the button 9 during operation.
  • use left axis of the joystick for linear motion (x-y). Use right axis for angular motion.
$ roslaunch spur_bringup kb_teleop.launch

Run RViz, gmapping, along with robot's controller.

term-1a$ roslaunch spur_bringup minimal.launch                            # Real robot
term-1b$ roslaunch spur_gazebo spur_world.launch visualize_laser:=true    # Simulation
term-2$ roslaunch spur_description rviz.launch
term-3$ roslaunch spur_2dnav gmapping.launch

After launching above, follow existing other tutorials (e.g. one from Turtlebot).

With only a single command you can invoke AMCL mode, assuming that this will be the most frequently used operation. Just notice:

  • Simulation mode uses Willow Garage map by default
  • Real robot mode requires map file as an argument

The following single launch command will run move_base along with other services (same as when you created a map).

term-1-sim$ roslaunch spur_2dnav amcl.launch sim:=true
term-1-real$ roslaunch spur_2dnav amcl.launch map_file:=%PATH_MAPFILE%

(Ex.)
term-1-real$ roslaunch spur_2dnav amcl.launch map_file:=`rospack find spur_2dnav`/launch/mysweethome.yaml

Then you can start navigating the robot by setting the goal on RViz. Simply do,

  1. Set robot's current pose on the map using 2D Pose Estimate button (it's at the top of RViz pane).
  2. Then set the goal pose on RViz pane using 2D Nav Goal.
2D Pose matched b/w rviz and gazebo

Image. 2D pose of the robot is matched using 2D Pose Estimate feature on RViz.

Then follow existing tutorials (e.g. Using rviz with the Navigation Stack).

If you wish to give the initial pose programmatically, publish geometry_msgs/PoseWithCovarianceStamped (see this QA for more info).

This robot's base stops when no velocity message (linear and angular represented by geometry_msgs/Twist) is received for a certain period of time (default: 3.0 seconds). You can change that from commandline option when you boot the robot's controller.

$ roslaunch spur_gazebo spur_world.launch sec_idle:=1.0           # Simulation
$ roslaunch spur_controller spur_controller.launch sec_idle:=1.0  # Real robot

spur_controller_configuration.yaml defines each motor's configuration. You should not, however, directly modify this file (you can, but not recommended). Instead, follow:

  1. Modify spur_controller_configuration_gen.sh as you like.
  2. Then run it. This yields the aforementioned spur_controller_configuration.yaml.
$ ./spur_controller_configuration_gen.sh

spur's People

Contributors

130s avatar 534o avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.