Giter Site home page Giter Site logo

Comments (8)

ThomasTimm avatar ThomasTimm commented on June 12, 2024

I haven't previously heard of servoj overshooting; on the contrary the problem is usually the opposite that it doesn't quite arrive at the target position. I my testing the robot was off with about 0.3 degrees after executing a trajectory.

Unfortunately we can't start moving the robot around after it supposedly completed the trajectory (that is, if somebody ordered a 4 sec trajectory, the robot should not stop after 4 secs, realize that it didn't quite reach the goal, and then start moving again), as this could be quite disastrous for sensors or actuators intended to be activated at the end of the trajectory. So a movej is not a viable solution.

I am actually letting the servoj thread on the controller run 0.1 sec after the trajectory is completed (https://github.com/ThomasTimm/ur_modern_driver/blob/master/src/ur_driver.cpp#L216 ) (with the control loop running on the controller at 125 hz, this equals 12 "extra" calls to servoj), but apparently this is not enough for servoj to converge.

You can try to limit the overshoot with ros_control by properly tuning the PID parameters, I'm sure there's room for improving them.

As far as I can see on your video, you are only doing very simple motion (no obstacle avoidance, just plain trajectories from A to B) without being too time-sensitive (it's not a problem if it takes a couple of hundred milliseconds before the robot starts moving). You could therefore send movej (or movel) to the robot directly via the urscript interface that the driver also exposes.

This, and ros_cotnrol, is likely the only possible way to get high goal-accuracy with the Universal Robot. I would therefore like to close this issue if that is okay with you.

from ur_modern_driver.

andreaskoepf avatar andreaskoepf commented on June 12, 2024

Sorry, I am personally not 100% sure if the joint-trajectory action server of the ur_modern_driver shows over- of undershooting ... what I can definitely say is that it does not reach the goal position with the maximum possible accuracy - e.g. ros_control shows some 'swinging' at the end but finally converges to the desired end-pose with a a samller remaining error. I also noted a very strange effect: If the velocity scaling slider in the UR teach-panel is set to a lower value (e.g. 10%) the accuracy increases significantly. I was speculating that there could be an overshooting because servoj has this look-ahead parameter (which you set to the smallest possible value of 0.03s) and I believe that ros_controller in position mode is effectively very close to your own built-in trajectory-action handler.

Regarding the PID of ros_controller: I noticed that you have gains in for the position based trajectory controller in the ur5_controllers.yaml. The pos trajector_controller only forwards the desired position - e.g. it is open-loop without any PID code inside it, e.g. see hardware_interface_adapter.h - of course only as long as the PositionJointInterface is used.

Regarding the video and obstacle-avoidance: You are right that he same operation in this simple setting could have been done pretty easily without MoveIt! (which much simpler and more direct paths) .. but actually since we are working on an adaptive programming system (ROSVita) for more complicated scenarios we generate every move using actual path-planning (e.g. MoveIt! currently most of the time). We are aware that the trajectories generated by MoveIt! are far from optimal and we have somebody on our team who will work on improving the situation there over the next months.

from ur_modern_driver.

andreaskoepf avatar andreaskoepf commented on June 12, 2024

I plotted some trajectories as they are generated by MoveIt! currently, e.g. see the following plot of the trajectory for joint 3:
joint_3_plot
Unfortunately the trajectory planner seems to be very basic and the end of the trajectory is NOT a smooth deceleration - I guess with smoother trajectories the problem will become less noticeable. Since we have a camera mounted near the TCP and we measure the robot position optically we are quite sensitive to even small deviations from the target pose.

One note regarding:

Unfortunately we can't start moving the robot around after it supposedly completed the trajectory (that is, if somebody ordered a 4 sec trajectory, the robot should not stop after 4 secs, realize that it didn't quite reach the goal, and then start moving again), as this could be quite disastrous for sensors or actuators intended to be activated at the end of the trajectory

I believe the convergence that is enforced by the ros_controllers trajectory-controller also takes place after the trajectory action already finished (e.g. after processing the last trajectory point). Exactly for the reason you mention (e.g. taking snapshots with camera/depth sensor) I had to include wait-statements in our test-scripts which ensure that sensor-reading is not done before actually reaching the destination pose.

from ur_modern_driver.

ThomasTimm avatar ThomasTimm commented on June 12, 2024

I believe the convergence that is enforced by the ros_controllers trajectory-controller also takes place after the trajectory action already finished (e.g. after processing the last trajectory point).

Yes, the controller continuously controls the robot and tries to minimize the error. That is, as you've concluded yourself, the reason the arm "swings" back and forth. It shouldn't do that if you were using the position-based controller (especially as it just passes the command forward as you just informed me it does). Are you sure you are using the pos_based_pos_traj_controller ? (try executing call /controller_manager/list_controllers {} ) , it should list what controller is actually running).

from ur_modern_driver.

ThomasTimm avatar ThomasTimm commented on June 12, 2024

You answered my question in the other issue, guess you are using the position based interface. This swinging back and forth is thus very strange.
Could you check the controller output to see if that swinging is also visible there? Because then it suggests a problem with the sampler in ros_control.
I've never seen a overshoot with servoj; it moves the robot to the instructed position and then stops motion with very high acceleration.

from ur_modern_driver.

miguelprada avatar miguelprada commented on June 12, 2024

Opened a new issue to discuss the overshoot when using position based ros_control controllers, since this thread refers specifically to the internal, non-ros_control-based, interface.

from ur_modern_driver.

andreaskoepf avatar andreaskoepf commented on June 12, 2024

@ThomasTimm just for reference here my output for rosservice call /controller_manager/list_controllers {}:

controller: 
  - 
    name: joint_state_controller
    state: running
    type: joint_state_controller/JointStateController
    hardware_interface: hardware_interface::JointStateInterface
    resources: []
  - 
    name: vel_based_pos_traj_controller
    state: stopped
    type: velocity_controllers/JointTrajectoryController
    hardware_interface: hardware_interface::VelocityJointInterface
    resources: ['elbow_joint', 'shoulder_lift_joint', 'shoulder_pan_joint', 'wrist_1_joint', 'wrist_2_joint', 'wrist_3_joint']
  - 
    name: pos_based_pos_traj_controller
    state: running
    type: position_controllers/JointTrajectoryController
    hardware_interface: hardware_interface::PositionJointInterface
    resources: ['elbow_joint', 'shoulder_lift_joint', 'shoulder_pan_joint', 'wrist_1_joint', 'wrist_2_joint', 'wrist_3_joint']

A colleague of mine will post a trace of the controller output near end of the trajectory later today.

from ur_modern_driver.

ThomasTimm avatar ThomasTimm commented on June 12, 2024

Closes this as it was solved in #47

from ur_modern_driver.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.