Giter Site home page Giter Site logo

Comments (17)

parthnatekar avatar parthnatekar commented on July 3, 2024 1

Thanks for the help; I am able to match the IK and observed pose now.

As a follow up: If I wanted to apply a transform such that the end-effector frame matched up with the camera frame, i.e. +z of the end-effector aligns with +z of the camera, +y of the end-effector aligns with +y of the camera, etc. how would I do that.

I essentially want the robot to move naturally with respect to a right hand system aligning with the camera. When I give a sinusoidal input along the y-axis, the robot should look like it moves up and down, not at an angle as shown in the video below.

Robot Pose without Phantom

from surgical_robotics_challenge.

parthnatekar avatar parthnatekar commented on July 3, 2024

Hi, just a follow up on this. Even when I run don't send any target joint angles (i.e. leave them at the initialized value), the measured joint state seems to be different from the initialized value and is fluctuating, even though my inverse-kinematics solution is stable, correct, and not fluctuating. Any idea on what is happening?

TL;DR measured joint state different from IK solutions, please help.

from surgical_robotics_challenge.

adnanmunawar avatar adnanmunawar commented on July 3, 2024

Hello,

When you are running the launch_crtk_interface.py script prior to running crtk_ros_based_control are you keeping the flag to set random joint errors? The flag is set at this line.

If you are disabling the setting of random joint errors, what interface are you using to read the joint angles? The interface provided by launch_crtk_interface script?

from surgical_robotics_challenge.

parthnatekar avatar parthnatekar commented on July 3, 2024

Hi,

The problem in my second comment is solved.

I'm using launch_crtk_interface to read measured joint angles from the simulator.

My first issue still persists. The end-effector should move in a uni-dimensional fashion when I apply the command servo_cp_1_msg.transform.translation.y = 0.2 * math.cos(rospy.Time.now().to_sec()), but it moves in an arc, i.e. it stops following the IK solution after a point. I'm wondering if any limits or rigid body constraints are being set on the simulator side. I can also see an enforce_limits function in PsmIK.py, but that is not what is causing this issue.

I notice you have changed TransformStamped to PoseStamped. I tried making these changes locally but it is breaking; I need to pull the latest commit.

from surgical_robotics_challenge.

adnanmunawar avatar adnanmunawar commented on July 3, 2024

The end-effector should move in a uni-dimensional fashion when I apply the command servo_cp_1_msg.transform.translation.y = 0.2 * math.cos(rospy.Time.now().to_sec()), but it moves in an arc, i.e. it stops following the IK solution after a point.

What are the values of x and z in your translation command? Is it possible that the IK solution at some points is outside the reachable workspace for some joints in the AMBF PSM model? Those joint limits are set in the respective PSM's ADF file (for example here). A video / graph of your commanded vs actual joint values would be helpful.

from surgical_robotics_challenge.

parthnatekar avatar parthnatekar commented on July 3, 2024

Hi, I'm not changing x and z, they continue to be at their original values.

Is it possible that the IK solution at some points is outside the reachable workspace for some joints in the AMBF PSM model?

I think this is what might be happening. So the joint limits enforced in enforce_limits function in PsmIK.py are not the same as the AMBF PSM model limits? Anyway, here are the videos (for some reason the github embedded videos are not working so uploading to drive):

IK vs Actual Measured

Robot Pose

Comparison of one joint angle

from surgical_robotics_challenge.

adnanmunawar avatar adnanmunawar commented on July 3, 2024

Thanks for posting the videos. Just by looking at the middle video (the one with the AMBF simulation) I can see that the PSM is colliding with the environment and thus unable to attain the commanded joint positions.

While at some instances you can visually see the PSM colliding the Pink Phantom, there is also an invisible (Hidden) collision box surrounding the Pink Phantom which the PSM starts colliding with. This was added to prevent the needle and the thread from falling to the floor in case it was knocked off. You can see that hidden collision box in this picture:

Screenshot from 2022-09-20 16-50-08

To check if this is indeed the problem, can you try loading the scene without the Phantom, hidden box etc. I believe that the Phantom's launch index is 14,15, so you may run without it as follows:

./ambf_simulator --launch_file <surgical_robotics_challenge>/launch.yaml -l 0,1,3,4 -p 120 -t 1 --override_max_comm_freq 120

Then check if the disparity between the commanded and the measured joint positions remains.

from surgical_robotics_challenge.

parthnatekar avatar parthnatekar commented on July 3, 2024

Hi,

The explanation makes sense; however can you please check the command arguments? This one seems to be doing something else!

from surgical_robotics_challenge.

adnanmunawar avatar adnanmunawar commented on July 3, 2024

Ahh, can you add index number 16. The ADF at index 16 defines a kinematic camera frame, which is otherwise defined in the ADF file at index 15.

The command should look like:

./ambf_simulator --launch_file <surgical_robotics_challenge>/launch.yaml -l 0,1,3,4,16 -p 120 -t 1 --override_max_comm_freq 120

You will need to replace the <surgical_robotics_challenge> in the command above with the actual location of where you cloned the surgical_robotics_challenge. And ./ambf_simulator in the command above is assuming that you are in the directory where ambf_simulator executable is located. Alternatively you can add this to you .bashrc file:

alias ambf_simulator=~/ambf/bin/lin-x86_64/ambf_simulator to run the simulator from anywhere in the terminal. Just change the ~/ambf/bin/lin-x86_64/ambf_simulator path as necessary depending of where ambf_simulator is on your system.

from surgical_robotics_challenge.

parthnatekar avatar parthnatekar commented on July 3, 2024

I figured it out, so closing this for now.

from surgical_robotics_challenge.

parthnatekar avatar parthnatekar commented on July 3, 2024

Hi, reopening because I have a question on the last issue:

I want to find the transform between the CameraFrame and the PSM arm robot base frame.

I can see ADF files with the position and orientation of, for example, PSM1 and CameraFrame. However, the orientation of the CameraFrame I get here does not match with the orientation of the CameraFrame that I print when the simulator is running.

How can I access the correct coordinate frame of the camera and PSM arm base in the 'World' frame so that I can find the transform between them?

from surgical_robotics_challenge.

adnanmunawar avatar adnanmunawar commented on July 3, 2024

I can see ADF files with the position and orientation of, for example, PSM1 and CameraFrame. However, the orientation of the CameraFrame I get here does not match with the orientation of the CameraFrame that I print when the simulator is running.

How are you getting the orientation of the CameraFrame other than the ADF file? This is what I just tested using the Python Client and it seems to be the same as the ADF file.

Screenshot from 2022-10-11 13-39-23

Screenshot from 2022-10-11 13-33-07

The CameraFrame is a kinematic frame, and not the camera itself, the actual cameras are parented to it so that we can move just this CameraFrame and the child cameras (multiple of them) can move along.

Check the definition of actual cameras and notice the "parent: CameraFrame" line:

cameraL:
  namespace: cameras/
  name: cameraL
  location: {x: -0.02, y: 0.0, z: -0.5}
  look at: {x: 0.0, y: 0.0, z: -1.0}
  up: {x: 0.0, y: 1.0, z: 0.0}
  clipping plane: {near: 0.01, far: 10.0}
  field view angle: 1.2
  monitor: 1
  parent: BODY CameraFrame <-- Parent defined here, thus the location, look at, and up are now relative to the parent.
  # preprocessing shaders:
  #   path: ../../ambf_shaders/preprocessing
  #   vertex: shader.vs
  #   fragment: shader.fs
  # publish image: True
  publish image interval: 5
  publish image resolution: {width: 1920, height: 1080}
  # publish depth: True
  # publish depth resolution: {width: 640, height: 480}
  # multipass: True

You can get the coordinate frame of the camera in the world frame by following the script (image) I showed above. Alternatively, follow this line from this script to get the pose of PSM base in CameraFrame and vice-versa (https://github.com/collaborative-robotics/surgical_robotics_challenge/blob/master/scripts/surgical_robotics_challenge/teleoperation/mtm_multi_psm_control.py#L86)

from surgical_robotics_challenge.

parthnatekar avatar parthnatekar commented on July 3, 2024

Hi, thanks for your reply. A couple of quick questions:

  1. When I send a position to the robot using crtk_ros_based_control.py, is this the position of the PSM end-effector with respect to the PSM base frame, or with respect to the world frame?
  2. What is the Euler angle convention used?

from surgical_robotics_challenge.

adnanmunawar avatar adnanmunawar commented on July 3, 2024

Hi Parth,
Here are the answers to your questions:

When I send a position to the robot using crtk_ros_based_control.py, is this the position of the PSM end-effector with respect to the PSM base frame, or with respect to the world frame?

The pose commands in the crtk_ros_based_control.py, and the servo_cp method in psm_arm.py are applied w.r.t to the PSM's base frame and NOT the world frame.

What is the Euler angle convention used?

Convention for what specifically? In the ADF file, we use RPY (Roll Pitch Yaw), which is the fixed angle (equivalent to the extrinsic Euler convention) of the order XYZ. Extrinsic Euler XYZ with rotations in the order -> (a, b, c) is equivalent to Intrinsic Euler ZYX rotation in the order c, b, a. Or:

Fixed RPY (a, b, c) = Extrinsic Euler XYZ (a, b, c) = Intrinsic Euler ZYX (c, b, a)

Not all software packages specify the Extrinsic and Intrinsic Euler convention. Most commonly, if they don't qualify the Euler convention, they use the Intrinsic Euler, which is also known as the rotation w.r.t to a moving frame.

from surgical_robotics_challenge.

parthnatekar avatar parthnatekar commented on July 3, 2024

Hi,

Thanks for your reply.

Can you point me to the documentation you referred to on the coordinate frames for the PSM arm?

from surgical_robotics_challenge.

adnanmunawar avatar adnanmunawar commented on July 3, 2024

There is the dVRK Manual found here (https://research.intusurg.com/index.php/DVRK:Documentation:Main). You need to have an account to login and to view the document. Or you can also refer here. This figure only shows the base and tip frame but they are identical to the dVRK Manual and what we use for the actual PSM. Finally, you can open any PSM ADF file in Blender to view individual link frames, which too should be similar to the actual PSM. Notice, you will need to install the blender_ambf addon to load the ADF files.

from surgical_robotics_challenge.

adnanmunawar avatar adnanmunawar commented on July 3, 2024

Closing this as resolved

from surgical_robotics_challenge.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.