Giter Site home page Giter Site logo

mobile tests about ethzasl_sensor_fusion HOT 32 OPEN

ethz-asl avatar ethz-asl commented on August 15, 2024
mobile tests

from ethzasl_sensor_fusion.

Comments (32)

simonlynen avatar simonlynen commented on August 15, 2024

Hi Aswin,

  1. yes you need to specify camera-imu attitude (and roughly position) in the file pose_sensor_fix.yaml. Expressed in the body (IMU) frame.
  2. Such a large influence of the sensor_fusion package on the ptam frame-rate sounds strange to me, unless you are processing this on a single-core machine. Given that you use the odroid, there should be sufficient processing power so that the filter can run alongside ptam. With our setup we use IMU data at rates from 100Hz to 2kHz and PTAM is running at 30 Hz.
  3. the filter-movement compared to the ptam movement depends on the scale which is estimated. It looks to me that you initialized the scale with a value of 30. However you have to provide a value which relates the initial stereo-pair distance of ptam to a metric value.
  4. publishing of the pose is stopped if there is no update. However the propagation continues in the background.
  5. As I wrote before, the computational cost should not be that high. The filter outputs the attitude at a higher rate which will be crucial for control.

Simon

from ethzasl_sensor_fusion.

simonlynen avatar simonlynen commented on August 15, 2024

Aswin,

something else. It would probably be a good idea to get raw IMU measurements as Markus stated in the mail yesterday. Without gravity it will be hard to estimate the attitude correctly.

Simon

from ethzasl_sensor_fusion.

aswinthomas avatar aswinthomas commented on August 15, 2024

Hi Simon,
As far as I know, the xsens MTi sends out acceleration with gravity. Does a bias_z of 10 mean that there is no gravity?

I will update the attitude in the config file and try again.
So if I understand correctly, if I do not need attitude information (i.e. only 3DOF of ptam), I dont need to use the filter? Is it possible to get velocity information from ptam?

Pardon me for my lack of knowledge in the subject.

Aswin

from ethzasl_sensor_fusion.

simonlynen avatar simonlynen commented on August 15, 2024

Hi Aswin,

yes, as Markus wrote you must use an IMU which provides measurement including gravity, otherwise two things are problematic:

  1. The attitude w.r.t to the world frame is not observable any more, because the PTAM attitude will drift over longer distances travelled.
  2. To calculate the gravity compensation, the IMU uses the gyro-readings. However the EKF basic theory does not allow these two measurements to be correlated.

If you want only position information, the filter can still provide an estimate of the scale, thus giving you access to metric measurements.

Ptam is returning absolute values with respect to the first frame of the stereo-init pair. So you can calculate velocities, however if you want to use this information for control, a higher rate of smoothed position estimates will be necessary which the filter can provide.

If you however use e.g. a AR-Drone, which stabilizes itself, you can also use PTAM directly for position control in a non metric frame.

Simon

from ethzasl_sensor_fusion.

stephanweiss avatar stephanweiss commented on August 15, 2024

Hi Aswin,

If I may join the conversation, I would like to ask you to provide us the following:

  • plot/bag-file of the IMU raw accelerometer measurements when the robot is "flat" on the ground.
  • system setup: is your camera pointing down, up, to the front? What robot are you using?
  • values you enter in the config file for the 6DoF transformation between cam and IMU
  • screenshot of the dynamic reconfigure GUI showing all parameter values of the sensor_fusion node

Besides that, you can

  • try what Simon and Markus suggested including setting the cam-IMU attitude in the config file
  • use "set_height" (instead of "init_filter") to initialize the framework when the robot is at a given metric height above the tracked features. This reduces the risk of initializing a wrong scale factor.
  • Verify if the time-stamps are correctly set for all your data. If you have different processors, make sure they are time synchronized (using chrony, ticsync etc).

Best
Stephan


From: Simon Lynen [[email protected]]
Sent: Wednesday, February 06, 2013 11:50 PM
To: ethz-asl/ethzasl_sensor_fusion
Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

Hi Aswin,

yes, as Markus wrote you must use an IMU which provides measurement including gravity, otherwise two things are problematic:

  1. The attitude w.r.t to the world frame is not observable any more, because the PTAM attitude will drift over longer distances travelled.
  2. To calculate the gravity compensation, the IMU uses the gyro-readings. However the EKF basic theory does not allow these two measurements to be correlated.

If you want only position information, the filter can still provide an estimate of the scale, thus giving you access to metric measurements.

Ptam is returning absolute values with respect to the first frame of the stereo-init pair. So you can calculate velocities, however if you want to use this information for control, a higher rate of smoothed position estimates will be necessary which the filter can provide.

If you however use e.g. a AR-Drone, which stabilizes itself, you can also use PTAM directly for position control in a non metric frame.

Simon


Reply to this email directly or view it on GitHubhttps://github.com//issues/8#issuecomment-13225309.

from ethzasl_sensor_fusion.

aswinthomas avatar aswinthomas commented on August 15, 2024

Hi Stephan,
Please give me a few days to reply since I am not in lab. My system setup is shown below:
https://dl.dropbox.com/u/8948006/setup.jpg -- side view IMU+camera
https://dl.dropbox.com/u/8948006/setup2.jpg -- top view IMU+camera

Cheers

from ethzasl_sensor_fusion.

aswinthomas avatar aswinthomas commented on August 15, 2024

Hi Stephan,
Thank you for the suggestion on set heights. It works.
Acceleration plot when stationary: https://dl.dropbox.com/u/8948006/accplot.png
Does this look right?

Screenshot of dynamic reconfigure: https://dl.dropbox.com/u/8948006/dynamic.png

parameters in config file:
init/q_ci/w: 1.0
init/q_ci/x: 0.0
init/q_ci/y: 0.0
init/q_ci/z: 0.0

init/p_ci/x: 0.0
init/p_ci/y: -10.0
init/p_ci/z: -5.0

Thank you
Aswin

from ethzasl_sensor_fusion.

stephanweiss avatar stephanweiss commented on August 15, 2024

Aswin,

Thanks for the plots and the data. Acc plot and reconfig parameters look mostly good. Looking at the rest, it surprises me that the filter works - given the following:

  • I saw that you have the value for "set height" set to 1. This is only valid if you initialize your framework at 1m above the tracked features.
  • q_ic parameters are most probably wrong. Based on the picture you sent me you have a 180° rotation between cam and IMU such that the z-axis of the cam looks down and presumably some yaw. This leads to a quaternion q(w, x, y, z) = [0, x, y, 0] where the values x, y depend on the amount of yaw between the cam and IMU.
  • p_ic is probably wrong too, remember that these are metric values. Your cam might not be over 10m away from the IMU given the picture you sent me. Also, remember that p_ic represents the position of the cam center in the IMU frame.

Let me know if the filter works correctly after these corrections.

Best
Stephan


From: aswinthomas [[email protected]]
Sent: Thursday, February 14, 2013 1:50 AM
To: ethz-asl/ethzasl_sensor_fusion
Cc: stephanweiss
Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

Hi Stephan,
Thank you for the suggestion on set heights. It works.
Acceleration plot when stationary: https://dl.dropbox.com/u/8948006/accplot.png
Does this look right?

Screenshot of dynamic reconfigure: https://dl.dropbox.com/u/8948006/dynamic.png

parameters in config file:
init/q_ci/w: 1.0
init/q_ci/x: 0.0
init/q_ci/y: 0.0
init/q_ci/z: 0.0

init/p_ci/x: 0.0
init/p_ci/y: -10.0
init/p_ci/z: -5.0

Thank you
Aswin


Reply to this email directly or view it on GitHubhttps://github.com//issues/8#issuecomment-13540027.

from ethzasl_sensor_fusion.

aswinthomas avatar aswinthomas commented on August 15, 2024

Hi Stephan,

  1. The filter was initialized 1m from the ground.
  2. Regarding q_ci parameters, if the camera is looking down, isnt the zaxis of ptam also facing down? the IMU z axis is also facing down. There could be a 180 degree yaw difference in x and y axis.
  3. Thanks for the suggestion in p_ci values. I have corrected them.

At present I can obtain good values for velocity. The position however seems wrong (shows 50m in x and 100m in y direction). I guess this is expected. My basic requirement was velocity (m/s from ssf_core) and camera frame position (from ptam). I hope this will be enough to control a helicopter.

Thank you for your support.
Cheers

from ethzasl_sensor_fusion.

icoderaven avatar icoderaven commented on August 15, 2024

Hi Stephan,

Sorry for hijacking this thread, but on a related note, can I use a front facing camera with ptam? I could not find any information regarding this on the tutorials or the issues. What modifications will I be required to make if I were to use such a configuration?

I have already set the IMU->camera transformation, with the following parameters in pose_sensor_fix

init/q_ci/w: -0.5
init/q_ci/x: 0.5
init/q_ci/y: 0.5
init/q_ci/z: 0.5

init/p_ci/x: 0.10
init/p_ci/y: 0.0
init/p_ci/z: 0.08

(My IMU has a local NED frame, with X pointing forwards, and Z downwards, and the camera is located facing forward along the IMU's positive X axis)

Specifically, I'm having trouble with determining the scale parameter. The pose estimate provided by the filter rapidly jumps to 10^38, and I am beginning to wonder if it's because the framework implicitly assumes that the camera is facing downwards.

My overall aim is to obtain satisfactory position estimates over short periods of time (a few seconds)

Thanks a lot!

p.s. I've also attached a rosbag for your perusal.
https://dl.dropboxusercontent.com/u/37626302/2013-04-16-01-07-33.bag
And here's the screenshot (I'm looking at a sofa right in front of a wall and the drone is 2.06 meters from the wall)
Screenshot from 2013-04-16 01:08:59

from ethzasl_sensor_fusion.

stephanweiss avatar stephanweiss commented on August 15, 2024

Hi

Yes, you can use a front facing camera with our framework (and PTAM). The view-direction of the camera is irrelevant for our algorithms as long as you initialize the states near to the true values.

That is:

  • Camera-IMU transformation as you described should do it
  • you also need to adjust the initial values of the vision-world drift states q_wv in the code. If you face a vertical wall this is most likely a 90deg turn in roll. Note that you can chose the initial yaw since this is an unobservable state.
    The high values in the pose may come from missing update signals (i.e. missing camera pose estimates). Please verify the rosgraph such that all topics are linked correctly and that camera readings continuously "arrive" in the filter framework.

I am currently on travel, hence the short answer. I will be back on Friday.

Best
Stephan


From: Kumar Shaurya Shankar [[email protected]]
Sent: Monday, April 15, 2013 10:15 PM
To: ethz-asl/ethzasl_sensor_fusion
Cc: stephanweiss
Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

Hi Stephan,

Sorry for hijacking this thread, but on a related note, can I use a front facing camera with ptam? I could not find any information regarding this on the tutorials or the issues. What modifications will I be required to make if I were to use such a configuration?

I have already set the IMU->camera transformation, with the following parameters in pose_sensor_fix

init/q_ci/w: -0.5
init/q_ci/x: 0.5

init/q_ci/y: 0.5
init/q_ci/z: 0.5

init/p_ci/x: 0.10

init/p_ci/y: 0.0
init/p_ci/z: 0.08

(My IMU has a local NED frame, with X pointing forwards, and Z downwards, and the camera is located facing forward along the IMU's positive X axis)

Specifically, I'm having trouble with determining the scale parameter. The pose estimate provided by the filter rapidly jumps to 10^38, and I am beginning to wonder if it's because the framework implicitly assumes that the camera is facing downwards.

My overall aim is to obtain satisfactory position estimates over short periods of time (a few seconds)

Thanks a lot!

p.s. I've also attached a rosbag for your perusal.
https://dl.dropboxusercontent.com/u/37626302/2013-04-16-01-07-33.bag
And here's the screenshot (I'm looking at a sofa right in front of a wall and the drone is 2.06 meters from the wall)
[Screenshot from 2013-04-16 01:08:59]https://f.cloud.github.com/assets/1692520/384454/30a9a902-a654-11e2-8eaf-9182a016d405.png


Reply to this email directly or view it on GitHubhttps://github.com//issues/8#issuecomment-16427022.

from ethzasl_sensor_fusion.

markusachtelik avatar markusachtelik commented on August 15, 2024

You should also rotate your IMU coordinate system to the convention we use (somewhat ENU): x forward, y left, z up. Otherwise, gravity gets subtracted into the wrong direction during prediction…

from ethzasl_sensor_fusion.

stephanweiss avatar stephanweiss commented on August 15, 2024

yes, that's a good point. So either you change your IMU coordinate system or you change the gravity vector defined in the [sensor]_measuerments.h file.

Best
Stephan


From: Markus Achtelik [[email protected]]
Sent: Wednesday, April 17, 2013 9:54 AM
To: ethz-asl/ethzasl_sensor_fusion
Cc: stephanweiss
Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

You should also rotate your IMU coordinate system to the convention we use (somewhat ENU): x forward, y left, z up. Otherwise, gravity gets subtracted into the wrong direction during prediction…

From: stephanweiss <[email protected]mailto:[email protected]>
Reply-To: ethz-asl/ethzasl_sensor_fusion <[email protected]mailto:[email protected]>
Date: Wednesday, April 17, 2013 17:20
To: ethz-asl/ethzasl_sensor_fusion <[email protected]mailto:[email protected]>
Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

Hi

Yes, you can use a front facing camera with our framework (and PTAM). The view-direction of the camera is irrelevant for our algorithms as long as you initialize the states near to the true values.

That is:

  • Camera-IMU transformation as you described should do it
  • you also need to adjust the initial values of the vision-world drift states q_wv in the code. If you face a vertical wall this is most likely a 90deg turn in roll. Note that you can chose the initial yaw since this is an unobservable state.
    The high values in the pose may come from missing update signals (i.e. missing camera pose estimates). Please verify the rosgraph such that all topics are linked correctly and that camera readings continuously "arrive" in the filter framework.

I am currently on travel, hence the short answer. I will be back on Friday.

Best
Stephan


From: Kumar Shaurya Shankar [[email protected]:[email protected]]
Sent: Monday, April 15, 2013 10:15 PM
To: ethz-asl/ethzasl_sensor_fusion
Cc: stephanweiss
Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

Hi Stephan,

Sorry for hijacking this thread, but on a related note, can I use a front facing camera with ptam? I could not find any information regarding this on the tutorials or the issues. What modifications will I be required to make if I were to use such a configuration?

I have already set the IMU->camera transformation, with the following parameters in pose_sensor_fix

init/q_ci/w: -0.5
init/q_ci/x: 0.5

init/q_ci/y: 0.5
init/q_ci/z: 0.5

init/p_ci/x: 0.10

init/p_ci/y: 0.0
init/p_ci/z: 0.08

(My IMU has a local NED frame, with X pointing forwards, and Z downwards, and the camera is located facing forward along the IMU's positive X axis)

Specifically, I'm having trouble with determining the scale parameter. The pose estimate provided by the filter rapidly jumps to 10^38, and I am beginning to wonder if it's because the framework implicitly assumes that the camera is facing downwards.

My overall aim is to obtain satisfactory position estimates over short periods of time (a few seconds)

Thanks a lot!

p.s. I've also attached a rosbag for your perusal.
https://dl.dropboxusercontent.com/u/37626302/2013-04-16-01-07-33.bag
And here's the screenshot (I'm looking at a sofa right in front of a wall and the drone is 2.06 meters from the wall)
[Screenshot from 2013-04-16 01:08:59]https://f.cloud.github.com/assets/1692520/384454/30a9a902-a654-11e2-8eaf-9182a016d405.png


Reply to this email directly or view it on GitHubhttps://github.com//issues/8#issuecomment-16427022.


Reply to this email directly or view it on GitHubhttps://github.com//issues/8#issuecomment-16516201.


Reply to this email directly or view it on GitHubhttps://github.com//issues/8#issuecomment-16518297.

from ethzasl_sensor_fusion.

icoderaven avatar icoderaven commented on August 15, 2024

Aha. That explains it. Thanks! I'll change the coordinate systems and try. Do you think this coordinate convention should be specified on the ROS page?

from ethzasl_sensor_fusion.

markusachtelik avatar markusachtelik commented on August 15, 2024

We're using the standard ros conventions described here: http://www.ros.org/wiki/geometry/CoordinateFrameConventions , but it would indeed make sense to put that link on the page.

from ethzasl_sensor_fusion.

stephanweiss avatar stephanweiss commented on August 15, 2024

yes, I will update the tutorials as soon as I am back. I am unsure if the IMU coordinate system is the main cause of your issues since I assume you initialized all corresponding attitudes using this coordinate frame (so it should be consistent). Usually, a "wrong" g vector just results in a bias estimate of about 18 in z and some other but minor inconsistencies. I will have a closer look at your bag file and the rotations on Friday. In the meantime, please also check your vision-world initialization and the correct linking of the topics.

Best
Stephan


From: Kumar Shaurya Shankar [[email protected]]
Sent: Wednesday, April 17, 2013 10:50 AM
To: ethz-asl/ethzasl_sensor_fusion
Cc: stephanweiss
Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

Aha. That explains it. Thanks! I'll change the coordinate systems and try. Do you think this coordinate convention should be specified on the ROS page?


Reply to this email directly or view it on GitHubhttps://github.com//issues/8#issuecomment-16521520.

from ethzasl_sensor_fusion.

icoderaven avatar icoderaven commented on August 15, 2024

Hi Stephan, Markus

I've tried changing my camera's orientation to point downwards so as to coincide with the IMU's NED frame, and even tried flipping the signs on the accelerations in the imu topic (so that, e.g., my z coordinate showed positive 9.8ish values ), but to no avail.

When I track both the /vslam/pose and the /imu/data topics, rviz shows me that they are both oriented in a NED frame and their rotations correspond as well, so it should not be a quaternion issue either. (I set the quaternion rotation in my ptam config file to 0,0,0,1)

Also, it seemed that the pose estimate was working well finally, but then quite often it seems that when it appears to converge, the prediction step suddenly starts throwing out NaN errors. A very good example of such a situation is towards the end of the rosbag attached.

Any further inputs?

Thanks a lot for your help!

p.s. Rosbag https://dl.dropboxusercontent.com/u/37626302/2013-04-21-02-53-48.bag

from ethzasl_sensor_fusion.

stephanweiss avatar stephanweiss commented on August 15, 2024

Hi,

sorry for the very late reply. Things are a bit busy at the moment.

As far as I remember, the conflict only occurs on old compilers gcc 4.3 I think.
Concerning the rest, note that the quaternion init is (w,x,y,z) and not (x,y,z,w). I am going through your bag file now.

Best
Stephan


From: Kumar Shaurya Shankar [[email protected]]
Sent: Thursday, May 16, 2013 10:31 AM
To: ethz-asl/ethzasl_sensor_fusion
Cc: stephanweiss
Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

I was going through the source code directory and just saw the IMPORTANT file in ssf_core that mentioned that the ROS PCL libraries conflict with EKF updates. That might be causing the problem. Has that issue been resolved yet?


Reply to this email directly or view it on GitHubhttps://github.com//issues/8#issuecomment-18016193.

from ethzasl_sensor_fusion.

icoderaven avatar icoderaven commented on August 15, 2024

Hi Stephan

Thanks for replying! I did make progress on this - I think I manage to get the filter working, (i.e. the x,y and z coordinates look about right). And yes, I noticed that the quaternion was a [w x y z].
it_works
,but when I try plotting them using the pose output topic, I see very odd behaviour. Specifically, at every PTAM update the pose jumps to where it should be, and then drifts rapidly (see next figure; I was moving the quadrotor in circles, and the pose output drifts rapidly outwards until the next PTAM update when it jumps to another point). That seemed to suggest that the pose output is different from the state vector in the stateout message.
huh

I saw that the IMU in your data was in the NWU frame, and the camera looking down was in the NED frame, so the appropriate quaternion transformation was a [0 1 0 0] quaternion. Since my 3DM-GX3-25 outputs IMU data in the NED frame, I flipped the accelerometer values, and rotated my orientation matrix with the above mentioned rotation transform (A diag(1,-1,-1)). I paused work on this since I started focusing on getting PTAM working well first. I shall look into it again in a few days.

As an additional side note, I couldn't get the ssf_update node running on the PandaBoard - on a first look it seemed like another ARM architecture based issue within ROS.

Thanks for responding!

from ethzasl_sensor_fusion.

simonlynen avatar simonlynen commented on August 15, 2024

Hi icoderaven,

what is the error you get with ssf_updates on ARM? Could you check if it is a buserror by attaching gdb?

In case you get a bus error, please use the fix described here:

https://code.ros.org/trac/ros/attachment/ticket/2883/serialization_memcpy_instead_of_reinterpret.diff
The file is in ros_underlay/roscpp_core/roscpp_serialization/include/ros/serialization.h

Simon

from ethzasl_sensor_fusion.

icoderaven avatar icoderaven commented on August 15, 2024

Hi Simon
Yes I had the bus error earlier with Ptam,and I have already made that patch. This was another bus error that was coming from elsewhere in serialization if memory serves me correctly. I'll look into it and let you know where it was.

from ethzasl_sensor_fusion.

stephanweiss avatar stephanweiss commented on August 15, 2024

Hi,

Just saw your reply after having typed the email below, I post it anyway as reference for others as a small how-to for debugging.
Also, can we move the ARM issue to another thread/issue with an appropriate title?

some remarks on your bag file:
you seem to never have motion longer than 10-20seconds. Also, in the acceleration data there is at least one significant spike. During the debug phase, try to do the folling:

  • init ptam
  • align the helicopter with the ptam map (for a front looking camera try to align it e.g. that helicopter x points to ptam z or something like this) This does not need to be very exact, we just want that the visual measurement is excitet in one single axis when we move the heli in one single axis body frame.
  • do a pure (as pure as manualy possible) sinusoidal translation in helicopter x. starting the sinusoid in positive helicopter x. Aplitude about 0.5m at roughly 1Hz at roughly 1.5m distance to the features
  • repeat this for y, and z (of course one of these motions will change the distance to the features)
  • do a pure (as pure as manualy possible) rotation around one axis at the time (+/- 10deg at about 0.5Hz, at about 1.5m distance to the features)

Record this data (IMU and PTAM, IMU at 100Hz is sufficient, PTAM as fast as possible). You can check the cam-IMU roation by analyzing the IMU angular velocities and take the derivative of the PTAM attitude. with this same data you can also verify the time delay between the two signals, true framerate, spikes, etc. If the error does not show itself by analyzing this data, please send the bag file.

I hope this helps, please let me know
Best
Stephan


From: Kumar Shaurya Shankar [[email protected]]
Sent: Thursday, May 16, 2013 10:31 AM
To: ethz-asl/ethzasl_sensor_fusion
Cc: stephanweiss
Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

I was going through the source code directory and just saw the IMPORTANT file in ssf_core that mentioned that the ROS PCL libraries conflict with EKF updates. That might be causing the problem. Has that issue been resolved yet?


Reply to this email directly or view it on GitHubhttps://github.com//issues/8#issuecomment-18016193.

from ethzasl_sensor_fusion.

stephanweiss avatar stephanweiss commented on August 15, 2024

In your latest image I see "fuzzy tracking" in one of the consoles. Is your map robust and well initialized and stable under motion?

do you get continuous measurements on vslam/pose? with what frequency?

Best
Stephan


From: Kumar Shaurya Shankar [[email protected]]
Sent: Friday, June 07, 2013 1:12 PM
To: ethz-asl/ethzasl_sensor_fusion
Cc: stephanweiss
Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

Hi Simon
Yes I had the bus error earlier with Ptam,and I have already made that patch, this was another bus error that was coming from elsewhere in serialization if memory serves me correctly. I'll look into it and let you know where it was.


Reply to this email directly or view it on GitHubhttps://github.com//issues/8#issuecomment-19130238.

from ethzasl_sensor_fusion.

icoderaven avatar icoderaven commented on August 15, 2024

Hi Stephan
Thanks a lot for the standard debugging procedure - it's really useful! I'll record a bag with this data soon.
In the meanwhile, yes, I do have a stable map (I might have reset PTAM in the screenshot earlier), and I get PTAM updates at approximately 15 Hz.
Right now I've shifted the camera to look down such that it's x axis points forward along the front(x) of the quadrotor.

from ethzasl_sensor_fusion.

icoderaven avatar icoderaven commented on August 15, 2024

Hi Stephan
So I did create a bag and compared the angular velocity output of the PTAM pose with that provided by the IMU, and I see a very close match about Y and Z axes, but about X axis, the PTAM angular velocity estimate goes bonkers for some odd reason. I am using the Quaternion to Euler transformation in tf for converting the quaternion to an Euler angle, and then taking the difference between two consecutive poses and dividing that by their time difference. Please see attached plot. The one on the left is the calculated angular velocity, while the one on the right is the IMU message values.
fusion_plots
fusion_debug

Is this normal? Also, here's the bag file https://dl.dropboxusercontent.com/u/37626302/test.bag

Thanks a lot, yet again!

from ethzasl_sensor_fusion.

stephanweiss avatar stephanweiss commented on August 15, 2024

Hi,

From the plots it is hard to tell if everything matches well, please plot an error plot or at least an overlay. you can compare the quaternions directly, this way you avoid potential issues while converting to euler angles.
From the bag file I have the feeling that there are some hiccups in the pose estimate and the measurements do not arrive regularly. This could be because the map is not robust at every frame or because of time-sync issues. Are all your data time synched?

Best
Stephan

from ethzasl_sensor_fusion.

devmax avatar devmax commented on August 15, 2024

You mention that your IMU assumes a z-axis upward, in which case the initial acceleration in the z-direction would be -9.8. However, instead of adding 9.8 to the initial value, you subtract 9.8, which would make the observed z -19.6 in case of zero motion...

is there something I'm missing?

from ethzasl_sensor_fusion.

stephanweiss avatar stephanweiss commented on August 15, 2024

The IMU assumes a world z-axis pointing upwards. That is you have to apply a force of [mass]*9.81 to have the MAV in hover mode (simply speaking, the acceleration part is what the IMU is measuring).

Best
Stephan


From: devmax [[email protected]]
Sent: Tuesday, August 06, 2013 10:41 AM
To: ethz-asl/ethzasl_sensor_fusion
Cc: stephanweiss
Subject: Re: [ethzasl_sensor_fusion] mobile tests (#8)

You mention that your IMU assumes a z-axis upward, in which case the initial acceleration in the z-direction would be -9.8. However, instead of adding 9.8 to the initial value, you subtract 9.8, which would make the observed z -19.6 in case of zero motion...

is there something I'm missing?


Reply to this email directly or view it on GitHubhttps://github.com//issues/8#issuecomment-22196910.

from ethzasl_sensor_fusion.

devmax avatar devmax commented on August 15, 2024

I see, well just to be sure then,

  1. assuming the x-forward y-left co-ordinate system you use, if the drone travels to the left, should the acc. on y be positive or negative?

from ethzasl_sensor_fusion.

markusachtelik avatar markusachtelik commented on August 15, 2024

It should be positive — hope that doesn't cause too much confusion ;)

from ethzasl_sensor_fusion.

devmax avatar devmax commented on August 15, 2024

Ahh so my ROS driver publishes faulty signs then- I suppose that is my problem, or one of them at least :)

Thank you!

from ethzasl_sensor_fusion.

icoderaven avatar icoderaven commented on August 15, 2024

Yeah, for instance my IMU reports accelerations in the X forward, Y right and Z downwards BODY frame, but the angular orientations in the NED frame (or, more precisely, a rotation matrix that when multiplied by a vector in the Earth Fixed system gives you the vector in the BODY frame (i.e. Vl = M*Ve) ) . As you can imagine, it gets pretty confusing trying to figure out what is the right way to go about matching what is on the Pelican, also because I don't know how/what frame the rotations are provided by the Pelican IMU.

from ethzasl_sensor_fusion.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.