Giter Site home page Giter Site logo

Comments (11)

Rennschnitzl avatar Rennschnitzl commented on August 27, 2024

Maybe the rs::intrinsics you get from the device do not match the actual intrinsics of the cameras.

The deproject function does some basic projective geometry and the undistortion with the model (brown-conrady).

I had a similar problem when using the unofficial library from https://github.com/teknotus/depthview where i had to correct the distances by myself. Since you have to do some calculations based on the camera intrinsics. Being lazy while calibrating gave me the same effect.
the base streams are with lens distortion so you can use them for calibration.

You could try the pointcloud sample ( number 3) and see if the effect is in there too. My camera seems fine but i put my intrinsic values in instead of taking it from the camera.

tl;dr
my bet is on bad calibration.

screenshot from 2016-01-23 18 30 48
screenshot from 2016-01-23 18 30 54
screenshot from 2016-01-23 18 31 18

from librealsense.

teknotus avatar teknotus commented on August 27, 2024

On the F200 the camera interfaces are claimed by the video 4 linux drivers, and usually put in a permissions group for video, but the calibration is handled by another interface which shows up as raw USB access, and is given permissions a normal user wouldn't have. There are udev rules in /config that might fix this by making the USB endpoint read/write rather than the default of no access at all. I also noticed when reading through the calibration code that there is a TODO for supporting resolutions other than 640x480.

from librealsense.

Wollimayer avatar Wollimayer commented on August 27, 2024

I did some more tests with different cameras. Some stored calibration seem to be nearly perfect. However, others needed some manual calibration with a cheesboard.
The undistorted ir stream seems to be correct now, still there are problems with the depth information.

The bending effekt at the corners becomes visible when using the whole depth image ( not just the part, that is also visible by the color camera). This might be a limitation of the depth image.

I measured the distances computet by different cameras within the color rectangle. Some of them differ in 2 cm ( constant over the area).

As this effect does occure depending on the camera used, I'm wondering whether the orientation of the projector couses difference in the depth image. As far as i know the camera and the projected image need to be calibrated (probably by intel during production); depending on the forces applied to the circuit during assembling this might change the relation between camera and projector which does finally lead to slightly changes in the measured distances.

The question is, how to deal with the effects.

from librealsense.

Rennschnitzl avatar Rennschnitzl commented on August 27, 2024

The distance measured depends on the sensors temperature. There is a model for the difference based on the temperature measured by the sensor itself. Please make sure both cameras are running roughly at the same temperature by either letting both cool down (around an hour for mine) or let them run for a while before measuring (mine was stable after around 45 minutes - as a rule of thumb).

The relation of the RGB and IR sensor (the extrinsics you can actually calibrate) are not relevant for the distance measurement. Only for mapping the color and depth.

I'm not sure if the orientation of the IR sensor wrt the laser projector makes a difference. If that is the case you'd have to measure the offset and have to add it manually, i guess.

from librealsense.

ddiakopoulos avatar ddiakopoulos commented on August 27, 2024

@Rennschnitzl Correct in your statement about temperature and performance, although there is host-side correction for this which updates the ASIC directly.

@teknotus You reminded us that we need to cleanup a few TODOs in the code -- namely that they aren't entirely accurate (non-640x480 modes should work on F200 now).

The underlying issue at hand is that one of the cameras documented in this thread probably left the factory without being properly calibrated (or was physically dropped, bent, or warped). There's little corrective action we can take in librealsense -- the per-camera calibration stored in device memory should yield correct intrinsics in all cases with no manual intervention needed by developers (such as checkerboard recalibration). The best we can offer is that you can try emailing [email protected] for a replacement F200. Hope that helps!

from librealsense.

teknotus avatar teknotus commented on August 27, 2024

from librealsense.

ddiakopoulos avatar ddiakopoulos commented on August 27, 2024

@teknotus Correct, there are some minor adjustments made on the host side to correct for thermal drift, but in fact these updated intrinsics are not given back to users of the library. We double and triple checked with our internal hardware team, but the recomputed coefficients are used to stabilize the ASIC algorithm and have minimal effect on the projection (e.g. shifting principal point by a thousanth of a pixel), which doesn't explain the massive discrepancies documented in the images above. This is only a problem for F200 and now all compensation is done in hardware for SR300.

from librealsense.

Wollimayer avatar Wollimayer commented on August 27, 2024

@ddiakopoulos I tested another camera, same effect, but not as strong. I'm wondering: is the intrinsic calibration performed by Intel or by creative ? During the assembling process by creative, the camera might be exposed to (minimal) pressure.

A bended circuit leads to a shift (alpha) of the projected patterns. The camera does calculate the depth information based on the location of each pattern in the image. As the pattern position shifts, the camera assumes a wrong distance ( crossing between camera ray and the expected ray for a given pattern).

img_0022

You may correct me if I'm wrong but I believe this leads to the blending effect.

In order to use this technology for scientific projects it might be useful to have a method to correct this effect. (I'm unsure about the SR300 solving this problem entirely, as it seems to use the same technology )

Otherwise, is there any chance to get the raw real sense chips ( calibrated by intel ) .

from librealsense.

leonidk avatar leonidk commented on August 27, 2024

@Wollimayer

First, I think a basic "check" is to try the Realsense SDK and see if you're still having issues. If so, there's an actual hardware problem and you should contact Intel support for warranty/support information. If the RSSDK fixes it, then we've an actual librealsense bug.

Second, I think we should check if this bending is within the expected tolerances for shipped F200 units. @ddiakopoulos

Third, I think this may just unfortunate, expected behavior. I don't believe we currently offer a method of re-calibrating the device in either the RSSDK or librealsense. The cameras are factory-calibrated, but there exist challenges to maintaining calibration and there's the possibility that certain cameras will fall out of calibration for some of the following

  1. Mechanical impulses causing intrinsics or extrinsics changes, such as bending of the stiffener or shifting of the lens. The cameras are designed with a stiffener, but it's possible that mechanical impulses (e.g. dropping) might knock the device out of calibration.
  2. Un-modeled temperature changes. We currently have a real-time loop running a temperature correction model that adjusts intrinsics. If you subject the device to strong or non-uniform temperature sources, this might cause this loop to fail.
  3. Reflection & multipath. If you have a highly reflective or specular environment, self-reflections in the scene might cause ambiguities.
  4. Un-modeled pressure changes. I won't talk about details, but the F200 can be sensitive to strong ambient pressure changes. There are sensors and logic to compensate for this but you never know.

from librealsense.

teknotus avatar teknotus commented on August 27, 2024
  1. Un-modeled pressure changes. I won't talk about details, but our
    device can be sensitive to strong ambient pressure changes. There's sensors
    and logic to compensate for this but you never know.

Does this mean it has a barometer? I don't think I know enough to use that
information for correcting an image, but the raw data from a barometer
would be useful for unrelated applications.

from librealsense.

ddiakopoulos avatar ddiakopoulos commented on August 27, 2024

The ambient pressure change compensation is hard-coded into the chip -- there is no barometer (a colleague has informed me that there is one but it's hardwired into the chip with no software controls) and no compensation should be required on the part of the camera user.

from librealsense.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤ī¸ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.