Giter Site home page Giter Site logo

Comments (9)

brucejk avatar brucejk commented on September 26, 2024 1

Hi,

Thank you for using our repo! As your title, what corners you are talking about? Are those camera corners or LiDAR vertices? For the camera corners, you need to click on the image and find the coordinates. For LiDAR vertices, this package will optimize them for you by setting (opts.optimizeAllCorners = 1) will do the job. Please also follow here when you try to calibrate your system.

Please let me know if you have other problems!

from extrinsic_lidar_camera_calibration.

wenboDong0917 avatar wenboDong0917 commented on September 26, 2024

Do we need the ALL_LiDAR_vertices folder when we use our own dataset? and how can I create the dataset as what you porvided?

from extrinsic_lidar_camera_calibration.

wenboDong0917 avatar wenboDong0917 commented on September 26, 2024

from extrinsic_lidar_camera_calibration.

brucejk avatar brucejk commented on September 26, 2024

Hi,

Please follow the instruction here to collect your datasets. You don't need the ALL_LiDAR_vertices for your own datasets at first. The software will create one and save the LiDAR vertices for you. Please let me know if you have other questions!

from extrinsic_lidar_camera_calibration.

wenboDong0917 avatar wenboDong0917 commented on September 26, 2024

OK! Thanks again, I find that the .mat files have various types,and I would appreciate if you can tell me how to creat the .mat files such as the type of 'full-pc-.mat' as well as the type of 'velodyne_points-EECS3--2019-09-06-06-19.mat' couse I don't know what's datas in that .mat. I have got the mat files like 'big/med/small/-.mat' by using the bag2mat.py.

from extrinsic_lidar_camera_calibration.

brucejk avatar brucejk commented on September 26, 2024

Hi,

That's great that you used bag2mat.py to convert the data already! Please take a look at getBagData.m. There are two types of data:

I) TestData is for testing/visulization, which does not contain calibration targets. Take TestData(1) for example,
TestData(1).bagfile = "EECS3.bag"; --> The bagfile you collected for a testing scene.
TestData(1).pc_file = "velodyne_points-EECS3--2019-09-06-06-19.mat"; --> the full set of point cloud extracted from the bagfile using bag2mat.py.

II) BagData is for training and validation, which does contain calibration targets. You need to know how many calibration targets are in the scene and the size of each target. For each target, we need LiDAR returns on the target. In other words, a patch of the LiDAR point cloud on the target. This package will use the patch to estimate the LiDAR vertices for you. We also need corner coordinates on the image with the order of top-left-right-bottom.

P.S. If you only need to do calibration once, use bag2mat.py might be faster but if you plan to do it many times, it is recommended to use LiDARTag package to extract returns The LiDAR returns could be extracted by or bag2mat.py

Take BagData(2) for example, it contains two calibration targets and the bagfile collected for the scene. You will have the following information:

Scene information

BagData(2).bagfile = "lab2-closer.bag"; --> The bagfile you collected for a calibration/validation scene.
BagData(2).num_tag = 2; --> How many calibration tartets are in the scene.
BagData(2).lidar_full_scan = "velodyne_points-lab2-full-pc--2019-09-05-23-20.mat"; --> full scan of point cloud of the scene extracted by bag2mat.py, or LiDARTag package.

First target

BagData(2).lidar_target(1).pc_file = 'velodyne_points-lab2-closer-big--2019-09-05-21-51.mat'; --> LiDAR returns on the first calibration target. You can use LiDARTag package to extract the LiDAR returns.
BagData(2).lidar_target(1).tag_size = 0.8051; --> The size of the first calibration target
BagData(2).camera_target(1).corners = [340, 263, 406, 316; 236, 313, 341, 417; 1, 1, 1, 1]; --> Image coordinates of the first calibration target

Second target

BagData(2).lidar_target(2).pc_file = 'velodyne_points-lab2-closer-small--2019-09-05-21-53.mat'; --> LiDAR returns on the second calibration target . You can use LiDARTag package to extract the LiDAR returns.
BagData(2).lidar_target(2).tag_size = 0.158; --> The size of the second calibration target
BagData(2).camera_target(2).corners = [197, 153, 220, 176; 250, 273, 292, 315; 1, 1, 1, 1]; --> Image coordinates of the second calibration target

from extrinsic_lidar_camera_calibration.

brucejk avatar brucejk commented on September 26, 2024

Hi,

I am going to close this issue. Please feel free to reopen it if you encounter related issues.

from extrinsic_lidar_camera_calibration.

wenboDong0917 avatar wenboDong0917 commented on September 26, 2024

I'm so sorry to reply you so late, because I was busy with other things a few days ago, I used the bag2mat.py you provided to extract the mat files, but the files I extracted are not the same as the ones you provided, but most of them are the same. Only a few of them are different, I don’t understand the reason, if you can help me, I would be very grateful.
image
image

from extrinsic_lidar_camera_calibration.

brucejk avatar brucejk commented on September 26, 2024

Hi,

That depends on how you extract the point cloud. If you include more points from the patch of a point cloud, it will be different.

from extrinsic_lidar_camera_calibration.

Related Issues (17)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.