Giter Site home page Giter Site logo

ankitdhall / lidar_camera_calibration Goto Github PK

View Code? Open in Web Editor NEW
1.5K 36.0 460.0 18 MB

ROS package to find a rigid-body transformation between a LiDAR and a camera for "LiDAR-Camera Calibration using 3D-3D Point correspondences"

Home Page: http://arxiv.org/abs/1705.09785

License: GNU General Public License v3.0

CMake 19.07% C++ 40.63% C 16.74% Makefile 23.56%
camera lidar ros calibration velodyne aruco-markers point-cloud lidar-camera-calibration data-fusion camera-calibration

lidar_camera_calibration's People

Contributors

ankitdhall avatar karnikram avatar mananabanana avatar ruichen-v avatar xinwf avatar xmfcx avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

lidar_camera_calibration's Issues

Cannot detect aruco markers

Hi, thanks for your sharing. I'm trying to calibrate the velodyne lidar with a Kinect2 camera . But I cannot detect the aruco markers, which makes the program stuck. I tried to run the aruco_mapping node only, and it also dosen't work.

Why the marker cannot be detected? I don't know what's wrong. Please help!!!!!!!!!!!!!!!!!! T T!
problem

Few questions for setup

Setup: Velodyne and Point Grey Cameras in Stereo setup.

  1. Is there an ideal distance between camera-lidar and the markers. Right now my markers are detected only when close( 3ft-4ft away).

  2. There seems to be different values of projection matrix in conf/config_file.txt

611.651245 0.0        642.388357 0.0
0.0        688.443726 365.971718 0.0
0.0        0.0        1.0        0.0

and the corresponding data/zed_left_uurmi.ini in aruco_mapping.

projection
6.8230856098675258e+02 0 6.2940856933593750e+02 0 
0 6.8230856098675258e+02 3.8939595031738281e+02 0 
0 0 1 0

Is it intentional or is there something more to this?

Thanks!

The calibration result seems wrong in scale

My camera and lidar are placed as follow:
lidar camera
But I get the this calibration result.

`After 100 iterations

Average translation is:
0.095076
-0.649966
0.048822
Average rotation is:
0.993107 -0.0859981 0.079641
0.0558907 0.944694 0.323156
-0.103027 -0.316477 0.942989
Average transformation is:
0.993107 -0.0859981 0.079641 0.095076
0.0558907 0.944694 0.323156 -0.649966
-0.103027 -0.316477 0.942989 0.048822
0 0 0 1
Final rotation is:
0.0804318 -0.993111 0.0852074
0.3232 -0.0548811 -0.944738
0.942906 0.103526 0.31656
Final ypr is:
1.32689
-1.23125
0.316071
Average RMSE is: 0.0731869
RMSE on average transformation is: 0.0755466`

The rotation seems right, but the translation is wrong obviously. The translation between camera and lidar can't be big as 0.65m definitely. What is wrong with my usage. I edit the config_file.txt, maker_coordinates.txt in lidar_camera_calibration and zed_left_uurmi.ini, aruco_mapping.lanch in aruco_mapping. These are my configure files.

config_file.txt
marker_coordinates.txt
zed_left_uurmi.txt
aruco_mapping.txt

I'll be thankful to for any suggestion to correct my calibration.

calibration process stuck

I followed the instruction and running the calibration toolbox in ubuntu14.04. But the process is stuck at 'Reading CameraInfo from configuration file' and not continue. I'm not sure it's my settings problem or my laptop is not powerful enough?
Btw, the velodyne and camera nodes are running in another machine. I access them via ROS Master. Thanks!
screenshot from 2017-08-15 11 44 47

Stuck after finding both markers

I was able to successfully build the code and followed through some of the solutions that other questions answered. At this point, the program is able to find both AruCo markers on the image. But for some reason, it is just stuck there and does not go past it. It keeps printing the location of both markers to screen continuously. The output from the program is shown below. One thing that concerns me is Failed to find match for field 'intensity'. line. I see that intensity filter is being used for the point cloud. But I'm just running ROS Velodyne package VLP16_points.launch file. On Rviz, I am able to see the intensity field under /velodyne_points.

Any help is much appreciated!

NODES
  /
    aruco_mapping (aruco_mapping/aruco_mapping)
    find_transform (lidar_camera_calibration/find_transform)

ROS_MASTER_URI=http://localhost:11311

core service [/rosout] found
process[aruco_mapping-1]: started with pid [24294]
process[find_transform-2]: started with pid [24296]
[ INFO] [1505776473.556181936]: Calibration file path: /home/shubangsridhar/shub_src/extrinsic_calibration/src/lidar_camera_calibration/conf/ueye-serialnum-4102778863.ini
[ INFO] [1505776473.556269936]: Number of markers: 2
[ INFO] [1505776473.556295936]: Marker Size: 0.226
[ INFO] [1505776473.556308936]: Type of space: plane
[ INFO] [1505776473.556347936]: ROI allowed: 0
[ INFO] [1505776473.556365936]: ROI x-coor: 0
[ INFO] [1505776473.556379936]: ROI y-coor: 0
[ INFO] [1505776473.556398936]: ROI width: 32710
[ INFO] [1505776473.556418936]: ROI height: 4273583
[ INFO] [1505776473.559131936]: Calibration data loaded successfully
Size of the frame: 1600 x 1200
Limits: -4 to 4
Limits: -4 to 4
Limits: -4 to 0
Number of markers: 2
Intensity threshold (between 0.0 and 1.0): 0.02
useCameraInfo: 1
Projection matrix: 
[805.45886, 0, 800.26105, 0;
  0, 805.26483, 603.10297, 0;
  0, 0, 1, 0]
MAX_ITERS: 1
[ INFO] [1505776473.575178936]: Reading CameraInfo from topic
done1
25=(310.637,608.945) (218.48,681.984) (128.23,605.684) (227.578,534.446) Txyz=-1.23591 0.010846 1.72566 Rxyz=-0.379636 1.37063 -1.90313 
582=(899.297,610.282) (838.419,675.12) (771.255,613.907) (832.576,552.011) Txyz=0.0875873 0.0225216 2.00668 Rxyz=-0.804321 1.5226 -2.06702 
[ INFO] [1505776483.425556935]: First marker with ID: 25 detected
[ WARN] [1505776483.429582935]: TF to MSG: Quaternion Not Properly Normalized
[ INFO] [1505776483.491359935]: Camera info received at 1.50578e+09
[ INFO] [1505776483.491411935]: Velodyne scan received at 1.50578e+09
[ INFO] [1505776483.491425936]: marker_6dof received at 1.50578e+09
Failed to find match for field 'intensity'.
25 -1.23591 0.010846 1.72566 -0.379636 1.37063 -1.90313 582 0.0875873 0.0225216 2.00668 -0.804321 1.5226 -2.06702 
[ INFO] [1505776483.493985935]: iteration number: 0

---------Moving on to next marker--------
582=(899.328,610.288) (838.491,675.034) (771.252,613.848) (832.693,552.059) Txyz=0.0877038 0.0224342 2.0062 Rxyz=-0.80573 1.51786 -2.07075 
25=(310.461,608.854) (218.511,681.82) (128.524,605.7) (227.515,534.588) Txyz=-1.23841 0.0108164 1.72915 Rxyz=-0.381727 1.37277 -1.90387 
582=(899.179,610.352) (838.482,675.026) (771.325,613.815) (832.659,552.14) Txyz=0.0877794 0.0225476 2.00906 Rxyz=-0.800976 1.51713 -2.06865 
25=(310.415,608.931) (218.611,681.7) (128.44,605.721) (227.527,534.674) Txyz=-1.24015 0.0108746 1.73166 Rxyz=-0.374344 1.36729 -1.90307 
582=(899.201,610.354) (838.438,674.998) (771.341,613.83) (832.609,551.993) Txyz=0.0877532 0.0224742 2.00929 Rxyz=-0.799631 1.52126 -2.06486 
25=(310.468,608.963) (218.554,681.845) (128.369,605.734) (227.551,534.594) Txyz=-1.23849 0.0109205 1.72927 Rxyz=-0.377402 1.36981 -1.90236 
582=(899.207,610.279) (838.416,675.121) (771.273,613.924) (832.494,551.981) Txyz=0.0875637 0.0225483 2.00808 Rxyz=-0.800433 1.52453 -2.06427 
25=(310.473,608.933) (218.484,681.905) (128.375,605.759) (227.554,534.503) Txyz=-1.23726 0.0108846 1.72747 Rxyz=-0.381968 1.37321 -1.90287 
582=(899.122,610.443) (838.41,675.137) (771.231,613.829) (832.629,552.245) Txyz=0.0875896 0.0227273 2.00819 Rxyz=-0.800913 1.51456 -2.0699 
25=(310.554,608.924) (218.456,681.872) (128.316,605.737) (227.542,534.537) Txyz=-1.2372 0.0108513 1.72742 Rxyz=-0.379603 1.36957 -1.9054 
582=(898.964,610.388) (838.418,675.058) (771.61,613.883) (832.442,552.419) Txyz=0.0879597 0.0229165 2.01592 Rxyz=-0.796683 1.51967 -2.06589 
25=(310.473,608.964) (218.481,681.855) (128.292,605.76) (227.53,534.484) Txyz=-1.23754 0.0108705 1.72781 Rxyz=-0.379375 1.3714 -1.90231 
582=(898.918,610.433) (838.426,675.075) (771.504,613.949) (832.482,552.386) Txyz=0.0878678 0.0229718 2.0153 Rxyz=-0.796319 1.52035 -2.06511 
25=(310.593,608.907) (218.481,681.826) (128.269,605.749) (227.561,534.504) Txyz=-1.23702 0.010798 1.72721 Rxyz=-0.378176 1.36781 -1.90667 
582=(898.723,610.276) (838.547,675.061) (771.739,614.017) (832.356,552.432) Txyz=0.0881584 0.0230551 2.0211 Rxyz=-0.788952 1.52711 -2.05927 
25=(310.568,608.95) (218.42,681.895) (128.269,605.755) (227.593,534.474) Txyz=-1.23647 0.0108444 1.72637 Rxyz=-0.38093 1.37005 -1.90567

screen shot 2017-09-18 at 7 30 18 pm

Calibration Issues...

Hi,

I am really struggling to get good results. Any help would be much appreciated. @ankitdhall and @karnikram here are some typical results from the calibration:

Distance between the corners:
0.525242
0.501325
0.519984
0.509987
Num points is:8
Number of points: 8
  2.35791  -0.184031  -0.138072
-0.185836   0.504375 -0.0946709
-0.135297 -0.0896248  0.0302841
Rotation matrix:
  0.999998  0.00105429 -0.00192951
-0.00107425    0.999946   -0.010374
0.00191847    0.010376    0.999944
Rotation in Euler angles:
179.952
-179.903
-179.419
Translation:
0.00544582
 0.0184382
-0.00412293
RMSE: 0.00636279
Rigid-body transformation:
  0.999998  0.00105429 -0.00192951  0.00544582
-0.00107425    0.999946   -0.010374   0.0184382
0.00191847    0.010376    0.999944 -0.00412293
         0           0           0           1
--------------------------------------------------------------------
After 81 iterations
--------------------------------------------------------------------
Average translation is:
 0.0271066
 0.0151644
-0.00466677
Average rotation is:
  0.999998  0.00105429 -0.00192951
-0.00107425    0.999946   -0.010374
0.00191847    0.010376    0.999944
Average transformation is:
  0.999998  0.00105429 -0.00192951   0.0271066
-0.00107425    0.999946   -0.010374   0.0151644
0.00191847    0.010376    0.999944 -0.00466677
         0           0           0           1
Average RMSE is: 0.0141243
RMSE on average transformation is: 0.0228186

The camera and lidar are separated by about 40cm horizontally but the translation values are at least an order of magnitude smaller than that. I double checked the intrinsic calibration values and the other config files. Here they are:

Config_file.txt

640 480
-4.5 4.5
-4.0 4.0
-4.0 4.0
0.005
2
0
635.519348 0.0        306.008153 0.0
0.0        634.956848 249.757176 0.0
0.0        0.0        1.0        0.0
100

marker_coordinates.txt

2
50.0
50.0
3.8
0.5
20.6
50
51.6
3.5
0.3
20.6

find_transform.launch

<?xml version="1.0"?>
<launch>
 <!-- <param name="/use_sim_time" value="true"/> -->
 

 <!-- ArUco mapping -->
<node pkg="aruco_mapping" type="aruco_mapping" name="aruco_mapping" output="screen">
   <!--<remap from="/image_raw" to="/frontNear/left/image_raw"/>-->

   <param name="calibration_file" type="string" value="$(find aruco_mapping)/data/liveCam.ini" />
   <param name="num_of_markers" type="int" value="2" />
   <param name="marker_size" type="double" value="0.206"/>
   <param name="space_type" type="string" value="plane" />
   <param name="roi_allowed" type="bool" value="false" />


 </node>


 <rosparam command="load" file="$(find lidar_camera_calibration)/conf/lidar_camera_calibration.yaml" />
 <node pkg="lidar_camera_calibration" type="find_transform" name="find_transform" output="screen">
 </node>

 <node pkg="lidar_input" type="lidar_input_node" name="lidar_input_node" output="screen">
 </node>

 <node pkg="camera_input" type="camera_input_node" name="camera_input_node" output="screen">
 </node>
</launch>

Thank you for all your help so far. I really appreciate it. If you have any insight into why this might not be working, please please please let me know as soon as possible.

Thanks!

find_transform.launch doesn't continue after printing "Reading CameraInfo from topic"

I am running the launch file on rosbag that I had collected. after a while, it outputs " Messages of type 0 arrived out of order (will print only once)". The screen with the Aruco markers, where we have to select the corner of the board doesn't even pop up.
I am able to echo the topics and see the lidar and camera data in Rviz as well.

I am using ROS Indigo on Ubuntu 14.04.
config_file.txt
marker_coordinates.txt
The hardware I am using is VLP_16 and Pointgrey chameleon 3
screenshot from 2018-05-07 13 10 54

Coordinate system of rotation and translation?

Can you give me coordinate system of the translation/lidar? Thanks for all the help by the way! I have gotten the package to run to completion. Unfortunately, I don't think the results of the translation are accurate. Any suggestions?

No Point cloud

Not able to get any point cloud from Lidar when find transform is launched. Although I am able to visualize lidar in rviz

The filtered point cloud was not displayed

I use rosplay to publish the topics needed. the marker was detected by the aruco_mapiing package ,which shown by the window mono8; but the filtered point cloud window was not displayed, I could not mark the line segments. Is anything wrong with my rosbag publishing. How should I publish the topics ?
Thank You!

Points.txt number of points

The README says that the file points.txt should contain a number at the begining of the file that corresponds to num_of_sensors*(num_of_markers*points_per_board). I looked at the code and it seems that this number should just be (num_of_markers*points_per_board) . For the example targets that are in the README the number of points would then be 8 rather than the suggested 16. Is this correct?

Met Poor Result..

I run with correct intrinsic camera parameter, marker configuration file, fixed launch files, but it shows so poor result(RMSE : 0.37~~~)

image

It seems some error with our running method with this package, but I can't find any error...

I would really appreciate it if anyone could help with our calibration.
Below files are our configuration files, launch file and rosbag file of our experiment.

config_file.zip
lidar_camera_calibration.zip
marker_coordinates.zip
find_transform.zip
zed_left_uurmi.zip
https://drive.google.com/open?id=0B_ZpibQh0KW6NjYwOTJKUEp1aUU
(rosbag file)

Kabsch or ICP

In your paper, you mention the results for ICP (iterative approximation) and Kabsch (closed-form solution) algorithm.

  1. Why does ICP outperform Kabsch in the RMSE? Is the error in Kabsch caused by averaging the rotation matrices?

  2. Does the code use ICP or Kabsch algorithm? Is there a way of switching in between both?

Thanks!

marker_6dof.h directory not found

[ 21%] Linking CXX shared library /home/shivam/catkin_ws/devel/lib/libaruco.so
[ 21%] Built target aruco
/home/shivam/catkin_ws/src/lidar_camera_calibration/src/find_velodyne_points.cpp:34:50: fatal error: lidar_camera_calibration/marker_6dof.h: No such file or directory
compilation terminated.
lidar_camera_calibration/CMakeFiles/find_transform.dir/build.make:62: recipe for target 'lidar_camera_calibration/CMakeFiles/find_transform.dir/src/find_velodyne_points.o' failed
make[2]: *** [lidar_camera_calibration/CMakeFiles/find_transform.dir/src/find_velodyne_points.o] Error 1
make[2]: *** Waiting for unfinished jobs....
CMakeFiles/Makefile2:2999: recipe for target 'lidar_camera_calibration/CMakeFiles/find_transform.dir/all' failed
make[1]: *** [lidar_camera_calibration/CMakeFiles/find_transform.dir/all] Error 2
Makefile:138: recipe for target 'all' failed
make: *** [all] Error 2
Invoking "make -j8 -l8" failed

Fatal error: Assertion failed

Hello,
the calibration process has died after 74 iterations. The displayed error is the following.
Are you familiar with the error and could you please explain what this error means and how it is caused?
[FATAL] [1511892092.578293398]: ASSERTION FAILED
file = .../src/lidar_camera_calibration/include/lidar_camera_calibration/Find_RT.h
line = 277
cond = marker_info.size()/7 == num_of_marker_in_config
Thanks !

Lidar_RGB_calibration

Hey, if I wanna calibrate the RGB camera with the Velodyne VLP-16, how could I make it?

Whether it's convenient for leaving me a contact?

lidar_camera_calibration_rt topic has no publisher

I think I have found the reason why it is hanging but I don't know how to fix it. The callback function in find_velodyne_points.cpp subscribes to the topic "lidar_camera_calibration_rt" but this topic has no publisher when I do $ rostopic info /lidar_camera_calibration_rt. I am assuming the topic /aruco_poses should publish to it but that topic is currently publishing a completely different type (aruco_mapping/ArucoMarker). Any thoughts?

How to set initial rotation?

Hello, I want to ask how do I set the initial rotation between the camera and the lidar?
First of all is the rotation from cam to Lidar or from Lidar to cam?
second, if I have the x-axis of the lidar pointing right, does this mean that the rot_x is 0 because the camera's x-axis is also pointing right?
@karnikram
@ankitdhall
Thanks

flipped camera points

We have a testing setup as shown in the image and got some results from it. I evaluated the result using the matlab code provided. But seems the result of camera points are flipped. Our velodyne in the car is pointing downwards around 6 degrees, camera is pointing down around 15 degrees. I subscribed to the already rectified image topic. Not sure what could be the reason for this result?
averaged transformation T = [ 0.907442 -0.377166 0.185188 -0.656001
-0.372771 -0.926022 -0.0593794 -0.212374
0.193884 -0.0151491 -0.980908 3.65862
0 0 0 1
];
evaluation
setup

catkin_make issues

Hi,

For ROS-Kinetic, it seems that most tutorials mention a very simple procedure by copying to /catkin_ws/src and running catkin_make

as shown here
https://github.com/eYSIP-2017/eYSIP-2017_Navigation-in-Indoor-Environments-using-drone/wiki/Setup-ArUco-Mapping-package
and
https://github.com/warp1337/ros_aruco

Unfortunately, after long hours of trying, this still leads to "aruco_msgs" missing configuration file, or 'aruco'

Following
#49
and installing aruco, aruco-ros, aruco-msgs, and aruco-mapping from apt, i.e. sudo apt install ros-kinetic-aruco

but once these are installed from apt, compiling another library leads to issues.
We are trying to compile
https://github.com/ankitdhall/lidar_camera_calibration
and it leads to 'lidar_camera_calibration" missing configuration file when running catkin_make
yet it works fine without that folder in the src folder.

Anyone have a similar experience?

We tried building non-aruco libraries from src with catkin_make, and it was successful. Our feeling is that something goes wrong with aruco builds, hence it can't even make them from source but can only do so after installing from apt.

Thanks,

Distortion matrix

Does the lidar-camera-calibration package take the camera's intrinsic distortion matrix as input?

The aruco-mapping takes the distortion matrix as input. I wonder if there's an imprecision factor generated, by not including the distortion matrix in lidar-camera-calibration.

Directory structure for installation.

In your instruction,

Clone this repository to your machine.
Put the cloned repository, dependencies/aruco_ros and dependencies/aruco_mapping folders in path/to/your/ros/workspace/src and run the following commands,

So I did following commands

# Move two dependencies folders under the src
cd ~
git clone https://github.com/ankitdhall/lidar_camera_calibration.git
cd lidar_camera_calibration/
cp -r dependencies/ ./../catkin_ws/src/

# Move entire git folder under the src 
cd ~
cp -r lidar_camera_calibration/ ./catkin_ws/src/

Then I type the catkin_make -DCATKIN_WHITELIST_PACKAGES="aruco_ros;aruco_mapping" in ~/catkin_ws/

Is this correct?

Over estimating size of targets

Have you found that the algorithm slightly over estimates the side lengths of the targets by about 1cm or so? It's causing the calibration to be off...

Use Average transformation or Final Rotation matrix?

In order to directly map raw lidar points to camera frame, should I use the average transformation matrix? or should I add the the translation vector to the final rotation matrix and use it instead?
Thanks

Build Issues

I had the following issues when building your code and I thought it might be useful for you to know:

  • Problem: fatal error: pcap.h: No such file or directory compilation terminated. Solution: $ Sudo not-get instal libpcap0.8-dev

  • Problem: In Corners.cpp could not find ros::package Solution: Insert the following line #include<ros/package.h>

Cloud window is not visible

Hi,
Thanks for providing this module for lidar-camera calibration.
The issue I have is when I roslaunch this module I don't see the cloud and polygon windows show up.
I'm not using velodyne lidar but another lidar which has 40 rings as opposed to 16 rings that you use.
Is there something specific about velodyne lidar that the code assumes?

Please let me know.

Thanks,
Sumanth

fx and cx in config_file.txt

Hello,
could you please explain the meaning and values of the transform in config_file.txt:
fx 0 cx 0
0 fy cy 0
0 0 1 0

611.651245 0.0 642.388357 0.0
0.0 688.443726 365.971718 0.0
0.0 0.0 1.0 0.0
Thanks!

Why can I detect the edges of two rectangular board, but the 'polygon' window still in invalid state?

I am also stuck at constantly refreshing the TF of two markers . I can see points in "cloud" window, but the "polygon" window displayed nothing. I paste my log information below, and looking foreword to any reply, thanks!

2017-12-21 14_44_13____________
`SUMMARY

PARAMETERS

  • /aruco_mapping/calibration_file: /home/molmc-chy/v...
  • /aruco_mapping/marker_size: 0.18
  • /aruco_mapping/num_of_markers: 2
  • /aruco_mapping/roi_allowed: False
  • /aruco_mapping/roi_height: 480
  • /aruco_mapping/roi_width: 752
  • /aruco_mapping/roi_x: 0
  • /aruco_mapping/roi_y: 0
  • /aruco_mapping/space_type: plane
  • /lidar_camera_calibration/camera_frame_topic: /camera/image_raw
  • /lidar_camera_calibration/camera_info_topic: /camera/camera_info
  • /lidar_camera_calibration/velodyne_topic: /velodyne_points
  • /rosdistro: indigo
  • /rosversion: 1.11.21

NODES
/
aruco_mapping (aruco_mapping/aruco_mapping)
find_transform (lidar_camera_calibration/find_transform)

ROS_MASTER_URI=http://localhost:11311

core service [/rosout] found
process[aruco_mapping-1]: started with pid [9424]
process[find_transform-2]: started with pid [9425]
[ INFO] [1513839635.890962802]: Calibration file path: /home/molmc-chy/velodyne_ws/src/aruco_mapping/data/ptgrey.ini
[ INFO] [1513839635.891108217]: Number of markers: 2
[ INFO] [1513839635.891158264]: Marker Size: 0.18
[ INFO] [1513839635.891188261]: Type of space: plane
[ INFO] [1513839635.891222934]: ROI allowed: 0
[ INFO] [1513839635.891251538]: ROI x-coor: 0
[ INFO] [1513839635.891282672]: ROI y-coor: 0
[ INFO] [1513839635.891312241]: ROI width: 32665
[ INFO] [1513839635.891342511]: ROI height: 4272766
[ INFO] [1513839635.893254134]: Calibration data loaded successfully
Size of the frame: 752 x 480
Limits: -1 to 1.5
Limits: -1 to 1
Limits: 0 to 2.3
Number of markers: 2
Intensity threshold (between 0.0 and 1.0): 0.0004
useCameraInfo: 0
Projection matrix:
[722.22699, 0, 396.83157, 0;
0, 724.25177, 245.65433, 0;
0, 0, 1, 0]
MAX_ITERS: 100
[ INFO] [1513839635.928687068]: Reading CameraInfo from configuration file
66=(136.628,271.375) (100.5,319.508) (49.3703,284.37) (86.3264,235.886) Txyz=-0.91036 0.0955786 2.16889 Rxyz=-0.441642 1.79648 -1.78922
77=(403.858,266.719) (359.723,305.623) (320.718,261.157) (365.832,222.415) Txyz=-0.104221 0.0555665 2.22151 Rxyz=-0.630651 1.64157 -1.6905
[ INFO] [1513839636.272981332]: First marker with ID: 66 detected
[ WARN] [1513839636.277726860]: TF to MSG: Quaternion Not Properly Normalized
66=(136.723,271.402) (100.399,319.546) (49.3407,284.313) (86.2181,235.797) Txyz=-0.908639 0.0953415 2.1645 Rxyz=-0.44495 1.80124 -1.78419
77=(403.909,266.643) (359.656,305.696) (320.753,261.148) (365.746,222.378) Txyz=-0.104228 0.0554753 2.22053 Rxyz=-0.634499 1.63693 -1.6988
[ INFO] [1513839636.387703679]: Velodyne scan received at 1.51384e+09
[ INFO] [1513839636.387768020]: marker_6dof received at 1.51384e+09

Initial Rot 0.000796274 -0.999999 -0.000796274
0 0.000796274 -1
1 0.000796274 6.34053e-07
66 -0.91036 0.0955786 2.16889 -0.441642 1.79648 -1.78922 77 -0.104221 0.0555665 2.22151 -0.630651 1.64157 -1.6905
[ INFO] [1513839636.391943552]: iteration number: 0

---------Moving on to next marker--------
66=(136.766,271.302) (100.316,319.534) (49.3688,284.327) (86.2015,235.876) Txyz=-0.909787 0.0954205 2.16722 Rxyz=-0.441765 1.79419 -1.79198
77=(403.852,266.67) (359.709,305.584) (320.792,261.157) (365.715,222.271) Txyz=-0.104259 0.0553625 2.22151 Rxyz=-0.630496 1.63518 -1.69853
66=(136.598,271.351) (100.45,319.472) (49.4021,284.353) (86.3635,235.782) Txyz=-0.910094 0.0953978 2.16819 Rxyz=-0.446826 1.79635 -1.79403
77=(403.905,266.596) (359.696,305.592) (320.723,261.163) (365.805,222.383) Txyz=-0.104179 0.0554385 2.22051 Rxyz=-0.627335 1.64939 -1.6817
66=(136.666,271.409) (100.374,319.469) (49.3761,284.363) (86.2708,235.826) Txyz=-0.910144 0.0954931 2.16811 Rxyz=-0.443119 1.79708 -1.78829
77=(403.794,266.645) (359.525,305.734) (320.872,261.134) (365.687,222.247) Txyz=-0.104203 0.0553382 2.2188 Rxyz=-0.614036 1.61741 -1.70352
66=(136.612,271.359) (100.438,319.458) (49.407,284.363) (86.324,235.876) Txyz=-0.910964 0.0955674 2.17024 Rxyz=-0.442622 1.79515 -1.7918
77=(403.818,266.673) (359.722,305.599) (320.775,261.138) (365.78,222.356) Txyz=-0.104262 0.0554456 2.22226 Rxyz=-0.628077 1.63286 -1.69834
66=(136.723,271.406) (100.434,319.426) (49.4147,284.36) (86.3251,235.868) Txyz=-0.910746 0.0955728 2.16997 Rxyz=-0.438658 1.79505 -1.78661
77=(403.828,266.642) (359.678,305.608) (320.782,261.121) (365.769,222.334) Txyz=-0.104261 0.055387 2.22166 Rxyz=-0.627092 1.6328 -1.69756
66=(136.729,271.341) (100.553,319.447) (49.526,284.382) (86.3537,235.758) Txyz=-0.909343 0.0953369 2.16702 Rxyz=-0.446929 1.80043 -1.7902
77=(403.936,266.628) (359.617,305.587) (320.764,261.054) (365.798,222.433) Txyz=-0.104355 0.0554368 2.22316 Rxyz=-0.644325 1.64833 -1.69197
66=(136.815,271.371) (100.532,319.432) (49.4913,284.398) (86.4279,235.801) Txyz=-0.909968 0.0954647 2.16877 Rxyz=-0.439982 1.79382 -1.79069
77=(403.882,266.616) (359.656,305.628) (320.807,261.104) (365.783,222.364) Txyz=-0.104229 0.0553983 2.22183 Rxyz=-0.63101 1.63568 -1.69703
66=(136.788,271.348) (100.529,319.442) (49.5416,284.419) (86.3848,235.854) Txyz=-0.910355 0.0955456 2.16963 Rxyz=-0.441614 1.795 -1.79218
77=(403.867,266.593) (359.677,305.699) (320.839,261.138) (365.731,222.415) Txyz=-0.104268 0.055483 2.22259 Rxyz=-0.632272 1.628 -1.70697
66=(136.791,271.344) (100.528,319.444) (49.626,284.319) (86.4911,235.815) Txyz=-0.910125 0.0954218 2.16938 Rxyz=-0.447536 1.79623 -1.79413
77=(403.825,266.688) (359.698,305.662) (320.783,261.141) (365.748,222.356) Txyz=-0.104261 0.0554683 2.22167 Rxyz=-0.62955 1.62726 -1.705 `

met poor performace

the data in poins.txt is as following:
8
-0.56811 -0.473328 1.79659
-0.213204 -0.134026 1.7816
-0.543555 0.218787 1.78348
-0.915342 -0.132713 1.8172
0.145415 -0.450564 1.79215
0.56126 -0.191263 1.78309
0.289001 0.229009 1.78225
-0.113636 -0.0443698 1.79478
-0.485469 -0.327932 1.47909
-0.144692 0.0051091 1.53701
-0.471128 0.350685 1.47055
-0.811904 0.0176443 1.41263
0.124684 -0.314695 1.50046
0.524897 -0.0503131 1.51868
0.260503 0.350273 1.51351
-0.139709 0.0858913 1.49529
it seems that the points' coordinates of the camera are much different from those of the lidar.
at the same time, the result of my calibratioin seems not so correct. because my camera is just next to the lidar.
(
Average transformation is:
0.996523 0.00304858 -0.0832566 0.176845
-0.00350322 0.99998 -0.0053152 0.124334
0.0832387 0.00558839 0.996514 -0.28119
0 0 0 1
)
can you help my figure out what may be wrong ?and how to make it more acurate?

velodyne_msgs error

CMake Warning at /opt/ros/kinetic/share/catkin/cmake/catkinConfig.cmake:76 (find_package):
Could not find a package configuration file provided by "velodyne_msgs"
with any of the following names:

velodyne_msgsConfig.cmake
velodyne_msgs-config.cmake

Add the installation prefix of "velodyne_msgs" to CMAKE_PREFIX_PATH or set
"velodyne_msgs_DIR" to a directory containing one of the above files. If
"velodyne_msgs" provides a separate development package or SDK, be sure it
has been installed.
Call Stack (most recent call first):
lidar_camera_calibration/CMakeLists.txt:12 (find_package)

-- Could not find the required component 'velodyne_msgs'. The following CMake error indicates that you either need to install the package with the same name or change your environment so that it can be found.
CMake Error at /opt/ros/kinetic/share/catkin/cmake/catkinConfig.cmake:83 (find_package):
Could not find a package configuration file provided by "velodyne_msgs"
with any of the following names:

velodyne_msgsConfig.cmake
velodyne_msgs-config.cmake

Add the installation prefix of "velodyne_msgs" to CMAKE_PREFIX_PATH or set
"velodyne_msgs_DIR" to a directory containing one of the above files. If
"velodyne_msgs" provides a separate development package or SDK, be sure it
has been installed.
Call Stack (most recent call first):
lidar_camera_calibration/CMakeLists.txt:12 (find_package)

-- Configuring incomplete, errors occurred!
See also "/home/shivam/catkin_ws/build/CMakeFiles/CMakeOutput.log".
See also "/home/shivam/catkin_ws/build/CMakeFiles/CMakeError.log".
Makefile:290: recipe for target 'cmake_check_build_system' failed
make: *** [cmake_check_build_system] Error 1
Invoking "make cmake_check_build_system" failed

lidar input node

Do you happen to have two nodes/code that publish the lidar and camera data to the topics:

camera_frame_topic: /frontNear/left/image_raw
velodyne_topic: /velodyne_points

Thanks for all the help and the quick responses!

Sample Data

I was trying to get this running but I don't have any test data to work with at the moment. Would you mind uploading some please?

setup requirements

I just started looking for some good calibration toolboxes. I'm wondering if there's any requirement for ArUco marker? Where should I find the best choice of this marker? Also, is this toolbox working for both stereo and monocular cameras? Does it work for multiple cameras (say more than 2)? Thanks!

how to make 100 iterations?

Hi ankitdhall, May I ask how can I modify your code to run 100 iterations? I met very poor result with only one iteration. And according to your paper it seem the accuracy can be improved a lot by increase the iteration numbers.
Thanks in advance.

how to understand calibration result

Hi, I tried twice for this calibration toolbox. I got two different answers for the same camera-velodyne setup. Our velodyne is located ~5cm above the camera, and the camera is pointing 15 degree downwards. Here is the result we got:
​Second test:
After 100 iterations

Average translation is:
-0.220729
-0.273677
4.82629
Average rotation is:
0.905049 0.419485 0.0701358
0.40611 -0.901351 0.15047
0.126337 -0.107699 -0.986124
Average transformation is:
0.905049 0.419485 0.0701358 -0.220729
0.40611 -0.901351 0.15047 -0.273677
0.126337 -0.107699 -0.986124 4.82629
0 0 0 1
Average RMSE is: 0.325415
RMSE on average transformation is: 0.331159

First test:
After 100 iterations

Average translation is:
0.422593
0.831942
3.90745
Average rotation is:
0.900951 0.36973 -0.227127
0.22701 -0.847701 -0.479447
-0.369801 0.380398 -0.84767
Average transformation is:
0.900951 0.36973 -0.227127 0.422593
0.22701 -0.847701 -0.479447 0.831942
-0.369801 0.380398 -0.84767 3.90745
0 0 0 1
Average RMSE is: 0.346004
RMSE on average transformation is: 0.353328

Is there any documentation on how to read this result? I visualized rotation matrix with ROS tf, but neither of them seems correct. Do you have any suggestions what I should do to get good results? Will this toolbox work for sensor setup like ours? Btw, our camera is 100degree FoV and Velodyne is 16 beams. Really appreciate your help!

How to use average translation and YPR angles with ROS from LIDAR and Camera data fusion

Hi,
I run the package, and got results as shown below,
screenshot from 2018-03-30 19-58-31

I would like to visualize Lidar point clouds in Zed camera frame. So I do a static coordinate transformation with this command

rosrun tf static_transform_publisher -0.09122 -0.04545 -0.0432 1.48563 -1.5026 0.13215 zed_left_camera velodyne 100

the first three parameters are x y z for translation and the last three parameters are for yaw pitch roll angles.

However result look non relevant as shown below

screenshot from 2018-03-30 20-09-15
What I am doing wrong ? Is there another way to use transformation matrix(4X4) in ROS
below are my files and rosbag
zed_left_uurmi.ini
config_file.txt

marker_coordinates.txt
Rosbag : https://drive.google.com/open?id=1HKqOUeIacSARwwUcCV6CU83fjTWr3OwI

Update;
My setup
29831154_1700629523308503_861922818_o

Does it work with other lidar types?

I tried using another lidar device and updated the velodyne topic in lidar_camera_calibration https://github.com/ankitdhall/lidar_camera_calibration/blob/master/conf/lidar_camera_calibration.yaml#L4 . I am getting the following error:

[ INFO] [1522777695.370229735]: First marker with ID: 10 detected
[ WARN] [1522777695.382916640]: TF to MSG: Quaternion Not Properly Normalized
[ INFO] [1522777695.432211706]: Velodyne scan received at 1.52278e+09
[ INFO] [1522777695.432235629]: marker_6dof received at 1.52278e+09
Failed to find match for field 'ring'.

Initial Rot 0.000796274    -0.999999 -0.000796274
           0  0.000796274           -1
           1  0.000796274  6.34053e-07
10=(1107.69,1012.23) (1024.24,1102.09) (935.85,1021.21) (1020.12,931.427) Txyz=0.111969 0.849601 2.85558 Rxyz=-0.620609 1.73495 -1.76494 
20=(1498.92,1023.35) (1407.57,1114.76) (1310.73,1023.5) (1400.71,932.817) Txyz=0.816052 0.817916 2.70597 Rxyz=-0.78969 1.8706 -1.74944 
10=(1107.74,1012.18) (1024.28,1102.05) (935.82,1021.17) (1020.11,931.481) Txyz=0.111959 0.849404 2.85501 Rxyz=-0.620971 1.73789 -1.76304 
20=(1498.92,1023.39) (1407.6,1114.73) (1310.82,1023.57) (1400.71,932.755) Txyz=0.816282 0.818081 2.70652 Rxyz=-0.78935 1.86932 -1.7504 
[find_transform-2] process has died [pid 18752, exit code -11, cmd /.../devel/lib/lidar_camera_calibration/find_transform __name:=find_transform __log:=/cyngn_log/9a3596ec-3763-11e8-bf5e-1c1b0d9bb6f8/find_transform-2.log].
log file: /log/9a3596ec-3763-11e8-bf5e-1c1b0d9bb6f8/find_transform-2*.log

I am able to calibrate when using a Velodyne lidar. The only difference that I made to cause this error is use a different lidar and different topic name. Both use PointCloud2.msg.

No response - no windows - no errors

Hello I trying to run the calibration but no window is popping up. Node launches but the last info message is Reading CameraInfo from configuration file and then nothing.

Using ROS Indigo with OpenCV 3.4.1

Any clues?

Here is the output:

Size of the frame: 1024 x 512
Limits: -0.56 to 0.5
Limits: 2.7 to 3.2
Limits: -0.99 to -0.43
Number of markers: 2
Intensity threshold (between 0.0 and 1.0): 0.0004
useCameraInfo: 0
Projection matrix:
[551.29785, -0.27212, 501.52975, 0;
0, 551.16797, 248.17503, 0;
0, 0, 1, 0]
MAX_ITERS: 10
[ INFO] [1528900343.889889061]: Reading CameraInfo from configuration file

Do you analyse scale also?

Hi.

When using VLP-16 and Logitech c930 and they are positioned close to each other, object appear with different sizes in the two sensors. Also when calculating the scales from the transformation matrix or transforming RGB pixels on top of the pointcloud, the scale difference is visible as seen on the Figure with two aruco cartboards.

image

Is it correct that scale information is not considered or the problem is probably somewhere else?

When I run the program, I've been stuck in this position

Thank you for all the work you have done!!
When I run the program, I've been stuck in this position

exbot@robot:~/seg_ws$ roslaunch lidar_camera_calibration find_transform.launch
... logging to /home/exbot/.ros/log/1389abea-d635-11e7-978d-b8975ac33022/roslaunch-robot-19991.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://robot:39300/

SUMMARY

PARAMETERS

  • /aruco_mapping/calibration_file: /home/exbot/seg_w...
  • /aruco_mapping/marker_size: 0.207
  • /aruco_mapping/num_of_markers: 2
  • /aruco_mapping/roi_allowed: False
  • /aruco_mapping/space_type: plane
  • /lidar_camera_calibration/camera_frame_topic: /camera/rgb/image...
  • /lidar_camera_calibration/camera_info_topic: /camera/rgb/camer...
  • /lidar_camera_calibration/velodyne_topic: /velodyne_points
  • /rosdistro: indigo
  • /rosversion: 1.11.21

NODES
/
aruco_mapping (aruco_mapping/aruco_mapping)
find_transform (lidar_camera_calibration/find_transform)

ROS_MASTER_URI=http://localhost:11311

core service [/rosout] found
process[aruco_mapping-1]: started with pid [20009]
process[find_transform-2]: started with pid [20010]
[ INFO] [1512098234.389068869]: Calibration file path: /home/exbot/seg_ws/src/aruco_mapping/data/zed_left_uurmi.ini
[ INFO] [1512098234.389137636]: Number of markers: 2
[ INFO] [1512098234.389169334]: Marker Size: 0.207
[ INFO] [1512098234.389186190]: Type of space: plane
[ INFO] [1512098234.389215743]: ROI allowed: 0
[ INFO] [1512098234.389235967]: ROI x-coor: 0
[ INFO] [1512098234.389251696]: ROI y-coor: 0
[ INFO] [1512098234.389265392]: ROI width: 32576
[ INFO] [1512098234.389278377]: ROI height: 4273583
[ INFO] [1512098234.390848820]: Calibration data loaded successfully
Size of the frame: 640 x 480
Limits: -1.5 to 1.5
Limits: -1.5 to 1.5
Limits: 0 to 3
Number of markers: 2
Intensity threshold (between 0.0 and 1.0): 0.01
useCameraInfo: 0
Projection matrix:
[541.53827, 0, 309.19846, 0;
0, 541.88391, 234.94804, 0;
0, 0, 1, 0]
MAX_ITERS: 100
[ INFO] [1512098234.446563774]: Reading CameraInfo from configuration file
100=(240.526,224.147) (280.594,264.826) (240.282,304.936) (201.141,263.979) Txyz=-0.248869 0.106907 1.96283 Rxyz=-0.619671 -1.69843 1.76172
180=(376.31,345.552) (337.306,305.144) (377.607,265.207) (416.413,306.024) Txyz=0.247798 0.257583 1.98183 Rxyz=-1.46249 0.537823 -0.655972
[ INFO] [1512098238.688796955]: First marker with ID: 100 detected
[ WARN] [1512098238.691812183]: TF to MSG: Quaternion Not Properly Normalized
100=(240.47,224.216) (280.5,264.74) (240.568,304.809) (201.005,264.215) Txyz=-0.24944 0.107319 1.96808 Rxyz=-0.636058 -1.71376 1.75153
180=(376.51,345.975) (337.219,305.229) (377.612,265.202) (416.801,306.303) Txyz=0.247346 0.257336 1.9751 Rxyz=-1.48229 0.544808 -0.652409
100=(240.562,224.118) (280.593,264.826) (240.25,304.968) (200.947,264.038) Txyz=-0.248917 0.106914 1.96229 Rxyz=-0.631458 -1.703 1.76491
180=(376.744,346.239) (337.357,305.162) (377.665,265.115) (416.894,306.355) Txyz=0.247381 0.256815 1.97104 Rxyz=-1.50036 0.526474 -0.67013
100=(240.655,224.12) (280.512,264.907) (240.344,304.874) (201.089,264.175) Txyz=-0.249104 0.107202 1.96539 Rxyz=-0.627884 -1.70095 1.76401
180=(376.921,346.28) (337.231,305.378) (377.628,265.109) (417.063,306.319) Txyz=0.247126 0.256677 1.96811 Rxyz=-1.49488 0.53991 -0.662613
100=(240.656,224.044) (280.517,264.902) (240.285,304.933) (201.053,263.942) Txyz=-0.248732 0.106777 1.96154 Rxyz=-0.618724 -1.70001 1.76226
180=(376.567,345.822) (337.283,305.167) (377.644,265.17) (416.95,306.326) Txyz=0.247888 0.257515 1.9779 Rxyz=-1.48503 0.558162 -0.640999
100=(240.553,224.223) (280.53,264.854) (240.194,305.09) (200.858,264.014) Txyz=-0.249112 0.107109 1.96236 Rxyz=-0.634358 -1.70198 1.7685

It looks like there's no new thing.why?

cross_sensor_calibration/launch does not exist...

Hi,

Really cool ROS package! I like the approach using AR tags. I built the package but I am having issues running it. The directory cross_sensor_calibration/launch does not exist. I was wondering if you could add this or clarify where the new launch file is. Please note that I am fairly new to ROS so I am not an expert in launching ROS nodes.

Thanks!

Should launch `aruco_mapping.launch` after "done1" msg.?

I change the config_file.txt & lidar_camera_calibration.yaml & aruco_mapping.launch files and excute the roslaunch lidar_camera_calibration find_transform.launch --screen

image

However, it stuck after "done1" msg.

There is nothing happened. So I launched roslaunch aruco_mapping.launch manually in other terminal.

Is this correct?

how can i value my calibration resultes?

I saw that you said you got a result with error about 1-2 cm?
So how did you measure its true position so that you can compare it with your calibration result ?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.