Giter Site home page Giter Site logo

carlosmccosta / dynamic_robot_localization Goto Github PK

View Code? Open in Web Editor NEW
752.0 35.0 190.0 15.7 MB

Point cloud registration pipeline for robot localization and 3D perception

License: BSD 3-Clause "New" or "Revised" License

CMake 1.01% C++ 97.34% Shell 1.56% C 0.09%
robot-localization localization robotics mapping pcl lidar pointcloud-registration tracking 3d-perception icp

dynamic_robot_localization's People

Contributors

carlosmccosta avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dynamic_robot_localization's Issues

catkin_make error

Hi ,Carlos Miguel :
I have a problem about the "catkin_make",below was the result:
In file included from /home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/src/cloud_filters/plane_segmentation.cpp:10:0:
/home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/include/dynamic_robot_localization/cloud_filters/impl/plane_segmentation.hpp: In instantiation of ‘void dynamic_robot_localization::PlaneSegmentation::filter(const typename pcl::PointCloud::Ptr&, typename pcl::PointCloud::Ptr&) [with PointT = pcl::PointXYZRGBNormal; typename pcl::PointCloud::Ptr = boost::shared_ptr<pcl::PointCloudpcl::PointXYZRGBNormal >]’:
/home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/src/cloud_filters/plane_segmentation.cpp:19:1: required from here
/home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/include/dynamic_robot_localization/cloud_filters/impl/plane_segmentation.hpp:124:157: error: call of overloaded ‘transformPointCloudWithNormals(pcl::PointCloudpcl::PointXYZRGBNormal&, pcl::PointCloudpcl::PointXYZRGBNormal&, const Eigen::internal::inverse_impl<Eigen::Matrix<float, 4, 4> >)’ is ambiguous
pcl::transformPointCloudWithNormals(*convex_hull_for_projected_plane_inliers, *convex_hull_for_projected_plane_inliers, centroid_matrix.inverse());
^
/home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/include/dynamic_robot_localization/cloud_filters/impl/plane_segmentation.hpp:124:157: note: candidates are:
In file included from /home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/include/dynamic_robot_localization/cloud_filters/plane_segmentation.h:26:0,
from /home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/include/dynamic_robot_localization/cloud_filters/impl/plane_segmentation.hpp:9,
from /home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/src/cloud_filters/plane_segmentation.cpp:10:
/home/zcb/catkin_ws_drl/devel_release/include/pcl-1.8/pcl/common/transforms.h:145:3: note: void pcl::transformPointCloudWithNormals(const pcl::PointCloud&, pcl::PointCloud&, const Affine3f&, bool) [with PointT = pcl::PointXYZRGBNormal; Eigen::Affine3f = Eigen::Transform<float, 3, 2>]
transformPointCloudWithNormals (const pcl::PointCloud &cloud_in,
^
In file included from /home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/include/dynamic_robot_localization/cloud_filters/plane_segmentation.h:26:0,
from /home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/include/dynamic_robot_localization/cloud_filters/impl/plane_segmentation.hpp:9,
from /home/zcb/drl/src/dynamic_robot_localization-kinetic-devel/src/cloud_filters/plane_segmentation.cpp:10:
/home/zcb/catkin_ws_drl/devel_release/include/pcl-1.8/pcl/common/transforms.h:318:3: note: void pcl::transformPointCloudWithNormals(const pcl::PointCloud&, pcl::PointCloud&, const Matrix4f&, bool) [with PointT = pcl::PointXYZRGBNormal; Eigen::Matrix4f = Eigen::Matrix<float, 4, 4>]
transformPointCloudWithNormals (const pcl::PointCloud &cloud_in,
^
make[2]: *** [dynamic_robot_localization-kinetic-devel/CMakeFiles/drl_cloud_filters.dir/src/cloud_filters/plane_segmentation.cpp.o] 错误 1
make[2]: *** 正在等待未完成的任务....
make[1]: *** [dynamic_robot_localization-kinetic-devel/CMakeFiles/drl_cloud_filters.dir/all] 错误 2
make: *** [all] 错误 2

Failed build dynamic_robot_localization

Greetings Carlos,

after I clone your repository and execute the install.bash, I get the following error at the end:

Errors << dynamic_robot_localization:make /home/warehouse/catkin_ws_drl/logs_release/dynamic_robot_localization/build.make.002.log
make[2]: *** No rule to make target 'CMakeFiles/drl_common.dir/build'. Stop.
make[1]: *** [CMakeFiles/drl_common.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
make: *** [all] Error 2
cd /home/warehouse/catkin_ws_drl/build_release/dynamic_robot_localization; catkin build --get-env dynamic_robot_localization | catkin env -si /usr/bin/make --jobserver-fds=6,7 -j; cd -
...............................................................................
Failed << dynamic_robot_localization:make [ Exited with code 2 ]
Failed <<< dynamic_robot_localization [ 2.0 seconds ]

[build] Successful packages:

[Successful] PCL [Successful] laserscan_to_pointcloud
[Successful] mesh_to_pointcloud [ Ignored] octomap_mapping
[Successful] octomap_server [Successful] pcl_conversions
[Successful] pcl_msgs [Successful] pcl_ros
[ Ignored] perception_pcl [Successful] pose_to_tf_publisher

[build] Failed packages:

[ Failed] dynamic_robot_localization

[build] Summary: 8 of 9 packages succeeded.
[build] Ignored: 2 packages were skipped or are blacklisted.
[build] Warnings: None.
[build] Abandoned: None.
[build] Failed: 1 packages failed.
[build] Runtime: 20.5 seconds total.

I'm running it on Kinetic, Ubuntu 16.04.
Any ideas of what might be causing the problem?

Thanks,
Juan Mesa

Robot jumps sometimes | slam

Hi Carlos,

Sometimes during the localization the robot jumps very far away. Is that because I need to restrict the box search further down to avoid that?

Also, is it possible to update the map along the navigation? Because I have a very dynamic obstacle and I think I need to always refresh my map

build with ros-noetic

Hi Carlos,

Thank you for providing this good package, but here I have a problem when build the package with ros-noetic.
I have install your customized PCL library but I still get this error, do you have any idea?

[ 30%] Building CXX object dynamic_robot_localization/CMakeFiles/drl_common.dir/src/common/registration_visualizer.cpp.o
In file included from /usr/include/boost/detail/endian.hpp:9,
                 from /usr/local/include/pcl-1.8/pcl/PCLPointCloud2.h:11,
                 from /usr/local/include/pcl-1.8/pcl/pcl_base.h:55,
                 from /usr/local/include/pcl-1.8/pcl/registration/registration.h:45,
                 from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/include/dynamic_robot_localization/common/registration_visualizer.h:41,
                 from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp:39,
                 from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:43:
/usr/include/boost/predef/detail/endian_compat.h:11:161: note: #pragma message: The use of BOOST_*_ENDIAN and BOOST_BYTE_ORDER is deprecated. Please include <boost/predef/other/endian.h> and use BOOST_ENDIAN_*_BYTE instead
   11 | ORDER is deprecated. Please include <boost/predef/other/endian.h> and use BOOST_ENDIAN_*_BYTE instead")
      |                                                                                                       ^

In file included from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:43:
/home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp: In instantiation of ‘bool dynamic_robot_localization::RegistrationVisualizer<PointSource, PointTarget>::setRegistration(pcl::Registration<PointSource, PointTarget>&) [with PointSource = pcl::PointXYZRGBNormal; PointTarget = pcl::PointXYZRGBNormal]’:
/home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:52:1:   required from here
/home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp:70:2: error: no matching function for call to ‘pcl::Registration<pcl::PointXYZRGBNormal, pcl::PointXYZRGBNormal, float>::registerVisualizationCallback(std::function<void(const pcl::PointCloud<pcl::PointXYZRGBNormal>&, const std::vector<int>&, const pcl::PointCloud<pcl::PointXYZRGBNormal>&, const std::vector<int>&)>&)’
   70 |  registration.registerVisualizationCallback(this->update_visualizer_);
      |  ^~~~~~~~~~~~
In file included from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/include/dynamic_robot_localization/common/registration_visualizer.h:41,
                 from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp:39,
                 from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:43:
/usr/local/include/pcl-1.8/pcl/registration/registration.h:380:7: note: candidate: ‘template<class FunctionSignature> bool pcl::Registration<PointSource, PointTarget, Scalar>::registerVisualizationCallback(boost::function<FunctionSignature>&) [with FunctionSignature = FunctionSignature; PointSource = pcl::PointXYZRGBNormal; PointTarget = pcl::PointXYZRGBNormal; Scalar = float]’
  380 |       registerVisualizationCallback (boost::function<FunctionSignature> &visualizerCallback)
      |       ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/usr/local/include/pcl-1.8/pcl/registration/registration.h:380:7: note:   template argument deduction/substitution failed:
In file included from /home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:43:
/home/rentao/wheel/develop/tiradar/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp:70:2: note: std::function<void(const pcl::PointCloud<pcl::PointXYZRGBNormal>&, const std::vector<int>&, const pcl::PointCloud<pcl::PointXYZRGBNormal>&, const std::vector<int>&)>’ is not derived from ‘boost::function<Signature>’
   70 |  registration.registerVisualizationCallback(this->update_visualizer_);
      |  ^~~~~~~~~~~~
make[2]: *** [dynamic_robot_localization/CMakeFiles/drl_common.dir/build.make:193: dynamic_robot_localization/CMakeFiles/drl_common.dir/src/common/registration_visualizer.cpp.o] Error 1
make[1]: *** [CMakeFiles/Makefile2:7231: dynamic_robot_localization/CMakeFiles/drl_common.dir/all] Error 2
make: *** [Makefile:141: all] Error 2
Invoking "make -j8 -l8" failed

Mismatch between 6DoF tracking achieved locally and the reference video with same data set

Hi,
I am trying to reproduce 6DoF tracking as in Video 6: Free fly test with Kinect in the ETHZ RGB-D dataset using the 6 DoF localization system.
After configuring workspace and installing all packages, I have executed following commands from terminal.
$ source ~/catkin_ws_drl/devel_release/setup.bash
$ cd /catkin_ws_drl/src/dynamic_robot_localization_tests/launch/environments/asl/bags
$ roslaunch ethzasl_kinect_dataset_high_complexity_slow_fly_movement.launch

Once RViz comes up, I switched to terminal and pressed space bar.
Now, tracking gets visualized in RViz.
However it does not look like that in reference video (https://www.youtube.com/watch?v=UslDiUkm7wE)

The output I got is shared here: https://youtu.be/8FZZhAnhFvs
What could be the reason for this mismatch?

To make the reference map visible in RViz, I have set reference_pointcloud_update_mode" default="FullIntegration" for unless="$(arg use_slam)" case in ethzasl_kinect_dataset.launch file.

I am also getting set of warnings and errors in terminal as below.
[ WARN] [1615551179.209201725, 1298112281.916861620]: No transform between frames map_ground_truth_corrected and base_link available after 1298112281.916862 seconds of waiting. This warning only prints once.
[ WARN] [1615551179.213807044, 1298112281.916861620]: No transform between frames map and base_link available after 1298112281.916862 seconds of waiting. This warning only prints once.
[ WARN] [1615551179.222639773, 1298112281.916861620]: No transform between frames map_drl and base_link available after 1298112281.916862 seconds of waiting. This warning only prints once.
[ WARN] [1615551179.231428189, 1298112281.916861620]: No transform between frames odom and base_link available after 1298112281.916862 seconds of waiting. This warning only prints once.

[ WARN] [1615551190.198542742, 1298112287.954772391]: Discarded cloud because localization couldn't be calculated
[ WARN] [1615551190.198679600, 1298112287.954772391]: Failed point cloud processing with error [PointCloudFilteringFailed]
[ WARN] [1615551191.853380178, 1298112289.606053195]: Failed point cloud processing with error [PointCloudFilteringFailed]
[ WARN] [1615551191.886966564, 1298112289.636235797]: Discarded cloud because localization couldn't be calculated
[ WARN] [1615551191.886997556, 1298112289.636235797]: Failed point cloud processing with error [PointCloudFilteringFailed]
[ERROR] [1615551191.919287026, 1298112289.666394334]: Lost tracking!

In initial attempts, I got warnings related to euclidean_transformation_validator parameters, like:
[ WARN] [1615551455.183423693, 1298112283.032384128]: EuclideanTransformationValidator rejected new pose at time 1298112283.032384128 due to max_outliers_percentage ->
correction translation: 3.12924 | (max_transformation_distance: 4)
correction rotation: 2.03959 | (max_transformation_angle: 0.1)
new pose diff translation: 3.12924 | (max_new_pose_diff_distance: 3.9)
new pose diff rotation: 2.03959 | (max_new_pose_diff_angle: 3.57)
root_mean_square_error: 0.0333355 | (max_root_mean_square_error: 0.05)
root_mean_square_error_reference_pointcloud: 0 | (max_root_mean_square_error_reference_pointcloud: -1)
outliers_percentage: 0.925824 | (max_outliers_percentage: 0.6)
outliers_percentage_reference_pointcloud: 0 | (max_outliers_percentage_reference_pointcloud: -1)
inliers_angular_distribution: 2 | (min_inliers_angular_distribution: 0.125)
outliers_angular_distribution: -2 | (max_outliers_angular_distribution: 0.875)
[pcl::IterativeClosestPointWithNormals::computeTransformation] Not enough correspondences found. Relax your threshold parameters.

Fo fix this, I had modified values in ../dynamic_robot_localization/yaml/configs/pose_tracking/cluttered_environments_dynamic_large_map_3d.yaml as below:
transformation_validators:
euclidean_transformation_validator:
max_transformation_angle: 3.20
max_transformation_distance: 4.0
max_new_pose_diff_angle: 3.57
max_new_pose_diff_distance: 3.90
max_root_mean_square_error: 0.05
max_outliers_percentage: 1.00

Please guide me to resolve this.

My configuration is:

  • dynamic_robot_localization: melodic-devel
  • dynamic_robot_localization_tests: hydro-devel
  • Ubuntu 18.04
  • Intel® Core™ i7-10810U CPU @ 1.10GHz × 12
  • Quadro P520/PCIe/SSE2 Graphics
  • 16 GB RAM
  • 512 GB HDD

How do you get the correct TF?

Whenvever you find the TF from the lidar frame using the ICP:
icp.getFinalTransformation() , getting the matrix and converting to ROS TF, how do you then proceed on publishing the Map to Odom TF? Is it like this?

  1. final_transformation_as_tf = final_transformation_as_tf (the one you got from the matrix) * map_to_base_tf2 (from listening the TF);
  2. Then getting the relative TF relative_tf2 = sensor_to_base_TF.inverseTimes(odom_to_base_TF)
  3. Finally final_transformation_as_tf = final_transformation_as_tf * relative_tf2;

Is that correct?

Localization failure | Generation of deb files

Hi,

I decided tom tryout the default configuration of the guardian localization example. I am only using the front lidar and therefore edited the launch file a bit (just the lidar area).

First, the localization was not able to find the correspondance points and I thought this is due to the global localization problem. After I pointed out the initial pose, I get:

[ INFO] [1569164427.212248087]: Received initial pose at time [1569164427.180871518]: 
	Translation -> [ x: -1.34668 | y: -0.0939072 | z: 0 ]
	Rotation -> [ qx: 0 | qy: 0 | qz: 0.055193 | qw: 0.998476 ]
[pcl::RandomSampleConsensus::computeModel] No samples could be selected!
[pcl::SampleConsensusModel::getSamples] Can not select 0 unique points out of 2!
[pcl::RandomSampleConsensus::computeModel] No samples could be selected!
[pcl::IterativeClosestPoint::computeTransformation] Not enough correspondences found. Relax your threshold parameters.
[ WARN] [1569164428.699073609]: Discarded cloud because localization couldn't be calculated
[ WARN] [1569164428.699200539]: Failed point cloud processing with error [FailedInitialPoseEstimation]
[ WARN] [1569164429.367911121]: Pose to TF publisher reached timeout for last valid pose
	TF translation -> [ x: -1.34668 | y: -0.0939072 | z: 0 ]
	TF orientation -> [ qx: 0 | qy: 0 | qz: 0.055193 | qw: 0.998476 ]
	Current time: 1569164429.367611740
	Last pose time: 1569164427.180871518
	Last pose arrival time: 1569164427.268565237
[ WARN] [1569164430.468016846]: Pose to TF publisher reached timeout for last valid pose
	TF translation -> [ x: -1.34668 | y: -0.0939072 | z: 0 ]
	TF orientation -> [ qx: 0 | qy: 0 | qz: 0.055193 | qw: 0.998476 ]
	Current time: 1569164430.467624702
	Last pose time: 1569164427.180871518
	Last pose arrival time: 1569164427.268565237
[ WARN] [1569164430.611402094]: The /scan observation buffer has not been updated for 1.31 seconds, and it should be updated every 1.20 seconds.
[pcl::IterativeClosestPoint2D::computeTransformation] Not enough correspondences found. Relax your threshold parameters.
[ WARN] [1569164430.830316015]: EuclideanTransformationValidator rejected new pose at time 1569164430.830252524 due to root_mean_square_error -> 
	 correction translation: 0 | (max_transformation_distance: 0.5)
	 correction rotation: 0 | (max_transformation_angle_: 0.7)
	 new pose diff translation: 0 | (max_new_pose_diff_distance_: 0.9)
	 new pose diff rotation: 0 | (max_new_pose_diff_angle_: 1.1)
	 root_mean_square_error: 1.79769e+308 | (max_root_mean_square_error_: 0.1)
	 root_mean_square_error_reference_pointcloud: 0 | (max_root_mean_square_error_reference_pointcloud_: -1)
	 outliers_percentage: 1 | (max_outliers_percentage_: 0.85)
	 outliers_percentage_reference_pointcloud: 0 | (max_outliers_percentage_reference_pointcloud_: -1)
	 inliers_angular_distribution: 2 | (min_inliers_angular_distribution_: 0.07)
	 outliers_angular_distribution: -2 | (max_outliers_angular_distribution_: 0.875)
[ WARN] [1569164430.830735712]: Discarded cloud because localization couldn't be calculated
[ WARN] [1569164430.830845610]: Failed point cloud processing with error [FailedPoseEstimation]

Is that really the parameter choice?

Aslo, I have noticed that the pointcloud produced by lasercan_to_pointcloud (/guardian/lasers) works fine but no node subscribes to it. So I am not sure the purpose of it. The other PCL produced by drl node (ambient_pointcloud) works fine.

Here is the another warning:

[ WARN] [1569224041.915014225]: EuclideanTransformationValidator rejected new pose at time 1569224041.914963026 due to max_new_pose_diff_angle -> 
	 correction translation: 0.0208703 | (max_transformation_distance: 0.5)
	 correction rotation: 0.00947034 | (max_transformation_angle_: 0.7)
	 new pose diff translation: 0.0208703 | (max_new_pose_diff_distance_: 0.9)
	 new pose diff rotation: 0.00947034 | (max_new_pose_diff_angle_: 1.1)
	 root_mean_square_error: 0.0307908 | (max_root_mean_square_error_: 0.1)
	 root_mean_square_error_reference_pointcloud: 0 | (max_root_mean_square_error_reference_pointcloud_: -1)
	 outliers_percentage: 0.348315 | (max_outliers_percentage_: 0.85)
	 outliers_percentage_reference_pointcloud: 0 | (max_outliers_percentage_reference_pointcloud_: -1)
	 inliers_angular_distribution: 2 | (min_inliers_angular_distribution_: 0.07)
	 outliers_angular_distribution: -2 | (max_outliers_angular_distribution_: 0.875)

I am not sure why this warning appears as the max_new_pose_diff_angle is higher than new pose diff rotation: 0.00947034

drl build linking error

I have installed:
ros-kinetic-laserscan-to-pointcloud_1.0.0-0xenial_amd64.deb
ros-kinetic-pcl_1.8.1-0xenial_amd64.deb
ros-kinetic-pose-to-tf-publisher_1.0.0-0xenial_amd64.deb

then I build the drl and get the following error:
[ 98%] Linking CXX executable devel/lib/dynamic_robot_localization/drl_localization_node
devel/lib/libdrl_localization.so:对‘pcl::search::Searchpcl::PointXYZRGBNormal::getName() const’未定义的引用
collect2: error: ld returned 1 exit status

what can I do to fix this problem

ROS2 Support

Hello, Carlos

I am very surprised by your package performance.

I want to use this package on ROS2 as well.

Is there any way to use it?

Or do you have plan to support ROS2 for this package.

Pointclouds get dropped

Thank you for your great work with this package, it looks really promising!
I recorded a 3d Pointcloud map of our environment beforhand. And now I'd like to lokalize our robot in said map.
Therefore I wanted to use the ynamic_robot_localization_system_6dof.launch launchfile. But when starting it, after the Map is loaded and global pose estimate is published, all the new Pointclouds get dropped, because:

[ WARN] [1660297010.079226327]: Dropping pointcloud because tf between base_link and odom isn't available
Failed to find match for field 'rgb'.
Failed to find match for field 'normal_x'.
Failed to find match for field 'normal_y'.
Failed to find match for field 'normal_z'.
Failed to find match for field 'curvature'.

Is that a Problem because of the pointcloud type I use? I am using a robosense Lidar and can decide between XYZIRT and XYZI type. So there is no option for rgb... . Am I still able to use this package somehow?

Selecting RoI

Hi,

If I have predefined static objects in specific locations of occupancy map, is it possible to select a region of interest for the ICP? So instead of trying to match the whole visible map, it will only try to match the selected RoI of the map

How to start

Hello
Im trying to use this package but it's a little bit overwhelming!
Let's suppose I have a rosbag with pointcloud and imu data, what would be the first steps to generate a map and use it for navigation? Thank you

Lidar in pointcloud localization

Hi, I would like to perform localization of a lidar which is moving in a space, which is also recorded as a pcl pointcloud. I checked the big config file and I haven't seen any option for map upload. Where can I find a good launch file for this? Maybe it would be better to use the dynamic_robot_localization_tests package?
Regards

prcess died

Hi Carlos,

I am implementing this project with stage simulation environment. But I found when I launched all nodes, system would stuck and died after few minutes.

Here are the messages in terminal.
Screenshot from 2019-11-04 13-31-20

Hope you can offer your help, and I will be appreciated.

**[URGENT]** What is the result of the 3dof localization?

Hello @carlosmccosta ,thank you for the open-source.

I checked out some of the issues and readme,but I am not very clear about the 3dof localization.
I had a few questions-
I have a bag file attached here, and I changed the launch file to this and also the yaml file ,then I run the following commands-

1.roscore
2.roslaunch dynamic_robot_localization dynamic_robot_localization_system.launch
3.rosrun map_server map_server map_default.yaml
4.rosbag play --clock expt6.bag
5 rviz

I have attached the map yaml here and pgm.I am clueless as to what should be the output,in the sense I have the ground_truth pose of the bot,so which topic should I compare this to(when I run the above commands) ,so that I know how good is the 3dof localization?
What should be the expected rviz output?
Let me know if any more information is required.
Any help/suggestion is appreciated.Waiting for your early reply.
Thank you

Compile the project once

Compiling on ARM takes a lot of time, especially PCL! What is the best way to compile PCL once for all and save its binaries and libraries so I can simply transfer to another ARM machine? I tried to simply make the PCL independently and then install its binaries to some folder and use that but getting an error at:

-- Using these message generators: gencpp;geneus;genlisp;gennodejs;genpy
CMake Error at /opt/ros/melodic/share/dynamic_reconfigure/cmake/dynamic_reconfigure-macros.cmake:80 (add_custom_target):
  add_custom_target cannot create target "pcl_ros_gencfg" because another
  target with the same name already exists.  The existing target is a custom
  target created in source directory "/home/abyl/right-wheel/src/laser_icp".
  See documentation for policy CMP0002 for more details.
Call Stack (most recent call first):
  perception_pcl/pcl_ros/CMakeLists.txt:70 (generate_dynamic_reconfigure_options)

2D Lidar and AMCL

As far as I understood, this localization uses NDT-MCL. How is it compared with a traditional AMCL package?

Also, as I can see that the package focuses on 3D lidars, does it make sense to use this package having only 2d lidar?

And what is the reason of using PCL instead of laserscan data for 2D measurement?

2-D global (3 DoF) localization questions

Hi Carlos!

First of all, thanks for sharing your research work, which is quite fascinating.

I've successfully installed from the source and recently I'm looking through your codes. I try to implement your algorithm on my own 2-D laser data to accomplish 3 DoF (position_x, position_y, angle_yaw) localization tasks. However, I have a few questions which I hope you could clarify for me:

  1. Since I'm trying to implement the 2-D global (3 DoF) localization, I'm wondering whether should I use the guardian_localization.launch or dynamic_robot_localization_system.launch?
  2. I've built maps of the environment by implementing the 2-D SLAM algorithms Hector-SLAM and Google Cartographer. I obtained occupancy grid maps (OGMs) of the type (.pgm | .yaml) and (.pbstream) for these two algorithms, respectively. So can I utilize the obtained OGMs as the reference maps in the localization tasks? Do I have to transform the OGMs to pointcloud maps then can I use them?
  3. Could you detail the steps to implement the 2-D global (3 DoF) localization on my own 2-D laser data? I checked the Usage part but still have no idea how to do this.

By the way, I wrongly posted the issue #17 and you can delete it. Sorry about this.

Thanks a lot and looking forward to your early reply!

Best Regards,
William

the angle unit

Hi Carlo,

I am configuring the yaml file, can I know what is the unit of the "max_new_pose_diff_angle"?

Degree among 360° or rad among 2π?

Best regards
Sun Rentao

error:registration.registerVisualizationCallback(this->update_visualizer_);

error: no matching function for call to ‘pcl::Registration<pcl::PointXYZRGBNormal, pcl::PointXYZRGBNormal, float>::registerVisualizationCallback(std::function<void(const pcl::PointCloudpcl::PointXYZRGBNormal&, const std::vector&, const pcl::PointCloudpcl::PointXYZRGBNormal&, const std::vector&)>&)’
registration.registerVisualizationCallback(this->update_visualizer_);

Using sparse and noisy PointCloud2

Hi,

I am using mmWave radars from TI (IWR6843AOP) and wonder which best method will suite from DRL package to perform a localization?

Suppose that I have made a PCL map of the environment (which is also sparse and noisy as the map will be made from the same radars) and want to localize, will you suggest ICP method or NDT?

Node doesn't start due to invalid time?

Hi,

I'm trying to use your package for a SLAM project, but I'm running into a weird issue when launching the localization node.
The drl_localization_node starts up but remains stuck in a loop, posting the message "Waiting for valid time...", which corresponds to this loop:

Screenshot from 2021-03-10 23-24-45

Have you ever run into this problem before? If so, could you be so kind as to provide some information in order to solve it?

Thanks!

Issues installing on Kinetic

Running On Ubuntu 15.10 Linux version: 4.2.0-42-generic

1 ) I have cloned the repo in ~/
2) Ran bash install.sh

There are a couple warnings about deprecated functions, than the error shown bellow:
screen shot 2017-03-16 at 5 09 17 pm

Can you please help me with the install?

Thank you

Initial pose and odometry reuse

Hi,

In order to use this package in real production environment I have a couple of questions:

  1. When the initial pose is given, the package still seems to try and get the localization. Is it possible for the package to trust the initial pose and just use it as a reliable starting position without the need of initializing the localization?
  2. If the odometry of the robot is very accurate and envrionment is dynamic, is it possible to keep publishing the TF (map -> odom) wheneve the DRL loses the track?

P.S: I know that I can comment out the yaml file for the initial pose but when it looses the track, TF is not published

Laser to PCL conversion

Hi Carlos,

Sorry for asking so many questions but I just find your library fascinating and I am learning and reading your code since the last week. Also thank you for taking your time to answer my questions, I really appreciate it :)

The question I have is about datatypes you are using for the ICP algorithm.
From your code correct me if I am wrong:

  1. You put the reference cloud in a struct called CorrespondenceTable
  2. Any laser input is being converted to PCLPointCloud2 object from here:
    http://docs.pointclouds.org/1.7.0/_p_c_l_point_cloud2_8h_source.html
  3. When it is a time to converge, you translate the reference CorrespodenceTable type object into PCLPointCloud2 and do the ICP convergence.
  4. What happened to the pcl::PointXYZ like here as a datatype for the ICP algorithm? http://pointclouds.org/documentation/tutorials/iterative_closest_point.php

Turn on/off the tracking

Hi,

I want to combine the AMCL and ICP to get a more accurate approach localization. My idea is to use AMCL and turn off the tracking_2d from the DRL while navigating and then when the goal distance is less than X threshold, I can turn off AMCL and turn on DRL for final approach.

Is there a simple way to turn off the DRL completely (tracking_2d) and turn on with an initial position?

cmake error when compiling pcl

Hi, thanks for your work.

I encountered an error when I build pcl part, seems like related to cmake version? There is a compiled-from-source pcl-1.9.1 on my system, everything work well. I am running ROS Kinetic on 16.06. Do you have any idea what might be the problem?
Screenshot from 2019-11-04 22-52-37

implementation for hash storage of map

Hi Carlos,

As the code for your library is quite substantial, can you please point the location where you use hash table to store the map (reference cloud) against which you perform ICP and get a faster convergence than k tree implementation?

Also, I saw your version of icp in PCL fork but how different is the PCL ICP implementation from this one (referring if we use k tree implementation)?
https://github.com/AndreaCensi/csm

HSVto RGB

Dear Carlos,

Thank you for your work on this package.
I pull the newest noetic branch, and I get this error
In file included from /home/genius/ros_ws/src/dynamic_robot_localization/src/outlier_detectors/euclidean_outlier_detector.cpp:10: /home/genius/ros_ws/src/dynamic_robot_localization/include/dynamic_robot_localization/outlier_detectors/impl/euclidean_outlier_detector.hpp: In member function ‘virtual size_t dynamic_robot_localization::EuclideanOutlierDetector<PointT>::detectOutliers(typename pcl::search::KdTree<PointT>::Ptr, const pcl::PointCloud<PointT>&, typename pcl::PointCloud<PointT>::Ptr&, typename pcl::PointCloud<PointT>::Ptr&, double&)’: /home/genius/ros_ws/src/dynamic_robot_localization/include/dynamic_robot_localization/outlier_detectors/impl/euclidean_outlier_detector.hpp:158:11: error: ‘HSVtoRGB’ is not a member of ‘pcl’ 158 | pcl::HSVtoRGB(hue, 1.0f, 1.0f, point.r, point.g, point.b); | ^~~~~~~~
I installed with your install script.
And I noticed you pushed a new update related to HSV recently, do you know how this happes ?

Thank you
Best regards

compile errors

Hello Carlos Mccosta :)
when i compile from the source, some error occured:

Errors << dynamic_robot_localization:make /home/curry/catkin_ws_drl/logs_release/dynamic_robot_localization/build.make.000.log
In file included from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/cloud_viewer.cpp:10:0:
/home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/cloud_viewer.hpp: In instantiation of ‘void dynamic_robot_localization::CloudViewer::setupConfigurationFromParameterServer(ros::NodeHandlePtr&, ros::NodeHandlePtr&, const string&) [with PointT = pcl::PointXYZRGBNormal; ros::NodeHandlePtr = boost::shared_ptrros::NodeHandle; std::_cxx11::string = std::cxx11::basic_string]’:
/home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/cloud_viewer.cpp:19:1: required from here
/home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/cloud_viewer.hpp:33:85: warning: unused parameter ‘node_handle’ [-Wunused-parameter]
void CloudViewer::setupConfigurationFromParameterServer(ros::NodeHandlePtr& node_handle, ros::NodeHandlePtr& private_node_handle, const std::string& configuration_namespace) {
^~~~~~~~~~~
In file included from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/pointcloud_conversions.h:32:0,
from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/pointcloud_conversions.hpp:9,
from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/pointcloud_conversions.cpp:10:
/usr/include/pcl-1.8/pcl/visualization/cloud_viewer.h:202:14: warning: ‘template class std::auto_ptr’ is deprecated [-Wdeprecated-declarations]
std::auto_ptr<CloudViewer_impl> impl
;
^~~~~~~~
In file included from /usr/include/c++/7/memory:80:0,
from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/pointcloud_conversions.h:12,
from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/pointcloud_conversions.hpp:9,
from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/pointcloud_conversions.cpp:10:
/usr/include/c++/7/bits/unique_ptr.h:51:28: note: declared here
template class auto_ptr;
^~~~~~~~
In file included from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:43:0:
/home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp: In instantiation of ‘bool dynamic_robot_localization::RegistrationVisualizer<PointSource, PointTarget>::setRegistration(pcl::Registration<PointSource, PointTarget>&) [with PointSource = pcl::PointXYZRGBNormal; PointTarget = pcl::PointXYZRGBNormal]’:
/home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:52:1: required from here
/home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp:70:2: error: no matching function for call to ‘pcl::Registration<pcl::PointXYZRGBNormal, pcl::PointXYZRGBNormal, float>::registerVisualizationCallback(std::function<void(const pcl::PointCloudpcl::PointXYZRGBNormal&, const std::vector&, const pcl::PointCloudpcl::PointXYZRGBNormal&, const std::vector&)>&)’
registration.registerVisualizationCallback(this->update_visualizer
);
^~~~~~~~~~~~
In file included from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/registration_visualizer.h:41:0,
from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp:39,
from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:43:
/usr/include/pcl-1.8/pcl/registration/registration.h:375:7: note: candidate: template bool pcl::Registration<PointSource, PointTarget, Scalar>::registerVisualizationCallback(boost::function&) [with FunctionSignature = FunctionSignature; PointSource = pcl::PointXYZRGBNormal; PointTarget = pcl::PointXYZRGBNormal; Scalar = float]
registerVisualizationCallback (boost::function &visualizerCallback)
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/usr/include/pcl-1.8/pcl/registration/registration.h:375:7: note: template argument deduction/substitution failed:
In file included from /home/curry/catkin_ws_drl/src/dynamic_robot_localization/src/common/registration_visualizer.cpp:43:0:
/home/curry/catkin_ws_drl/src/dynamic_robot_localization/include/dynamic_robot_localization/common/impl/registration_visualizer.hpp:70:2: note: ‘std::function<void(const pcl::PointCloudpcl::PointXYZRGBNormal&, const std::vector&, const pcl::PointCloudpcl::PointXYZRGBNormal&, const std::vector&)>’ is not derived from ‘boost::function’
registration.registerVisualizationCallback(this->update_visualizer
);

i dont kown the reasons,could you please help me to solve it? thanks again

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.