Giter Site home page Giter Site logo

rpng / mins Goto Github PK

View Code? Open in Web Editor NEW
341.0 341.0 55.0 38.85 MB

An efficient and robust multisensor-aided inertial navigation system with online calibration that is capable of fusing IMU, camera, LiDAR, GPS/GNSS, and wheel sensors. Use cases: VINS/VIO, GPS-INS, LINS/LIO, multi-sensor fusion for localization and mapping (SLAM). This repository also provides multi-sensor simulation and data.

License: GNU General Public License v3.0

CMake 1.07% C++ 98.21% Shell 0.72%

mins's Introduction

MINS

Docker Image CI

An efficient, robust, and tightly-coupled Multisensor-aided Inertial Navigation System (MINS) which is capable of flexibly fusing all five sensing modalities (IMU, wheel encoders, camera, GNSS, and LiDAR) in a filtering fashion by overcoming the hurdles of computational complexity, sensor asynchronicity, and intra-sensor calibration.

Exemplary use case of MINS:

  • VINS (mono, stereo, multi-cam)
  • GPS-IMU (single, multiple)
  • LiDAR-IMU (single, multiple)
  • wheel-IMU
  • Camera-GPS-LiDAR-wheel-IMU or more combinations.

alt text alt text

Key Features

  • Inertial(IMU)-based multi-sensor fusion including wheel odometry and arbitrary numbers of cameras, LiDARs, and GNSSs (+ VICON or loop-closure) for localization.
  • Online calibration of all onboard sensors (check exemplary results).
  • Consistent high-order state on manifold interpolation improved from our prior work (MIMC-VINS) and dynamic cloning strategy for light-weight estimation performance.
  • Multi-sensor simulation toolbox for IMU, camera, LiDAR, GNSS, and wheel enhanced from our prior work (OpenVINS)
  • Evaluation toolbox for consistency, accuracy, and timing analysis.
  • Very detailed options for each sensor enabling general multi-sensor application.

Dependency

MINS is tested on Ubuntu 18 and 20 and only requires corresponding ROS (Melodic and Noetic).

  • Default Eigen version will be 3.3.7 (Noetic) or lower, but if one has a higher version the compilation can be failed due to thirdparty library (libpointmatcher) for LiDAR.

Build and Source

mkdir -p $MINS_WORKSPACE/catkin_ws/src/ && cd $MINS_WORKSPACE/catkin_ws/src/
git clone https://github.com/rpng/MINS
cd .. && catkin build
source devel/setup.bash

Run Examples

Simulation

roslaunch mins simulation.launch cam_enabled:=true lidar_enabled:=true

alt text

Real-World Dataset

Directly reading the ros bag file

roslaunch mins rosbag.launch config:=kaist/kaist_LC path_gt:=urban30.txt path_bag:=urban30.bag

alt text

Here are the rosbag files and ground truths we used in the evaluation. To be specific, we used kaist2bag to convert all sensor readings to rosbag files. All rights reserved to KAIST urban dataset.

Rosbag GT (txt) GT (csv) Rosbag GT (txt) GT (csv)
urban18.bag urban18.txt urban18.csv urban19.bag urban19.txt urban19.csv
urban20.bag urban20.txt urban20.csv urban21.bag urban21.txt urban21.csv
urban22.bag urban22.txt urban22.csv urban23.bag urban23.txt urban23.csv
urban24.bag urban24.txt urban24.csv urban25.bag urban25.txt urban25.csv
urban26.bag urban26.txt urban26.csv urban27.bag urban27.txt urban27.csv
urban28.bag urban28.txt urban28.csv urban29.bag urban29.txt urban29.csv
urban30.bag urban30.txt urban30.csv urban31.bag urban31.txt urban31.csv
urban32.bag urban32.txt urban32.csv urban33.bag urban33.txt urban33.csv
urban34.bag urban34.txt urban34.csv urban35.bag urban35.txt urban35.csv
urban36.bag urban36.txt urban36.csv urban37.bag urban37.txt urban37.csv
urban38.bag urban38.txt urban38.csv urban39.bag urban39.txt urban39.csv

Subscribing to the ros messages

roslaunch mins subscribe.launch config:=euroc_mav rosbag:=V1_03_difficult.bag bag_start_time:=0

alt text

RViz

rviz -d mins/launch/display.rviz

Acknowledgements

This project was built on top of the following libraries which are in the thirdparty folder.

Credit / Licensing

This code was written by the Robot Perception and Navigation Group (RPNG) at the University of Delaware. If you have any issues with the code please open an issue on our GitHub page with relevant implementation details and references. For researchers that have leveraged or compared to this work, please cite the following:

The publication reference will be updated soon.

@article{Lee2023arxiv,
    title        = {MINS: Efficient and Robust Multisensor-aided Inertial Navigation System},
    author       = {Woosik Lee and Patrick Geneva and Chuchu Chen and Guoquan Huang},
    year         = 2023,
    journal      = {arXiv preprint arXiv:2309.15390},
    url          = {https://github.com/rpng/MINS},
}

The codebase and documentation is licensed under the GNU General Public License v3 (GPL-3). You must preserve the copyright and license notices in your derivative work and make available the complete source code with modifications under the same license (see this; this is not legal advice).

mins's People

Contributors

lnexenl avatar woosiklee2510 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mins's Issues

Lidar Undistortion

Hi, I currently run MINS on some private datasets, where a car goes around the figure eight, requiring a high-precision lidar undistortion function.

However, I find that lidar undistortion function requires v_angles and h_angles to undistort point cloud. And this function also assumes that the time differences between each point are same.
success = state->get_interpolated_pose(lidar_inL->time + dt - (total - i) * op->raw_point_dt, RGtoIi, pIiinG);

I think lidar undistortion function could be made more general, by getting each point's timestamp directly from ros msg. Is it possible to introduce this as a new feature in your future work?

rviz configuration settings for simulation.launch, rosbag.launch and subscribe.launch

@WoosikLee2510 @goldbattle @ghuangud @yangyulin @huaizheng @saimouli

After launching the files simulation.launch, rosbag.launch and subscribe.launch, iam unable to see proper visualization in rviz for setting tf transformation coordinates i.e Fixed Frame sttings in rviz and it is too confusing, tricky and time consuming. Can you plaese provide rvziz configuration for any of the one above launches so that i can roslaunch along with rviz appropriate settings. Requesting you kindly to do the needful

Initialization need platform to move and sometimes failed to init (drift)

Hi MINS maintainers,

First of all, thank you for great work and share it to the public community. I would like to apply MINS to the mobile robot application, I'm using stereo camera + IMU (Intel d435i) , but there are some issues of MINS that I want to ask:

  1. Is there any way to make the system initialization while stay in one place, I assume that the VIO need platform to move to recover scale (in monocular camera), but in case of using stereo cameras, could I make the initiazation without moving?
  2. Sometimes the initialization failed and pose got drifted. How could I prevent drifting?
  3. I aware that CPU usage of MINS is much higher than that of Open-Vins ( in my PC, 150% CPU compare to 50% of OpenVINS). What cause the CPU usage higher?

Questions about LiDAR noise and consistency

Hi, thank you for opensouring this comprehensive sensor fusion system. I have some questions regarding the LiDARs.

In the LiDAR odometry part, you use a direct scan-submap registration as residual, which is similar to FASTLIO2. I am quite curious about the map_noise parameter used in simulation. If I understand correctly, this parameter is used to whiten the point-plane residual of the neighbor points found on the local map. However, the neighbor points can only be selected if they all pass a plane sanity check, i.e., their distance to the fitted plane should be smaller than plane_max_p2pd, and this is a hard constraint. However, the map_noise (related to the uncertainty we think it has) is 5x of plane_max_p2pd (related to the actually uncertainty), so it seems like a manual dilation of point-plane residual uncertainty, which should(?) lead to a smaller NEES (conservative).

So, my questions are the following:

  1. Why is the map_noise parameter much larger in simulation than in real experiments, given that they use the same sanity check threshold plane_max_p2pd, and the simulation has a much smaller LiDAR point measurement noise raw_noise?
  2. It seems that the three simulation scenes mentioned in the paper are used for three different purposes, and UD small is used for consistency verification. I'm wondering how consistent the system is on longer or more complex datasets (e.g., "C" shaped corridor)? Because UD small lasts only 60 seconds, and the whole scene is visible to the LiDARs throughout the experiment (i.e., local map never decays).
  3. How consistent is the system under different LiDAR noise levels? For example, raw_noise at level 2cm, 3cm, or 10cm as used in the real world experiments. How to change other parameters accordingly when this raw_noise changes?

The table below is excerpted from the configuration files of MINS:

KAIST / KAIST LC KAIST L Simulation
raw_downsample_size 2.0 1.0 0.3
raw_noise 0.1 0.1 0.01
map_downsample_size 0.5 0.3 0.3
map_noise 0.1 0.1 0.5
plane_max_p2pd 0.1 0.1 0.1
map_decay_time 30 9999 120
map_decay_dist 30 100 100

I hope you can clear up my questions. Thank you!

[KAIST] Process died when use_imu_res is disabled.

Hi, Woosik. Really appreciate this great work!

I am currently running KAIST urban 30 dataset. Everything works fine when using subscribe.launch with the defaut kaist/kaist_LC config.

However, when I try to run with polynomial estimated residuals,
mins/config/kaist/kaist_LC/config_estimator.yaml-> use_imu_res: false,
the [mins_subscribe-2] process died on startup.
Screenshot from 2023-10-11 16-43-17

Could you take a look at this problem? Much appreciated.

process[mins simulation0-1]: started with pid [7471]

Excuse me, I am running "roslaunch mins simulation.launch cam_enabled:=true lidar_enabled:=true". What is the cause of the error? How to fix it?

6676fd3ff17ca00e003eacc1acc1215

And I'm using "roslaunch mins serial.launch config:=kaist/kaist_LC path_gt:=urban30.txt path_bag:=urban30.bag "When running rosbag with your own kaist dataset, can the bag contain only the following topics? What are the topics of your bag?

a2d162de8c8437e781ab1e8ca21869d

Thank you very much.

how to implement with own hardware platform ?

Hi,
Thanks for great work. I am wondering if i can use this MINS algorithm with the hardware like ouster, stereo camera etc ? is there any detailed documentation to follow ?
Thanks in advance!

[Question] The formular in UpdaterWheel::preintegration_3D

I don't understand why p0_dot = R_Gto0.transpose() * v_hat; but not p0_dot = R_Gto0 * v_hat;

  // k1 ================
  Vector4d dq_0 = {0, 0, 0, 1};
  Vector4d q0_dot = 0.5 * Omega(w_hat) * dq_0;
  Matrix3d R_Gto0 = quat_2_Rot(quat_multiply(dq_0, q_local));
  Vector3d p0_dot = R_Gto0.transpose() * v_hat;

Is it because of the quaternion is in JPL style?

Can MINS work on a drone?

Hi!

I read the paper of MINS, and I see the experiments were done with ground vehicles. I would assume it is usable on a drone, but I wanted to confirm this from a practical point of view. Would there be specific assumptions for MINS to be used one e.g., a multi-rotor UAV?

Thanks.

Question about the feature representation

Dear MINS developer,

First of all, thanks for sharing your awesome work. According the report Visual-Inertial Odometry on Resource-Constrained Systems, compare between different feature representation, the performance of the AHP and IDP parameterizations is significantly better than that of the XYZ parameterization.

Currently, MINS only support GLOBAL_3D or GLOBAL_FULL_INVERSE_DEPTH, which I support it is XYZ parameterization in the paper. so the question is why MINS only support Global representation? Is it possible to use all feature representation (from Open-Vins) to integrate to MINS and improve the tracking stability? Thanks in advance.

Tw, Tg, Ta, R_IMUtoGYRO,Fail to appear

There is a transformation matrix T for the angular velocity and acceleration in the imu parameters, in openvins. But in mins this parameter is not in the yaml file, don't you need to worry about that?

Random crash of program

Hi, I find that MINS really works well, but sometimes it crashes my system, causing a reboot of my PC (no matter where I execute the program: inside or outside docker container). Have you ever met such problem? I am curious whether it's a common problem or it only happens on my PC. I guess this problem might be caused by invalid memory access.

[label:question] Question about Jacobians `Phi_tr` in UpdaterWheel::preintegration_3D

in this line:

Phi_tr.block(3, 0, 3, 3) = -R_3D.transpose() * skew_x(R_3D.transpose() * (new_p - p_3D));

I think new_p is equal to (p_3D + R_3D.transpose()*v*dt), then the line comes to:

Phi_tr.block(3, 0, 3, 3) = -R_3D.transpose() * skew_x(R_3D.transpose() * (R_3D.transpose()*v*dt));

But it seems that the final result should be:

Phi_tr.block(3, 0, 3, 3) = -R_3D.transpose() * skew_x(v*dt));

So I think the origin line should be the following code:

Phi_tr.block(3, 0, 3, 3) = -R_3D.transpose() * skew_x(R_3D * (new_p - p_3D));

Just remove the .transpose()

IMU Coordinate and Camera coordinate

i want use my own dataset to test MINS system, after check the openvins project , i am really confused about the IMU coordinate and Camera coordinate , could you please just tell me what imu coordiante this system use:

what i understand:

camera:
[
Z: forward
X: left
Y: down
]

IMU [
X:?
Y: ?
Z: ?
]
what i need ito do is calculate the camera - imu rotation matrix.

now i don't know how to define imu coordinate .

double free or corruption (out)

roslaunch mins simulation.launch cam_enabled:=true lidar_enabled:=true
When I run the above command, the error shown below will appear。
double free or corruption (out)

Formular does not make sense

in

bool I_Initializer::initialization(Matrix<double, 17, 1> &imustate) {

...

  Vector3d z_axis = a_avg_2to1 / a_avg_2to1.norm();

  // Create an x_axis
  Vector3d e_1(1, 0, 0);

  // Make x_axis perpendicular to z
  Vector3d x_axis = e_1 - z_axis * z_axis.transpose() * e_1;
  x_axis = x_axis / x_axis.norm();

...

Why x_axis perpendicular z_axis? x_axis.dot(z_axis) is not equal to zero.

Project Consulting

Thank you for your great work. What I want to ask is what is the relationship between this project and MIMC-VINS

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.