Giter Site home page Giter Site logo

tianchenji / paad Goto Github PK

View Code? Open in Web Editor NEW
14.0 2.0 1.0 1.27 MB

[RA-L'22] Proactive Anomaly Detection for Robot Navigation with Multi-Sensor Fusion

License: MIT License

Python 100.00%
deep-learning sensor-fusion anomaly-detection perceptron-learning-algorithm

paad's Introduction

PAAD

This repository stores the Pytorch implementation of PAAD for the following paper:

Proactive Anomaly Detection for Robot Navigation with Multi-Sensor Fusion

Tianchen Ji, Arun Narenthiran Sivakumar, Girish Chowdhary, Katie Driggs-Campbell

published in IEEE Robotics and Automation Letters (RA-L), 2022

PAAD fuses camera, LiDAR, and planned path to predict the probability of future failure for robot navigation. The code was tested on Ubuntu 20.04 with Python 3.8 and Pytorch 1.8.1.

[paper] [video] [dataset]

Abstract

Despite the rapid advancement of navigation algorithms, mobile robots often produce anomalous behaviors that can lead to navigation failures. We propose a proactive anomaly detection network (PAAD) for robot navigation in unstructured and uncertain environments. PAAD predicts the probability of future failure based on the planned motions from the predictive controller and the current observation from the perception module. Multi-sensor signals are fused effectively to provide robust anomaly detection in the presence of sensor occlusion as seen in field environments. Our experiments on field robot data demonstrate superior failure identification performance than previous methods, and that our model can capture anomalous behaviors in real-time while maintaining a low false detection rate in cluttered fields.

Description of the code

More detailed comments can be found in the code. Here are some general descriptions:

  • nets: Contains network architectures for PAAD.

  • train.py and test.py: Train and test PAAD on the dataset, respectively.

  • custom_dataset.py: Loads the dataset from CSV files and image folders.

  • dataset_vis.py: Visualizes a datapoint in the dataset. The annotated planned trajectory is projected onto the front-view image, and the 2D lidar scan around the robot is plotted.

  • utils.py: Contains the code for loss functions and metrics for quantitative results.

  • rosnode: Contains a rosnode which performs proactive anomaly detection in real-time using PAAD.

Dataset

Both the offline dataset and the rosbags used for real-time test in the paper are available.

Each datapoint in the dataset consists of the following information:

  • image: a front-view image of size 320 × 240.
  • LiDAR scan: a range reading of size 1081, which covers a 270° range with 0.25° angular resolution.
  • x coordinates of the planned trajectory in bird's-eye view
  • y coordinates of the planned trajectory in bird's-eye view
  • label: a vector of size 10, which indicates if the robot fails the navigation task in the next 10 time steps.

The reference frame, in which planned trajectories are defined, is as follows:

Sample datapoints from the dataset are as follows:

The training set consists of 29292 datapoints and contains 2258 anomalous behaviors collected over 5 days, while the test set consists of 6869 datapoints and contains 689 anomalous behaviors from data colected on 2 additional days.

The 6 rosbags used for the real-time test were collected on additional days and contain all the necessary perception signals for PAAD. The detailed related rostopics can be found in the sample code provided in rosnode.

ROS

A minimal ROS workspace for running the real-time test of PAAD could look like this:

workspace_folder/         -- WORKSPACE
  src/
    CMakeLists.txt
    fpn_msgs/             -- catkin package for custom msgs
      CMakeLists.txt
      package.xml
      msg/
        Terrabot.msg
    proactive_ad/         -- catkin package for PAAD
      CMakeLists.txt
      package.xml
      src/
        ad_paad.py
        nets/
          ...

Please follow ROS tutorials to create ROS packages and msgs.

After the workspace is ready, PAAD can be tested in real time by executing the following steps:

  1. run catkin_make in the workspace.
  2. run python3 ad_paad.py
  3. play a bag file of your choice.

The RGB images and LiDAR point clouds can be visualized in real time through rviz.

Citation

If you find the code or the dataset useful, please cite our paper:

@article{ji2022proactive,
  title={Proactive Anomaly Detection for Robot Navigation With Multi-Sensor Fusion},
  author={Ji, Tianchen and Sivakumar, Arun Narenthiran and Chowdhary, Girish and Driggs-Campbell, Katherine},
  journal={IEEE Robotics and Automation Letters},
  year={2022},
  volume={7},
  number={2},
  pages={4975-4982},
  doi={10.1109/LRA.2022.3153989}}

Contact

Should you have any questions or comments on the code, please feel free to open an issue or contact the author at [email protected].

paad's People

Contributors

tianchenji avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

Forkers

2023-paper-fun

paad's Issues

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.