Name: Robot Perception and Learning Lab - Legged Robotics at UCL
Type: Organization
Bio: The Robot Perception and Learning (RPL) Lab researches robots with limbs (e.g., legged) to function in challenging environments.
Twitter: rpl_as_ucl
Location: Department of Computer Science, Univesisty College London (UCL)
Blog: https://rpl-as-ucl.github.io
Robot Perception and Learning Lab - Legged Robotics at UCL's Projects
Description of the RPL-CS-UCL organization
Local Navigation Planner for Legged Robots
ASFM: Augmented Social Force Model\\for Legged Robot Social Navigation
Leveraging system development and robot deployment for ground-based autonomous navigation and exploration.
[RSS 2023] Diffusion Policy Visuomotor Policy Learning via Action Diffusion
DiPPeR Project Webpage
DiPPeST webpage
DMMP: Diffusion Model Motion Primitives for Proactive Assistance in Teleoperation Tasks project website
Implementation of Dreamer v3 in pytorch.
Elevation Mapping on GPU.
Fast, Attemptable Route Planner for Navigation in Known and Unknown Environments
ROS integration for Franka Emika research robots
Wrappers, tools and additional API's for using ROS with Gazebo
ROS metapackages with footstep planning and localization for humanoid robots
Hydra ROS Interface
iPlanner: Imperative Path Planning. An end-to-end learning planning framework using a novel unsupervised imperative learning approach
Isaac Gym Reinforcement Learning Environments
Livox device driver under ros, support Lidar Mid-40, Mid-70, Tele-15, Horizon, Avia.
A sphinx-based centralized documentation repo for MoveIt
Broadcasts the odometry frame for RB-KAIROS and MPPL
This repository contains code for implementing the neural network on Unitree GO1.
The Panda robot is the flagship MoveIt integration robot
Panda Franka Emika Simulation (Gazebo)
Robotnik Car common packages