Research
-
V4RL Datasets
-
V4RL Code Releases
- Code: MemCDT
- Code: COVINS
- Code: Event-based Feature Tracking
- Code: Multi-robot Coordination for Autonomous Navigation in Partially Unknown Environments
- Aerial Single-view Depth Completion: Code + Datasets + Simulator
- Code: CCM-SLAM
- Code: Real-time Mesh-based Scene Estimation
- Code: Visual-Inertial Relative Pose Estimation for Aerial Vehicles
- Code: COVINS-G
-
Projects
-
V4RL Setup Release
Code: Multi-robot Coordination for Autonomous Navigation in Partially Unknown Environments
This multi-agent, centralized coordination framework performs state estimation, mapping and path-planning for a small team of robots (UAVs in the original paper, where this method was presented). In contrast to existing approaches, we relax the common assumptions of perfect knowledge of the robots’ poses and availability of pre-built maps.
Assuming that each robot is equipped with a monocular camera, an IMU, GPS and a depth sensor, this framework can co-localize the robots in a geo-referenced reference frame and create a joint globally consistent map of the environment by fusing the experiences of the individual agents. This map is then used by a global planner that, by assigning priorities to the robots, coordinates their motions while they navigate towards their respective user-selected goal positions. While the computationally expensive components of the pipeline can be outsourced to a potentially powerful server, only keyframe-based visual-inertial odometry and local obstacle avoidance run onboard each robot.
The software for the pipeline is publicly available and can be accessed via link. The software is released as a series of ROS-based C++ packages.
Users of this software are asked to cite the following article, where it was introduced:
Luca Bartolomei, Marco Karrer and Margarita Chli, “Multi-robot Coordination with Agent-Server Architecture for Autonomous Navigation in Partially Unknown Environments" in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2020. (DOI) (Research Collection)
The video below shows example experiments presented in this work, while here is the video presentation of this work for IROS 2020.