3260 papers • 126 benchmarks • 313 datasets
Radar odometry is the task of estimating the trajectory of the radar sensor, e.g. as presented in https://arxiv.org/abs/2105.01457. A well established performance metric was presented by Geiger (2012) - "Are we ready for autonomous driving? the KITTI vision benchmark suite"
(Image credit: Papersgraph)
These leaderboards are used to track progress in radar-odometry-5
Use these libraries to find radar-odometry-5 models and implementations
No subtasks available.
A self-supervised framework capable of full mapping and localisation with radar in urban environments, and is sensor agnostic and can be applied to most modalities.
This paper presents an accurate, highly efficient and learning free method for large-scale radar odometry estimation by using a simple filtering technique that keeps the strongest returns, and produces a clean radar data representation and reconstruct surface normals for efficient and accurate scan matching.
This paper proposes a novel motion and visual perception approach, dubbed MVP, that unifies these two sensor modalities for large-scale, target-driven navigation tasks and can learn faster, and is more accurate and robust to both extreme environmental changes and poor GPS data than corresponding vision-only navigation methods.
Simulation results demonstrate that the proposed SPEBT scheme is capable of providing precise pose estimation information and accurate beam tracking output, while reducing the proportion of beam training overhead to less than 5% averagely.
Adding a benchmark result helps the community track progress.