LiDAR-Camera Fusion Methods for Long-Distance Rail Transit Perception

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

In autonomous driving systems, sensor-based environmental perception is paramount. However, in long-distance perception for rail transit, the extrinsic calibration of LiDAR and telephoto cameras is hindered by sparse point clouds and intrinsic parameter inaccuracies. To address these challenges, we propose a novel calibration board design and a corresponding joint extrinsic calibration method. Inspired by engineering positioning principles, this calibration board builds upon the traditional checkerboard by integrating circular positioning holes. By coupling spatial re-projection constraints with geometric feature alignment, the proposed approach markedly improves feature point extraction and 2D–3D correspondences. Experimental results demonstrate that the method substantially enhances both calibration accuracy and efficiency, offering solid technical support for environmental perception in rail transit.

Article activity feed