3D Rock and Pothole Detection in Desert for the Wild Navigation

Robust rock and pothole detection is essential for autonomous exploration in challenging desert environments. The barren environment requires accurate traversability estimation based on the detected rocks and potholes in the wild navigation. However, the diverse shapes, scales, and occlusions of rocks and potholes make the detection challenging. Current single-sensor approaches (e.g., Camera or LiDAR methods) struggle to obtain accurate locations and landmarks for traversability estimation. However, the light and ground conditions captured by Cameras and the sparsity of LiDAR point cloud pose challenges for singles-sensor pipelines. Additionally, the undulating ground of the desert often makes the sensor calibration with errors, which has a significant influence on the accurate detection. To this end, we present DyFTNet, a LiDAR-Camera fusion-based 3D rock and pothole detection method that specially combines features from both modalities by dynamically correcting point cloud projection errors induced by desert surface irregularities. Subsequently, we construct a multi-modality 3D detection dataset in the desert environment, named U2Ground dataset, consisting of 5534 rocks and 2926 potholes within 1463 pairs of LiDAR-Camera frames. DyFTNet is verified on U2Ground with a comparison to the state-of-the-art methods and superior performance is obtained. The code and dataset will be available on the website https://github.com/ Yyb-XJTU/U2Ground-dataset.