Link Search Menu Expand Document

Reconfigurable Voxels: A New Representation for LiDAR-Based Point Clouds

Paper PDF Supplemental

Authors

Tai Wang (The Chinese University of Hong Kong)*; Xinge Zhu (The Chinese University of Hong Kong); Dahua Lin (The Chinese University of Hong Kong)

Interactive Session

2020-11-16, 12:30 - 13:00 PST | PheedLoop Session

Abstract

LiDAR is an important method for autonomous driving systems to sense the environment. The point clouds obtained by LiDAR typically exhibit sparse and irregular distribution, thus posing great challenges to the detection of 3D objects, especially those that are small and distant. To tackle this difficulty, we propose Reconfigurable Voxels, a new approach to constructing representations from 3D point clouds. Specifically, we devise a biased random walk scheme, which adaptively covers each neighborhood with a fixed number of voxels based on the local spatial distribution and produces a representation by integrating the points in the chosen neighbors. We found empirically that this approach effectively improves the stability of voxel features, especially for sparse regions. Experimental results on multiple benchmarks, including nuScenes, Lyft, and KITTI, show that this new representation can remarkably improve the detection performance for small and distant objects, without incurring noticeable overhead cost

Video

Reviews and Rebuttal

Reviews & Rebuttal


Conference on Robot Learning 2020