Fan Rui, Ozgunalp Umar, Hosking Brett, Liu Ming, Pitas Ioannis
IEEE Trans Image Process. 2019 Aug 22. doi: 10.1109/TIP.2019.2933750.
Pothole detection is one of the most important tasks for road maintenance. Computer vision approaches are generally based on either 2D road image analysis or 3D road surface modeling. However, these two categories are always used independently. Furthermore, the pothole detection accuracy is still far from satisfactory. Therefore, in this paper, we present a robust pothole detection algorithm that is both accurate and computationally efficient. A dense disparity map is first transformed to better distinguish between damaged and undamaged road areas. To achieve greater disparity transformation efficiency, golden section search and dynamic programming are utilized to estimate the transformation parameters. Otsu's thresholding method is then used to extract potential undamaged road areas from the transformed disparity map. The disparities in the extracted areas are modeled by a quadratic surface using least squares fitting. To improve disparity map modeling robustness, the surface normal is also integrated into the surface modeling process. Furthermore, random sample consensus is utilized to reduce the effects caused by outliers. By comparing the difference between the actual and modeled disparity maps, the potholes can be detected accurately. Finally, the point clouds of the detected potholes are extracted from the reconstructed 3D road surface. The experimental results show that the successful detection accuracy of the proposed system is around 98.7% and the overall pixel-level accuracy is approximately 99.6%.
坑洼检测是道路养护最重要的任务之一。计算机视觉方法通常基于二维道路图像分析或三维路面建模。然而,这两类方法总是独立使用。此外,坑洼检测精度仍远不能令人满意。因此,在本文中,我们提出了一种既准确又计算高效的鲁棒坑洼检测算法。首先对密集视差图进行变换,以更好地区分受损和未受损的道路区域。为了实现更高的视差变换效率,利用黄金分割搜索和动态规划来估计变换参数。然后使用大津阈值法从变换后的视差图中提取潜在的未受损道路区域。使用最小二乘法拟合,通过二次曲面模拟提取区域中的视差。为了提高视差图建模的鲁棒性,还将表面法线集成到表面建模过程中。此外,利用随机抽样一致性来减少异常值的影响。通过比较实际视差图和建模视差图之间的差异,可以准确检测出坑洼。最后,从重建的三维路面中提取检测到的坑洼的点云。实验结果表明,所提系统的成功检测准确率约为98.7%,整体像素级准确率约为99.6%。