Lin Hao, Mullins Darragh, Molloy Dara, Ward Enda, Collins Fiachra, Denny Patrick, Glavin Martin, Deegan Brian, Jones Edward
School of Engineering, University of Galway, University Road, H91 TK33 Galway, Ireland.
Ryan Institute, University of Galway, University Road, H91 TK33 Galway, Ireland.
Sensors (Basel). 2024 Aug 8;24(16):5135. doi: 10.3390/s24165135.
Camera-based object detection is integral to advanced driver assistance systems (ADAS) and autonomous vehicle research, and RGB cameras remain indispensable for their spatial resolution and color information. This study investigates exposure time optimization for such cameras, considering image quality in dynamic ADAS scenarios. Exposure time, the period during which the camera sensor is exposed to light, directly influences the amount of information captured. In dynamic scenarios, such as those encountered in typical driving scenarios, optimizing exposure time becomes challenging due to the inherent trade-off between Signal-to-Noise Ratio (SNR) and motion blur, i.e., extending exposure time to maximize information capture increases SNR, but also increases the risk of motion blur and overexposure, particularly in low-light conditions where objects may not be fully illuminated. The study introduces a comprehensive methodology for exposure time optimization under various lighting conditions, examining its impact on image quality and computer vision performance. Traditional image quality metrics show a poor correlation with computer vision performance, highlighting the need for newer metrics that demonstrate improved correlation. The research presented in this paper offers guidance into the enhancement of single-exposure camera-based systems for automotive applications. By addressing the balance between exposure time, image quality, and computer vision performance, the findings provide a road map for optimizing camera settings for ADAS and autonomous driving technologies, contributing to safety and performance advancements in the automotive landscape.
基于摄像头的目标检测是先进驾驶辅助系统(ADAS)和自动驾驶车辆研究的重要组成部分,而RGB摄像头因其空间分辨率和颜色信息仍然不可或缺。本研究针对此类摄像头的曝光时间优化展开调查,考虑了动态ADAS场景中的图像质量。曝光时间是指摄像头传感器暴露在光线下的时间段,它直接影响所捕获的信息量。在动态场景中,比如典型驾驶场景中遇到的情况,由于信噪比(SNR)和运动模糊之间存在固有的权衡,优化曝光时间变得具有挑战性,即延长曝光时间以最大化信息捕获会增加信噪比,但同时也会增加运动模糊和过度曝光的风险,特别是在低光照条件下,物体可能无法被充分照亮。该研究引入了一种在各种光照条件下进行曝光时间优化的综合方法,并研究了其对图像质量和计算机视觉性能的影响。传统的图像质量指标与计算机视觉性能的相关性较差,这凸显了需要更新的指标来展示更好的相关性。本文提出的研究为增强基于单曝光摄像头的汽车应用系统提供了指导。通过解决曝光时间、图像质量和计算机视觉性能之间的平衡问题,研究结果为优化ADAS和自动驾驶技术的摄像头设置提供了路线图,有助于推动汽车领域的安全性和性能提升。