Levi Dan, Gispan Liran, Giladi Niv, Fetaya Ethan
General Motors Israel, Herzliya 4672515, Israel.
Faculty of Computer Science, Technion, Haifa 3200003, Israel.
Sensors (Basel). 2022 Jul 25;22(15):5540. doi: 10.3390/s22155540.
Predicting not only the target but also an accurate measure of uncertainty is important for many machine learning applications, and in particular, safety-critical ones. In this work, we study the calibration of uncertainty prediction for regression tasks which often arise in real-world systems. We show that the existing definition for the calibration of regression uncertainty has severe limitations in distinguishing informative from non-informative uncertainty predictions. We propose a new definition that escapes this caveat and an evaluation method using a simple histogram-based approach. Our method clusters examples with similar uncertainty prediction and compares the prediction with the empirical uncertainty on these examples. We also propose a simple, scaling-based calibration method that preforms as well as much more complex ones. We show results on both a synthetic, controlled problem and on the object detection bounding-box regression task using the COCO and KITTI datasets.
对于许多机器学习应用,尤其是对安全至关重要的应用来说,不仅要预测目标,还要准确衡量不确定性。在这项工作中,我们研究了回归任务中不确定性预测的校准,回归任务在现实世界系统中经常出现。我们表明,回归不确定性校准的现有定义在区分信息性和非信息性不确定性预测方面存在严重局限性。我们提出了一个新定义来规避这一问题,并提出了一种基于简单直方图方法的评估方法。我们的方法对具有相似不确定性预测的示例进行聚类,并将预测结果与这些示例上的经验不确定性进行比较。我们还提出了一种简单的基于缩放的校准方法,其性能与更复杂的方法相当。我们在一个合成的、受控的问题以及使用COCO和KITTI数据集的目标检测边界框回归任务上展示了结果。