Ono Shuji
Fujifilm Corporation, Kaisei 250-8577, Ashigara-kami, Kanagawa, Japan.
J Imaging. 2025 Mar 21;11(4):93. doi: 10.3390/jimaging11040093.
In this study, we propose a color-based multispectral approach using four selected wavelengths (453, 556, 668, and 708 nm) from the visible to near-infrared range to separate clothing from the background. Our goal is to develop a human detection camera that supports real-time processing, particularly under daytime conditions and for common fabrics. While conventional deep learning methods can detect humans accurately, they often require large computational resources and struggle with partially occluded objects. In contrast, we treat clothing detection as a proxy for human detection and construct a lightweight machine learning model (multi-layer perceptron) based on these four wavelengths. Without relying on full spectral data, this method achieves an accuracy of 0.95, precision of 0.97, recall of 0.93, and an F1-score of 0.95. Because our color-driven detection relies on pixel-wise spectral reflectance rather than spatial patterns, it remains computationally efficient. A simple four-band camera configuration could thus facilitate real-time human detection. Potential applications include pedestrian detection in autonomous driving, security surveillance, and disaster victim searches.
在本研究中,我们提出了一种基于颜色的多光谱方法,该方法使用从可见光到近红外范围的四个选定波长(453、556、668和708纳米)来将衣物与背景分离。我们的目标是开发一种支持实时处理的人体检测相机,特别是在白天条件下以及针对常见织物的情况。虽然传统的深度学习方法能够准确地检测人体,但它们通常需要大量的计算资源,并且在处理部分遮挡的物体时存在困难。相比之下,我们将衣物检测作为人体检测的替代方法,并基于这四个波长构建了一个轻量级的机器学习模型(多层感知器)。该方法不依赖于全光谱数据,实现了0.95的准确率、0.97的精确率、0.93的召回率以及0.95的F1分数。由于我们基于颜色的检测依赖于逐像素的光谱反射率而非空间模式,因此在计算上仍然高效。这样一种简单的四波段相机配置能够促进实时人体检测。潜在应用包括自动驾驶中的行人检测、安全监控以及灾害遇难者搜寻。
J Imaging. 2025-3-21
Opt Express. 2020-11-9
Sensors (Basel). 2023-12-22
J Biomed Opt. 2019-9
Acc Chem Res. 2010-6-15
Biomed Opt Express. 2022-5-19
Sensors (Basel). 2024-8-13
Micromachines (Basel). 2021-6-30
Opt Express. 2020-11-9
IEEE Trans Pattern Anal Mach Intell. 2016-6-6
Sensors (Basel). 2016-6-4
IEEE Trans Pattern Anal Mach Intell. 2010-9
Opt Express. 2004-4-19
Proc Natl Acad Sci U S A. 1959-8