• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

PRNU 提取去噪滤波器的实证交叉验证。

An empirical cross-validation of denoising filters for PRNU extraction.

机构信息

PDPM Indian Institute of Information Technology, Design & Manufacturing Jabalpur, MP 482005, India.

出版信息

Forensic Sci Int. 2018 Nov;292:110-114. doi: 10.1016/j.forsciint.2018.09.017. Epub 2018 Sep 26.

DOI:10.1016/j.forsciint.2018.09.017
PMID:30292935
Abstract

The present work is an empirical study to check the importance of widely used PRNU extraction de-noising filters at different stages of source camera identification procedure. The proposed work is unique in the sense that it gives an idea about the choice of the appropriate de-noising filters at the time of PRNU extraction for formation of unique identification pattern of digital camera and noise residue extraction of query image. Also in this work, we have suggested the best values of σ (noise variance) for formation of unique identification pattern of digital camera and noise residue extraction of query image (based on empirical observations). This study has been performed to check that which part (camera unique identification pattern, noise residue, enhancement methods, and value of σ) mostly dominates the performance of source camera identification.

摘要

本工作是一项经验研究,旨在检查在源相机识别过程的不同阶段广泛使用的 PRNU 提取去噪滤波器的重要性。拟议的工作是独特的,因为它提供了有关在 PRNU 提取时选择适当的去噪滤波器的想法,以形成数字相机的独特识别模式和查询图像的噪声残留提取。此外,在这项工作中,我们还根据经验观察建议了形成数字相机的独特识别模式和查询图像的噪声残留提取的最佳 σ 值(噪声方差)。进行这项研究是为了检查哪个部分(相机独特识别模式、噪声残留、增强方法和 σ 值)对源相机识别的性能影响最大。

相似文献

1
An empirical cross-validation of denoising filters for PRNU extraction.PRNU 提取去噪滤波器的实证交叉验证。
Forensic Sci Int. 2018 Nov;292:110-114. doi: 10.1016/j.forsciint.2018.09.017. Epub 2018 Sep 26.
2
Image features dependant correlation-weighting function for efficient PRNU based source camera identification.用于基于高效PRNU的源相机识别的图像特征相关加权函数
Forensic Sci Int. 2018 Apr;285:111-120. doi: 10.1016/j.forsciint.2018.02.005. Epub 2018 Feb 15.
3
Improved photo response non-uniformity (PRNU) based source camera identification.基于改进的图像响应非均匀性(PRNU)的源相机识别。
Forensic Sci Int. 2013 Mar 10;226(1-3):132-41. doi: 10.1016/j.forsciint.2012.12.018. Epub 2013 Jan 9.
4
Beyond PRNU: Learning Robust Device-Specific Fingerprint for Source Camera Identification.超越 PRNU:学习稳健的设备特定指纹用于源相机识别。
Sensors (Basel). 2022 Oct 17;22(20):7871. doi: 10.3390/s22207871.
5
A Stress Test for Robustness of Photo Response Nonuniformity (Camera Sensor Fingerprint) Identification on Smartphones.智能手机光电响应非均匀性(相机传感器指纹)识别稳健性的压力测试。
Sensors (Basel). 2023 Mar 25;23(7):3462. doi: 10.3390/s23073462.
6
Source Camera Identification with a Robust Device Fingerprint: Evolution from Image-Based to Video-Based Approaches.基于稳健设备指纹的源摄像头识别:从基于图像的方法到基于视频的方法的演进
Sensors (Basel). 2023 Aug 24;23(17):7385. doi: 10.3390/s23177385.
7
Forensic use of photo response non-uniformity of imaging sensors and a counter method.成像传感器光响应不均匀性的法医学应用及一种应对方法。
Opt Express. 2014 Jan 13;22(1):470-82. doi: 10.1364/OE.22.000470.
8
Factors that Influence PRNU-Based Camera-Identification via Videos.通过视频影响基于图案噪声的相机识别的因素。
J Imaging. 2021 Jan 13;7(1):8. doi: 10.3390/jimaging7010008.
9
Camera-identification and common-source identification: The correlation values of mismatches.摄像头识别与共同来源识别:不匹配的相关值。
Forensic Sci Int. 2019 Aug;301:46-54. doi: 10.1016/j.forsciint.2019.05.008. Epub 2019 May 10.
10
Source-anchored, trace-anchored, and general match score-based likelihood ratios for camera device identification.基于源锚定、轨迹锚定和综合匹配得分的相机设备识别似然比。
J Forensic Sci. 2022 May;67(3):975-988. doi: 10.1111/1556-4029.14991. Epub 2022 Feb 6.

引用本文的文献

1
Interpol review of imaging and video 2016-2019.国际刑警组织2016 - 2019年成像与视频审查
Forensic Sci Int Synerg. 2020 May 30;2:540-562. doi: 10.1016/j.fsisyn.2020.01.017. eCollection 2020.