• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

关注概率事件的变化:基于消息重要性度量的信息处理

Attention to the Variation of Probabilistic Events: Information Processing with Message Importance Measure.

作者信息

She Rui, Liu Shanyun, Fan Pingyi

机构信息

Beijing National Research Center for Information Science and Technology, Department of Electronic Engineering, Tsinghua University, Beijing 100084, China.

出版信息

Entropy (Basel). 2019 Apr 26;21(5):439. doi: 10.3390/e21050439.

DOI:10.3390/e21050439
PMID:33267153
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7514929/
Abstract

Different probabilities of events attract different attention in many scenarios such as anomaly detection and security systems. To characterize the events' importance from a probabilistic perspective, the message importance measure (MIM) is proposed as a kind of semantics analysis tool. Similar to Shannon entropy, the MIM has its special function in information representation, in which the parameter of MIM plays a vital role. Actually, the parameter dominates the properties of MIM, based on which the MIM has three work regions where this measure can be used flexibly for different goals. When the parameter is positive but not large enough, the MIM not only provides a new viewpoint for information processing but also has some similarities with Shannon entropy in the information compression and transmission. In this regard, this paper first constructs a system model with message importance measure and proposes the message importance loss to enrich the information processing strategies. Moreover, the message importance loss capacity is proposed to measure the information importance harvest in a transmission. Furthermore, the message importance distortion function is discussed to give an upper bound of information compression based on the MIM. Additionally, the bitrate transmission constrained by the message importance loss is investigated to broaden the scope for Shannon information theory.

摘要

在诸如异常检测和安全系统等许多场景中,不同的事件概率会吸引不同的关注。为了从概率角度刻画事件的重要性,提出了消息重要性度量(MIM)作为一种语义分析工具。与香农熵类似,MIM在信息表示方面有其特殊作用,其中MIM的参数起着至关重要的作用。实际上,该参数主导着MIM的性质,基于此MIM有三个工作区域,在这些区域中该度量可针对不同目标灵活使用。当参数为正但不够大时,MIM不仅为信息处理提供了新视角,而且在信息压缩和传输方面与香农熵有一些相似之处。在此方面,本文首先构建了一个带有消息重要性度量的系统模型,并提出了消息重要性损失以丰富信息处理策略。此外,提出了消息重要性损失容量来度量传输中收获的信息重要性。再者,讨论了消息重要性失真函数以给出基于MIM的信息压缩的上界。另外,研究了受消息重要性损失约束的比特率传输,以拓宽香农信息论的范围。

相似文献

1
Attention to the Variation of Probabilistic Events: Information Processing with Message Importance Measure.关注概率事件的变化:基于消息重要性度量的信息处理
Entropy (Basel). 2019 Apr 26;21(5):439. doi: 10.3390/e21050439.
2
Recognizing Information Feature Variation: Message Importance Transfer Measure and Its Applications in Big Data.识别信息特征变化:消息重要性传递度量及其在大数据中的应用
Entropy (Basel). 2018 May 24;20(6):401. doi: 10.3390/e20060401.
3
Matching Users' Preference under Target Revenue Constraints in Data Recommendation Systems.数据推荐系统中目标收益约束下的用户偏好匹配
Entropy (Basel). 2019 Feb 21;21(2):205. doi: 10.3390/e21020205.
4
Information theory and the ethylene genetic network.信息论与乙烯遗传网络。
Plant Signal Behav. 2011 Oct;6(10):1483-98. doi: 10.4161/psb.6.10.16424. Epub 2011 Oct 1.
5
Storage Space Allocation Strategy for Digital Data with Message Importance.
Entropy (Basel). 2020 May 25;22(5):591. doi: 10.3390/e22050591.
6
A psychophysical theory of Shannon entropy.香农熵的一种心理物理学理论。
Neuro Endocrinol Lett. 2013;34(7):615-7.
7
Globally Variance-Constrained Sparse Representation and Its Application in Image Set Coding.全局方差约束稀疏表示及其在图像集编码中的应用。
IEEE Trans Image Process. 2018 Aug;27(8):3753-3765. doi: 10.1109/TIP.2018.2823546.
8
Elucidating the link between binding statistics and Shannon information in biological networks.阐明生物网络中结合统计与香农信息之间的联系。
J Chem Phys. 2024 Sep 28;161(12). doi: 10.1063/5.0226904.
9
Rényi entropy measure of noise-aided information transmission in a binary channel.二元信道中噪声辅助信息传输的雷尼熵度量
Phys Rev E Stat Nonlin Soft Matter Phys. 2010 May;81(5 Pt 1):051112. doi: 10.1103/PhysRevE.81.051112. Epub 2010 May 12.
10
An Information Entropy-Based Modeling Method for the Measurement System.一种基于信息熵的测量系统建模方法。
Entropy (Basel). 2019 Jul 15;21(7):691. doi: 10.3390/e21070691.

引用本文的文献

1
Jeffreys Divergence and Generalized Fisher Information Measures on Fokker-Planck Space-Time Random Field.福克-普朗克时空随机场上的杰弗里斯散度和广义费希尔信息度量
Entropy (Basel). 2023 Oct 13;25(10):1445. doi: 10.3390/e25101445.
2
Robust, practical and comprehensive analysis of soft compression image coding algorithms for big data.大数据下软压缩图像编码算法的健壮性、实用性和全面性分析。
Sci Rep. 2023 Feb 2;13(1):1958. doi: 10.1038/s41598-023-29068-z.
3
Information Theoretic Measures and Their Applications.信息论测度及其应用

本文引用的文献

1
Category Theory for Autonomous and Networked Dynamical Systems.自治与网络动力系统的范畴论
Entropy (Basel). 2019 Mar 20;21(3):302. doi: 10.3390/e21030302.
2
Is the Voronoi Entropy a True Entropy? Comments on "Entropy, Shannon's Measure of Information and Boltzmann's H-Theorem", 2017, , 48.Voronoi熵是一种真正的熵吗?对《熵、香农信息测度与玻尔兹曼H定理》的评论,2017年,第48期。
Entropy (Basel). 2019 Mar 6;21(3):251. doi: 10.3390/e21030251.
3
Matching Users' Preference under Target Revenue Constraints in Data Recommendation Systems.
Entropy (Basel). 2020 Dec 7;22(12):1382. doi: 10.3390/e22121382.
4
Anomaly Detection for Individual Sequences with Applications in Identifying Malicious Tools.用于识别恶意工具的单个序列异常检测
Entropy (Basel). 2020 Jun 12;22(6):649. doi: 10.3390/e22060649.
数据推荐系统中目标收益约束下的用户偏好匹配
Entropy (Basel). 2019 Feb 21;21(2):205. doi: 10.3390/e21020205.
4
Recognizing Information Feature Variation: Message Importance Transfer Measure and Its Applications in Big Data.识别信息特征变化:消息重要性传递度量及其在大数据中的应用
Entropy (Basel). 2018 May 24;20(6):401. doi: 10.3390/e20060401.
5
Optimization of CNN through Novel Training Strategy for Visual Classification Problems.通过新颖训练策略优化卷积神经网络以解决视觉分类问题
Entropy (Basel). 2018 Apr 17;20(4):290. doi: 10.3390/e20040290.
6
KL Divergence-Based Fuzzy Cluster Ensemble for Image Segmentation.基于KL散度的模糊聚类集成用于图像分割
Entropy (Basel). 2018 Apr 12;20(4):273. doi: 10.3390/e20040273.
7
A Feature Extraction Method Using Improved Multi-Scale Entropy for Rolling Bearing Fault Diagnosis.一种基于改进多尺度熵的滚动轴承故障诊断特征提取方法
Entropy (Basel). 2018 Mar 21;20(4):212. doi: 10.3390/e20040212.
8
A measure of the concentration of rare events.稀有事件浓度的衡量标准。
Sci Rep. 2016 Aug 31;6:32369. doi: 10.1038/srep32369.