• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

识别信息特征变化:消息重要性传递度量及其在大数据中的应用

Recognizing Information Feature Variation: Message Importance Transfer Measure and Its Applications in Big Data.

作者信息

She Rui, Liu Shanyun, Fan Pingyi

机构信息

Department of Electronic Engineering, Tsinghua University, Beijing 30332, China.

出版信息

Entropy (Basel). 2018 May 24;20(6):401. doi: 10.3390/e20060401.

DOI:10.3390/e20060401
PMID:33265491
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7512920/
Abstract

Information transfer that characterizes the information feature variation can have a crucial impact on big data analytics and processing. Actually, the measure for information transfer can reflect the system change from the statistics by using the variable distributions, similar to Kullback-Leibler (KL) divergence and Renyi divergence. Furthermore, to some degree, small probability events may carry the most important part of the total message in an information transfer of big data. Therefore, it is significant to propose an information transfer measure with respect to the message importance from the viewpoint of small probability events. In this paper, we present the message importance transfer measure (MITM) and analyze its performance and applications in three aspects. First, we discuss the robustness of MITM by using it to measuring information distance. Then, we present a message importance transfer capacity by resorting to the MITM and give an upper bound for the information transfer process with disturbance. Finally, we apply the MITM to discuss the queue length selection, which is the fundamental problem of caching operation on mobile edge computing.

摘要

表征信息特征变化的信息传递对大数据分析与处理可能产生至关重要的影响。实际上,信息传递的度量可以通过使用变量分布从统计角度反映系统变化,这类似于库尔贝克 - 莱布勒(KL)散度和雷尼散度。此外,在大数据的信息传递中,小概率事件在某种程度上可能承载了总信息中最重要的部分。因此,从小概率事件的角度提出一种关于消息重要性的信息传递度量具有重要意义。在本文中,我们提出了消息重要性传递度量(MITM)并从三个方面分析其性能与应用。首先,我们通过使用MITM测量信息距离来讨论其稳健性。然后,借助MITM给出一个消息重要性传递容量,并给出有干扰情况下信息传递过程的上界。最后,我们应用MITM来讨论队列长度选择问题,这是移动边缘计算中缓存操作的基本问题。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/e996136f9d20/entropy-20-00401-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/110a3b62a498/entropy-20-00401-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/a8ef011ec1af/entropy-20-00401-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/06d9477f633a/entropy-20-00401-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/50bb65453519/entropy-20-00401-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/758112f8aff2/entropy-20-00401-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/e055fadbf678/entropy-20-00401-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/e996136f9d20/entropy-20-00401-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/110a3b62a498/entropy-20-00401-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/a8ef011ec1af/entropy-20-00401-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/06d9477f633a/entropy-20-00401-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/50bb65453519/entropy-20-00401-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/758112f8aff2/entropy-20-00401-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/e055fadbf678/entropy-20-00401-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3e10/7512920/e996136f9d20/entropy-20-00401-g007.jpg

相似文献

1
Recognizing Information Feature Variation: Message Importance Transfer Measure and Its Applications in Big Data.识别信息特征变化:消息重要性传递度量及其在大数据中的应用
Entropy (Basel). 2018 May 24;20(6):401. doi: 10.3390/e20060401.
2
Attention to the Variation of Probabilistic Events: Information Processing with Message Importance Measure.关注概率事件的变化:基于消息重要性度量的信息处理
Entropy (Basel). 2019 Apr 26;21(5):439. doi: 10.3390/e21050439.
3
Optimum Achievable Rates in Two Random Number Generation Problems with -Divergences Using Smooth Rényi Entropy.使用平滑Rényi熵在两个具有散度的随机数生成问题中的最优可达速率
Entropy (Basel). 2024 Sep 6;26(9):766. doi: 10.3390/e26090766.
4
Revisiting Chernoff Information with Likelihood Ratio Exponential Families.用似然比指数族重新审视切尔诺夫信息。
Entropy (Basel). 2022 Oct 1;24(10):1400. doi: 10.3390/e24101400.
5
Towards an Optimized Distributed Message Queue System for AIoT Edge Computing: A Reinforcement Learning Approach.面向 AIoT 边缘计算的优化分布式消息队列系统:强化学习方法。
Sensors (Basel). 2023 Jun 8;23(12):5447. doi: 10.3390/s23125447.
6
Bayesian similarity searching in high-dimensional descriptor spaces combined with Kullback-Leibler descriptor divergence analysis.结合库尔贝克-莱布勒描述符散度分析的高维描述符空间中的贝叶斯相似性搜索。
J Chem Inf Model. 2008 Feb;48(2):247-55. doi: 10.1021/ci700333t. Epub 2008 Jan 30.
7
Computation of Kullback-Leibler Divergence in Bayesian Networks.贝叶斯网络中库尔贝克-莱布勒散度的计算。
Entropy (Basel). 2021 Aug 28;23(9):1122. doi: 10.3390/e23091122.
8
Minimising the Kullback-Leibler Divergence for Model Selection in Distributed Nonlinear Systems.用于分布式非线性系统模型选择的最小化库尔贝克-莱布勒散度
Entropy (Basel). 2018 Jan 23;20(2):51. doi: 10.3390/e20020051.
9
Exact Expressions for Kullback-Leibler Divergence for Multivariate and Matrix-Variate Distributions.多元和矩阵变量分布的库尔贝克-莱布勒散度的精确表达式。
Entropy (Basel). 2024 Aug 4;26(8):663. doi: 10.3390/e26080663.
10
Principles of Bayesian Inference Using General Divergence Criteria.使用一般散度准则的贝叶斯推断原理。
Entropy (Basel). 2018 Jun 6;20(6):442. doi: 10.3390/e20060442.

引用本文的文献

1
Jeffreys Divergence and Generalized Fisher Information Measures on Fokker-Planck Space-Time Random Field.福克-普朗克时空随机场上的杰弗里斯散度和广义费希尔信息度量
Entropy (Basel). 2023 Oct 13;25(10):1445. doi: 10.3390/e25101445.
2
Entropy Analysis of a Flexible Markovian Queue with Server Breakdowns.具有服务器故障的灵活马尔可夫队列的熵分析。
Entropy (Basel). 2020 Sep 3;22(9):979. doi: 10.3390/e22090979.
3
Storage Space Allocation Strategy for Digital Data with Message Importance.

本文引用的文献

1
Optimization of CNN through Novel Training Strategy for Visual Classification Problems.通过新颖训练策略优化卷积神经网络以解决视觉分类问题
Entropy (Basel). 2018 Apr 17;20(4):290. doi: 10.3390/e20040290.
2
KL Divergence-Based Fuzzy Cluster Ensemble for Image Segmentation.基于KL散度的模糊聚类集成用于图像分割
Entropy (Basel). 2018 Apr 12;20(4):273. doi: 10.3390/e20040273.
3
A Feature Extraction Method Using Improved Multi-Scale Entropy for Rolling Bearing Fault Diagnosis.一种基于改进多尺度熵的滚动轴承故障诊断特征提取方法
Entropy (Basel). 2020 May 25;22(5):591. doi: 10.3390/e22050591.
4
Attention to the Variation of Probabilistic Events: Information Processing with Message Importance Measure.关注概率事件的变化:基于消息重要性度量的信息处理
Entropy (Basel). 2019 Apr 26;21(5):439. doi: 10.3390/e21050439.
5
Matching Users' Preference under Target Revenue Constraints in Data Recommendation Systems.数据推荐系统中目标收益约束下的用户偏好匹配
Entropy (Basel). 2019 Feb 21;21(2):205. doi: 10.3390/e21020205.
Entropy (Basel). 2018 Mar 21;20(4):212. doi: 10.3390/e20040212.
4
Information transfer between dynamical system components.动力系统组件之间的信息传递。
Phys Rev Lett. 2005 Dec 9;95(24):244101. doi: 10.1103/PhysRevLett.95.244101. Epub 2005 Dec 8.
5
Measuring information transfer.测量信息传递。
Phys Rev Lett. 2000 Jul 10;85(2):461-4. doi: 10.1103/PhysRevLett.85.461.