• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用分布式编码器进行猜测。

Guessing with Distributed Encoders.

作者信息

Bracher Annina, Lapidoth Amos, Pfister Christoph

机构信息

P&C Solutions, Swiss Re, 8022 Zurich, Switzerland.

Signal and Information Processing Laboratory, ETH Zurich, 8092 Zurich, Switzerland.

出版信息

Entropy (Basel). 2019 Mar 19;21(3):298. doi: 10.3390/e21030298.

DOI:10.3390/e21030298
PMID:33267013
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7514780/
Abstract

Two correlated sources emit a pair of sequences, each of which is observed by a different encoder. Each encoder produces a rate-limited description of the sequence it observes, and the two descriptions are presented to a guessing device that repeatedly produces sequence pairs until correct. The number of guesses until correct is random, and it is required that it have a moment (of some prespecified order) that tends to one as the length of the sequences tends to infinity. The description rate pairs that allow this are characterized in terms of the Rényi entropy and the Arimoto-Rényi conditional entropy of the joint law of the sources. This solves the guessing analog of the Slepian-Wolf distributed source-coding problem. The achievability is based on random binning, which is analyzed using a technique by Rosenthal.

摘要

两个相关源发射一对序列,每个序列由不同的编码器进行观测。每个编码器对其观测到的序列生成一个速率受限的描述,并且这两个描述被呈现给一个猜测装置,该装置反复生成序列对直到正确为止。直到正确的猜测次数是随机的,并且要求当序列长度趋于无穷大时,它具有某个预定阶数的矩趋于1。允许这种情况的描述速率对是根据源的联合律的Rényi熵和Arimoto - Rényi条件熵来表征的。这解决了Slepian - Wolf分布式源编码问题的猜测类似问题。可达性基于随机分箱,使用Rosenthal的一种技术对其进行分析。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3b78/7514780/fa122d5e1689/entropy-21-00298-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3b78/7514780/08f1ce952b9b/entropy-21-00298-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3b78/7514780/fa122d5e1689/entropy-21-00298-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3b78/7514780/08f1ce952b9b/entropy-21-00298-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3b78/7514780/fa122d5e1689/entropy-21-00298-g002.jpg

相似文献

1
Guessing with Distributed Encoders.使用分布式编码器进行猜测。
Entropy (Basel). 2019 Mar 19;21(3):298. doi: 10.3390/e21030298.
2
Tight Bounds on the Rényi Entropy via Majorization with Applications to Guessing and Compression.通过优超得到的雷尼熵的紧界及其在猜测和压缩中的应用
Entropy (Basel). 2018 Nov 22;20(12):896. doi: 10.3390/e20120896.
3
Are Guessing, Source Coding and Tasks Partitioning Birds of A Feather?猜测、源编码和任务划分是同类事物吗?
Entropy (Basel). 2022 Nov 19;24(11):1695. doi: 10.3390/e24111695.
4
Trade-offs between Error Exponents and Excess-Rate Exponents of Typical Slepian-Wolf Codes.典型斯莱皮恩 - 沃尔夫码的错误指数与超额速率指数之间的权衡
Entropy (Basel). 2021 Feb 24;23(3):265. doi: 10.3390/e23030265.
5
Conditional Rényi Divergences and Horse Betting.条件雷尼散度与赛马投注
Entropy (Basel). 2020 Mar 11;22(3):316. doi: 10.3390/e22030316.
6
Optimum Achievable Rates in Two Random Number Generation Problems with -Divergences Using Smooth Rényi Entropy.使用平滑Rényi熵在两个具有散度的随机数生成问题中的最优可达速率
Entropy (Basel). 2024 Sep 6;26(9):766. doi: 10.3390/e26090766.
7
Source Symbol Purging-Based Distributed Conditional Arithmetic Coding.基于源符号清除的分布式条件算术编码
Entropy (Basel). 2021 Jul 30;23(8):983. doi: 10.3390/e23080983.
8
Channel-Supermodular Entropies: Order Theory and an Application to Query Anonymization.通道超模熵:序理论及其在查询匿名化中的应用
Entropy (Basel). 2021 Dec 25;24(1):39. doi: 10.3390/e24010039.
9
Low Complexity Estimation Method of Rényi Entropy for Ergodic Sources.遍历源的雷尼熵低复杂度估计方法
Entropy (Basel). 2018 Aug 31;20(9):657. doi: 10.3390/e20090657.
10
Coding of correlated sources with prescribed distortion by separated encoders.分离编码器对预定失真的相关源进行编码。
Proc Natl Acad Sci U S A. 1980 Oct;77(10):5618-9. doi: 10.1073/pnas.77.10.5618.

引用本文的文献

1
Conditional Rényi Divergences and Horse Betting.条件雷尼散度与赛马投注
Entropy (Basel). 2020 Mar 11;22(3):316. doi: 10.3390/e22030316.

本文引用的文献

1
Tight Bounds on the Rényi Entropy via Majorization with Applications to Guessing and Compression.通过优超得到的雷尼熵的紧界及其在猜测和压缩中的应用
Entropy (Basel). 2018 Nov 22;20(12):896. doi: 10.3390/e20120896.