Jacobs R A
Department of Brain and Cognitive Sciences, University of Rochester, NY 14627, USA.
Neural Comput. 1995 Sep;7(5):867-88. doi: 10.1162/neco.1995.7.5.867.
This article reviews statistical techniques for combining multiple probability distributions. The framework is that of a decision maker who consults several experts regarding some events. The experts express their opinions in the form of probability distributions. The decision maker must aggregate the experts' distributions into a single distribution that can be used for decision making. Two classes of aggregation methods are reviewed. When using a supra Bayesian procedure, the decision maker treats the expert opinions as data that may be combined with its own prior distribution via Bayes' rule. When using a linear opinion pool, the decision maker forms a linear combination of the expert opinions. The major feature that makes the aggregation of expert opinions difficult is the high correlation or dependence that typically occurs among these opinions. A theme of this paper is the need for training procedures that result in experts with relatively independent opinions or for aggregation methods that implicitly or explicitly model the dependence among the experts. Analyses are presented that show that m dependent experts are worth the same as k independent experts where k < or = m. In some cases, an exact value for k can be given; in other cases, lower and upper bounds can be placed on k.
本文回顾了用于合并多个概率分布的统计技术。其框架是决策者就某些事件咨询多位专家。专家们以概率分布的形式表达他们的意见。决策者必须将专家们的分布汇总成一个可用于决策的单一分布。文中回顾了两类汇总方法。当使用超贝叶斯程序时,决策者将专家意见视为可通过贝叶斯规则与其自身先验分布相结合的数据。当使用线性意见池时,决策者形成专家意见的线性组合。使专家意见汇总变得困难的主要特征是这些意见之间通常存在的高度相关性或依赖性。本文的一个主题是需要有能使专家意见相对独立的培训程序,或者需要有能隐式或显式对专家之间的依赖性进行建模的汇总方法。文中给出的分析表明,m个相关专家等同于k个独立专家,其中k≤m。在某些情况下,可以给出k的精确值;在其他情况下,可以给出k的上下界。