Jørgensen Mads Ry Vogel, Svendsen Helle, Schmøkel Mette Stokkebro, Overgaard Jacob, Iversen Bo Brummerstedt
Center for Materials Crystallography, Department of Chemistry and iNANO, Aarhus University, Aarhus C, Denmark.
Acta Crystallogr A. 2012 Mar;68(Pt 2):301-3; discussion 304. doi: 10.1107/S0108767312003066. Epub 2012 Feb 16.
Recently Henn & Meindl [Acta Cryst. (2010), A66, 676-684] examined the significance of Bragg diffraction data through the descriptor W = (I(1/2))/(σ(I)). In the Poisson limit for the intensity errors W equals unity, but any kind of data processing (background subtraction, integration, scaling, absorption correction, Lorentz and polarization correction etc.) introduces additional error as well as remaining systematic errors and thus the significance of processed Bragg diffraction data is expected to be below the Poisson limit (W(Bragg) < 1). Curiously, it was observed by Henn & Meindl for several data sets that W(Bragg) had values larger than one. In the present study this is shown to be an artefact due to the neglect of a data scale factor applied to the standard uncertainties, and corrected values of W(Bragg) applied to Bragg data on an absolute scale are presented, which are all smaller than unity. Furthermore, the error estimation models employed by two commonly used data-processing programs {SADABS (Bruker AXS Inc., Madison, Wisconsin, USA) and SORTAV [Blessing (1997). J. Appl. Cryst. 30, 421-426]} are examined. It is shown that the empirical error model in SADABS very significantly lowers the significance of the Bragg data and it also results in a very strange distributions of errors, as observed by Henn & Meindl. On the other hand, error estimation based on the variance of a population of abundant intensity data, as used in SORTAV, provides reasonable error estimates, which are only slightly less significant than the raw data. Given that modern area detectors make measurement of highly redundant data relatively straightforward, it is concluded that the latter is the best approach for processing of data.
最近,亨恩和迈因德尔[《晶体学报》(2010年),A66卷,676 - 684页]通过描述符W = (I(1/2))/(σ(I))研究了布拉格衍射数据的重要性。在强度误差的泊松极限情况下,W等于1,但任何类型的数据处理(背景扣除、积分、缩放、吸收校正、洛伦兹和偏振校正等)都会引入额外误差以及残留的系统误差,因此处理后的布拉格衍射数据的重要性预计低于泊松极限(W(布拉格)<1)。奇怪的是,亨恩和迈因德尔在几个数据集中观察到W(布拉格)的值大于1。在本研究中,结果表明这是由于忽略了应用于标准不确定度的数据比例因子而导致的假象,并给出了应用于绝对尺度上布拉格数据的W(布拉格)的校正值,这些值均小于1。此外,还研究了两个常用数据处理程序{SADABS(布鲁克AXS公司,美国威斯康星州麦迪逊)和SORTAV[布莱辛(199现研究中,结果表明这是由于忽略了应用于标准不确定度的数据比例因子而导致的假象,并给出了应用于绝对尺度上布拉格数据的W(布拉格)的校正值,这些值均小于1。此外,还研究了两个常用数据处理程序{SADABS(布鲁克AXS公司,美国威斯康星州麦迪逊)和SORTAV[布莱辛(1997年)。《应用晶体学杂志》30卷,421 - 页]}使用的误差估计模型。结果表明,SADABS中的经验误差模型极大地降低了布拉格数据的重要性,并且正如亨恩和迈因德尔所观察到的,它还导致了非常奇怪的误差分布。另一方面,SORTAV中使用的基于大量强度数据总体方差的误差估计提供了合理的误差估计,其重要性仅略低于原始数据。鉴于现代面探测器使高度冗余数据的测量相对简单,得出的结论是,后者是处理数据的最佳方法。 7年)。《应用晶体学杂志》30卷,421 - 426页]}使用的误差估计模型。结果表明,SADABS中的经验误差模型极大地降低了布拉格数据的重要性,并且正如亨恩和迈因德尔所观察到的,它还导致了非常奇怪的误差分布。另一方面,SORTAV中使用的基于大量强度数据总体方差的误差估计提供了合理的误差估计,其重要性仅略低于原始数据。鉴于现代面探测器使高度冗余数据的测量相对简单,得出的结论是,后者是处理数据的最佳方法。