Wienk K J, Marx J J, Beynen A C
Department of Laboratory Animal Science, Utrecht University, The Netherlands.
Eur J Nutr. 1999 Apr;38(2):51-75. doi: 10.1007/s003940050046.
In this review a broad overview of historical and current methods for the assessment of iron bioavailability was given. These methods can be divided into iron solubility studies, iron absorption studies, endpoint measures, and arithmetic models. The pros and cons of all methods were discussed. First, studies on in vitro and in vivo iron solubility have been described. The disadvantages of iron solubility include the impossibility of measuring absorption or incorporation of iron. Furthermore, only the solubility of nonheme iron, and not heme iron, can be studied. Second, we focused on iron absorption studies (either with the use of native iron, radioiron or stable iron isotopes), in which balance techniques, whole-body counting or postabsorption plasma iron measurements can be applied. In vitro determination of iron absorption using intestinal loops or cell lines, was also discussed in this part. As far as absorption studies using animals, duodenal loops, gut sacs or Caco-2 cells were concerned, the difficulty of extrapolating the results to the human situation seemed to be the major drawback. Chemical balance in man has been a good, but laborious and expensive, way to study iron absorption. Whole-body counting has the disadvantage of causing radiation exposure and it is based on a single meal. The measurement of plasma iron response did not seem to be of great value in determining nutritional iron bioavailability. The next part dealt with endpoint measures. According to the definition of iron bioavailability, these methods gave the best figure for it. In animals, the hemoglobin-repletion bioassay was most often used, whereas most studies in humans monitored the fate of radioisotopes or stable isotopes of iron in blood. Repletion bioassays using rats or other animals were of limited use because the accuracy of extrapolation to man is unknown. The use of the rat as a model for iron bioavailability seemed to be empirically based, and there were many reasons to consider the rat as an obsolete model in this respect. The double-isotope technique was probably the best predictor of iron bioavailability in humans. Disadvantages of this method are the single meal basis and the exposure to radiation (as far as radioisotopes were used). Finally, some arithmetic models were described. These models were based on data from iron bioavailability studies and could predict the bioavailability of iron from a meal.
本综述对评估铁生物利用度的历史方法和当前方法进行了全面概述。这些方法可分为铁溶解度研究、铁吸收研究、终点指标和算术模型。讨论了所有方法的优缺点。首先,描述了体外和体内铁溶解度的研究。铁溶解度的缺点包括无法测量铁的吸收或掺入。此外,只能研究非血红素铁的溶解度,而不能研究血红素铁的溶解度。其次,我们重点关注铁吸收研究(使用天然铁、放射性铁或稳定铁同位素),其中可以应用平衡技术、全身计数或吸收后血浆铁测量。这部分还讨论了使用肠袢或细胞系体外测定铁吸收的方法。就使用动物、十二指肠袢、肠囊或Caco-2细胞进行的吸收研究而言,将结果外推至人体情况的困难似乎是主要缺点。人体化学平衡法一直是研究铁吸收的一种很好但费力且昂贵的方法。全身计数的缺点是会导致辐射暴露,并且它基于单一餐食。血浆铁反应的测量在确定营养性铁生物利用度方面似乎价值不大。下一部分讨论了终点指标。根据铁生物利用度的定义,这些方法给出了最佳数值。在动物中,最常使用血红蛋白补充生物测定法,而在人类中,大多数研究监测血液中铁的放射性同位素或稳定同位素的去向。使用大鼠或其他动物进行的补充生物测定法用途有限,因为向人类外推的准确性尚不清楚。将大鼠用作铁生物利用度模型似乎是基于经验的,并且有许多理由认为大鼠在这方面是过时的模型。双同位素技术可能是预测人类铁生物利用度的最佳方法。该方法的缺点是基于单一餐食以及暴露于辐射(就使用放射性同位素而言)。最后,描述了一些算术模型。这些模型基于铁生物利用度研究的数据,可以预测一餐中铁的生物利用度。