Department of Oncology, Helsinki University Central Hospital, POB-180, 00029 HUS, Finland.
Phys Med Biol. 2011 Apr 7;56(7):2119-30. doi: 10.1088/0031-9155/56/7/014. Epub 2011 Mar 9.
We estimated cost/benefit ratios for different quality control programs of radiation output measurements of medical linear accelerators. The cost/benefit ratios of quality control (QC) programs (a combination of output measurement time interval and measurement action levels) were defined as workload divided by achievable dose accuracy. Dose accuracy was assumed to be inversely proportional to the 99% confidence limit of shifts of total treatment doses and workload as inversely proportional to the output measurement time interval. Our previously reported method was used to estimate the distribution of shifts of total treatment doses due to changes in accelerator radiation output (Gy/MU). The confidence limits of dose shifts were estimated for different QC programs and for different levels of output measurement reproducibility. Output shifts used in the estimations had previously been observed for four linear accelerators over 5 years. We observed that the cost/benefit ratio increases remarkably when the output measurement time interval is less than 1 month. The ratio depends strongly on the action levels and reproducibility of the QC measurements. Improvement of these factors optimizes the cost/benefit ratio by a factor of several times. The most cost-effective output measurement time interval to achieve 99% confidence limits of ±2, ±2.5 or ±3% for dose shifts ranged from 0.25 month to as much as 6 months depending on the factors given above and the intended accuracy level. It is several times more cost effective to increase dose accuracy by lowering the action levels of the QC measurements and by attempting to improve their reproducibility than by simply shortening the time interval of the output measurements. Methods improving utilization and interpretation of the results of the QC measurements play a key role in further optimization of cost/benefit ratios in dosimetric QC.
我们评估了医用线性加速器辐射输出测量不同质量控制(QC)方案的成本效益比。QC 方案(输出测量时间间隔和测量动作水平的组合)的成本效益比定义为工作量除以可实现的剂量准确性。剂量准确性被假设与总治疗剂量的变化的 99%置信限成反比,而工作量与输出测量时间间隔成反比。我们使用之前报告的方法来估计由于加速器辐射输出变化(Gy/MU)导致的总治疗剂量变化的分布。不同 QC 方案和不同输出测量可重复性水平的剂量变化置信限都被估算出来。用于估算的输出变化之前已经在四台线性加速器上观察了五年。我们发现,当输出测量时间间隔小于 1 个月时,成本效益比显著增加。该比值强烈依赖于 QC 测量的动作水平和可重复性。这些因素的改进可以将成本效益比优化几倍。为了实现剂量变化的 99%置信限为±2%、±2.5%或±3%,实现成本效益比最高的输出测量时间间隔范围从 0.25 个月到 6 个月不等,具体取决于上述因素和预期的准确度水平。通过降低 QC 测量的动作水平并尝试提高其可重复性来提高剂量准确性,比简单地缩短输出测量时间间隔更具成本效益。改进 QC 测量的结果的利用和解释方法在进一步优化剂量学 QC 的成本效益比方面起着关键作用。