Division of Population Sciences, and.
Division of Inpatient Oncology, Dana-Farber Cancer Institute.
J Natl Compr Canc Netw. 2022 Apr;20(4):335-341.e17. doi: 10.6004/jnccn.2021.7047.
Intermittent shortages of chemotherapeutics used to treat curable malignancies are a worldwide problem that increases patient mortality. Although multiple strategies have been proposed for managing these shortages (eg, prioritizing patients by age, scarce treatment efficacy per volume, alternative treatment efficacy difference), critical clinical dilemmas arise when selecting a management strategy and understanding its impact.
We developed a model to compare the impact of different allocation strategies on overall survival during intermittent chemotherapy shortages and tested it using vincristine, which was recently scarce for 9 months in the United States. Demographic and treatment data were abstracted from 1,689 previously treated patients in our tertiary-care system; alternatives were abstracted from NCCN Clinical Practice Guidelines in Oncology for each disease and survival probabilities from the studies cited therein. Modeled survival was validated using SEER data. Nine-month shortages were modeled for all possible supply levels. Pairwise differences in 3-year survival and risk reductions were calculated for each strategy compared with standard practice (first-come, first-served) for each 50-mg supply increment, as were supply thresholds above which each strategy maintained survival similar to scenarios without shortages.
A strategy prioritizing by higher vincristine efficacy per volume and greater alternative treatment efficacy difference performed best, improving survival significantly (P<.01) across 86.5% of possible shortages (relative risk reduction, 8.3%; 99% CI, 8.0-8.5) compared with standard practice. This strategy also maintained survival rates similar to a model without shortages until supply fell below 72.2% of the amount required to treat all patients, compared with 94.3% for standard practice.
During modeled vincristine shortages, prioritizing patients by higher efficacy per volume and alternative treatment efficacy difference significantly improved survival over standard practice. This approach can help optimize allocation as intermittent chemotherapy shortages continue to arise.
间歇性短缺的化疗药物用于治疗可治愈的恶性肿瘤是一个全球性的问题,增加了患者的死亡率。虽然已经提出了多种策略来管理这些短缺(例如,按年龄、每单位稀缺治疗效果、替代治疗效果差异来优先考虑患者),但在选择管理策略并了解其影响时,会出现关键的临床困境。
我们开发了一个模型,比较了在间歇性化疗短缺期间不同分配策略对总生存的影响,并使用长春新碱对其进行了测试,长春新碱最近在美国短缺了 9 个月。从我们的三级护理系统中,我们对 1689 名之前接受过治疗的患者提取了人口统计学和治疗数据;为每种疾病从 NCCN 临床实践指南中提取了替代方案,从其中引用的研究中提取了生存概率。使用 SEER 数据验证了模型化的生存。对所有可能的供应水平进行了 9 个月的短缺建模。对于每个 50mg 供应增量,与标准实践(先来先服务)相比,计算了每种策略的 3 年生存率差异和风险降低率,以及每种策略的供应阈值,超过该阈值,每种策略的生存情况与没有短缺的情况相似。
按更高的长春新碱每单位效果和更大的替代治疗效果差异进行优先排序的策略表现最佳,与标准实践相比,在 86.5%的可能短缺情况下(相对风险降低 8.3%;99%CI,8.0-8.5),显著提高了生存率(P<.01)。与标准实践相比,直到供应下降到治疗所有患者所需数量的 72.2%以下,该策略才保持与没有短缺的模型相似的生存率,而标准实践为 94.3%。
在模拟的长春新碱短缺期间,按更高的每单位效果和替代治疗效果差异优先考虑患者显著提高了生存率,优于标准实践。随着间歇性化疗短缺的继续出现,这种方法可以帮助优化分配。