Melbourne Brain Centre, 245 Burgundy Street, Heidelberg, Victoria, 3084, Australia.
Stroke. 2012 Apr;43(4):1025-31. doi: 10.1161/STROKEAHA.111.635888. Epub 2012 Feb 16.
In combination with diffusion-weighted imaging, perfusion-weighted imaging parameters are hypothesized to detect tissue at risk of infarction in patients with acute stroke. Recent studies have suggested that in addition to perfusion deficits, vascular flow parameters indicating bolus delay and/or dispersion may also contain important predictive information. This work investigates the infarct risk associated with delay/dispersion using multiparametric predictor models.
Predictor models were developed using specific combinations of perfusion parameters calculated using global arterial input function deconvolution (where perfusion is biased by dispersion), local arterial input function deconvolution (where perfusion has minimal dispersion bias), and parameters approximating bolus delay/dispersion. We also compare predictor models formed using summary parameters (which primarily reflect delay/dispersion). The models were trained on 15 patients with acute stroke imaged at 3 to 6 hours.
The global arterial input function models performed significantly better than their local arterial input function counterparts. Furthermore, in a paired comparison, the models including the delay/dispersion parameter performed significantly better than those without. There was no significant difference between the best deconvolution model and the best summary parameter model.
Delay and dispersion information is important to achieve accurate infarct prediction in the acute time window.
联合扩散加权成像,灌注加权成像参数被假设可用于检测急性脑卒中患者发生梗死的危险组织。最近的研究表明,除了灌注不足之外,指示团注延迟和/或展宽的血流动力学参数可能也包含重要的预测信息。本研究使用多参数预测模型来研究与延迟/展宽相关的梗死风险。
使用特定的灌注参数组合来开发预测模型,这些参数使用全局动脉输入函数反卷积(其中灌注受展宽影响)、局部动脉输入函数反卷积(其中灌注展宽影响最小)和近似团注延迟/展宽的参数来计算。我们还比较了使用摘要参数(主要反映延迟/展宽)形成的预测模型。该模型在 15 例发病 3 至 6 小时的急性脑卒中患者上进行了训练。
全局动脉输入函数模型的表现明显优于局部动脉输入函数模型。此外,在配对比较中,包含延迟/展宽参数的模型表现明显优于不包含的模型。最佳反卷积模型和最佳摘要参数模型之间没有显著差异。
在急性时间窗内,实现准确的梗死预测,延迟和展宽信息非常重要。