Suppr超能文献

大信号转导网络噪声的计算研究。

Computational study of noise in a large signal transduction network.

机构信息

Department of Mathematics, Tampere University of Technology, P.O. Box 553, 33101 Tampere, Finland.

出版信息

BMC Bioinformatics. 2011 Jun 21;12:252. doi: 10.1186/1471-2105-12-252.

Abstract

BACKGROUND

Biochemical systems are inherently noisy due to the discrete reaction events that occur in a random manner. Although noise is often perceived as a disturbing factor, the system might actually benefit from it. In order to understand the role of noise better, its quality must be studied in a quantitative manner. Computational analysis and modeling play an essential role in this demanding endeavor.

RESULTS

We implemented a large nonlinear signal transduction network combining protein kinase C, mitogen-activated protein kinase, phospholipase A2, and β isoform of phospholipase C networks. We simulated the network in 300 different cellular volumes using the exact Gillespie stochastic simulation algorithm and analyzed the results in both the time and frequency domain. In order to perform simulations in a reasonable time, we used modern parallel computing techniques. The analysis revealed that time and frequency domain characteristics depend on the system volume. The simulation results also indicated that there are several kinds of noise processes in the network, all of them representing different kinds of low-frequency fluctuations. In the simulations, the power of noise decreased on all frequencies when the system volume was increased.

CONCLUSIONS

We concluded that basic frequency domain techniques can be applied to the analysis of simulation results produced by the Gillespie stochastic simulation algorithm. This approach is suited not only to the study of fluctuations but also to the study of pure noise processes. Noise seems to have an important role in biochemical systems and its properties can be numerically studied by simulating the reacting system in different cellular volumes. Parallel computing techniques make it possible to run massive simulations in hundreds of volumes and, as a result, accurate statistics can be obtained from computational studies.

摘要

背景

生化系统由于离散的反应事件以随机的方式发生而固有地存在噪声。虽然噪声通常被视为干扰因素,但系统实际上可能从中受益。为了更好地理解噪声的作用,必须以定量的方式研究其质量。计算分析和建模在这一具有挑战性的工作中起着至关重要的作用。

结果

我们实现了一个大型非线性信号转导网络,该网络结合了蛋白激酶 C、丝裂原激活蛋白激酶、磷脂酶 A2 和 β 型磷脂酶 C 网络。我们使用精确的 Gillespie 随机模拟算法在 300 个不同的细胞体积中模拟了该网络,并在时间和频域中分析了结果。为了在合理的时间内进行模拟,我们使用了现代并行计算技术。分析表明,时间和频域特征取决于系统体积。模拟结果还表明,网络中存在几种噪声过程,它们都代表不同类型的低频波动。在模拟中,当系统体积增加时,所有频率的噪声功率都会降低。

结论

我们得出结论,基本的频域技术可应用于 Gillespie 随机模拟算法产生的模拟结果的分析。这种方法不仅适用于波动的研究,也适用于纯噪声过程的研究。噪声似乎在生化系统中起着重要的作用,通过在不同的细胞体积中模拟反应系统,可以对其特性进行数值研究。并行计算技术使得在数百个体积中运行大规模模拟成为可能,从而可以从计算研究中获得准确的统计数据。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d50b/3142227/b62d88014183/1471-2105-12-252-1.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验