Suppr超能文献

关于多项式加工树模型的最小描述长度复杂性

On the Minimum Description Length Complexity of Multinomial Processing Tree Models.

作者信息

Wu Hao, Myung Jay I, Batchelder William H

机构信息

The Ohio State University.

出版信息

J Math Psychol. 2010 Jun;54(3):291-303. doi: 10.1016/j.jmp.2010.02.001.

Abstract

Multinomial processing tree (MPT) modeling is a statistical methodology that has been widely and successfully applied for measuring hypothesized latent cognitive processes in selected experimental paradigms. This paper concerns model complexity of MPT models. Complexity is a key and necessary concept to consider in the evaluation and selection of quantitative models. A complex model with many parameters often overfits data beyond and above the underlying regularities, and therefore, should be appropriately penalized. It has been well established and demonstrated in multiple studies that in addition to the number of parameters, a model's functional form, which refers to the way by which parameters are combined in the model equation, can also have significant effects on complexity. Given that MPT models vary greatly in their functional forms (tree structures and parameter/category assignments), it would be of interest to evaluate their effects on complexity. Addressing this issue from the minimum description length (MDL) viewpoint, we prove a series of propositions concerning various ways in which functional form contributes to the complexity of MPT models. Computational issues of complexity are also discussed.

摘要

多项加工树(MPT)建模是一种统计方法,已被广泛且成功地应用于在选定的实验范式中测量假设的潜在认知过程。本文关注MPT模型的模型复杂性。复杂性是定量模型评估和选择中需要考虑的一个关键且必要的概念。一个具有许多参数的复杂模型往往会过度拟合超出潜在规律的数据,因此,应该进行适当的惩罚。多项研究已经充分证实并表明,除了参数数量外,模型的函数形式(即参数在模型方程中的组合方式)也会对复杂性产生显著影响。鉴于MPT模型的函数形式(树结构以及参数/类别分配)差异很大,评估它们对复杂性的影响将是很有意义的。从最小描述长度(MDL)的角度解决这个问题,我们证明了一系列关于函数形式对MPT模型复杂性贡献的各种方式的命题。还讨论了复杂性的计算问题。

相似文献

1
On the Minimum Description Length Complexity of Multinomial Processing Tree Models.
J Math Psychol. 2010 Jun;54(3):291-303. doi: 10.1016/j.jmp.2010.02.001.
2
Minimum description length model selection of multinomial processing tree models.
Psychon Bull Rev. 2010 Jun;17(3):275-86. doi: 10.3758/PBR.17.3.275.
3
On aggregation invariance of multinomial processing tree models.
Behav Res Methods. 2024 Dec;56(8):8677-8694. doi: 10.3758/s13428-024-02497-y. Epub 2024 Oct 14.
4
Extending multinomial processing tree models to measure the relative speed of cognitive processes.
Psychon Bull Rev. 2016 Oct;23(5):1440-1465. doi: 10.3758/s13423-016-1025-6.
5
Theoretical and empirical review of multinomial process tree modeling.
Psychon Bull Rev. 1999 Mar;6(1):57-86. doi: 10.3758/bf03210812.
6
TreeBUGS: An R package for hierarchical multinomial-processing-tree modeling.
Behav Res Methods. 2018 Feb;50(1):264-284. doi: 10.3758/s13428-017-0869-7.
7
Using recursive partitioning to account for parameter heterogeneity in multinomial processing tree models.
Behav Res Methods. 2018 Jun;50(3):1217-1233. doi: 10.3758/s13428-017-0937-z.
8
How to develop, test, and extend multinomial processing tree models: A tutorial.
Psychol Methods. 2023 Jul 27. doi: 10.1037/met0000561.
9
MPTinR: analysis of multinomial processing tree models in R.
Behav Res Methods. 2013 Jun;45(2):560-75. doi: 10.3758/s13428-012-0259-0.

引用本文的文献

1
On aggregation invariance of multinomial processing tree models.
Behav Res Methods. 2024 Dec;56(8):8677-8694. doi: 10.3758/s13428-024-02497-y. Epub 2024 Oct 14.
2
Does context recollection depend on the base-rate of contextual features?
Cogn Process. 2024 Feb;25(1):9-35. doi: 10.1007/s10339-023-01153-1. Epub 2023 Sep 11.
3
The effects of divided attention at encoding on specific and gist-based associative episodic memory.
Mem Cognit. 2022 Jan;50(1):59-76. doi: 10.3758/s13421-021-01196-9. Epub 2021 Jun 21.
4
Computing Bayes factors for evidence-accumulation models using Warp-III bridge sampling.
Behav Res Methods. 2020 Apr;52(2):918-937. doi: 10.3758/s13428-019-01290-6.
5
Testing Interactions in Multinomial Processing Tree Models.
Front Psychol. 2019 Nov 1;10:2364. doi: 10.3389/fpsyg.2019.02364. eCollection 2019.
7
TreeBUGS: An R package for hierarchical multinomial-processing-tree modeling.
Behav Res Methods. 2018 Feb;50(1):264-284. doi: 10.3758/s13428-017-0869-7.
9
Assumptions behind scoring source versus item memory: Effects of age, hippocampal lesions and mild memory problems.
Cortex. 2017 Jun;91:297-315. doi: 10.1016/j.cortex.2017.01.001. Epub 2017 Jan 12.
10
Turn around to have a look? Spatial referencing in dorsal vs. frontal settings in cross-linguistic comparison.
Front Psychol. 2015 Sep 2;6:1283. doi: 10.3389/fpsyg.2015.01283. eCollection 2015.

本文引用的文献

1
Minimum description length model selection of multinomial processing tree models.
Psychon Bull Rev. 2010 Jun;17(3):275-86. doi: 10.3758/PBR.17.3.275.
2
Does response scaling cause the generalized context model to mimic a prototype model?
Psychon Bull Rev. 2007 Dec;14(6):1043-50. doi: 10.3758/bf03193089.
3
Common and distinctive features in stimulus similarity: a modified version of the contrast model.
Psychon Bull Rev. 2004 Dec;11(6):961-74. doi: 10.3758/bf03196728.
4
A note on the applied use of MDL approximations.
Neural Comput. 2004 Sep;16(9):1763-8. doi: 10.1162/0899766041336378.
5
When a good fit can be bad.
Trends Cogn Sci. 2002 Oct 1;6(10):421-425. doi: 10.1016/s1364-6613(02)01964-2.
6
Theoretical and empirical review of multinomial process tree modeling.
Psychon Bull Rev. 1999 Mar;6(1):57-86. doi: 10.3758/bf03210812.
7
Toward a method of selecting among computational models of cognition.
Psychol Rev. 2002 Jul;109(3):472-91. doi: 10.1037/0033-295x.109.3.472.
8
On the Complexity of Additive Clustering Models.
J Math Psychol. 2001 Feb;45(1):131-148. doi: 10.1006/jmps.1999.1299.
9
The Importance of Complexity in Model Selection.
J Math Psychol. 2000 Mar;44(1):190-204. doi: 10.1006/jmps.1999.1283.
10
Model Selection Based on Minimum Description Length.
J Math Psychol. 2000 Mar;44(1):133-152. doi: 10.1006/jmps.1999.1280.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验