• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

具有重尾分布的随机配置网络(SCN)的最优随机性

Optimal Randomness for Stochastic Configuration Network (SCN) with Heavy-Tailed Distributions.

作者信息

Niu Haoyu, Wei Jiamin, Chen YangQuan

机构信息

Electrical Engineering and Computer Science Department, University of California, Merced, CA 95340, USA.

School of Telecommunications Engineering, Xidian University, No.2, Taibai Road, Xi'an 710071, Shaanxi, China.

出版信息

Entropy (Basel). 2020 Dec 31;23(1):56. doi: 10.3390/e23010056.

DOI:10.3390/e23010056
PMID:33396383
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7823536/
Abstract

Stochastic Configuration Network (SCN) has a powerful capability for regression and classification analysis. Traditionally, it is quite challenging to correctly determine an appropriate architecture for a neural network so that the trained model can achieve excellent performance for both learning and generalization. Compared with the known randomized learning algorithms for single hidden layer feed-forward neural networks, such as Randomized Radial Basis Function (RBF) Networks and Random Vector Functional-link (RVFL), the SCN randomly assigns the input weights and biases of the hidden nodes in a supervisory mechanism. Since the parameters in the hidden layers are randomly generated in uniform distribution, hypothetically, there is optimal randomness. Heavy-tailed distribution has shown optimal randomness in an unknown environment for finding some targets. Therefore, in this research, the authors used heavy-tailed distributions to randomly initialize weights and biases to see if the new SCN models can achieve better performance than the original SCN. Heavy-tailed distributions, such as Lévy distribution, Cauchy distribution, and Weibull distribution, have been used. Since some mixed distributions show heavy-tailed properties, the mixed Gaussian and Laplace distributions were also studied in this research work. Experimental results showed improved performance for SCN with heavy-tailed distributions. For the regression model, SCN-Lévy, SCN-Mixture, SCN-Cauchy, and SCN-Weibull used less hidden nodes to achieve similar performance with SCN. For the classification model, SCN-Mixture, SCN-Lévy, and SCN-Cauchy have higher test accuracy of 91.5%, 91.7% and 92.4%, respectively. Both are higher than the test accuracy of the original SCN.

摘要

随机配置网络(SCN)具有强大的回归和分类分析能力。传统上,正确确定神经网络的合适架构颇具挑战性,以便训练后的模型在学习和泛化方面都能取得优异性能。与已知的单隐藏层前馈神经网络的随机学习算法相比,如随机径向基函数(RBF)网络和随机向量功能链接(RVFL),SCN在监督机制中随机分配隐藏节点的输入权重和偏差。由于隐藏层中的参数是在均匀分布中随机生成的,所以假设存在最优随机性。重尾分布在未知环境中寻找某些目标时已显示出最优随机性。因此,在本研究中,作者使用重尾分布来随机初始化权重和偏差,以查看新的SCN模型是否能比原始SCN取得更好的性能。已使用了重尾分布,如列维分布、柯西分布和威布尔分布。由于一些混合分布显示出重尾特性,本研究工作中还研究了混合高斯和拉普拉斯分布。实验结果表明,具有重尾分布的SCN性能有所提高。对于回归模型,SCN - 列维、SCN - 混合、SCN - 柯西和SCN - 威布尔使用较少的隐藏节点就能达到与SCN相似的性能。对于分类模型,SCN - 混合、SCN - 列维和SCN - 柯西的测试准确率分别更高,为91.5%、91.7%和92.4%。两者都高于原始SCN的测试准确率。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/ab5aef9b9dbe/entropy-23-00056-g010a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/b9c3f5ec69ca/entropy-23-00056-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/f19d23d63033/entropy-23-00056-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/51608311c244/entropy-23-00056-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/9965a2113bc9/entropy-23-00056-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/f9ef02451deb/entropy-23-00056-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/3dfca14a51ec/entropy-23-00056-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/aea9f4e4cee2/entropy-23-00056-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/edca1df49cfc/entropy-23-00056-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/b6e0c9660652/entropy-23-00056-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/ab5aef9b9dbe/entropy-23-00056-g010a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/b9c3f5ec69ca/entropy-23-00056-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/f19d23d63033/entropy-23-00056-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/51608311c244/entropy-23-00056-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/9965a2113bc9/entropy-23-00056-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/f9ef02451deb/entropy-23-00056-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/3dfca14a51ec/entropy-23-00056-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/aea9f4e4cee2/entropy-23-00056-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/edca1df49cfc/entropy-23-00056-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/b6e0c9660652/entropy-23-00056-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/971d/7823536/ab5aef9b9dbe/entropy-23-00056-g010a.jpg

相似文献

1
Optimal Randomness for Stochastic Configuration Network (SCN) with Heavy-Tailed Distributions.具有重尾分布的随机配置网络(SCN)的最优随机性
Entropy (Basel). 2020 Dec 31;23(1):56. doi: 10.3390/e23010056.
2
Bidirectional stochastic configuration network for regression problems.双向随机配置网络回归问题。
Neural Netw. 2021 Aug;140:237-246. doi: 10.1016/j.neunet.2021.03.016. Epub 2021 Mar 18.
3
Stochastic Configuration Networks: Fundamentals and Algorithms.随机配置网络:原理与算法。
IEEE Trans Cybern. 2017 Oct;47(10):3466-3479. doi: 10.1109/TCYB.2017.2734043. Epub 2017 Aug 21.
4
An unsupervised parameter learning model for RVFL neural network.无监督参数学习模型在 RVFL 神经网络中的应用。
Neural Netw. 2019 Apr;112:85-97. doi: 10.1016/j.neunet.2019.01.007. Epub 2019 Jan 28.
5
COVID-19 X-ray images classification based on enhanced fractional-order cuckoo search optimizer using heavy-tailed distributions.基于使用重尾分布的增强分数阶布谷鸟搜索优化器的COVID-19 X光图像分类
Appl Soft Comput. 2021 Mar;101:107052. doi: 10.1016/j.asoc.2020.107052. Epub 2020 Dec 24.
6
Modeling and statistical analysis of non-Gaussian random fields with heavy-tailed distributions.具有重尾分布的非高斯随机场的建模与统计分析。
Phys Rev E. 2017 Apr;95(4-1):042114. doi: 10.1103/PhysRevE.95.042114. Epub 2017 Apr 7.
7
FPGA-Based Implementation of Stochastic Configuration Networks for Regression Prediction.基于 FPGA 的随机配置网络回归预测实现。
Sensors (Basel). 2020 Jul 28;20(15):4191. doi: 10.3390/s20154191.
8
A regularized stochastic configuration network based on weighted mean of vectors for regression.一种基于向量加权均值的正则化随机配置网络用于回归。
PeerJ Comput Sci. 2023 May 30;9:e1382. doi: 10.7717/peerj-cs.1382. eCollection 2023.
9
Why Do Big Data and Machine Learning Entail the Fractional Dynamics?为何大数据和机器学习需要分数动力学?
Entropy (Basel). 2021 Feb 28;23(3):297. doi: 10.3390/e23030297.
10
Stochastic configuration network ensembles with selective base models.基于选择性基模型的随机配置网络集成。
Neural Netw. 2021 May;137:106-118. doi: 10.1016/j.neunet.2021.01.011. Epub 2021 Jan 30.

引用本文的文献

1
Whether the Support Region of Three-Bit Uniform Quantizer Has a Strong Impact on Post-Training Quantization for MNIST Dataset?三位均匀量化器的支持区域对MNIST数据集的训练后量化有很大影响吗?
Entropy (Basel). 2021 Dec 20;23(12):1699. doi: 10.3390/e23121699.

本文引用的文献

1
Ensemble Stochastic Configuration Networks for Estimating Prediction Intervals: A Simultaneous Robust Training Algorithm and Its Application.用于估计预测区间的集成随机配置网络:一种同步稳健训练算法及其应用
IEEE Trans Neural Netw Learn Syst. 2020 Dec;31(12):5426-5440. doi: 10.1109/TNNLS.2020.2967816. Epub 2020 Nov 30.
2
2-D Stochastic Configuration Networks for Image Data Analytics.用于图像数据分析的二维随机配置网络
IEEE Trans Cybern. 2021 Jan;51(1):359-372. doi: 10.1109/TCYB.2019.2925883. Epub 2020 Dec 22.
3
Stochastic Configuration Networks: Fundamentals and Algorithms.
随机配置网络:原理与算法。
IEEE Trans Cybern. 2017 Oct;47(10):3466-3479. doi: 10.1109/TCYB.2017.2734043. Epub 2017 Aug 21.