• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于融合基于独立数据源的K近邻算法所获预测结果的神经网络。

Neural Network Used for the Fusion of Predictions Obtained by the K-Nearest Neighbors Algorithm Based on Independent Data Sources.

作者信息

Przybyła-Kasperek Małgorzata, Marfo Kwabena Frimpong

机构信息

Institute of Computer Science, Faculty of Science and Technology, University of Silesia in Katowice, Bȩdzińska 39, 41-200 Sosnowiec, Poland.

出版信息

Entropy (Basel). 2021 Nov 25;23(12):1568. doi: 10.3390/e23121568.

DOI:10.3390/e23121568
PMID:34945874
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8700412/
Abstract

The article concerns the problem of classification based on independent data sets-local decision tables. The aim of the paper is to propose a classification model for dispersed data using a modified k-nearest neighbors algorithm and a neural network. A neural network, more specifically a multilayer perceptron, is used to combine the prediction results obtained based on local tables. Prediction results are stored in the measurement level and generated using a modified -nearest neighbors algorithm. The task of neural networks is to combine these results and provide a common prediction. In the article various structures of neural networks (different number of neurons in the hidden layer) are studied and the results are compared with the results generated by other fusion methods, such as the majority voting, the Borda count method, the sum rule, the method that is based on decision templates and the method that is based on theory of evidence. Based on the obtained results, it was found that the neural network always generates unambiguous decisions, which is a great advantage as most of the other fusion methods generate ties. Moreover, if only unambiguous results were considered, the use of a neural network gives much better results than other fusion methods. If we allow ambiguity, some fusion methods are slightly better, but it is the result of this fact that it is possible to generate few decisions for the test object.

摘要

本文关注基于独立数据集——局部决策表的分类问题。本文的目的是使用改进的k近邻算法和神经网络为离散数据提出一种分类模型。神经网络,更具体地说是多层感知器,用于组合基于局部表获得的预测结果。预测结果存储在测量级别,并使用改进的近邻算法生成。神经网络的任务是组合这些结果并提供一个共同的预测。在本文中,研究了神经网络的各种结构(隐藏层中不同数量的神经元),并将结果与其他融合方法生成的结果进行比较,这些方法如多数投票法、博尔达计数法、求和规则、基于决策模板的方法和基于证据理论的方法。基于所得结果发现,神经网络总能产生明确的决策,这是一个很大的优势,因为大多数其他融合方法会产生平局情况。此外,如果只考虑明确的结果,使用神经网络比其他融合方法能得到更好的结果。如果允许存在模糊性,一些融合方法会稍好一些,但这是因为可能为测试对象生成的决策较少。

相似文献

1
Neural Network Used for the Fusion of Predictions Obtained by the K-Nearest Neighbors Algorithm Based on Independent Data Sources.用于融合基于独立数据源的K近邻算法所获预测结果的神经网络。
Entropy (Basel). 2021 Nov 25;23(12):1568. doi: 10.3390/e23121568.
2
Study on the Use of Artificially Generated Objects in the Process of Training MLP Neural Networks Based on Dispersed Data.基于离散数据训练多层感知器神经网络过程中人工生成对象的使用研究
Entropy (Basel). 2023 Apr 24;25(5):703. doi: 10.3390/e25050703.
3
A Fast Exact k-Nearest Neighbors Algorithm for High Dimensional Search Using k-Means Clustering and Triangle Inequality.一种使用k均值聚类和三角不等式进行高维搜索的快速精确k近邻算法。
Proc Int Jt Conf Neural Netw. 2012 Feb 8;43(6):2351-2358. doi: 10.1016/j.patcog.2010.01.003.
4
Optimizing neural networks for medical data sets: A case study on neonatal apnea prediction.优化神经网络在医学数据集上的应用:以新生儿呼吸暂停预测为例的研究
Artif Intell Med. 2019 Jul;98:59-76. doi: 10.1016/j.artmed.2019.07.008. Epub 2019 Jul 25.
5
New Classification Method for Independent Data Sources Using Pawlak Conflict Model and Decision Trees.基于 Pawlak 冲突模型和决策树的独立数据源新分类方法
Entropy (Basel). 2022 Nov 4;24(11):1604. doi: 10.3390/e24111604.
6
Improving Neural-Network Classifiers Using Nearest Neighbor Partitioning.利用最近邻分区改进神经网络分类器。
IEEE Trans Neural Netw Learn Syst. 2017 Oct;28(10):2255-2267. doi: 10.1109/TNNLS.2016.2580570. Epub 2016 Jun 30.
7
Application of K-nearest neighbors algorithm on breast cancer diagnosis problem.K近邻算法在乳腺癌诊断问题上的应用。
Proc AMIA Symp. 2000:759-63.
8
Orthodontic Treatment Planning based on Artificial Neural Networks.基于人工神经网络的正畸治疗计划。
Sci Rep. 2019 Feb 14;9(1):2037. doi: 10.1038/s41598-018-38439-w.
9
A Comparative Analysis of Machine/Deep Learning Models for Parking Space Availability Prediction.基于机器/深度学习模型的停车位可用预测的比较分析。
Sensors (Basel). 2020 Jan 6;20(1):322. doi: 10.3390/s20010322.
10
Cleft prediction before birth using deep neural network.使用深度神经网络进行产前唇腭裂预测。
Health Informatics J. 2020 Dec;26(4):2568-2585. doi: 10.1177/1460458220911789. Epub 2020 Apr 14.

引用本文的文献

1
A multi-layer perceptron neural network for varied conditional attributes in tabular dispersed data.用于表格离散数据中各种条件属性的多层感知器神经网络。
PLoS One. 2024 Dec 2;19(12):e0311041. doi: 10.1371/journal.pone.0311041. eCollection 2024.
2
Study on the Use of Artificially Generated Objects in the Process of Training MLP Neural Networks Based on Dispersed Data.基于离散数据训练多层感知器神经网络过程中人工生成对象的使用研究
Entropy (Basel). 2023 Apr 24;25(5):703. doi: 10.3390/e25050703.
3
2022 Multiple-country Monkeypox Outbreak and Its Importation Risk into China: An Assessment Based on the Risk Matrix Method.

本文引用的文献

1
Probabilistic Predictions with Federated Learning.联邦学习的概率预测
Entropy (Basel). 2020 Dec 30;23(1):41. doi: 10.3390/e23010041.
2
Averaging Is Probably Not the Optimum Way of Aggregating Parameters in Federated Learning.平均法可能不是联邦学习中聚合参数的最佳方式。
Entropy (Basel). 2020 Mar 11;22(3):314. doi: 10.3390/e22030314.
2022 年多国猴痘疫情爆发及其传入中国的风险:基于风险矩阵方法的评估。
Biomed Environ Sci. 2022 Oct 20;35(10):878-887. doi: 10.3967/bes2022.115.
4
Deep Learning Artificial Intelligence to Predict the Need for Tracheostomy in Patients of Deep Neck Infection Based on Clinical and Computed Tomography Findings-Preliminary Data and a Pilot Study.基于临床和计算机断层扫描结果,利用深度学习人工智能预测深部颈部感染患者气管切开术需求——初步数据和一项试点研究
Diagnostics (Basel). 2022 Aug 12;12(8):1943. doi: 10.3390/diagnostics12081943.