Azad Mohammad, Chikalov Igor, Hussain Shahid, Moshkov Mikhail, Zielosko Beata
Department of Computer Science, College of Computer and Information Sciences, Jouf University, Sakaka 72441, Saudi Arabia.
Intel Corporation, 5000 W Chandler Blvd, Chandler, AZ 85226, USA.
Entropy (Basel). 2021 Dec 7;23(12):1641. doi: 10.3390/e23121641.
Conventional decision trees use queries each of which is based on one attribute. In this study, we also examine decision trees that handle additional queries based on hypotheses. This kind of query is similar to the equivalence queries considered in exact learning. Earlier, we designed dynamic programming algorithms for the computation of the minimum depth and the minimum number of internal nodes in decision trees that have hypotheses. Modification of these algorithms considered in the present paper permits us to build decision trees with hypotheses that are optimal relative to the depth or relative to the number of the internal nodes. We compare the length and coverage of decision rules extracted from optimal decision trees with hypotheses and decision rules extracted from optimal conventional decision trees to choose the ones that are preferable as a tool for the representation of information. To this end, we conduct computer experiments on various decision tables from the UCI Machine Learning Repository. In addition, we also consider decision tables for randomly generated Boolean functions. The collected results show that the decision rules derived from decision trees with hypotheses in many cases are better than the rules extracted from conventional decision trees.
传统决策树使用的查询每个都基于一个属性。在本研究中,我们还研究了基于假设处理附加查询的决策树。这种查询类似于精确学习中考虑的等价查询。此前,我们设计了动态规划算法来计算具有假设的决策树的最小深度和最小内部节点数。本文中对这些算法的修改使我们能够构建相对于深度或相对于内部节点数而言最优的具有假设的决策树。我们将从具有假设的最优决策树中提取的决策规则的长度和覆盖范围与从最优传统决策树中提取的决策规则进行比较,以选择作为信息表示工具更可取的那些规则。为此,我们对来自UCI机器学习库的各种决策表进行了计算机实验。此外,我们还考虑了随机生成的布尔函数的决策表。收集到的结果表明,在许多情况下,从具有假设的决策树导出的决策规则优于从传统决策树中提取的规则。