Zhang Zhen, Li Yibing, Jin Shanshan, Zhang Zhaoyue, Wang Hui, Qi Lin, Zhou Ruolin
College of Information and Communication Engineering, Harbin Engineering University, Harbin 150001, China.
College of Air Traffic Management, Civil Aviation University of China, Tianjin 300300, China.
Entropy (Basel). 2018 Mar 16;20(3):198. doi: 10.3390/e20030198.
In this paper, information entropy and ensemble learning based signal recognition theory and algorithms have been proposed. We have extracted 16 kinds of entropy features out of 9 types of modulated signals. The types of information entropy used are numerous, including Rényi entropy and energy entropy based on S Transform and Generalized S Transform. We have used three feature selection algorithms, including sequence forward selection (SFS), sequence forward floating selection (SFFS) and RELIEF-F to select the optimal feature subset from 16 entropy features. We use five classifiers, including -nearest neighbor (KNN), support vector machine (SVM), Adaboost, Gradient Boosting Decision Tree (GBDT) and eXtreme Gradient Boosting (XGBoost) to classify the original feature set and the feature subsets selected by different feature selection algorithms. The simulation results show that the feature subsets selected by SFS and SFFS algorithms are the best, with a 48% increase in recognition rate over the original feature set when using KNN classifier and a 34% increase when using SVM classifier. For the other three classifiers, the original feature set can achieve the best recognition performance. The XGBoost classifier has the best recognition performance, the overall recognition rate is 97.74% and the recognition rate can reach 82% when the signal to noise ratio (SNR) is -10 dB.
本文提出了基于信息熵和集成学习的信号识别理论与算法。我们从9种调制信号中提取了16种熵特征。所使用的信息熵类型众多,包括基于S变换和广义S变换的Rényi熵和能量熵。我们使用了三种特征选择算法,包括顺序前向选择(SFS)、顺序前向浮动选择(SFFS)和RELIEF-F,从16个熵特征中选择最优特征子集。我们使用了五个分类器,包括K近邻(KNN)、支持向量机(SVM)、Adaboost、梯度提升决策树(GBDT)和极端梯度提升(XGBoost),对原始特征集和不同特征选择算法选择的特征子集进行分类。仿真结果表明,SFS和SFFS算法选择的特征子集最佳,使用KNN分类器时,识别率比原始特征集提高了48%,使用SVM分类器时提高了34%。对于其他三个分类器,原始特征集可实现最佳识别性能。XGBoost分类器具有最佳识别性能,总体识别率为97.74%,当信噪比(SNR)为-10 dB时,识别率可达82%。