《计算机应用研究》|Application Research of Computers

基于RUSBoost和积矩系数的神经网络优化算法

Optimization algorithm for neural network based on RUSBoost and correlation coefficient

免费全文下载 (已被下载 次)  
获取PDF全文
作者 尹化荣,陈莉,张永新,陈丹丹
机构 西北大学 信息科学与技术学院,西安 710127
统计 摘要被查看 次,已被下载
文章编号 1001-3695(2018)09-2592-05
DOI 10.3969/j.issn.1001-3695.2018.09.007
摘要 针对单个神经网络分类准确率低、RUSBoost算法提高NN分类器准确率耗时长的问题,提出了一种混合RUSBoost算法和积矩系数的分类优化算法。首先,利用RUSBoost算法生成m组训练集;然后,依据Pearson积矩系数计算每组训练集属性的相关程度消除冗余属性,生成目标训练集;最后,新的子训练集训练神经网络分类器,选择最大准确率分类器作为最终的分类模型。实验中使用了四个Benchmark数据集来验证所提算法的有效性。实验结果表明,提出的算法的准确率相较于传统的算法最多提升了8.26%,训练时间最高降低了6227%。
关键词 神经网络;RUSBoost;积矩系数;集成学习
基金项目 国家自然科学基金资助项目(61502219)
博士后基金资助项目(2015M582697)
本文URL http://www.arocmag.com/article/01-2018-09-007.html
英文标题 Optimization algorithm for neural network based on RUSBoost and correlation coefficient
作者英文名 Yin Huarong, Chen Li, Zhang Yongxin, Chen Dandan
机构英文名 SchoolofInformationScience&Technology,NorthwestUniversity,Xi'an710127,China
英文摘要 Referring to the current problems that a single neural network classification accuracy is low, and RUSBoost is a time-consuming algorithm for improving the NN classifier accuracy.This paper proposed a hybrid algorithm which combined RUSBoost with the Pearson correlation coefficient to optimize the classifier of neural network.First of all, it generated m by using RUSBoost algorithm for group training.Then, according to the Pearson product-moment coefficient of correlation calcula-ted the property of each group and eliminated redundant properties in the set, generating target training sets.Finally, it trained neural network classifiers by the new training set, and selected the maximum accuracy classifier as the ultimate model.It chose 4 Benchmark data sets to verify the effectiveness of the algorithm.Experimental results show that the accuracy of the algorithms presented compared to traditional algorithm for maximum lifting 8.26%, and the training timer reduced 62.27%.
英文关键词 neural network; RUSBoost; correlation coefficient; ensemble learning
参考文献 查看稿件参考文献
  [1] 张超群, 郑建国, 王翔. 蜂群算法研究综述[J] . 计算机应用研究, 2011, 28(9):3201-3205, 3214.
[2] Gutierrez-Osuna R, Nagle H T. A method for evaluating data-preprocessing techniques for odour classification with an array of gas sensors[J] . IEEE Trans on Systems, Man, and Cybernetics, Part B:Cybernetics, 1999, 29(5):626-632.
[3] Polikar R. Ensemble based systems in decision making[J] . IEEE Circuits and Systems Magazine, 2006, 6(3):21-45.
[4] Rokach L. Ensemble-based classifiers[J] . Artificial Intelligence Review, 2010, 33(1-2):1-39.
[5] Wang Xizhao, Xing Hangjie, Li Yan, et al. A study on relationship between generalization abilities and fuzziness of base classifiers in ensemble learning[J] . IEEE Trans on Fuzzy Systems, 2015, 23(5):1638-1654.
[6] Freund Y, Schapire R E. Experiments with a new boosting algorithm[C] //Proc of the 13th International Conference on Machine Learning. 1996.
[7] Freund Y, Schapire R E. A desicion-theoretic generalization of on-line learning and an application to boosting[C] //Proc of the 2nd Annual European Conference on Computational Learning Theory. Orlando, FL:Academic Press, Inc, 1995. [8] Huang Chang, Wu Bo, Haizhou A I, et al. Omni-directional face detection based on real adaboost[C] //Proc of International Conference on Image Processing. Piscataway, NJ:IEEE Press, 2004.
[9] Wu Shuqiong, Nagahashi H. Parameterized AdaBoost:introducing a parameter to speed up the training of real AdaBoost[J] . IEEE Signal Processing Letters, 2014, 21(6):687-691.
[10] Schapire R E, Singer Y. Improved boosting algorithms using confidence-rated predictions[J] . Machine Learning, 1999, 37(3):297-336.
[11] Friedman J, Hastie T, Tibshirani R. Additive logistic regression:a statistical view of boosting (with discussion and a rejoinder by the authors)[J] . The Annals of Statistics, 2000, 28(2):337-407.
[12] Chawla N V, Lazarevic A, Hall L O, et al. SMOTEBoost:improving prediction of the minority class in boosting[C] //Proc of European Conference on Principles of Data Mining and Discovery in Databases. 2003:107-119.
[13] Seiffert C, Khoshgoftaar T M, Van Hulse J, et al. RUSBoost:a hybrid approach to alleviating class imbalance[J] . IEEE Trans on Systems, Man, and Cybernetics, Part A:Systems and Humans, 2010, 40(1):185-197.
[14] Benesty J, Chen Jingdong, Huang Yiteng, et al. Pearson correlation coefficient[M] . Berlin:Springer, 2009:1-4.
[15] Zhong Kai, Yang Qiqi, Zhu Song. New algebraic conditions for ISS of memristive neural networks with variable delays[J] . Neural Computing & Applications, 2017, 28(8):2089-2097.
[16] 王小川. MATLAB神经网络43个案例分析[M] . 北京:北京航空航天大学出版社, 2013.
[17] Chen Yishi, Zhao Xing, Jia Xiuping. Spectral-spatial classification of hyperspectral data based on deep belief network[J] . IEEE Journal of Selected Topics in Applied Earth Observations & Remote Sensing, 2015, 8(6):2381-2392.
[18] Chen Yushi, Jiang Hanlu, Li Chunyang, et al. Deep feature extraction and classification of hyperspectral images based on convolutional neural networks[J] . IEEE Trans on Geoscience & Remote Sensing, 2016, 54(10):6232-6251.
[19] Krizhevsky A, Sutskever I, Hinton G E. ImageNet classification with deep convolutional neural networks[C] //Proc of the 25th International Conference on Neural Information Processing Systems. 2012:1097-1105.
[20] Jia P, Zhang M, Yu W, et al. Hyperspectral image feature extraction method based on sparse constraint convolutional neural network[Z] . 2017.
[21] Dr Soundarapandian. Chronic_kidney_disease[EB/OL] . [2017-03-06] . https://archive. ics. uci. edu/ml/datasets/Chronic_Kidney_Disease.
[22] Denver C. Parkinsons[EB/OL] . [2017-03] . https://archive. ics. uci. edu/ml/datasets/Parkinsons.
[23] Mota H D. Vertebral+Column[EB/OL] . [2017-03] . https://archive. ics. uci. edu/ml/datasets/Vertebral+Column.
[24] Venkat V, Chan K. A neural network methodology for process fault diagnosis[J] . AiChE Journal, 2010, 35(12):1993-2002.
收稿日期 2017/5/8
修回日期 2017/6/21
页码 2592-2596
中图分类号 TP391
文献标志码 A