Comparison of diverse ensemble neural network for large data classification

Bibliographic Details
Format: Restricted Document
_version_ 1860797411249618944
building INTELEK Repository
collection Online Access
collectionurl https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection407072
date 2015-12-30 12:07:15
format Restricted Document
id 12608
institution UniSZA
internalnotes [1] Bottou, L. and Cun, Y. L. 2004. Large Scale Online Learning, Advances in neural information processing systems, Vol.16, 217-224. [2] Cervantes, J., Li, X., Yu, W., and Li, K. 2008. Support Vector Machine Classification for Large Data Sets via Minimum Enclosing Ball Clustering, Neurocomputing, Vol.71, 611-619. [3] Tulunay, Y., Tulunay, E., and Senalp, E. T. 2004. The Neural Network Technique- General Exposition, Advances in Space Research, 983-987. [4] Anthony, M. and Bartlett, P. L. 2009.Neural network learning: Theoretical foundations: Cambridge University Press. [5] Windeatt, T. 2008. Ensemble MLP Classifier Design, in Computational Intelligence Paradigms. vol. 137, L. Jain, M. Sato-Ilic, M. Virvou, G. Tsihrintzis, V. Balas, and C. Abeynayake, Eds., ed: Springer Berlin Heidelberg, pp. 133-147. [6] Torres-Sospedra, J., Hernández-Espinosa, C., and Fernández-Redondo, M. 2011. Introducing Reordering Algorithms to Classic Well-Known Ensembles to Improve Their Performance, in Neural Information Processing. vol. 7063, B.-L. Lu, L. Zhang, and J. Kwok, Eds., ed: Springer Berlin Heidelberg, pp. 572-579. [7] Javadi, M., Ebrahimpour, R., Sajedin, A., Faridi, S., and Zakernejad, S. 2011. Improving ECG Classification Accuracy using an Ensemble of Neural Network Modules, Plos One, Vol.6, [8] Salkhordeh H., M., Vahedian, A., and Sadoghi Y., H. 2012. Making Diversity Enhancement Based on Multiple Classifier System by Weight Tuning, Neural Processing Letters, Vol.35, 61-80. [9] Ceamanosa, X., Waske, B., Benediktsson, J. A., Chanussot, J., Fauvele, M., and Sveinsson, J. R. 2010. A Classifier Ensemble Based on Fusion of Support Vector Machines for Classifying Hyperspectral Data, International Journal of Image and Data Fusion, Vol.1, 293–307. [10] Polikar, R. 2012. Ensemble Learning, in Ensemble Machine Learning: Methods and Applications, C. Zhang and Y. Ma, Eds., ed: Springer US, pp. 1-34. [11] Ulaş, A., Semerci, M., Yıldız, O. T., and Alpaydın, E. 2009. Incremental construction of classifier and discriminant ensembles, Information Sciences, Vol.179, 1298-1318. [12] Feng, L., Yao, Y., Jin, B., and Chen, Q. 2012. An improved incremental learning model for network data stream classification problems, Journal of Convergence Information Technology, Vol.7, [13] Kotsiantis, S. 2011. An incremental ensemble of classifiers, Artificial Intelligence Review, Vol.36, 249-266. [14] Jain, L., Sato-Ilic, M., Virvou, M., Tsihrintzis, G., Balas, V., Abeynayake, C., and Windeatt, T. 2008. Ensemble MLP Classifier Design, in Computational Intelligence Paradigms. vol. 137, T. Windeatt, Ed., ed: Springer Berlin Heidelberg, pp. 133-147. [15] Hanafizadeh, P., Parvin, E. S., Asadolahi, P., and Gholami, N. 2008. Ensemble Strategies to Build Neural Network to Facilitate Decision Making, Journal of Industrial Engineering International, Vol.4, 32-38. [16] Yu, L., Wang, S., and Lai, K. K. 2008. Credit risk assessment with a multistage neural network ensemble learning approach, Expert Systems with Applications, Vol.34, 1434-1444. [17] Huang, F., Xie, G., and Xiao, R. 2009. Research on Ensemble Learning, International Conference on Artificial Intelligence and Computational Intelligence, No. 249-252. [18] Wang, S. and Yao, X. 2013. Relationships between diversity of classification ensembles and single-class performance measures, IEEE Transactions on Knowledge and Data Engineering, Vol.25, 206 - 219 [19] Fernández, C., Valle, C., Saravia, F., and Allende, H. 2012. Behavior Analysis of Neural Network Ensemble Algorithm on A Virtual Machine Cluster, Neural Computing & Applications, Vol.21, 535-542. [20] Minaei-Bidgoli, B., Parvin, H., Alinejad-Rokny, H., Alizadeh, H., and Punch,W. F. 2014. Effects of Resampling Method and Adaptation on Clustering Ensemble Efficacy, Artificial Intelligence Review, Vol.41, 1-22. [21] Kuncheva, L. and Rodríguez, J. 2014. A Weighted Voting Framework for Classifiers Ensembles, Knowledge and Information Systems, Vol.38, 259-275. [22] Ciresan, D. C., Meier, U., and Schmidhuber, J. 2012. Multi-column Deep Neural Networks for Image Classification, IEEE Conference on Computer Vision and Pattern Recognition (CVPR) No. 3642-3649. [23] Alizadeh, H., Behrouz, M.-B., and Parvin, H. 2014. To Improve the Quality of Cluster Ensembles by Selecting a Subset of Base Clusters, Journal of Experimental & Theoretical Artificial Intelligence, Vol.26, 127-150. [24] Zhou, Z.-H. and Li, N. 2010. Multi-information Ensemble Diversity, in Multiple Classifier Systems. vol. 5997, N. E. Gayar, J. Kittler, and F. Roli, Eds., ed Cairo: Springer Berlin Heidelberg, pp. 134-144. [25] Woźniak, M., Graña, M., and Corchado, E. 2014. A survey of multiple classifier systems as hybrid systems, Information Fusion, Vol.16, 3-17. [26] Yang, L. 2011. Classifiers selection for ensemble learning based on accuracy and diversity, Procedia Engineering, Vol.15, 4266-4270. [27] Yang, Y., Xu, Y., and Zhu, Q.-x. 2010. A neural network ensemble method with new definition of diversity based on output error curve. in Proceeding of Intelligent Computing and Integrated Systems (ICISS), 2010 International Conference on,587-590. [28] Breiman, L. 1996. Bagging predictors, Machine Learning, Vol.24, 123-140. [29] Mohamad, M., Saman, M. Y. M., and Hitam, M. S. 2014. The Use of Output Combiners in Enhancing the Performance of Large Data for ANNs, IAENG International Journal of Computer Science, Vol.41, 38-47. [30] Li, G., Shi, J., and Zhou, J. 2011. Bayesian adaptive combination of short-term wind speed forecasts from neural network models, Renewable Energy, Vol.36, 352-359. [31] Re, M. and Valentini, G. 2010. Simple ensemble methods are competitive with state-of-the-art data integration methods for gene function prediction, Journal Machine Learning Reserach, Vol.8, 98-111. [32] Ahmad, Z. and Zhang, J. 2009. Selective combination of multiple neural networks for improving model prediction in nonlinear systems modelling through forward selection and backward elimination, Neurocomputing, Vol.72, 1198-1204.
originalfilename 6915-01-FH02-FIK-15-04679.jpg
person norman
recordtype oai_dc
resourceurl https://intelek.unisza.edu.my/intelek/pages/view.php?ref=12608
spelling 12608 https://intelek.unisza.edu.my/intelek/pages/view.php?ref=12608 https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection407072 Restricted Document Article Journal image/jpeg inches 96 96 norman 771 1423 43 43 2015-12-30 12:07:15 1423x771 6915-01-FH02-FIK-15-04679.jpg UniSZA Private Access Comparison of diverse ensemble neural network for large data classification International Journal of Advances in Soft Computing and its Applications In a large dataset classification, a higher number of attributes commonly evolve over time, where many dynamic learning strategies have been proposed such as the ensemble network and incremental neural network. Ensemble network is a learning paradigm where many neural networks are jointly used to solve a problem. The relationship between the ensemble and component of neural networks is analyzed from the context of classification in integrated framework. This task would reveal that, it may be better to have many neural networks instead the incremental neural network. Most approaches of ensemble using totally different classifiers for prediction. Then, in order to find an appropriate neural network from ensemble members, it can be selected from a set of different available neural networks. Thus, a Distributed Reordering Technique (DRT) is proposed. DRT is an enhanced algorithm based on distributed random for different neural networks. The weights are randomly assigned to networks in order to evolve, so that they can characterize each neural network to some extent of fitness in constituting a better result. The ensemble network integrated framework supported by the selection of some neural networks based on output and weights that made up the ensemble. The experimental study shows that in comparing with some ensemble approaches such as Bagging, DRT can generate a neural network with enhanced performance and stronger generalization ability. Furthermore, the use of DRT for neural network classifier is practical and relevance to classification systems for large and can be applied to different large data dimension in future. 7 3 International Center for Scientific Research and Studies International Center for Scientific Research and Studies 67-84 [1] Bottou, L. and Cun, Y. L. 2004. Large Scale Online Learning, Advances in neural information processing systems, Vol.16, 217-224. [2] Cervantes, J., Li, X., Yu, W., and Li, K. 2008. Support Vector Machine Classification for Large Data Sets via Minimum Enclosing Ball Clustering, Neurocomputing, Vol.71, 611-619. [3] Tulunay, Y., Tulunay, E., and Senalp, E. T. 2004. The Neural Network Technique- General Exposition, Advances in Space Research, 983-987. [4] Anthony, M. and Bartlett, P. L. 2009.Neural network learning: Theoretical foundations: Cambridge University Press. [5] Windeatt, T. 2008. Ensemble MLP Classifier Design, in Computational Intelligence Paradigms. vol. 137, L. Jain, M. Sato-Ilic, M. Virvou, G. Tsihrintzis, V. Balas, and C. Abeynayake, Eds., ed: Springer Berlin Heidelberg, pp. 133-147. [6] Torres-Sospedra, J., Hernández-Espinosa, C., and Fernández-Redondo, M. 2011. Introducing Reordering Algorithms to Classic Well-Known Ensembles to Improve Their Performance, in Neural Information Processing. vol. 7063, B.-L. Lu, L. Zhang, and J. Kwok, Eds., ed: Springer Berlin Heidelberg, pp. 572-579. [7] Javadi, M., Ebrahimpour, R., Sajedin, A., Faridi, S., and Zakernejad, S. 2011. Improving ECG Classification Accuracy using an Ensemble of Neural Network Modules, Plos One, Vol.6, [8] Salkhordeh H., M., Vahedian, A., and Sadoghi Y., H. 2012. Making Diversity Enhancement Based on Multiple Classifier System by Weight Tuning, Neural Processing Letters, Vol.35, 61-80. [9] Ceamanosa, X., Waske, B., Benediktsson, J. A., Chanussot, J., Fauvele, M., and Sveinsson, J. R. 2010. A Classifier Ensemble Based on Fusion of Support Vector Machines for Classifying Hyperspectral Data, International Journal of Image and Data Fusion, Vol.1, 293–307. [10] Polikar, R. 2012. Ensemble Learning, in Ensemble Machine Learning: Methods and Applications, C. Zhang and Y. Ma, Eds., ed: Springer US, pp. 1-34. [11] Ulaş, A., Semerci, M., Yıldız, O. T., and Alpaydın, E. 2009. Incremental construction of classifier and discriminant ensembles, Information Sciences, Vol.179, 1298-1318. [12] Feng, L., Yao, Y., Jin, B., and Chen, Q. 2012. An improved incremental learning model for network data stream classification problems, Journal of Convergence Information Technology, Vol.7, [13] Kotsiantis, S. 2011. An incremental ensemble of classifiers, Artificial Intelligence Review, Vol.36, 249-266. [14] Jain, L., Sato-Ilic, M., Virvou, M., Tsihrintzis, G., Balas, V., Abeynayake, C., and Windeatt, T. 2008. Ensemble MLP Classifier Design, in Computational Intelligence Paradigms. vol. 137, T. Windeatt, Ed., ed: Springer Berlin Heidelberg, pp. 133-147. [15] Hanafizadeh, P., Parvin, E. S., Asadolahi, P., and Gholami, N. 2008. Ensemble Strategies to Build Neural Network to Facilitate Decision Making, Journal of Industrial Engineering International, Vol.4, 32-38. [16] Yu, L., Wang, S., and Lai, K. K. 2008. Credit risk assessment with a multistage neural network ensemble learning approach, Expert Systems with Applications, Vol.34, 1434-1444. [17] Huang, F., Xie, G., and Xiao, R. 2009. Research on Ensemble Learning, International Conference on Artificial Intelligence and Computational Intelligence, No. 249-252. [18] Wang, S. and Yao, X. 2013. Relationships between diversity of classification ensembles and single-class performance measures, IEEE Transactions on Knowledge and Data Engineering, Vol.25, 206 - 219 [19] Fernández, C., Valle, C., Saravia, F., and Allende, H. 2012. Behavior Analysis of Neural Network Ensemble Algorithm on A Virtual Machine Cluster, Neural Computing & Applications, Vol.21, 535-542. [20] Minaei-Bidgoli, B., Parvin, H., Alinejad-Rokny, H., Alizadeh, H., and Punch,W. F. 2014. Effects of Resampling Method and Adaptation on Clustering Ensemble Efficacy, Artificial Intelligence Review, Vol.41, 1-22. [21] Kuncheva, L. and Rodríguez, J. 2014. A Weighted Voting Framework for Classifiers Ensembles, Knowledge and Information Systems, Vol.38, 259-275. [22] Ciresan, D. C., Meier, U., and Schmidhuber, J. 2012. Multi-column Deep Neural Networks for Image Classification, IEEE Conference on Computer Vision and Pattern Recognition (CVPR) No. 3642-3649. [23] Alizadeh, H., Behrouz, M.-B., and Parvin, H. 2014. To Improve the Quality of Cluster Ensembles by Selecting a Subset of Base Clusters, Journal of Experimental & Theoretical Artificial Intelligence, Vol.26, 127-150. [24] Zhou, Z.-H. and Li, N. 2010. Multi-information Ensemble Diversity, in Multiple Classifier Systems. vol. 5997, N. E. Gayar, J. Kittler, and F. Roli, Eds., ed Cairo: Springer Berlin Heidelberg, pp. 134-144. [25] Woźniak, M., Graña, M., and Corchado, E. 2014. A survey of multiple classifier systems as hybrid systems, Information Fusion, Vol.16, 3-17. [26] Yang, L. 2011. Classifiers selection for ensemble learning based on accuracy and diversity, Procedia Engineering, Vol.15, 4266-4270. [27] Yang, Y., Xu, Y., and Zhu, Q.-x. 2010. A neural network ensemble method with new definition of diversity based on output error curve. in Proceeding of Intelligent Computing and Integrated Systems (ICISS), 2010 International Conference on,587-590. [28] Breiman, L. 1996. Bagging predictors, Machine Learning, Vol.24, 123-140. [29] Mohamad, M., Saman, M. Y. M., and Hitam, M. S. 2014. The Use of Output Combiners in Enhancing the Performance of Large Data for ANNs, IAENG International Journal of Computer Science, Vol.41, 38-47. [30] Li, G., Shi, J., and Zhou, J. 2011. Bayesian adaptive combination of short-term wind speed forecasts from neural network models, Renewable Energy, Vol.36, 352-359. [31] Re, M. and Valentini, G. 2010. Simple ensemble methods are competitive with state-of-the-art data integration methods for gene function prediction, Journal Machine Learning Reserach, Vol.8, 98-111. [32] Ahmad, Z. and Zhang, J. 2009. Selective combination of multiple neural networks for improving model prediction in nonlinear systems modelling through forward selection and backward elimination, Neurocomputing, Vol.72, 1198-1204.
spellingShingle Comparison of diverse ensemble neural network for large data classification
summary In a large dataset classification, a higher number of attributes commonly evolve over time, where many dynamic learning strategies have been proposed such as the ensemble network and incremental neural network. Ensemble network is a learning paradigm where many neural networks are jointly used to solve a problem. The relationship between the ensemble and component of neural networks is analyzed from the context of classification in integrated framework. This task would reveal that, it may be better to have many neural networks instead the incremental neural network. Most approaches of ensemble using totally different classifiers for prediction. Then, in order to find an appropriate neural network from ensemble members, it can be selected from a set of different available neural networks. Thus, a Distributed Reordering Technique (DRT) is proposed. DRT is an enhanced algorithm based on distributed random for different neural networks. The weights are randomly assigned to networks in order to evolve, so that they can characterize each neural network to some extent of fitness in constituting a better result. The ensemble network integrated framework supported by the selection of some neural networks based on output and weights that made up the ensemble. The experimental study shows that in comparing with some ensemble approaches such as Bagging, DRT can generate a neural network with enhanced performance and stronger generalization ability. Furthermore, the use of DRT for neural network classifier is practical and relevance to classification systems for large and can be applied to different large data dimension in future.
title Comparison of diverse ensemble neural network for large data classification
title_full Comparison of diverse ensemble neural network for large data classification
title_fullStr Comparison of diverse ensemble neural network for large data classification
title_full_unstemmed Comparison of diverse ensemble neural network for large data classification
title_short Comparison of diverse ensemble neural network for large data classification
title_sort comparison of diverse ensemble neural network for large data classification