The Use of Output Combiners in Enhancing the Performance of Large Data for ANNs

Bibliographic Details
Format: Restricted Document
_version_ 1860796931292266496
building INTELEK Repository
collection Online Access
collectionurl https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection407072
date 2014-07-09 10:14:59
format Restricted Document
id 10727
institution UniSZA
internalnotes [1] L. Xiaoou, J. Cervantes, and W. Yu, "Fast Classification using Large Datasets via Random Selection Clustering and Support Vector Machines", Intelligent Data Analysis, 12:16, 2012. [2] L. Bottou, "Large-scale Machine Learning with Stochastic Gradient Descent", in 19th International Conference on Computational Statistics, Paris: Physica-Verlag HD, 2010, pp. 177-186. [3] L. Bottou and Y. L. Cun, "Large Scale Online Learning", Advances in neural information processing systems, 16:217, 2004. [4] N. C. Oza and K. Tumer, "Classifier Ensembles: Select Real-world Applications", Information Fusion, 9:1, pp. 4-20, 2008. [5] J. Torres-Sospedra, C. Hernández-Espinosa, and M. Fernández-Redondo, "Introducing Reordering Algorithms to Classic Well-Known Ensembles to Improve Their Performance", in Neural Information Processing. 7063: Springer Berlin Heidelberg, 2011, pp. 572-579. [6] V. Turchenko, L. Grandinetti, and A. Sachenko, "Parallel Batch Pattern Training of Neural Networks on Computational Clusters", in International Conference on High Performance Computing and Simulation (HPCS), Madrid, 2012, pp. 202-208. [7] A. R. M. Kattan, R. Abdullah, and R. A. Salam, "Training Feed-Forward Neural Networks Using a Parallel Genetic Algorithm with the Best Must Survive Strategy", in International Conference on Intelligent Systems, Modelling and Simulation, Liverpool, 2010, pp. 96-99. [8] S. Babii, "Performance Evaluation for Training a Distributed Back Propagation Implementation", in 4th International Symposium on Applied Computational Intelligence and Informatics, Timisoara 2007, pp. 273-278. [9] R. Polikar, "Ensemble Learning", in Ensemble Machine Learning: Methods and Applications: Springer US, 2012, pp. 1-34. [10] N. De Silva and N. Thurairajah, "Architecture of Ensemble Neural Networks for Risk Analysis", in ASC 48th Annual International Conference, Birmingham City University, U.K. , 2012. [11] S. Cheng, L. Li, D. Chen, and J. Li, "A Neural Network Based Ensemble Approach for Improving the Accuracy of Meteorological Fields Used for Regional Air Quality Modeling", Journal of Environmental Management, 112:1, pp. 404-414, 2012. [12] X. Ceamanosa, B. Waske, J. A. Benediktsson, J. Chanussot, M. Fauvele, and J. R. Sveinsson, "A Classifier Ensemble Based on Fusion of Support Vector Machines for Classifying Hyperspectral Data", International Journal of Image and Data Fusion, 1:3, pp. 293–307, 2010. [13] Y. Bi, "The Impact of Diversity on the Accuracy of Evidential Classifier Ensembles", International Journal of Approximate Reasoning, 53:4, pp. 584-607, 2012. [14] M. Mohamad, M. Y. M. Saman, and M. S. Hitam, "Divide and Conquer Approach in Reducing ANN Training Time for Small and Large Data", Journal of Applied Sciences, 13:1, pp. 133-139, 2013. [15] D. C. Ciresan, U. Meier, and J. Schmidhuber, "Multi-column Deep Neural Networks for Image Classification", presented at the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Providence, RI 2012. [16] B. M. Wilamowski, "Neural Network Architectures and Learning Algorithms", IEEE Industrial Electronics Magazine, 3:4, pp. 56-63, 2009. [17] J. K. Bradley and R. Schapire, "Filterboost: Regression and Classification on Large Datasets", in Advances in neural information processing systems, Vancouver, 2008, pp. 185-192. [18] J. Cervantes, X. Li, W. Yu, and K. Li, "Support Vector Machine Classification for Large Data Sets via Minimum Enclosing Ball Clustering", Neurocomputing, 71:4, pp. 611-619, 2008. [19] L. Yue, T. Zaixia, and Z. Bofeng, "Computational Grid Based Neural Network Ensemble Learning Platform and Its Application", in International Conference on Management of e-Commerce and e-Government (ICMECG '09), Nanchang, 2009, pp. 176-181. [20] L. Jain, M. Sato-Ilic, M. Virvou, G. Tsihrintzis, V. Balas, C. Abeynayake, and T. Windeatt, "Ensemble MLP Classifier Design", in Computational Intelligence Paradigms. 137: Springer Berlin Heidelberg, 2008, pp. 133-147. [21] Z.-H. Zhou and N. Li, "Multi-information Ensemble Diversity", in Multiple Classifier Systems. 5997, Cairo: Springer Berlin Heidelberg, 2010, pp. 134-144. [22] M. Salkhordeh Haghighi, A. Vahedian, and H. Sadoghi Yazdi, "Making Diversity Enhancement Based on Multiple Classifier System by Weight Tuning", Neural Processing Letters, 35:1, pp. 61-80, 2012/02/01 2012. [23] H. Alizadeh, H. Parvin, and S. Parvin, "A Framework for Cluster Ensemble Based on a Max Metric as Cluster Evaluator", IAENG International Journal of Computer Science, 39:1, pp. 10-19, 2012. [24] L. Kuncheva and J. Rodríguez, "A Weighted Voting Framework for Classifiers Ensembles", Knowledge and Information Systems, 38:2, pp. 259-275, 2012/12/01 2014. [25] M. W. Shields and M. C. Casey, "A Theoretical Framework for Multiple Neural Network Systems", Neurocomputing, 71:7–9, pp. 1462-1476, 2008. [26] S. Bhagat, "Divide and Conquer Strategies for MLP Training", presented at the Proceedings of the International Joint Conference on Neural Networks (IJCNN), Vancouver, 2006. [27] H. Parvin, H. Alinejad-Rokny, and S. Parvin, "Divide and Conquer Classification", Australian Journal of Basic and Applied Sciences, 5:12, pp. 2446-2452, 2011. [28] K. Li and Y. Han, "Study of Selective Ensemble Learning Method and its Diversity Based on Decision Tree and Neural Network", in Chinese Control and Decision Conference (CCDC), Xuzhou, 2010, pp. 1310-1315. [29] K. W. Hsu and J. Srivastava, "Diversity in Combinations of Heterogeneous Classifiers", in Advances in Knowledge Discovery and Data Mining. 5476: Springer Berlin Heidelberg, 2009, pp. 923-932. [30] H. K. Butler, "The Relationship Between Diversity and Accuracy in Multiple Classifier Systems," Air Force Inst of Tech Wright-Patterson AFB Graduate School of Engineering & Managementt, 2012. [31] C. Fernández, C. Valle, F. Saravia, and H. Allende, "Behavior Analysis of Neural Network Ensemble Algorithm on A Virtual Machine Cluster", Neural Computing & Applications, 21:3, pp. 535-542, 2012. [32] B. Minaei-Bidgoli, H. Parvin, H. Alinejad-Rokny, H. Alizadeh, and W. F. Punch, "Effects of Resampling Method and Adaptation on Clustering Ensemble Efficacy", Artificial Intelligence Review, 41:1, pp. 1-22, 2014. [33] C.-L. Liu and H. Fujisawa, "Classification and Learning Methods for Character Recognition: Advances and Remaining Problems Machine Learning in Document Analysis and Recognition", in Machine Learning in Document Analysis and Recognition vol. 90, S. Marinai and H. Fujisawa, Eds., ed: Springer Berlin Heidelberg, 2008, pp. 139-161. [34] P. Hanafizadeh, E. S. Parvin, P. Asadolahi, and N. Gholami, "Ensemble Strategies to Build Neural Network to Facilitate Decision Making", Journal of Industrial Engineering International, 4:6, pp. 32-38, 2008. [35] T. Windeatt, "Accuracy Diversity and Ensemble MLP Classifier Design", IEEE Transactions on Neural Networks, 17:5, pp. 1194-1211, 2006. [36] N. P. Khanyile, J.-R. Tapamo, and E. Dube, "An Analytic Model for Predicting the Performance of Distributed Applications on Multicore Clusters", IAENG International Journal of Computer Science 39:3, pp. 312-320, 2012. [37] M. Galar, A. Fernández, E. Barrenechea, H. Bustince, and F. Herrera, "An Overview of Ensemble Methods for Binary Classifiers in Multi-class Problems: Experimental Study on One-vs-One and One-vs-All Schemes", Pattern Recognition, 44:8, pp. 1761-1776, 2011.
originalfilename 4838-01-FH02-FIK-14-00846.jpg
person UniSZA
Unisza
unisza
recordtype oai_dc
resourceurl https://intelek.unisza.edu.my/intelek/pages/view.php?ref=10727
spelling 10727 https://intelek.unisza.edu.my/intelek/pages/view.php?ref=10727 https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection407072 Restricted Document Article Journal UniSZA Unisza unisza image/jpeg inches 96 96 10 10 1427 2014-07-09 10:14:59 793 1427x793 4838-01-FH02-FIK-14-00846.jpg UniSZA Private Access The Use of Output Combiners in Enhancing the Performance of Large Data for ANNs IAENG International Journal of Computer Science Deriving classification information from large databases presents several challenges. The current methods used to classify a large dataset have the disadvantage of requiring long computational time and high complexity. In addition, most of the methods can only deal with selected features of the data while some of the methods can only deal with categorical or numerical attributes. This paper proposes large data solutions by defining the strategy to classify large data with local processors of Artificial Neural Networks (ANNs). A combination technique for reordered ANNs is proposed in modeling the combination of multiple ANNs as part of framework approach. Several repeated experiments with different techniques tested with the MNIST dataset show good percentage of performance and reduction of errors. The results obtained are in line with the importance of good performance achieved with the use of combiner for a large data solution. 41 1 38-47 [1] L. Xiaoou, J. Cervantes, and W. Yu, "Fast Classification using Large Datasets via Random Selection Clustering and Support Vector Machines", Intelligent Data Analysis, 12:16, 2012. [2] L. Bottou, "Large-scale Machine Learning with Stochastic Gradient Descent", in 19th International Conference on Computational Statistics, Paris: Physica-Verlag HD, 2010, pp. 177-186. [3] L. Bottou and Y. L. Cun, "Large Scale Online Learning", Advances in neural information processing systems, 16:217, 2004. [4] N. C. Oza and K. Tumer, "Classifier Ensembles: Select Real-world Applications", Information Fusion, 9:1, pp. 4-20, 2008. [5] J. Torres-Sospedra, C. Hernández-Espinosa, and M. Fernández-Redondo, "Introducing Reordering Algorithms to Classic Well-Known Ensembles to Improve Their Performance", in Neural Information Processing. 7063: Springer Berlin Heidelberg, 2011, pp. 572-579. [6] V. Turchenko, L. Grandinetti, and A. Sachenko, "Parallel Batch Pattern Training of Neural Networks on Computational Clusters", in International Conference on High Performance Computing and Simulation (HPCS), Madrid, 2012, pp. 202-208. [7] A. R. M. Kattan, R. Abdullah, and R. A. Salam, "Training Feed-Forward Neural Networks Using a Parallel Genetic Algorithm with the Best Must Survive Strategy", in International Conference on Intelligent Systems, Modelling and Simulation, Liverpool, 2010, pp. 96-99. [8] S. Babii, "Performance Evaluation for Training a Distributed Back Propagation Implementation", in 4th International Symposium on Applied Computational Intelligence and Informatics, Timisoara 2007, pp. 273-278. [9] R. Polikar, "Ensemble Learning", in Ensemble Machine Learning: Methods and Applications: Springer US, 2012, pp. 1-34. [10] N. De Silva and N. Thurairajah, "Architecture of Ensemble Neural Networks for Risk Analysis", in ASC 48th Annual International Conference, Birmingham City University, U.K. , 2012. [11] S. Cheng, L. Li, D. Chen, and J. Li, "A Neural Network Based Ensemble Approach for Improving the Accuracy of Meteorological Fields Used for Regional Air Quality Modeling", Journal of Environmental Management, 112:1, pp. 404-414, 2012. [12] X. Ceamanosa, B. Waske, J. A. Benediktsson, J. Chanussot, M. Fauvele, and J. R. Sveinsson, "A Classifier Ensemble Based on Fusion of Support Vector Machines for Classifying Hyperspectral Data", International Journal of Image and Data Fusion, 1:3, pp. 293–307, 2010. [13] Y. Bi, "The Impact of Diversity on the Accuracy of Evidential Classifier Ensembles", International Journal of Approximate Reasoning, 53:4, pp. 584-607, 2012. [14] M. Mohamad, M. Y. M. Saman, and M. S. Hitam, "Divide and Conquer Approach in Reducing ANN Training Time for Small and Large Data", Journal of Applied Sciences, 13:1, pp. 133-139, 2013. [15] D. C. Ciresan, U. Meier, and J. Schmidhuber, "Multi-column Deep Neural Networks for Image Classification", presented at the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Providence, RI 2012. [16] B. M. Wilamowski, "Neural Network Architectures and Learning Algorithms", IEEE Industrial Electronics Magazine, 3:4, pp. 56-63, 2009. [17] J. K. Bradley and R. Schapire, "Filterboost: Regression and Classification on Large Datasets", in Advances in neural information processing systems, Vancouver, 2008, pp. 185-192. [18] J. Cervantes, X. Li, W. Yu, and K. Li, "Support Vector Machine Classification for Large Data Sets via Minimum Enclosing Ball Clustering", Neurocomputing, 71:4, pp. 611-619, 2008. [19] L. Yue, T. Zaixia, and Z. Bofeng, "Computational Grid Based Neural Network Ensemble Learning Platform and Its Application", in International Conference on Management of e-Commerce and e-Government (ICMECG '09), Nanchang, 2009, pp. 176-181. [20] L. Jain, M. Sato-Ilic, M. Virvou, G. Tsihrintzis, V. Balas, C. Abeynayake, and T. Windeatt, "Ensemble MLP Classifier Design", in Computational Intelligence Paradigms. 137: Springer Berlin Heidelberg, 2008, pp. 133-147. [21] Z.-H. Zhou and N. Li, "Multi-information Ensemble Diversity", in Multiple Classifier Systems. 5997, Cairo: Springer Berlin Heidelberg, 2010, pp. 134-144. [22] M. Salkhordeh Haghighi, A. Vahedian, and H. Sadoghi Yazdi, "Making Diversity Enhancement Based on Multiple Classifier System by Weight Tuning", Neural Processing Letters, 35:1, pp. 61-80, 2012/02/01 2012. [23] H. Alizadeh, H. Parvin, and S. Parvin, "A Framework for Cluster Ensemble Based on a Max Metric as Cluster Evaluator", IAENG International Journal of Computer Science, 39:1, pp. 10-19, 2012. [24] L. Kuncheva and J. Rodríguez, "A Weighted Voting Framework for Classifiers Ensembles", Knowledge and Information Systems, 38:2, pp. 259-275, 2012/12/01 2014. [25] M. W. Shields and M. C. Casey, "A Theoretical Framework for Multiple Neural Network Systems", Neurocomputing, 71:7–9, pp. 1462-1476, 2008. [26] S. Bhagat, "Divide and Conquer Strategies for MLP Training", presented at the Proceedings of the International Joint Conference on Neural Networks (IJCNN), Vancouver, 2006. [27] H. Parvin, H. Alinejad-Rokny, and S. Parvin, "Divide and Conquer Classification", Australian Journal of Basic and Applied Sciences, 5:12, pp. 2446-2452, 2011. [28] K. Li and Y. Han, "Study of Selective Ensemble Learning Method and its Diversity Based on Decision Tree and Neural Network", in Chinese Control and Decision Conference (CCDC), Xuzhou, 2010, pp. 1310-1315. [29] K. W. Hsu and J. Srivastava, "Diversity in Combinations of Heterogeneous Classifiers", in Advances in Knowledge Discovery and Data Mining. 5476: Springer Berlin Heidelberg, 2009, pp. 923-932. [30] H. K. Butler, "The Relationship Between Diversity and Accuracy in Multiple Classifier Systems," Air Force Inst of Tech Wright-Patterson AFB Graduate School of Engineering & Managementt, 2012. [31] C. Fernández, C. Valle, F. Saravia, and H. Allende, "Behavior Analysis of Neural Network Ensemble Algorithm on A Virtual Machine Cluster", Neural Computing & Applications, 21:3, pp. 535-542, 2012. [32] B. Minaei-Bidgoli, H. Parvin, H. Alinejad-Rokny, H. Alizadeh, and W. F. Punch, "Effects of Resampling Method and Adaptation on Clustering Ensemble Efficacy", Artificial Intelligence Review, 41:1, pp. 1-22, 2014. [33] C.-L. Liu and H. Fujisawa, "Classification and Learning Methods for Character Recognition: Advances and Remaining Problems Machine Learning in Document Analysis and Recognition", in Machine Learning in Document Analysis and Recognition vol. 90, S. Marinai and H. Fujisawa, Eds., ed: Springer Berlin Heidelberg, 2008, pp. 139-161. [34] P. Hanafizadeh, E. S. Parvin, P. Asadolahi, and N. Gholami, "Ensemble Strategies to Build Neural Network to Facilitate Decision Making", Journal of Industrial Engineering International, 4:6, pp. 32-38, 2008. [35] T. Windeatt, "Accuracy Diversity and Ensemble MLP Classifier Design", IEEE Transactions on Neural Networks, 17:5, pp. 1194-1211, 2006. [36] N. P. Khanyile, J.-R. Tapamo, and E. Dube, "An Analytic Model for Predicting the Performance of Distributed Applications on Multicore Clusters", IAENG International Journal of Computer Science 39:3, pp. 312-320, 2012. [37] M. Galar, A. Fernández, E. Barrenechea, H. Bustince, and F. Herrera, "An Overview of Ensemble Methods for Binary Classifiers in Multi-class Problems: Experimental Study on One-vs-One and One-vs-All Schemes", Pattern Recognition, 44:8, pp. 1761-1776, 2011.
spellingShingle The Use of Output Combiners in Enhancing the Performance of Large Data for ANNs
summary Deriving classification information from large databases presents several challenges. The current methods used to classify a large dataset have the disadvantage of requiring long computational time and high complexity. In addition, most of the methods can only deal with selected features of the data while some of the methods can only deal with categorical or numerical attributes. This paper proposes large data solutions by defining the strategy to classify large data with local processors of Artificial Neural Networks (ANNs). A combination technique for reordered ANNs is proposed in modeling the combination of multiple ANNs as part of framework approach. Several repeated experiments with different techniques tested with the MNIST dataset show good percentage of performance and reduction of errors. The results obtained are in line with the importance of good performance achieved with the use of combiner for a large data solution.
title The Use of Output Combiners in Enhancing the Performance of Large Data for ANNs
title_full The Use of Output Combiners in Enhancing the Performance of Large Data for ANNs
title_fullStr The Use of Output Combiners in Enhancing the Performance of Large Data for ANNs
title_full_unstemmed The Use of Output Combiners in Enhancing the Performance of Large Data for ANNs
title_short The Use of Output Combiners in Enhancing the Performance of Large Data for ANNs
title_sort use of output combiners in enhancing the performance of large data for anns