A new soft set based pruning algorithm for ensemble method

Bibliographic Details
Format: Restricted Document
_version_ 1860797545827008512
building INTELEK Repository
collection Online Access
collectionurl https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection407072
date 2016-07-14 08:59:57
format Restricted Document
id 13181
institution UniSZA
internalnotes [1] Dietterich, T. G. ,“Ensemble methods in machine learning. In Multiple classifier systems, Springer Berlin Heidelberg, 2000, pp. 1-15 [2] Breiman, L. , “Bagging predictors. Machine learning,” 24(2), 1996, pp. 123-140 [3] Freund, Y., & Schapire, R. E., Experiments with a new boosting algorithm,” In ICML, 1996, July, (Vol. 96, pp. 148-156) [4] Breiman, L., “Stacked regressions. Machine learning”, 24(1), 1996, pp. 49-64 [5] Wang, H., Fan, W., Yu, P. S., & Han, J. , “ Mining concept-drifting data streams using ensemble classifiers.” In Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining, ACM, August,2003, pp.226-235 [6] Rodriguez, J. J., Kuncheva, L. I., & Alonso, C. J. , “Rotation forest: A new classifier ensemble method. Pattern Analysis and Machine Intelligence,” IEEE Transactions on, 28(10), 2006, pp.1619-1630. [7] Caruana, R., Niculescu-Mizil, A., Crew, G., & Ksikes, A “Ensemble selection from libraries of models,” In Proceedings of the twenty-first international conference on Machine learning ACM, July 2004, pp 18 [8] Tsoumakas, G., Partalas, I., & Vlahavas, I. , “ A taxonomy and short review of ensemble selection,” In Workshop on Supervised and Unsupervised Ensemble Methods and Their Applications, July 2008. [9] Partalas, I., Tsoumakas, G., Katakis, I., & Vlahavas, I. “Ensemble pruning using reinforcement learning,” In Advances in Artificial Intelligence Springer Berlin Heidelberg. 2006, pp.301-310 [10] Martinez-Muoz, G., Hernández-Lobato, D., & Suarez, A. , “An analysis of ensemble pruning techniques based on ordered aggregation.” Pattern Analysis and Machine Intelligence, IEEE Transactions on, 31(2),2009, pp. 245- 259. [11] Caruana, R., Munson, A., & Niculescu-Mizil, A. , “Getting the most out of ensemble selection. In Data Mining,” ICDM'06. Sixth International Conference on ,2006, pp. 828- 833. IEEE. [12] Cruz, R. M., Sabourin, R., Cavalcanti, G. D., & Ren, T. I. , META-DES: A dynamic ensemble selection framework using meta-learning. Pattern Recognition,” 48(5),2015, pp. 1925- 1935 [13] Taghavi, Z. S., & Sajedi, H. , “ Ensemble pruning based on oblivious Chained Tabu Searches,” International Journal of Hybrid Intelligent Systems, 12(3), 2016, pp.131-143 [14] Fürnkranz, J., & Widmer, G. , “ Incremental reduced error pruning, In Proceedings of the 11th International Conference on Machine Learning (ML-94) , 1994, pp. 70-77 [15] Margineantu, D. D., & Dietterich, T. G. , “Pruning adaptive boosting,” In ICML, July 1997,Vol. 97, pp. 211-218). [16] Schapire, R. E., & Singer, Y., “BoosTexter: A boosting-based system for text categorization. Machine learning, 39(2),200, pp. 135-168. [17] Strehl, A., & Ghosh, J. , “Cluster ensembles--- a knowledge reuse framework for combining multiple partitions,” The Journal of Machine Learning Research, 3, 2003, pp. 583-617. [18] Topchy, A., Jain, A. K., & Punch, W. , “Clustering ensembles: Models of consensus and weak partitions. Pattern Analysis and Machine Intelligence,” IEEE Transactions on, 27(12),2005, pp. 1866-1881. [19] Bakker, B., & Heskes, T. , “ Clustering ensembles of neural network models. Neural networks,” 16(2),2005, pp 261-269. [20] Zhang, Y., Burer, S., & Street, W. N. , “ Ensemble pruning via semi-definite programming,” The Journal of Machine Learning Research, 7,2005, pp. 1315-1338. [21] Chen, H., Tino, P., & Yao, X. , “A probabilistic ensemble pruning algorithm” In Data Mining Workshops, 2006. ICDM Workshops 2006. Sixth IEEE International Conference IEEE,pp.878-882 [22] Molodtsov, D. , “ Soft set theory—first results. Computers & Mathematics with Applications,” 37(4),1999, pp. 19-31. [23] Maji, P. K., Biswas, R., & Roy, A. “ Soft set theory. Computers & Mathematics with Applications,” 45(4),2003,pp. 555-562. [24] Herawan, T., & Deris, M. M. , “ A direct proof of every rough set is a soft set, “ In | 2009 Third Asia International Conference on Modelling & Simulation, May 2009,(pp. 119- 124,. IEEE [25] Skowron, A., & Rauszer, C. , “The discernibility matrices and functions in information systems,” In Intelligent Decision Support ,1992, pp. 331-362. Springer Netherlands [26] Kong, Z., Gao, L., Wang, L., & Li, S. , “The normal parameter reduction of soft sets and its algorithm. Computers & Mathematics with Applications”, 56(12), 2008, 3029-3037 [27] Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., & Witten, I. H, “. The WEKA data mining software: an update,” ACM SIGKDD explorations newsletter, 11(1), 2009, pp. 10-18.
originalfilename 7489-01-FH02-FIK-16-06166.jpg
person norman
recordtype oai_dc
resourceurl https://intelek.unisza.edu.my/intelek/pages/view.php?ref=13181
spelling 13181 https://intelek.unisza.edu.my/intelek/pages/view.php?ref=13181 https://intelek.unisza.edu.my/intelek/pages/search.php?search=!collection407072 Restricted Document Article Journal image/jpeg inches 96 96 norman 775 87 87 1425 2016-07-14 08:59:57 1425x775 7489-01-FH02-FIK-16-06166.jpg UniSZA Private Access A new soft set based pruning algorithm for ensemble method Journal of Theoretical and Applied Information Technology Ensemble methods have been introduced as a useful and effective solution to improve the performance of the classification. Despite having the ability of producing the highest classification accuracy, ensemble methods have suffered significantly from their large volume of base classifiers. Nevertheless, we could overcome this problem by pruning some of the classifiers in the ensemble repository. However, only a few researches focused on the ensemble pruning algorithm. Therefore, this paper aims to increase classification accuracy and at the same time minimizing ensemble classifiers by constructing a new ensemble pruning method (SSPM) based on dimensionality reduction in soft set theory. Ensemble pruning deals with the reduction of predictive models in order to improve its efficiency and predictive performance. Soft set theory has been proved to be an effective mathematical tool for dimension reduction. Thus, we proposed a novel soft set based method to prune the classifiers from heterogeneous ensemble committee and select the best subsets of the component classifiers prior to the combination process. The results show that the proposed method not only reduce the number of members of the ensemble, but able to produce highest prediction accuracy. 88 3 Asian Research Publishing Network Asian Research Publishing Network 384-391 [1] Dietterich, T. G. ,“Ensemble methods in machine learning. In Multiple classifier systems, Springer Berlin Heidelberg, 2000, pp. 1-15 [2] Breiman, L. , “Bagging predictors. Machine learning,” 24(2), 1996, pp. 123-140 [3] Freund, Y., & Schapire, R. E., Experiments with a new boosting algorithm,” In ICML, 1996, July, (Vol. 96, pp. 148-156) [4] Breiman, L., “Stacked regressions. Machine learning”, 24(1), 1996, pp. 49-64 [5] Wang, H., Fan, W., Yu, P. S., & Han, J. , “ Mining concept-drifting data streams using ensemble classifiers.” In Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining, ACM, August,2003, pp.226-235 [6] Rodriguez, J. J., Kuncheva, L. I., & Alonso, C. J. , “Rotation forest: A new classifier ensemble method. Pattern Analysis and Machine Intelligence,” IEEE Transactions on, 28(10), 2006, pp.1619-1630. [7] Caruana, R., Niculescu-Mizil, A., Crew, G., & Ksikes, A “Ensemble selection from libraries of models,” In Proceedings of the twenty-first international conference on Machine learning ACM, July 2004, pp 18 [8] Tsoumakas, G., Partalas, I., & Vlahavas, I. , “ A taxonomy and short review of ensemble selection,” In Workshop on Supervised and Unsupervised Ensemble Methods and Their Applications, July 2008. [9] Partalas, I., Tsoumakas, G., Katakis, I., & Vlahavas, I. “Ensemble pruning using reinforcement learning,” In Advances in Artificial Intelligence Springer Berlin Heidelberg. 2006, pp.301-310 [10] Martinez-Muoz, G., Hernández-Lobato, D., & Suarez, A. , “An analysis of ensemble pruning techniques based on ordered aggregation.” Pattern Analysis and Machine Intelligence, IEEE Transactions on, 31(2),2009, pp. 245- 259. [11] Caruana, R., Munson, A., & Niculescu-Mizil, A. , “Getting the most out of ensemble selection. In Data Mining,” ICDM'06. Sixth International Conference on ,2006, pp. 828- 833. IEEE. [12] Cruz, R. M., Sabourin, R., Cavalcanti, G. D., & Ren, T. I. , META-DES: A dynamic ensemble selection framework using meta-learning. Pattern Recognition,” 48(5),2015, pp. 1925- 1935 [13] Taghavi, Z. S., & Sajedi, H. , “ Ensemble pruning based on oblivious Chained Tabu Searches,” International Journal of Hybrid Intelligent Systems, 12(3), 2016, pp.131-143 [14] Fürnkranz, J., & Widmer, G. , “ Incremental reduced error pruning, In Proceedings of the 11th International Conference on Machine Learning (ML-94) , 1994, pp. 70-77 [15] Margineantu, D. D., & Dietterich, T. G. , “Pruning adaptive boosting,” In ICML, July 1997,Vol. 97, pp. 211-218). [16] Schapire, R. E., & Singer, Y., “BoosTexter: A boosting-based system for text categorization. Machine learning, 39(2),200, pp. 135-168. [17] Strehl, A., & Ghosh, J. , “Cluster ensembles--- a knowledge reuse framework for combining multiple partitions,” The Journal of Machine Learning Research, 3, 2003, pp. 583-617. [18] Topchy, A., Jain, A. K., & Punch, W. , “Clustering ensembles: Models of consensus and weak partitions. Pattern Analysis and Machine Intelligence,” IEEE Transactions on, 27(12),2005, pp. 1866-1881. [19] Bakker, B., & Heskes, T. , “ Clustering ensembles of neural network models. Neural networks,” 16(2),2005, pp 261-269. [20] Zhang, Y., Burer, S., & Street, W. N. , “ Ensemble pruning via semi-definite programming,” The Journal of Machine Learning Research, 7,2005, pp. 1315-1338. [21] Chen, H., Tino, P., & Yao, X. , “A probabilistic ensemble pruning algorithm” In Data Mining Workshops, 2006. ICDM Workshops 2006. Sixth IEEE International Conference IEEE,pp.878-882 [22] Molodtsov, D. , “ Soft set theory—first results. Computers & Mathematics with Applications,” 37(4),1999, pp. 19-31. [23] Maji, P. K., Biswas, R., & Roy, A. “ Soft set theory. Computers & Mathematics with Applications,” 45(4),2003,pp. 555-562. [24] Herawan, T., & Deris, M. M. , “ A direct proof of every rough set is a soft set, “ In | 2009 Third Asia International Conference on Modelling & Simulation, May 2009,(pp. 119- 124,. IEEE [25] Skowron, A., & Rauszer, C. , “The discernibility matrices and functions in information systems,” In Intelligent Decision Support ,1992, pp. 331-362. Springer Netherlands [26] Kong, Z., Gao, L., Wang, L., & Li, S. , “The normal parameter reduction of soft sets and its algorithm. Computers & Mathematics with Applications”, 56(12), 2008, 3029-3037 [27] Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., & Witten, I. H, “. The WEKA data mining software: an update,” ACM SIGKDD explorations newsletter, 11(1), 2009, pp. 10-18.
spellingShingle A new soft set based pruning algorithm for ensemble method
summary Ensemble methods have been introduced as a useful and effective solution to improve the performance of the classification. Despite having the ability of producing the highest classification accuracy, ensemble methods have suffered significantly from their large volume of base classifiers. Nevertheless, we could overcome this problem by pruning some of the classifiers in the ensemble repository. However, only a few researches focused on the ensemble pruning algorithm. Therefore, this paper aims to increase classification accuracy and at the same time minimizing ensemble classifiers by constructing a new ensemble pruning method (SSPM) based on dimensionality reduction in soft set theory. Ensemble pruning deals with the reduction of predictive models in order to improve its efficiency and predictive performance. Soft set theory has been proved to be an effective mathematical tool for dimension reduction. Thus, we proposed a novel soft set based method to prune the classifiers from heterogeneous ensemble committee and select the best subsets of the component classifiers prior to the combination process. The results show that the proposed method not only reduce the number of members of the ensemble, but able to produce highest prediction accuracy.
title A new soft set based pruning algorithm for ensemble method
title_full A new soft set based pruning algorithm for ensemble method
title_fullStr A new soft set based pruning algorithm for ensemble method
title_full_unstemmed A new soft set based pruning algorithm for ensemble method
title_short A new soft set based pruning algorithm for ensemble method
title_sort new soft set based pruning algorithm for ensemble method