Review of the research landscape of multi-criteria evaluation and bench- marking processes for many-objective optimisation methods: coherent taxonomy, challenges and recommended solution

Evaluation and benchmarking of many-objective optimization (MaOO) methods are complicated. The rapid development of new optimization algorithms for solving problems with many objectives has increased the necessity of developing performance indicators or metrics for evaluating the performance quality...

Full description

Bibliographic Details
Main Authors: Mohammed, R. T., Yaakob, R., Zaidan, A. A., Sharef, N. M., Abdullah, R. H., Zaidan, B. B., Dawood, K. A.
Format: Article
Published: World Scientific 2020
Online Access:http://psasir.upm.edu.my/id/eprint/87482/
_version_ 1848860447093030912
author Mohammed, R. T.
Yaakob, R.
Zaidan, A. A.
Sharef, N. M.
Abdullah, R. H.
Zaidan, B. B.
Dawood, K. A.
author_facet Mohammed, R. T.
Yaakob, R.
Zaidan, A. A.
Sharef, N. M.
Abdullah, R. H.
Zaidan, B. B.
Dawood, K. A.
author_sort Mohammed, R. T.
building UPM Institutional Repository
collection Online Access
description Evaluation and benchmarking of many-objective optimization (MaOO) methods are complicated. The rapid development of new optimization algorithms for solving problems with many objectives has increased the necessity of developing performance indicators or metrics for evaluating the performance quality and comparing the competing optimization algorithms fairly. Further investigations are required to highlight the limitations of how criteria/metrics are determined and the consistency of the procedures with the evaluation and benchmarking processes of MaOO. A review is conducted in this study to map the research landscape of multi-criteria evaluation and benchmarking processes for MaOO into a coherent taxonomy. Then contentious and challenging issues related to evaluation are highlighted, and the performance of optimization algorithms for MaOO is benchmarked. The methodological aspects of the evaluation and selection of MaOO algorithms are presented as the recommended solution on the basis of four distinct and successive phases. First, in the determination phase, the evaluation criteria of MaOO are collected, classified and grouped for testing experts’ consensus on the most suitable criteria. Second, the identification phase involves the process of establishing a decision matrix via a crossover of the ‘evaluation criteria’ and MaOO’, and the level of importance of each selective criteria and sub-criteria from phase one is computed to identify its weight value by using the best–worst method (BWM). Third, the development phase involves the creation of a decision matrix for MaOO selection on the basis of the integrated BWM and VIKOR method. Last, the validation phase involves the validation of the proposed solution.
first_indexed 2025-11-15T12:45:22Z
format Article
id upm-87482
institution Universiti Putra Malaysia
institution_category Local University
last_indexed 2025-11-15T12:45:22Z
publishDate 2020
publisher World Scientific
recordtype eprints
repository_type Digital Repository
spelling upm-874822025-02-12T01:51:27Z http://psasir.upm.edu.my/id/eprint/87482/ Review of the research landscape of multi-criteria evaluation and bench- marking processes for many-objective optimisation methods: coherent taxonomy, challenges and recommended solution Mohammed, R. T. Yaakob, R. Zaidan, A. A. Sharef, N. M. Abdullah, R. H. Zaidan, B. B. Dawood, K. A. Evaluation and benchmarking of many-objective optimization (MaOO) methods are complicated. The rapid development of new optimization algorithms for solving problems with many objectives has increased the necessity of developing performance indicators or metrics for evaluating the performance quality and comparing the competing optimization algorithms fairly. Further investigations are required to highlight the limitations of how criteria/metrics are determined and the consistency of the procedures with the evaluation and benchmarking processes of MaOO. A review is conducted in this study to map the research landscape of multi-criteria evaluation and benchmarking processes for MaOO into a coherent taxonomy. Then contentious and challenging issues related to evaluation are highlighted, and the performance of optimization algorithms for MaOO is benchmarked. The methodological aspects of the evaluation and selection of MaOO algorithms are presented as the recommended solution on the basis of four distinct and successive phases. First, in the determination phase, the evaluation criteria of MaOO are collected, classified and grouped for testing experts’ consensus on the most suitable criteria. Second, the identification phase involves the process of establishing a decision matrix via a crossover of the ‘evaluation criteria’ and MaOO’, and the level of importance of each selective criteria and sub-criteria from phase one is computed to identify its weight value by using the best–worst method (BWM). Third, the development phase involves the creation of a decision matrix for MaOO selection on the basis of the integrated BWM and VIKOR method. Last, the validation phase involves the validation of the proposed solution. World Scientific 2020 Article PeerReviewed Mohammed, R. T. and Yaakob, R. and Zaidan, A. A. and Sharef, N. M. and Abdullah, R. H. and Zaidan, B. B. and Dawood, K. A. (2020) Review of the research landscape of multi-criteria evaluation and bench- marking processes for many-objective optimisation methods: coherent taxonomy, challenges and recommended solution. International Journal of Information Technology and Decision Making, 19 (6). pp. 1619-1693. ISSN 0219-6220; eISSN: 0219-6220 https://www.worldscientific.com/doi/abs/10.1142/S0219622020300049 10.1142/S0219622020300049
spellingShingle Mohammed, R. T.
Yaakob, R.
Zaidan, A. A.
Sharef, N. M.
Abdullah, R. H.
Zaidan, B. B.
Dawood, K. A.
Review of the research landscape of multi-criteria evaluation and bench- marking processes for many-objective optimisation methods: coherent taxonomy, challenges and recommended solution
title Review of the research landscape of multi-criteria evaluation and bench- marking processes for many-objective optimisation methods: coherent taxonomy, challenges and recommended solution
title_full Review of the research landscape of multi-criteria evaluation and bench- marking processes for many-objective optimisation methods: coherent taxonomy, challenges and recommended solution
title_fullStr Review of the research landscape of multi-criteria evaluation and bench- marking processes for many-objective optimisation methods: coherent taxonomy, challenges and recommended solution
title_full_unstemmed Review of the research landscape of multi-criteria evaluation and bench- marking processes for many-objective optimisation methods: coherent taxonomy, challenges and recommended solution
title_short Review of the research landscape of multi-criteria evaluation and bench- marking processes for many-objective optimisation methods: coherent taxonomy, challenges and recommended solution
title_sort review of the research landscape of multi-criteria evaluation and bench- marking processes for many-objective optimisation methods: coherent taxonomy, challenges and recommended solution
url http://psasir.upm.edu.my/id/eprint/87482/
http://psasir.upm.edu.my/id/eprint/87482/
http://psasir.upm.edu.my/id/eprint/87482/