Conditional max-preserving normalization: an innovative approach to combining diverse classification models

Ensemble learning is a widely recognized technique in Artificial Intelligence that boosts model performance by combining predictions from multiple classifiers. While traditional ensemble methods effectively combine classifiers within the same domain, they face challenges when integrating models that...

Full description

Bibliographic Details
Main Authors: Najafabadi, Amin Arab, Nejati, Faranak, Yap, Ng Keng, Md. Sultan, Abu Bakar, Ali, Mohamed Abdullahi, Ashani, Zahra Nazemi
Format: Article
Language:English
Published: Indonesian Society for Knowledge and Human Development 2024
Online Access:http://psasir.upm.edu.my/id/eprint/118444/
http://psasir.upm.edu.my/id/eprint/118444/1/118444.pdf
_version_ 1848867518222958592
author Najafabadi, Amin Arab
Nejati, Faranak
Yap, Ng Keng
Md. Sultan, Abu Bakar
Ali, Mohamed Abdullahi
Ashani, Zahra Nazemi
author_facet Najafabadi, Amin Arab
Nejati, Faranak
Yap, Ng Keng
Md. Sultan, Abu Bakar
Ali, Mohamed Abdullahi
Ashani, Zahra Nazemi
author_sort Najafabadi, Amin Arab
building UPM Institutional Repository
collection Online Access
description Ensemble learning is a widely recognized technique in Artificial Intelligence that boosts model performance by combining predictions from multiple classifiers. While traditional ensemble methods effectively combine classifiers within the same domain, they face challenges when integrating models that handle different tasks. This study introduces Conditional Max-Preserving Normalization, a novel approach that extends ensemble methods’ applicability across diverse classification domains. Unlike altering deep learning architectures, this method focuses on preserving the most significant prediction while proportionally scaling others to ensure consistency in the combined output. The study utilized the SoftMax function to emulate classification tasks, generating probability vectors for both Human-Car and Cat-Dog classifications. The proposed method identifies the highest confidence value in the combined vector, counts its occurrences, sums the remaining values, and computes a Scale Rate to normalize the vector. The competitive evaluation demonstrated that Conditional Max-Preserving Normalization outperforms traditional ensemble methods in maintaining accuracy and reliability across diverse classification tasks. Formal verification using the Z3 solver affirmed the method's robustness, confirming that the combined vector maintains a valid probability distribution and retains the maximum value. Future research could focus on refining the method to eliminate conditions during normalization, adapting it for binary classification, exploring its application in sequential classification tasks, and extending its use to regression problems. This research lays the groundwork for more robust and adaptable ensemble learning models with potential applications in various real-world scenarios.
first_indexed 2025-11-15T14:37:46Z
format Article
id upm-118444
institution Universiti Putra Malaysia
institution_category Local University
language English
last_indexed 2025-11-15T14:37:46Z
publishDate 2024
publisher Indonesian Society for Knowledge and Human Development
recordtype eprints
repository_type Digital Repository
spelling upm-1184442025-07-10T07:52:06Z http://psasir.upm.edu.my/id/eprint/118444/ Conditional max-preserving normalization: an innovative approach to combining diverse classification models Najafabadi, Amin Arab Nejati, Faranak Yap, Ng Keng Md. Sultan, Abu Bakar Ali, Mohamed Abdullahi Ashani, Zahra Nazemi Ensemble learning is a widely recognized technique in Artificial Intelligence that boosts model performance by combining predictions from multiple classifiers. While traditional ensemble methods effectively combine classifiers within the same domain, they face challenges when integrating models that handle different tasks. This study introduces Conditional Max-Preserving Normalization, a novel approach that extends ensemble methods’ applicability across diverse classification domains. Unlike altering deep learning architectures, this method focuses on preserving the most significant prediction while proportionally scaling others to ensure consistency in the combined output. The study utilized the SoftMax function to emulate classification tasks, generating probability vectors for both Human-Car and Cat-Dog classifications. The proposed method identifies the highest confidence value in the combined vector, counts its occurrences, sums the remaining values, and computes a Scale Rate to normalize the vector. The competitive evaluation demonstrated that Conditional Max-Preserving Normalization outperforms traditional ensemble methods in maintaining accuracy and reliability across diverse classification tasks. Formal verification using the Z3 solver affirmed the method's robustness, confirming that the combined vector maintains a valid probability distribution and retains the maximum value. Future research could focus on refining the method to eliminate conditions during normalization, adapting it for binary classification, exploring its application in sequential classification tasks, and extending its use to regression problems. This research lays the groundwork for more robust and adaptable ensemble learning models with potential applications in various real-world scenarios. Indonesian Society for Knowledge and Human Development 2024-12-31 Article PeerReviewed text en cc_by_sa_4 http://psasir.upm.edu.my/id/eprint/118444/1/118444.pdf Najafabadi, Amin Arab and Nejati, Faranak and Yap, Ng Keng and Md. Sultan, Abu Bakar and Ali, Mohamed Abdullahi and Ashani, Zahra Nazemi (2024) Conditional max-preserving normalization: an innovative approach to combining diverse classification models. International Journal on Advanced Science, Engineering and Information Technology, 14 (6). pp. 1976-1981. ISSN 2088-5334; eISSN: 2460-6952 https://ijaseit.insightsociety.org/index.php/ijaseit/article/view/17344 10.18517/ijaseit.14.6.17344
spellingShingle Najafabadi, Amin Arab
Nejati, Faranak
Yap, Ng Keng
Md. Sultan, Abu Bakar
Ali, Mohamed Abdullahi
Ashani, Zahra Nazemi
Conditional max-preserving normalization: an innovative approach to combining diverse classification models
title Conditional max-preserving normalization: an innovative approach to combining diverse classification models
title_full Conditional max-preserving normalization: an innovative approach to combining diverse classification models
title_fullStr Conditional max-preserving normalization: an innovative approach to combining diverse classification models
title_full_unstemmed Conditional max-preserving normalization: an innovative approach to combining diverse classification models
title_short Conditional max-preserving normalization: an innovative approach to combining diverse classification models
title_sort conditional max-preserving normalization: an innovative approach to combining diverse classification models
url http://psasir.upm.edu.my/id/eprint/118444/
http://psasir.upm.edu.my/id/eprint/118444/
http://psasir.upm.edu.my/id/eprint/118444/
http://psasir.upm.edu.my/id/eprint/118444/1/118444.pdf