Using the symmetrical Tau criterion for feature selection decision tree and neural network learning

The data collected for various domain purposes usually contains some features irrelevant tothe concept being learned. The presence of these features interferes with the learning mechanism and as a result the predicted models tend to be more complex and less accurate. It is important to employ an eff...

Full description

Bibliographic Details
Main Authors: Hadzic, Fedja, Dillon, Tharam S
Other Authors: Huan Liu
Format: Conference Paper
Published: ACM 2006
Subjects:
Online Access:http://hdl.handle.net/20.500.11937/31352
_version_ 1848753355692703744
author Hadzic, Fedja
Dillon, Tharam S
author2 Huan Liu
author_facet Huan Liu
Hadzic, Fedja
Dillon, Tharam S
author_sort Hadzic, Fedja
building Curtin Institutional Repository
collection Online Access
description The data collected for various domain purposes usually contains some features irrelevant tothe concept being learned. The presence of these features interferes with the learning mechanism and as a result the predicted models tend to be more complex and less accurate. It is important to employ an effective feature selection strategy so that only the necessary and significant features will be used to learn the concept at hand. The Symmetrical Tau (t) [13] is a statistical-heuristic measure for the capability of an attribute in predicting the class of another attribute, and it has successfully been used as a feature selection criterion during decision tree construction. In this paper we aim to demonstrate some other ways of effectively using the t criterion to filter out the irrelevant features prior to learning (pre-pruning) and after the learning process (post-pruning). For the pre-pruning approach we perform two experiments, one where the irrelevant features are filtered out according to their t value, and one where we calculate the t criterion for Boolean combinations of features and use the highest t-valued combination. In the post-pruning approach we use the t criterion to prune a trained neural network and thereby obtain a more accurate and simple rule set. The experiments are performed on data characterized by continuous and categorical attributes and the effectiveness of the proposed techniques is demonstrated by comparing the derived knowledge models in terms of complexity and accuracy.
first_indexed 2025-11-14T08:23:12Z
format Conference Paper
id curtin-20.500.11937-31352
institution Curtin University Malaysia
institution_category Local University
last_indexed 2025-11-14T08:23:12Z
publishDate 2006
publisher ACM
recordtype eprints
repository_type Digital Repository
spelling curtin-20.500.11937-313522023-02-27T07:34:29Z Using the symmetrical Tau criterion for feature selection decision tree and neural network learning Hadzic, Fedja Dillon, Tharam S Huan Liu Robert Stine Leonardo Auslender feature selection network pruning rule simplification The data collected for various domain purposes usually contains some features irrelevant tothe concept being learned. The presence of these features interferes with the learning mechanism and as a result the predicted models tend to be more complex and less accurate. It is important to employ an effective feature selection strategy so that only the necessary and significant features will be used to learn the concept at hand. The Symmetrical Tau (t) [13] is a statistical-heuristic measure for the capability of an attribute in predicting the class of another attribute, and it has successfully been used as a feature selection criterion during decision tree construction. In this paper we aim to demonstrate some other ways of effectively using the t criterion to filter out the irrelevant features prior to learning (pre-pruning) and after the learning process (post-pruning). For the pre-pruning approach we perform two experiments, one where the irrelevant features are filtered out according to their t value, and one where we calculate the t criterion for Boolean combinations of features and use the highest t-valued combination. In the post-pruning approach we use the t criterion to prune a trained neural network and thereby obtain a more accurate and simple rule set. The experiments are performed on data characterized by continuous and categorical attributes and the effectiveness of the proposed techniques is demonstrated by comparing the derived knowledge models in terms of complexity and accuracy. 2006 Conference Paper http://hdl.handle.net/20.500.11937/31352 ACM fulltext
spellingShingle feature selection
network pruning
rule simplification
Hadzic, Fedja
Dillon, Tharam S
Using the symmetrical Tau criterion for feature selection decision tree and neural network learning
title Using the symmetrical Tau criterion for feature selection decision tree and neural network learning
title_full Using the symmetrical Tau criterion for feature selection decision tree and neural network learning
title_fullStr Using the symmetrical Tau criterion for feature selection decision tree and neural network learning
title_full_unstemmed Using the symmetrical Tau criterion for feature selection decision tree and neural network learning
title_short Using the symmetrical Tau criterion for feature selection decision tree and neural network learning
title_sort using the symmetrical tau criterion for feature selection decision tree and neural network learning
topic feature selection
network pruning
rule simplification
url http://hdl.handle.net/20.500.11937/31352