Applicability of feed-forward and recurrent neural networks to Boolean function complexity modeling
In this paper, we present the feed-forward neural network (FFNN) and recurrent neural network (RNN) models for predicting Boolean function complexity (BFC). In order to acquire the training data for the neural networks (NNs), we conducted experiments for a large number of randomly generated single o...
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
PERGAMON-ELSEVIER SCIENCE LTD
2008
|
| Subjects: | |
| Online Access: | http://shdl.mmu.edu.my/2670/ http://shdl.mmu.edu.my/2670/1/772.pdf |
| _version_ | 1848790119499169792 |
|---|---|
| author | BEG, A CHANDANAPRASAD, P BEG, A |
| author_facet | BEG, A CHANDANAPRASAD, P BEG, A |
| author_sort | BEG, A |
| building | MMU Institutional Repository |
| collection | Online Access |
| description | In this paper, we present the feed-forward neural network (FFNN) and recurrent neural network (RNN) models for predicting Boolean function complexity (BFC). In order to acquire the training data for the neural networks (NNs), we conducted experiments for a large number of randomly generated single output Boolean functions (BFs) and derived the simulated graphs for number of min-terms against the BFC for different number of variables. For NN model (NNM) development, we looked at three data transformation techniques for pre-processing the NN-training and validation data. The trained NNMs are used for complexity estimation for the Boolean logic expressions with a given number of variables and sum of products (SOP) terms. Both FFNNs and RNNs were evaluated against the ISCAS benchmark results. Our FFNNs and RNNs were able to predict the BFC with correlations of 0.811 and 0.629 with the benchmark results, respectively. (c) 2007 Elsevier Ltd. All rights reserved. |
| first_indexed | 2025-11-14T18:07:33Z |
| format | Article |
| id | mmu-2670 |
| institution | Multimedia University |
| institution_category | Local University |
| language | English |
| last_indexed | 2025-11-14T18:07:33Z |
| publishDate | 2008 |
| publisher | PERGAMON-ELSEVIER SCIENCE LTD |
| recordtype | eprints |
| repository_type | Digital Repository |
| spelling | mmu-26702014-02-13T03:01:24Z http://shdl.mmu.edu.my/2670/ Applicability of feed-forward and recurrent neural networks to Boolean function complexity modeling BEG, A CHANDANAPRASAD, P BEG, A T Technology (General) QA75.5-76.95 Electronic computers. Computer science In this paper, we present the feed-forward neural network (FFNN) and recurrent neural network (RNN) models for predicting Boolean function complexity (BFC). In order to acquire the training data for the neural networks (NNs), we conducted experiments for a large number of randomly generated single output Boolean functions (BFs) and derived the simulated graphs for number of min-terms against the BFC for different number of variables. For NN model (NNM) development, we looked at three data transformation techniques for pre-processing the NN-training and validation data. The trained NNMs are used for complexity estimation for the Boolean logic expressions with a given number of variables and sum of products (SOP) terms. Both FFNNs and RNNs were evaluated against the ISCAS benchmark results. Our FFNNs and RNNs were able to predict the BFC with correlations of 0.811 and 0.629 with the benchmark results, respectively. (c) 2007 Elsevier Ltd. All rights reserved. PERGAMON-ELSEVIER SCIENCE LTD 2008-05 Article NonPeerReviewed text en http://shdl.mmu.edu.my/2670/1/772.pdf BEG, A and CHANDANAPRASAD, P and BEG, A (2008) Applicability of feed-forward and recurrent neural networks to Boolean function complexity modeling. Expert Systems with Applications, 34 (4). pp. 2436-2443. ISSN 09574174 http://dx.doi.org/10.1016/j.eswa.2007.04.010 doi:10.1016/j.eswa.2007.04.010 doi:10.1016/j.eswa.2007.04.010 |
| spellingShingle | T Technology (General) QA75.5-76.95 Electronic computers. Computer science BEG, A CHANDANAPRASAD, P BEG, A Applicability of feed-forward and recurrent neural networks to Boolean function complexity modeling |
| title | Applicability of feed-forward and recurrent neural networks to Boolean function complexity modeling |
| title_full | Applicability of feed-forward and recurrent neural networks to Boolean function complexity modeling |
| title_fullStr | Applicability of feed-forward and recurrent neural networks to Boolean function complexity modeling |
| title_full_unstemmed | Applicability of feed-forward and recurrent neural networks to Boolean function complexity modeling |
| title_short | Applicability of feed-forward and recurrent neural networks to Boolean function complexity modeling |
| title_sort | applicability of feed-forward and recurrent neural networks to boolean function complexity modeling |
| topic | T Technology (General) QA75.5-76.95 Electronic computers. Computer science |
| url | http://shdl.mmu.edu.my/2670/ http://shdl.mmu.edu.my/2670/ http://shdl.mmu.edu.my/2670/ http://shdl.mmu.edu.my/2670/1/772.pdf |