Applicability of feed-forward and recurrent neural networks to Boolean function complexity modeling

作者:

Highlights:

摘要

In this paper, we present the feed-forward neural network (FFNN) and recurrent neural network (RNN) models for predicting Boolean function complexity (BFC). In order to acquire the training data for the neural networks (NNs), we conducted experiments for a large number of randomly generated single output Boolean functions (BFs) and derived the simulated graphs for number of min-terms against the BFC for different number of variables. For NN model (NNM) development, we looked at three data transformation techniques for pre-processing the NN-training and validation data. The trained NNMs are used for complexity estimation for the Boolean logic expressions with a given number of variables and sum of products (SOP) terms. Both FFNNs and RNNs were evaluated against the ISCAS benchmark results. Our FFNNs and RNNs were able to predict the BFC with correlations of 0.811 and 0.629 with the benchmark results, respectively.

论文关键词:Machine learning,Feed-forward neural network,Recurrent neural network,Bias,Biological sequence analysis,Motif,Sub-cellular localization,Pattern recognition,Classifier design

论文评审过程:Available online 18 April 2007.

论文官网地址:https://doi.org/10.1016/j.eswa.2007.04.010