discriminant analysis [16], it can also be applied to the feature selection process in the ANN and the BBN as well as other machine learning classifiers. The basic concept of the stepwise feature selection method is the same when used for different classifiers, but the statistical criteria used to choose good features may vary for different classifiers. A common selection criterion used in linear discriminant analysis is maximization of the Mahalanobis distance r, which provides a generalized measure of the distance between two pattern vectors xx and x2 computed by accounting for the covariance matrix E of the distribution of vectors:

Then, increasing the number of features from m to m + 1 could be performed by evaluating the small number of calculation circles (N — m). In summary, the total testing number of different feature combinations using a progressive roundoff search in the entire feature space is

For example, in the experiment discussed before [31], we found that if we first searched for five optimal features and fixed them as initial growth seeds, the progressive roundoff method could obtain the same search results as shown in Fig. 6. Since N = 20 and m = 5, Tp = 15, 504. After five optimal features have been selected, it would take another 120 tests to finish a complete progressive roundoff search in this feature space. The computation time in this progressive roundoff experiment is only about 1.5% of the time required in the exhaustive permutation method. In other words, it should take less than 45 minutes to complete the search process using the same computer.

Was this article helpful?

## Post a comment