Naive Bayes does not select any important features. As you mentioned, the result of training the Naive Bayes classifier is the average value and variance for each function. The classification of the new samples in βYesβ or βNoβ is based on whether the values ββof the characteristics of the sample correspond to the best for the average value and variance of the trained characteristics for βYesβ or βNoβ.
You can use other algorithms to find the most informative attributes. In this case, you may need to use a decision tree classifier, for example. J48 in WEKA (this is an open source implementation of the C4.5 decision tree algorithm ). The first node in the resulting decision tree tells you which function has the most intelligent power.
Even better (as Rushdy Shams stated in another post); Weka Explorer offers special build options to find the most useful attributes in a dataset. These options can be found on the Select attributes tab.
Sicco
source share