Entropy in decision trees

Hi Everyone,

I builded a decision tree model but i couldn't find a entropy scores for each column of the tree.

How can i get that the entropy scores of every variable in the model. i know this informations calculating inside the decision tree alg. but i couldn't find a list of the variable which ordered of the importance and their scores

Can anybody help me ?

There is an information gain node but for entropy itself you'll have to calculate that manually AFAIK. Anyway, I don't think that the decision tree nodes render entropy calculations available for each split. If feature selection for another algorithm is your goal, then consider:

- the R integration, some DT functions report feature importance in a tree;

- the random forest node, which provides you with a less biased assessment of variable importance than a single decision tree.