Backward feature selection

I am wondering whether backward the feature selection eliminates in each iteration of the loop a random feature or ranks all the features in each ieraton then eliminate the less important in the next iteration.

Hi again, I notice also that from the end loop feature selection every time a feature is eliminated it is not put back for the next iteration. In this case a feature can have low importance itself but its interaction with another feature is important. Hoping that the elimination of the features is not random in each iteration, does the algorithm test all features in each loop before eliminating the less important one?

Hello Zied,

the notion of iteration is a bit more complex in case of the feature selection loop.
We have to differ between feature selection iteration and loop iteration.
A single feature selection iteration contains n loop iterations where n is the current size of the feature set.
In each loop iteration in a feature iteration, one of the features is eliminated and the resulting model is scored.
At the end of the feature iteration the feature whose removal resulted in the best score is removed and the next feature selection iteration is started with the now reduced feature set.



1 Like