Replies: 1 comment
-
That's an interesting question. SFS and SFFS are completely metrics driven. I.e., they brute-force select features to improve model performance. One could argue that this implicitly gets rid of redundant features, since redundant features don't contribute to the performance. However, there is no explicit evaluation or check for this in place. Spontaneously, to incorporate MRMR more explicitly, I was thinking one could maybe include redundancy into the evaluation criterion. I.e., instead of
something like
where alpha and beta are hyperparameters. But this would require some customization of the implementation, which is currently already very complicated. Unfortunately, I currently don't have a good suggestion or solution for you. |
Beta Was this translation helpful? Give feedback.
-
I am using the
SFFS
selection algorithm to choose feature subset. I know that wrapper-based methods select features that correlate strongest to the classification variable.Suppose I am interested in eliminating redundant features during selection, i.e. the Minimum Redundancy Maximum Relevance (mRMR) selection, what would you suggest?
Does
SFFS
deal with redundancy during elimination, since the "floating operation" involves removing a previously added feature from selected subset?Can someone shed more light on this?
Beta Was this translation helpful? Give feedback.
All reactions