You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using RuleFitClassifier(tree_generator = GradientBoostingClassifier()) with a GradientBoostingClassifier() object fitted and optimized separately via Scikitlearn API, it returns the next error when fitting RuleFitClassifier(tree_generator = GradientBoostingClassifier()):
ValueError: n_estimators=1 must be larger or equal to estimators_.shape[0]=100 when warm_start==True
When inspecting whats inside RuleFitClassifier(tree_generator = GradientBoostingClassifier()) after fitting the model, the GradientBoostingClassifier() is completely modified to other parameters different than those optimized before fitting RuleFitClassifier(), i.e., GradientBoostingClassifier(max_leaf_nodes=4, n_estimators=1, random_state=0, warm_start=True). Not sure why these parameters (from the GradientBoostingClassifier()) are changed inside the RuleFitClassifier() object.
If RuleFitClassifier(tree_generator = None), everything works well.
As per documentation:
tree_generator : Optional: this object will be used as provided to generate the rules.
This will override almost all the other properties above. Must be GradientBoostingRegressor(), GradientBoostingClassifier(), or RandomForestRegressor()
Which are those properties from RuleFitClassifier() that are override if tree_generator=GradientBoostingClassifier()?
Why does this behavior occurs?
Here is the closest solution I found in Issue #34, however the behavior is not clear.
Any help will be highly appreciated.
Many thanks!
The text was updated successfully, but these errors were encountered:
Hi,
When using
RuleFitClassifier(tree_generator = GradientBoostingClassifier())
with aGradientBoostingClassifier()
object fitted and optimized separately via Scikitlearn API, it returns the next error when fittingRuleFitClassifier(tree_generator = GradientBoostingClassifier())
:ValueError: n_estimators=1 must be larger or equal to estimators_.shape[0]=100 when warm_start==True
When inspecting whats inside
RuleFitClassifier(tree_generator = GradientBoostingClassifier())
after fitting the model, theGradientBoostingClassifier()
is completely modified to other parameters different than those optimized before fittingRuleFitClassifier()
, i.e.,GradientBoostingClassifier(max_leaf_nodes=4, n_estimators=1, random_state=0, warm_start=True)
. Not sure why these parameters (from theGradientBoostingClassifier()
) are changed inside theRuleFitClassifier()
object.If
RuleFitClassifier(tree_generator = None)
, everything works well.As per documentation:
RuleFitClassifier()
that are override iftree_generator=GradientBoostingClassifier()
?Here is the closest solution I found in Issue #34, however the behavior is not clear.
Any help will be highly appreciated.
Many thanks!
The text was updated successfully, but these errors were encountered: