Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: [automl] patch state (gd/cb_adf) being shared by multiple configs #3998

Merged
merged 19 commits into from
Jun 28, 2022

Conversation

lalo
Copy link
Collaborator

@lalo lalo commented Jun 27, 2022

  • fix test to make sure that weights are equal even if the learning order changes

  • add test that compares "-q ::" vs "automl", in a setting where the champ does not switch, therefore the weights have to be exactly the same (same floats, no precision error)

  • add extra state to model

  • reset state after config switch?

  • check what other feature state is being mutated on model (reversed vs non-reversed)

@lalo lalo changed the title Auml fixstate fix: [automl] patch state (gd/cb_adf) being shared by multiple configs Jun 27, 2022
@lalo lalo requested a review from bassmang June 27, 2022 19:09
@lalo
Copy link
Collaborator Author

lalo commented Jun 27, 2022

This should be merged after #4000

@lalo lalo enabled auto-merge (squash) June 28, 2022 20:15
@lalo lalo merged commit febb451 into master Jun 28, 2022
@lalo lalo deleted the auml-fixstate branch July 1, 2022 19:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants