You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be nice, if one could conveniently save the hyperparameters used to train a model. In other words, if there was a function to create custom hyperparameter templates so to speak
So right now I am using either model.describe() or the hyperparameter tuning logs to extract the best hyperparameters and merge them together with the fixed hyperparameters. It would be nice, if there was a function to just extract a dictionary with all of the parameters the same way they are saved for the hyperparameter templates.
The text was updated successfully, but these errors were encountered:
You are right to note that the hyper-parameters used to train a model are not saved in the model (YDF separate clearly the learner and the model concepts).
The hyper-parameters are available in the learner's hyperparameters method.
In addition, if the model was trained with a tuner, the tested hyper-parameters (and their respective scores) are available in a UI (model.describe) and programmatically (model.hyperparameter_optimizer_logs).
For example, here is the dictionary of hyper-parameters used to train a model with a tuner:
logs=model.hyperparameter_optimizer_logs()
top_score=max(t.scorefortinlogs.trials)
selected_trial= [tfortinlogs.trialsift.score==top_score][0]
print(selected_trial.params) # This is a dictionary
logs = model.hyperparameter_optimizer_logs()
top_score = max(t.score for t in logs.trials)
selected_trial = [t for t in logs.trials if t.score == top_score][0]
print(selected_trial.params) # This is a dictionary
this is exactly what I have been looking for. It would be really nice, if it was possible to get this dictionary easier! I also got the parameters from the describe UI but that was slow. I didn't figure out a fast way to get the dict like you posted. One of the reasons is that I had no idea logs was a list of objects! I thought it was simply a string! (I guess I could have checked it, but I didnt). It would be nice if that was a hint in the documentation.
It would be nice, if one could conveniently save the hyperparameters used to train a model. In other words, if there was a function to create custom hyperparameter templates so to speak
So right now I am using either model.describe() or the hyperparameter tuning logs to extract the best hyperparameters and merge them together with the fixed hyperparameters. It would be nice, if there was a function to just extract a dictionary with all of the parameters the same way they are saved for the hyperparameter templates.
The text was updated successfully, but these errors were encountered: