Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TensorBoardLogger should be able to add metric names in hparams #1111

Closed
tstumm opened this issue Mar 10, 2020 · 9 comments
Closed

TensorBoardLogger should be able to add metric names in hparams #1111

tstumm opened this issue Mar 10, 2020 · 9 comments
Labels
feature Is an improvement or enhancement help wanted Open to be worked on won't fix This will not be worked on

Comments

@tstumm
Copy link

tstumm commented Mar 10, 2020

🚀 Feature

TensorBoard allows investigating the effect of hyperparameters in the hparams tab. Unfortunately, the log_hyperparams function in TensorBoardLogger cannot add any information about which of the logged metrics is actually a "metric" which can be used for such a comparison.

Motivation

I would like to use the built-in hparams module of TensorBoard to evaluate my trainings.

Pitch

PyTorch-Lightning should give me the possibility to define the metrics of my model in some way such that any logger is able to derive which metric may be used for hyperparameter validation, as well as other possible characteristics which may be defined for those.

Additional context

The hparams method of a summary takes the following parameters:

def hparams(hparam_dict=None, metric_dict=None):

metric_dict is basically a dictionary mapping metric names to values, whereas the values are omitted in the function itself.

@tstumm tstumm added feature Is an improvement or enhancement help wanted Open to be worked on labels Mar 10, 2020
@github-actions
Copy link
Contributor

Hi! thanks for your contribution!, great first issue!

@awaelchli
Copy link
Contributor

awaelchli commented Mar 10, 2020

Since this is specific to tensorboard and other loggers handle hparams and metrics differently, it is better to use the SummaryWriter object directly. You can always do that with
self.logger.experiment.add_hparams(hparam_dict, metric_dict) within your LightningModule.

@tstumm
Copy link
Author

tstumm commented Mar 11, 2020

I think if Lightning offers such a logger mechanism, it should offer an abstraction to enable this functionality. I'd be fine with having a register_metric function in TensorBoardLogger, but I don't want to rely on implementation details of the underlying logging mechanism.

@Borda
Copy link
Member

Borda commented Mar 14, 2020

@tstumm that sounds good to me, would you mind to send a PR?
cc: @PyTorchLightning/core-contributors

@Borda
Copy link
Member

Borda commented Apr 9, 2020

@tstumm with logger you can access directly to the base TensorBoard so whatever is allowed there you could be able to do also here... May point some example of this Tensofoard functionality/use-case?

@awaelchli
Copy link
Contributor

It was introduced here recently: #1630
Feel free to reopen if issues remain.

@elkotito
Copy link
Contributor

elkotito commented May 4, 2020

@awaelchli Is there a plan to automatically log all metrics for hparams tab in TensorBoard? I mean all metrics returned in log key inside methods like validation_step using newly merged TensorBoardLogger().log_hyperparams()?

@awaelchli
Copy link
Contributor

I'm not up to date with the logger features atm. Will reopen to keep track of your suggestion and also because I just saw that there is still a bugfix in the works here: #1647

@awaelchli awaelchli reopened this May 4, 2020
@stale
Copy link

stale bot commented Jul 3, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the won't fix This will not be worked on label Jul 3, 2020
@stale stale bot closed this as completed Jul 12, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement help wanted Open to be worked on won't fix This will not be worked on
Projects
None yet
Development

No branches or pull requests

4 participants