Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logging data in between algorithm iterations #161

Open
YuhanLiin opened this issue Aug 25, 2021 · 4 comments
Open

Logging data in between algorithm iterations #161

YuhanLiin opened this issue Aug 25, 2021 · 4 comments
Labels
infrastructure General tasks effecting all implementations

Comments

@YuhanLiin
Copy link
Collaborator

YuhanLiin commented Aug 25, 2021

It'd be nice to have a way of logging information between each step when fitting an iterative algorithm for debugging purposes. For algorithms that use argmin we can simply call add_observer on Executor.

@bytesnake
Copy link
Member

bytesnake commented Aug 25, 2021

perhaps this can be emulated by setting the maximal number of iterations to 1 (or another value if you want to get informed every <n> iterations) and using fit_with. The estimated object should contain information about the current solution

@YuhanLiin
Copy link
Collaborator Author

This is only viable for algorithms where Fit and FitWith use the same algorithm, which isn't always the cases. Also some stats like cost aren't available even on estimated objects. I think using something like log might work here.

@bytesnake
Copy link
Member

I think that having debug information in the estimated model is more useful than sparkling prints through the code, but as you said sometimes this is not feasible

@YuhanLiin
Copy link
Collaborator Author

Putting debug info inside models (estimated or completed) is great for debugging purposes, but the Python ML libraries also print debug info in between fitting iterations, which is incredibly useful when dealing with divergence or NaN problems. I think having logging in the code is fine as long as there's an easy way to turn it off or redirect it elsewhere, rather than mindlessly dumping everything on console. We'd have to do some research to figure out the solution that fits our needs.

@YuhanLiin YuhanLiin added the infrastructure General tasks effecting all implementations label Oct 20, 2021
@YuhanLiin YuhanLiin mentioned this issue Jun 15, 2022
24 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
infrastructure General tasks effecting all implementations
Projects
None yet
Development

No branches or pull requests

2 participants