-
-
Notifications
You must be signed in to change notification settings - Fork 612
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No sanity checks on destructure
and loadparams!
#1408
Comments
The pytorch function is described for instance here (maybe not the best link?) https://pytorch.org/docs/master/generated/torch.nn.Module.html?highlight=load_state_dict#torch.nn.Module.load_state_dict Perhaps we should follow that and make a keyword |
If we want to address the |
The obvious treelike functor-structure here is the model itself. Perhaps it is I know there's been discussion of re-designing Params, do you have a link to a summary / entry point on that? |
It would probably look something like https://github.com/FluxML/XLA.jl/blob/master/examples/conv.jl. in other words, Params may not be required at all. |
Correct, that's why you have FluxML/Functors.jl#1 and Optimisers.jl, along with #1017 for training. This is the new api that we are moving towards. |
That's why I've updated Optimisers.jl recently to include most of the optimisers from flux (modulo some Adam derivatives but that is fairly easy) |
Given too many parameters, or parameters of the wrong shapes,
destructure
andloadparams!
silently have a go. I believe it would be safer to make these errors. Or at least warnings:When there are too few parameters, it does it seem to fail:
The text was updated successfully, but these errors were encountered: