Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ModelMerger class #10

Closed
8 tasks done
muammar opened this issue May 16, 2019 · 2 comments
Closed
8 tasks done

Add ModelMerger class #10

muammar opened this issue May 16, 2019 · 2 comments
Assignees
Labels
enhancement New feature or request

Comments

@muammar
Copy link
Owner

muammar commented May 16, 2019

  • Model merger.
  • Loss function weighted sum vs independently.
  • Printing of training evolution.
  • Save model (I think code already available in ml4chem could be used for that).
  • Set convergence stop mechanism.
  • Parallelization.
  • Add more compatibility with different available components.
  • Check for consistency of the class (independent vs dependent loss).

That class would help to merge models like this:

Chgpo

@muammar muammar changed the title Add ModelMerger function Add ModelMerger class May 16, 2019
@muammar muammar self-assigned this Jun 20, 2019
muammar added a commit that referenced this issue Jun 20, 2019
- This is another step forward #10.
- Beautification of code formatting using black.
@muammar
Copy link
Owner Author

muammar commented Jun 20, 2019

This class is now working for cases where the loss functions of models are independent (backward propagation occurs independently as well), and when they are summed (weights of all models are updated according to the gradient of all of them depending on each other).

The parallelization has to be worked out because a new forward() method was created.

muammar added a commit that referenced this issue Jun 20, 2019
- This is another step forward #10.
- Beautification of code formatting using black.
- A new ml4chem.metrics module has been added where all metrics will be
  gathered.
@muammar
Copy link
Owner Author

muammar commented Jul 11, 2019

The parallelization has been worked out because a new forward() method was added. I just tested with MSE loss functions. I also noticed that the independent loss function case is broken now.

@muammar muammar added the enhancement New feature or request label Jul 12, 2019
@muammar muammar closed this as completed in e69f25d Oct 2, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant