Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New SciML/Optimization for fitting parameters of ODEs tutorial tutorial #740

Merged

Conversation

TorkelE
Copy link
Member

@TorkelE TorkelE commented Dec 4, 2023

A new version of this one: #708

Adds a tutorial of how to use DiffEqParamEstim and Optimization for parameter fitting. Those packages are still not properly adapted to symbolic indexing, so normal vectors have to be used. However, I still think we go with that (I can add an additional note box at the beginning if we want to highlight this).

I still think this is better than self-coding this stuff (easy-to-miss stuff like what happens if a solver fails) so think this is useful (PEtab is still probably the best alternative). Also, hopefully, we can use this as a template workflow that we think should work without indexing, and something DiffEqParamEstim + Optimization should work towards fixing.

@TorkelE TorkelE force-pushed the new_SciMLOptimization_paramfitting_tutorial branch from b35dbbc to 60372f6 Compare January 25, 2024 22:48
@TorkelE TorkelE changed the base branch from master to Catalyst_version_14 January 25, 2024 22:49
Copy link
Member

@isaacsas isaacsas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about fitting multiple data sets that should have the same parameters simultaneously? (i.e. if the experiment has been run multiple times) Do we show this anywhere?

docs/pages.jl Show resolved Hide resolved
docs/pages.jl Outdated Show resolved Hide resolved
Comment on lines +9 to +10
using OptimizationOptimisers # Required for the ADAM optimizer.
using SciMLSensitivity # Required for `Optimization.AutoZygote()` automatic differentiation option.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
using OptimizationOptimisers # Required for the ADAM optimizer.
using SciMLSensitivity # Required for `Optimization.AutoZygote()` automatic differentiation option.
using OptimizationOptimisers # for the ADAM optimizer
using SciMLSensitivity # for Optimization.AutoZygote() AD backend

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I prefer the old version. Especially spelling out the AD part (As I imagine most people are unfamiliar with this, and especially just "AD backend" doesn't say much.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, I generally try to make my comments full sentences, not sure if there is some standard regarding this though.

@TorkelE
Copy link
Member Author

TorkelE commented Jan 30, 2024

What about fitting multiple data sets that should have the same parameters simultaneously? (i.e. if the experiment has been run multiple times) Do we show this anywhere?

I think I looked at demonstrating this as well, but it was just really messy so I decided not to do it (was a while ago, so don't really remember). There is a section mentioning that this is possible, and linking to where DiffEqParamEstim describes it though:

## Parameter fitting to multiple experiments
Say that we had measured our model for several different initial conditions, and would like to fit our model to all these measurements simultaneously. This can be done by first creating a [corresponding `EnsembleProblem`](@ref advanced_simulations_ensemble_problems). How to then create loss functions for these are described in more detail [here](https://docs.sciml.ai/DiffEqParamEstim/stable/tutorials/ensemble/).

We also show how to do this for PEtab.

TorkelE and others added 25 commits January 30, 2024 15:27
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
TorkelE and others added 2 commits January 30, 2024 15:27
Co-authored-by: Sam Isaacson <isaacsas@users.noreply.github.com>
@TorkelE TorkelE force-pushed the new_SciMLOptimization_paramfitting_tutorial branch from 5f6ad74 to 0251301 Compare January 30, 2024 20:27
@TorkelE
Copy link
Member Author

TorkelE commented Jan 30, 2024

Things updated and rebased on the latest v14 branch, thanks for the input!

@isaacsas
Copy link
Member

Feel free to merge when you feel this is done and sufficiently revised.

@TorkelE
Copy link
Member Author

TorkelE commented Jan 30, 2024

Thanks for the input :)

I will have a second read-through tonight just to be sure, and will merge afterwards.

@TorkelE TorkelE merged commit 89ea90c into Catalyst_version_14 Jan 31, 2024
1 of 2 checks passed
@TorkelE TorkelE deleted the new_SciMLOptimization_paramfitting_tutorial branch June 8, 2024 18:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants