Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[TASK]: Provide better feedback when a model may be problematic #6564

Open
mwdchang opened this issue Feb 11, 2025 · 0 comments
Open

[TASK]: Provide better feedback when a model may be problematic #6564

mwdchang opened this issue Feb 11, 2025 · 0 comments
Assignees
Labels
task Development task

Comments

@mwdchang
Copy link
Member

Essentially, models extracted from literature are prone to the problem of mistakes in the literature. Users spent an inordinate amount of time debugging the models when "we" could do some of these checks ahead of time.

We have some notion of a model semantic checker in place (in our engineering UI), but it is not leveraged in the canonical application itself because we didn't want to shower the users with warnings/errors. This sentiment has changed in light of the observations above. So we should find ways to detect and surface critical errors to the modellers, so at least they are aware of what directions they need to go.

As a brief, our current checker does a combination of structural and light semantic checks, eg:

  • rate laws and units are defined
  • declared distribution and actual values are not contradictory
  • duplication checks

New want snippets from @liunelson

  • Rate law simplification is a MIRA function; in the eval, the users created a lot of summed rate laws (e.g. many conversions in a single transition); it was easy to do it like that for them; but the final model does need to be "simplified" before doing stratification etc.
  • tangling production/degradation transitions, that's just counting; they can happen when the terms of a conversion don't match in the equations due to typo
  • non-conservative system, allowed but the users should be aware. The eval team encountered this problem.
@mwdchang mwdchang added the task Development task label Feb 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
task Development task
Projects
None yet
Development

No branches or pull requests

2 participants