Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEAT](compare datasets): MAE table in Compare model errors #6091

Open
pascaleproulx opened this issue Jan 16, 2025 · 0 comments
Open

[FEAT](compare datasets): MAE table in Compare model errors #6091

pascaleproulx opened this issue Jan 16, 2025 · 0 comments
Labels
feature New feature or request

Comments

@pascaleproulx
Copy link
Contributor

pascaleproulx commented Jan 16, 2025

Add an option in the compare dataset drilldown called ' Compare model errors'.
The second selector will contains the various datasets that are connected as input (output of calibrate operators + one ground truth dataset). The user will select the one that corresponds to the 'ground truth'. This will be used the same way it is used for comparing interventions i.e. as the baseline.
The only new thing here is the need for the mapping UI. The first column will be the ground truth dataset. Note that the variables from that dataset will be used to select what to plot in the output settings.

The table will show the selected error metric. The MAE metric is computed the same way as it is in Calibrate. The script to compute WIS will be provided by @liunelson. The 'Overall' column in the table will be computed by averaging the error metric of all the variables (not just the ones displayed).

The chart should contain all timepoints where the datasets overlap with the ground truth (i.e. where an error can be computed).

Image

@pascaleproulx pascaleproulx added the feature New feature or request label Jan 16, 2025
@shawnyama shawnyama changed the title [FEAT]: Compare model errors in 'Compare datasets' operator [FEAT](compare datasets): MAE table in Compare model errors Jan 31, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants