Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide benchmarks tests #103

Closed
redeboer opened this issue Jun 30, 2020 · 3 comments · Fixed by #368
Closed

Provide benchmarks tests #103

redeboer opened this issue Jun 30, 2020 · 3 comments · Fixed by #368
Assignees
Labels
🖱️ DX Improvements to the Developer Experience

Comments

@redeboer
Copy link
Member

redeboer commented Jun 30, 2020

Would be nice to profile/monitor this in a standardised way, so that we can see whether there are improvements upon each PR. The benchmarks should be similar in structure, probably making use of some shared façade functions.

What we probably want as input:

  • Initial + final state
  • Allowed intermediate states (the fewer, the faster)
  • Formalism type (helicity/canonical)
  • Number of:
    • phase space events
    • intensity-based events ('data')
  • Stage to perform (e.g. only up to data generation)

The recipe file is generated with the expertsystem based on this input. All the rest (e.g. which amplitude generator to use), should be deduced from the recipe.

Some potential tools:

@redeboer redeboer added ⚠️ High 🖱️ DX Improvements to the Developer Experience labels Jun 30, 2020
@redeboer redeboer added this to the Finalize dev environment milestone Jun 30, 2020
@redeboer redeboer changed the title Provie benchmarks tests Provide benchmarks tests Jun 30, 2020
@redeboer redeboer self-assigned this Jun 30, 2020
@redeboer redeboer added the Epic Collection of issues label Jul 30, 2020
@redeboer redeboer removed this from the Finalize dev environment milestone Jul 30, 2020
@redeboer redeboer removed their assignment Aug 20, 2020
redeboer added a commit that referenced this issue Oct 15, 2020
* build!: support Python 3.6 again
* ci: add validation schemas to vscode settings
* ci: run workflow scripts in GitHub Actions (see #103)
* ci: decrease required patch coverage (see #135)
* refactor!: upgrade to expertsystem 0.5.0
@redeboer redeboer removed Epic Collection of issues 💡 Feature labels Jun 2, 2021
@redeboer
Copy link
Member Author

Update: the HSF Data Analysis WG considers defining benchmark PWA analyses for comparing different PWA fitter frameworks. Once those benchmarks are defined, they can be addressed in this issue.

@redeboer
Copy link
Member Author

Also worth considering: host these benchmark test under a separate repository, otherwise it slows down CI of TensorWaves and could clutter the repo with a lot of additional testing code. Alternatively, the tests are run only upon merging into the stable branch.

@redeboer
Copy link
Member Author

redeboer commented Dec 1, 2021

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🖱️ DX Improvements to the Developer Experience
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant