Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use PyTorch's dynamo benchmark skip-list. #6416

Merged
merged 2 commits into from
Jan 30, 2024

Conversation

ysiraichi
Copy link
Collaborator

This PR starts using PyTorch's dynamo benchmark skip-list for skipping specific models, based on the experiments' configuration. Besides that, it also:

  • Refactors is_compatible function
  • Refactors the lookup of a given file, previously in add_torchbench_dir function
  • Replaces the comments on each skipped benchmark by its actual error message
    • I'd say it's more informative, but will revert it if anyone thinks otherwise

cc @miladm

@ysiraichi ysiraichi merged commit 492fe27 into master Jan 30, 2024
18 checks passed
if name in self.skip["multiprocess"]:
# No support for multiprocess, yet. So, skip all benchmarks that
# only work with it.
return False
Copy link
Collaborator

@miladm miladm Jan 31, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if we should include an "INFO" message when skipping models so it's clear on the logs - unless it gets reported elsewhere. wdyt @ysiraichi @zpcore?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The scripts already report when a model is skipped. However, the reason is not logged. Do you think we should log that, too?

its lists of models into sets of models.
"""

benchmarks_dir = self._find_near_file(("benchmarks",))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It turns out pytorch/benchmarks and xla/benchmarks share the same benchmarks name. And when we search for yaml file, the nearest will be the xla/benchmarks.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, I see.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants