Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[benchmarks] Add no-skip argument. #6484

Merged
merged 1 commit into from
Feb 7, 2024
Merged

Conversation

ysiraichi
Copy link
Collaborator

This PR adds and argument for not skipping models.

cc @miladm

# 2. the model is not compatible with the experiment configuration
#
# Otherwise, we should go ahead and execute it.
if (not self._args.no_skip and not self.model_loader.is_compatible(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

don't we need to check for no_skip in line 99? hoping to save some time not loading a model we don't intend to run.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we can do that. We still have to "load the model" for updating the environment variables of the child process (which will actually execute the benchmark). That said, this loading won't take many cycles, since we are calling with dummy=True: "instantiate the wrapper BenchmarkModel, but don't instantiate the actual torchbench model."

@miladm miladm self-requested a review February 6, 2024 22:48
@zpcore
Copy link
Collaborator

zpcore commented Feb 6, 2024

Thanks, by the way, I think we can remove --strict_compatible and the strict_deny_list. I tried with the skip list from the yaml file you introduced without --strict_compatible argument and it works fine now!

@ysiraichi
Copy link
Collaborator Author

@zpcore Sure. I will open a new PR doing that.

Copy link
Collaborator

@zpcore zpcore left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@ysiraichi ysiraichi merged commit e8a5f00 into master Feb 7, 2024
18 checks passed
amithrm pushed a commit to amithrm/xla that referenced this pull request Mar 1, 2024
bhavya01 pushed a commit that referenced this pull request Apr 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants