Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Patch to skip failing test_save_load_low_cpu_mem_usage tests #29043

Merged

Conversation

amyeroberts
Copy link
Collaborator

What does this PR do?

A handful of tests started failing after the merging in of #28948. Tests didn't fail on PR or initial main commit, but now failing. Looks like might be relevant tests not fetched for the runners.

This PR skips the tests for now.

cc @ylacombe As you might want to enable this feature for Music Gen

cc @ArthurZucker As this touches a few language models. Not sure it's worth digging in and fixing it for these as they have low-ish usage.

cc @ydshieh For when you're back in case there's anything else I should address here

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@amyeroberts amyeroberts merged commit 4156f51 into huggingface:main Feb 15, 2024
18 checks passed
@amyeroberts amyeroberts deleted the patch-low-cpu-mem-failing-tests branch February 15, 2024 17:26
@hackyon
Copy link
Contributor

hackyon commented Feb 15, 2024

Thanks for catching these, and sorry I missed some of them (I didn't know how to run the tests across all models at the time).

I've also worked to fill up all the failing tests: #29024

Do you mind taking a quick look? I'm hoping to get that in so it doesn't affect other people's workflows. Thanks!

hackyon added a commit to hackyon/transformers that referenced this pull request Feb 15, 2024
@amyeroberts
Copy link
Collaborator Author

Thanks for catching these, and sorry I missed some of them (I didn't know how to run the tests across all models at the time).

Oh, no need to apologise, you shouldn't need to manually do it, there's something wrong on our end as they should have been run automatically.

Do you mind taking a quick look? I'm hoping to get that in so it doesn't affect other people's workflows. Thanks!

Looking now 🤗

amyeroberts added a commit to amyeroberts/transformers that referenced this pull request Feb 20, 2024
amyeroberts added a commit that referenced this pull request Feb 20, 2024
* Revert "Add tie_weights() to LM heads and set bias in set_output_embeddings() (#28948)"

This reverts commit 725f4ad.

* Revert "Patch to skip failing `test_save_load_low_cpu_mem_usage` tests (#29043)"

This reverts commit 4156f51.
itazap pushed a commit that referenced this pull request May 14, 2024
* Patch to skip currently failing tests

* Whoops - wrong place
itazap pushed a commit that referenced this pull request May 14, 2024
* Revert "Add tie_weights() to LM heads and set bias in set_output_embeddings() (#28948)"

This reverts commit 725f4ad.

* Revert "Patch to skip failing `test_save_load_low_cpu_mem_usage` tests (#29043)"

This reverts commit 4156f51.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants