-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unpin hfh #6876
Unpin hfh #6876
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As I explain in the corresponding issue that I self-assigned (#6863), I was planning to unpin the upper bound of huggingface-hub < 0.23.0 only when the huggingface/transformers#30618 fix is merged and released in a new transformers version, otherwise it breaks our CI. The fix has been merged but not released yet.
Also note that datasets-2.19.1 (version currently installed in the dataset-viewer) does not include the pin in huggingface-hub:
https://github.com/huggingface/datasets/blob/2.19.1/setup.py#L138
Line 138 in bb2664c
"huggingface-hub>=0.21.2", |
If we urgently need some dev feature for dataset-viewer, I would suggest pushing the feature (cherry-picked) to a dedicated branch with 2.19.1 as its starting point (without opening a PR), and install datasets from that branch.
transformers 4.40.2 was release yesterday but not sure if it contains the fix |
@lhoestq yes I knew transformers 4.40.2 was released yesterday, but I had checked that it does not contain the fix: only 2 bug fixes. That is why our CI continues failing in this PR. We will have to wait until the next minor version. |
I have done so:
|
hfh 0.23.1 and transformers 4.41.0 as are out out, let's unpin no ? |
I have re-run the CI to check that is green before. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The CI is red.
The errors were coming from |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the investigation.
Do you know why a FutureWarning
was transformed into an OSError
? These are the raised errors:
FAILED tests/test_metric_common.py::LocalMetricTest::test_load_metric_bertscore - OSError: Can't load the configuration of 'roberta-large'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'roberta-large' is the correct path to a directory containing a config.json file
FAILED tests/test_fingerprint.py::TokenizersHashTest::test_hash_tokenizer - OSError: Can't load the configuration of 'bert-base-uncased'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'bert-base-uncased' is the correct path to a directory containing a config.json file
I opened an issue in transformers: |
It's because the error from the FutureWarning happened when running |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The error is raised because of our pytest settings:
Lines 17 to 20 in b12a2c5
# Test fails if a FutureWarning is thrown by `huggingface_hub` | |
filterwarnings = [ | |
"error::FutureWarning:huggingface_hub*", | |
] |
I find contradictory both having this setting (that transforms huggingface-hub FutureWarnings into errors) and at the same time ignoring their warnings that you implemented in 889a48d
CC: @Wauplin, who introduced the pytest settings:
Opened huggingface/transformers#31007 to fix the FutureWarning in transformers. Sorry, thought it was fixed by huggingface/transformers#30618 but clearly an oversight from my side. Regarding the pytest config, yes I remember adding it and in general I still think it's a good idea to have it. Will be more careful next time to update |
This reverts commit 889a48d.
alright I disabled the errors on FutureWarning, do you see anything else @albertvillanova or we can merge ? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks. It is good from my side.
Show benchmarksPyArrow==8.0.0 Show updated benchmarks!Benchmark: benchmark_array_xd.json
Benchmark: benchmark_getitem_100B.json
Benchmark: benchmark_indices_mapping.json
Benchmark: benchmark_iterating.json
Benchmark: benchmark_map_filter.json
Show updated benchmarks!Benchmark: benchmark_array_xd.json
Benchmark: benchmark_getitem_100B.json
Benchmark: benchmark_indices_mapping.json
Benchmark: benchmark_iterating.json
Benchmark: benchmark_map_filter.json
|
Needed to use those in dataset-viewer:
close #6863