-
Notifications
You must be signed in to change notification settings - Fork 559
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tips to upload large models/datasets #1565
Conversation
The documentation is not available anymore as the PR was closed or merged. |
cc @Pierrci @coyotte508 can you check the info is accurate. I took it from #995 (comment) but prefer to double-check it's not outdated (already 1y old 😄) cc @lhoestq might be interested for datasets users as well |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome! Once merged, it may be nice to link to this from the Datasets docs as well 😄
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool stuff! 🔥
@@ -371,11 +376,89 @@ In addition to [`upload_file`] and [`upload_folder`], the following functions al | |||
|
|||
For more detailed information, take a look at the [`HfApi`] reference. | |||
|
|||
## Push files with Git LFS | |||
## Tips and tricks for large uploads |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's be careful to not break this url
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would not be too worried about it to be honest. I've checked and #upload-files-with-git-lfs
is referenced nowhere in our internal docs (both hfh and hf_docs). Doesn't mean that such a URL doesn't exist in the wild but I would expect probability to be quite low. And even if it's the case, users will be redirected to the correct page even though it's not the correct section. Since Git LFS upload is mostly deprecated it should be fine
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(btw it's quite easy to add backward compat for a particular url anchor if needed)
Co-authored-by: Steven Liu <59462357+stevhliu@users.noreply.github.com> Co-authored-by: Omar Sanseviero <osanseviero@gmail.com>
I have addressed all the comments:
@stevhliu Could you have a second look to the docs and check it looks good? Thanks in advance! |
@@ -371,11 +377,95 @@ In addition to [`upload_file`] and [`upload_folder`], the following functions al | |||
|
|||
For more detailed information, take a look at the [`HfApi`] reference. | |||
|
|||
## Push files with Git LFS | |||
## Tips and tricks for large uploads |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
are we sure we don't want to put that content on its own doc page? (no strong opinio)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No strong opinion either. I thought the content was a bit light to get its own page
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, I was initially looking for this PR in hub-docs
, so yeah it could make sense haha, but it seems to me that some of the advice and limits are specific to huggingface_hub
(or at least to uploads through the HTTP API), so we can keep it here, at least for now
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's wait for @Pierrci quick review before merging please 🙏
Co-authored-by: Julien Chaumond <julien@huggingface.co>
…e/huggingface_hub into tutorial-upload-large-dataset
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice, love the additional columns you added to the table! 🤗
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was finally able to take a look 😄
I don't think we should advertise hard limits, if you reach them it means your repo is going to be very hard to manage (both for the user and for us); we really want to steer people as much as we can toward numbers lower than those. We also want to keep some leeway and be able to change those limits at our discretion to protect our infra if needed.
So I pushed suggestions that advertise recommendations instead - and as for any recommendations, you can choose to ignore them if you're a player, but sometimes it's not a good idea, particularly if you go way above them :)
@@ -371,11 +377,95 @@ In addition to [`upload_file`] and [`upload_folder`], the following functions al | |||
|
|||
For more detailed information, take a look at the [`HfApi`] reference. | |||
|
|||
## Push files with Git LFS | |||
## Tips and tricks for large uploads |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, I was initially looking for this PR in hub-docs
, so yeah it could make sense haha, but it seems to me that some of the advice and limits are specific to huggingface_hub
(or at least to uploads through the HTTP API), so we can keep it here, at least for now
Co-authored-by: Pierric Cistac <Pierrci@users.noreply.github.com>
Thanks for the review @Pierrci! Completely understand your point here about not to commit to much on the hard limits. I have applied your recommendations. Could you have a final look and we are good to merge? :) |
Thanks @Wauplin! I did a Grammarly pass and pushed additional changes (the VSCode extension is really nice for that!), everything LGTM on my end! |
Perfect! Thanks for the final pass, I'm finally merging this PR 😄 |
For example, json files can be merged into a single jsonl file, or large datasets can be exported as Parquet files. | ||
- The maximum number of files per folder cannot exceed 10k files per folder. A simple solution is to | ||
create a repository structure that uses subdirectories. For example, a repo with 1k folders from `000/` to `999/`, each containing at most 1000 files, is already enough. | ||
- **File size**: In the case of uploading large files (e.g. model weights), we strongly recommend splitting them **into chunks of around 5GB each**. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
late to the party but i would have done larger, e.g. 20GB or at least 10GB (Cloudfront caches up to 30GB if i'm not mistaken)
@Pierrci @huggingface/moon-landing-back
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, but I don't really see the interest; as mentioned just below in the doc splitting into small chunks is better for uploading/downloading and retries, while I'm not sure there are a lot of advantages in doing chunks of 30GB?
@julien-c about https://discuss.huggingface.co/t/is-there-a-size-limit-for-dataset-hosting/14861/13?u=severo, we had the information in a previous version of this PR (see the table above: #1565 (comment)). Should we add it again? |
i think we didn't want to do a table with "recommended limits" and "hard limits", so maybe just add a sentence like:
|
OK: #1624 |
We've talked several times about having a section in the docs with some tips for users wanting to upload a large amount of data. I tried to sum-up everything with two distinct aspects:
I also reorganized a bit the Upload guide in general. It's becoming a quite long page but I still think it makes sense to have everything in one place.