Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

itemsize is torch>=2.1, use element_size() #1630

Merged
merged 1 commit into from
Apr 8, 2024

Conversation

winglian
Copy link
Contributor

@winglian winglian commented Apr 8, 2024

PEFT lists torch version as >= 1.13.0 but .itemsize is only torch 2.1+. This is unexpectedly breaking for users on torch 2.0.

https://github.com/huggingface/peft/blob/main/setup.py#L63

@winglian
Copy link
Contributor Author

winglian commented Apr 8, 2024

/cc @younesbelkada @pacman100

Copy link
Contributor

@younesbelkada younesbelkada left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM thanks @winglian - wdyt @pacman100 @BenjaminBossan ?

Copy link
Contributor

@pacman100 pacman100 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @winglian for the quick fix!

@winglian
Copy link
Contributor Author

winglian commented Apr 8, 2024

you can verify be searching for itemize in the 2.0 docs. https://pytorch.org/docs/2.0/search.html?q=itemsize&check_keywords=yes&area=default

vs the 2.1 docs https://pytorch.org/docs/2.1/search.html?q=itemsize&check_keywords=yes&area=default

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Member

@BenjaminBossan BenjaminBossan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch, indeed this should work the same while being compatible with older PyTorch versions.

(It would be nice if PyTorch docs would indicate what version a feature was added)

@younesbelkada younesbelkada merged commit e07095a into huggingface:main Apr 8, 2024
14 checks passed
@BenjaminBossan
Copy link
Member

you can verify be searching for itemize in the 2.0 docs

Yes, but what I mean is when just browsing the docs, it's not possible to see if something was added recently or not. You have to suspect that something is new and then you can look up older docs, but it would be nice if that wasn't necessary. But either way, it's not like we can change this ;)

@zankner
Copy link

zankner commented Apr 8, 2024

Potentially relevant, running off of main and using torch=2.2.2+cu121 I now get the following error:

File "/usr/lib/python3/dist-packages/peft/peft_model.py", line 543, in get_nb_trainable_parameters
    num_bytes = param.quant_storage.element_size() if hasattr(param, "quant_storage") else 1
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'torch.dtype' object has no attribute 'element_size'

@BenjaminBossan
Copy link
Member

Potentially relevant, running off of main and using torch=2.2.2+cu121 I now get the following error:

See also here: https://github.com/huggingface/peft/actions/runs/8609302545/job/23593013504

BenjaminBossan added a commit to BenjaminBossan/peft that referenced this pull request Apr 9, 2024
Should fix the error introduced by huggingface#1630.

AFAICT, element_size should be called on the parameter, not the dtype.
Unfortunately, I had issues getting older PyTorch versions to work with
bnb, so I haven't tested the initial issue.

To be safe, I also re-added the previous code path using itemsize,
although it might be unnecessary (we would have to check the PyTorch
code to verify when the different attributes/methods were added).
BenjaminBossan added a commit that referenced this pull request Apr 9, 2024
Should fix the error introduced by #1630.

AFAICT, element_size should be called on the parameter, not the dtype.
Unfortunately, I had issues getting older PyTorch versions to work with
bnb, so I haven't tested the initial issue.

To be safe, I also re-added the previous code path using itemsize,
although it might be unnecessary (we would have to check the PyTorch
code to verify when the different attributes/methods were added).
DTennant pushed a commit to DTennant/peft that referenced this pull request Apr 16, 2024
Should fix the error introduced by huggingface#1630.

AFAICT, element_size should be called on the parameter, not the dtype.
Unfortunately, I had issues getting older PyTorch versions to work with
bnb, so I haven't tested the initial issue.

To be safe, I also re-added the previous code path using itemsize,
although it might be unnecessary (we would have to check the PyTorch
code to verify when the different attributes/methods were added).
@peterdonnelly1
Copy link

PEFT lists torch version as >= 1.13.0 but .itemsize is only torch 2.1+. This is unexpectedly breaking for users on torch 2.0.

https://github.com/huggingface/peft/blob/main/setup.py#L63

Can confirm. I am using docker image 'winglian/axolotl:main-20240415-py3.11-cu121-2.2.1' and having this precise issue.
I've just edited peft_model.py to use elementsize() rather than itemsize, fingers crossed and thanks for this.

Asides: (1) despite the tag name, this image installs torch 2.0.1 and not 2.2.1 (I used it because I couldn't get the latest version (latest by ostensible version number) to run on my setup (Ubuntu 22.04/RTX4090) (2) On dockerhub, the 'main-base' tag points to an image that's a year old. I don't know if that's intended or else just me not understanding dockerhub.

DTennant pushed a commit to DTennant/peft that referenced this pull request Apr 19, 2024
Should fix the error introduced by huggingface#1630.

AFAICT, element_size should be called on the parameter, not the dtype.
Unfortunately, I had issues getting older PyTorch versions to work with
bnb, so I haven't tested the initial issue.

To be safe, I also re-added the previous code path using itemsize,
although it might be unnecessary (we would have to check the PyTorch
code to verify when the different attributes/methods were added).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants