-
Notifications
You must be signed in to change notification settings - Fork 617
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Meet error when using xformers and doing loss backward #535
Comments
Thanks for reporting! Do you mind sharing what is the resolution you are using? Also can you report the output of |
@danthe3rd |
For convenience of reproducing, I make a more concise test case:
|
Thanks a lot - this is really useful! @artkorenev has been working on this and should have a fix coming soon |
Is there any solution now, I also have the same problem when I train stablediffusion model. I can get the result with forward function. However, when I calculate loss can execute backward, it raises raise NotImplementedError(f"No operator found for this attention: {inp}") this Error is raised in file "xformers/ops/fmha/dispatch.py", line 68, in _dispatch_bw", the "inp" has the shape {query:(64,256,1,128),key:(64,77,1,128),value:(64,77,1,128)}, I guess maybe the 1 dim cause this dispatch error? |
Closing this as it's resolved now. @leeruibin this is a different / unrelated issue. Can you open a new one with the entire stacktrace/log of the error? |
Just wonder the fix in which release or dev version? |
Thank for your reply, I have open a new issue in this link. |
Woops I forgot to circle back here. It has been fixed in 3ea7307 |
🐛 Bug
Associated issue: huggingface/diffusers#1314
Get error when I enable xformers of UNet and try to do backward:
Command
To Reproduce
Steps to reproduce the behavior:
Expected behavior
Nothing should happen.
Environment
System Info
diffusers version: 0.7.2
Platform: Windows-10-10.0.19041-SP0
Python version: 3.7.7
PyTorch version (GPU?): 1.12.0+cu113 (True)
Huggingface_hub version: 0.10.1
Transformers version: 4.24.0
Using GPU in script?: Yes
Using distributed or parallel set-up in script?: No
xformers version: efdca02
efdca02
The text was updated successfully, but these errors were encountered: