-
Notifications
You must be signed in to change notification settings - Fork 230
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LayerNorm error: TypeError: missing a required argument: 'num_warps' #215
Comments
Are you checked on a correct branch/commit? I'm not able to reproduce it myself. However I'm using triton 3.0 |
@S1ro1 I'm using the main branch, i.e., 0.2.1.dev20240905032819. I just cloned the latest clone from the main branch and run again and the error occurs. I'm using triton 2.3.1 as torch2.3.1 requires it. In [1]: import torch
...:
...: from liger_kernel.ops.layer_norm import LigerLayerNormFunction
...: from liger_kernel.transformers.functional import liger_layer_norm
...: from liger_kernel.transformers.layer_norm import LigerLayerNorm
In [2]: x = torch.randn(1, 128, 1024, dtype=torch.bfloat16, device='cuda:0', requires_grad=True)
In [3]: liger_ln = LigerLayerNorm(1024, eps=1e-6).to(torch.bfloat16).cuda()
In [6]: liger_ln(x) git log -q:
|
you are right, if I upgrade triton to 3.0.0, the error disappears. But I need trition2.3.1
|
Ok, installing triton 2.3.1 broke all of my installs, so can't really help you w this unless I figure this out. |
good fix! |
🐛 Describe the bug
running the test/transformers/test_layer_norm.py yields TypeError: missing a required argument: 'num_warps'
Reproduce
No response
Versions
Environment Report:
Operating System: Linux-5.10.0-1.0.0.28-x86_64-with-glibc2.27
Python version: 3.9.19
PyTorch version: 2.3.1+cu121
CUDA version: 12.1
Triton version: 2.3.1
Transformers version: 4.45.0.dev0
The text was updated successfully, but these errors were encountered: