Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LayerNorm error: TypeError: missing a required argument: 'num_warps' #215

Closed
wizyoung opened this issue Sep 5, 2024 · 6 comments · Fixed by #219
Closed

LayerNorm error: TypeError: missing a required argument: 'num_warps' #215

wizyoung opened this issue Sep 5, 2024 · 6 comments · Fixed by #219
Labels
bug Something isn't working triton Triton related issues

Comments

@wizyoung
Copy link
Contributor

wizyoung commented Sep 5, 2024

🐛 Describe the bug

running the test/transformers/test_layer_norm.py yields TypeError: missing a required argument: 'num_warps'

Reproduce

No response

Versions

Environment Report:

Operating System: Linux-5.10.0-1.0.0.28-x86_64-with-glibc2.27
Python version: 3.9.19
PyTorch version: 2.3.1+cu121
CUDA version: 12.1
Triton version: 2.3.1
Transformers version: 4.45.0.dev0

@S1ro1
Copy link
Contributor

S1ro1 commented Sep 5, 2024

Are you checked on a correct branch/commit? I'm not able to reproduce it myself. However I'm using triton 3.0

@wizyoung
Copy link
Contributor Author

wizyoung commented Sep 5, 2024

@S1ro1 I'm using the main branch, i.e., 0.2.1.dev20240905032819. I just cloned the latest clone from the main branch and run again and the error occurs. I'm using triton 2.3.1 as torch2.3.1 requires it.
To reproduce:

In [1]: import torch
   ...: 
   ...: from liger_kernel.ops.layer_norm import LigerLayerNormFunction
   ...: from liger_kernel.transformers.functional import liger_layer_norm
   ...: from liger_kernel.transformers.layer_norm import LigerLayerNorm
In [2]: x = torch.randn(1, 128, 1024, dtype=torch.bfloat16, device='cuda:0', requires_grad=True)
In [3]: liger_ln = LigerLayerNorm(1024, eps=1e-6).to(torch.bfloat16).cuda()
In [6]: liger_ln(x)

image

git log -q:

commit a307eac8b05582ee110bfd02e6dc6819a029af1c (HEAD -> main, origin/main, origin/HEAD)
Author: S1ro <54212263+S1ro1@users.noreply.github.com>
Date:   Thu Sep 5 05:27:59 2024 +0200

@wizyoung
Copy link
Contributor Author

wizyoung commented Sep 5, 2024

you are right, if I upgrade triton to 3.0.0, the error disappears. But I need trition2.3.1

Installing collected packages: triton
  Attempting uninstall: triton
    Found existing installation: triton 2.3.1
    Uninstalling triton-2.3.1:
      Successfully uninstalled triton-2.3.1
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
lmdeploy 0.5.3 requires triton<=2.3.1,>=2.1.0; sys_platform == "linux", but you have triton 3.0.0 which is incompatible.
torch 2.3.1 requires triton==2.3.1; platform_system == "Linux" and platform_machine == "x86_64" and python_version < "3.12", but you have triton 3.0.0 which is incompatible.
torchtext 0.14.1 requires torch==1.13.1, but you have torch 2.3.1 which is incompatible.

@S1ro1
Copy link
Contributor

S1ro1 commented Sep 5, 2024

Ok, installing triton 2.3.1 broke all of my installs, so can't really help you w this unless I figure this out.

@yundai424 yundai424 added bug Something isn't working triton Triton related issues labels Sep 5, 2024
@Tcc0403
Copy link
Collaborator

Tcc0403 commented Sep 6, 2024

@wizyoung There's a quick fix for triton 2.3.1. Check #219

@wizyoung
Copy link
Contributor Author

wizyoung commented Sep 6, 2024

good fix!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triton Triton related issues
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants