Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Pallas] Add a bfloat16 flash attention test case #6810

Merged
merged 1 commit into from
Mar 25, 2024
Merged

Conversation

alanwaketan
Copy link
Collaborator

Summary:
Add a bfloat16 flash attention test case.

Test Plan:
python test/test_pallas.py

Fix linters

skip tpu v2

fix linter
@alanwaketan alanwaketan changed the base branch from master to r2.3 March 22, 2024 20:26
@alanwaketan alanwaketan requested a review from lsy323 March 22, 2024 20:36
@lsy323 lsy323 merged commit 3522be1 into r2.3 Mar 25, 2024
1 check passed
@alanwaketan
Copy link
Collaborator Author

Thanks Jack and Siyuan!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants