Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Re-organize SLL ops, pt 4 #3644

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Re-organize SLL ops, pt 4 #3644

wants to merge 1 commit into from

Conversation

q10
Copy link
Contributor

@q10 q10 commented Jan 30, 2025

Summary: - Re-organize jagged_flash_attention_basic, jagged_softmax, and jagged2_softmax

Differential Revision: D68924000

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D68924000

Copy link

netlify bot commented Jan 30, 2025

Deploy Preview for pytorch-fbgemm-docs ready!

Name Link
🔨 Latest commit f9d2c52
🔍 Latest deploy log https://app.netlify.com/sites/pytorch-fbgemm-docs/deploys/679eb5865b3ce00008534b0f
😎 Deploy Preview https://deploy-preview-3644--pytorch-fbgemm-docs.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D68924000

q10 added a commit to q10/FBGEMM that referenced this pull request Jan 31, 2025
Summary:
X-link: facebookresearch/FBGEMM#720

Pull Request resolved: pytorch#3644

- Re-organize `jagged_flash_attention_basic`, `jagged_softmax`, and `jagged2_softmax`

Differential Revision: D68924000
@q10 q10 force-pushed the export-D68924000 branch from 88670e1 to 0da5ccf Compare January 31, 2025 01:51
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D68924000

@q10 q10 force-pushed the export-D68924000 branch from 0da5ccf to 360fb5d Compare January 31, 2025 01:59
q10 added a commit to q10/FBGEMM that referenced this pull request Jan 31, 2025
Summary:
X-link: facebookresearch/FBGEMM#720

Pull Request resolved: pytorch#3644

- Re-organize `jagged_flash_attention_basic`, `jagged_softmax`, and `jagged2_softmax`

Differential Revision: D68924000
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D68924000

q10 added a commit to q10/FBGEMM that referenced this pull request Jan 31, 2025
Summary:
X-link: facebookresearch/FBGEMM#720

Pull Request resolved: pytorch#3644

- Re-organize `jagged_flash_attention_basic`, `jagged_softmax`, and `jagged2_softmax`

Differential Revision: D68924000
@q10 q10 force-pushed the export-D68924000 branch from 360fb5d to f601628 Compare January 31, 2025 18:28
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D68924000

q10 added a commit to q10/FBGEMM that referenced this pull request Jan 31, 2025
Summary:
X-link: facebookresearch/FBGEMM#720

Pull Request resolved: pytorch#3644

- Re-organize `jagged_flash_attention_basic`, `jagged_softmax`, and `jagged2_softmax`

Differential Revision: D68924000
@q10 q10 force-pushed the export-D68924000 branch from f601628 to 28126a0 Compare January 31, 2025 18:50
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D68924000

q10 added a commit to q10/FBGEMM that referenced this pull request Feb 1, 2025
Summary:
X-link: facebookresearch/FBGEMM#720

Pull Request resolved: pytorch#3644

- Re-organize `jagged_flash_attention_basic`, `jagged_softmax`, and `jagged2_softmax`

Differential Revision: D68924000
@q10 q10 force-pushed the export-D68924000 branch from 28126a0 to fa930b2 Compare February 1, 2025 09:13
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D68924000

q10 added a commit to q10/FBGEMM that referenced this pull request Feb 1, 2025
Summary:
X-link: facebookresearch/FBGEMM#720

Pull Request resolved: pytorch#3644

- Re-organize `jagged_flash_attention_basic`, `jagged_softmax`, and `jagged2_softmax`

Differential Revision: D68924000
@q10 q10 force-pushed the export-D68924000 branch from fa930b2 to f152ea1 Compare February 1, 2025 09:20
Summary:
X-link: facebookresearch/FBGEMM#720

Pull Request resolved: pytorch#3644

- Re-organize `jagged_flash_attention_basic`, `jagged_softmax`, and `jagged2_softmax`

Differential Revision: D68924000
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D68924000

@q10 q10 force-pushed the export-D68924000 branch from f152ea1 to f9d2c52 Compare February 2, 2025 00:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants