-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Moe doc fixes #10077
Moe doc fixes #10077
Conversation
7d287fd
to
499c4d2
Compare
499c4d2
to
d12d679
Compare
Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>
d12d679
to
19b1e2c
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Completed the review again of the moe.rst file. Please add all copyedits for grammar, punctuation, and NeMo Framework naming conventions. Also, make the requested heading changes for Overview, Use MoE, and Configure MoE-specific Loss Function.
@@ -4,7 +4,7 @@ Mixture of Experts | |||
Overview |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Delete this heading here. We will move it to another location.
Overview
@@ -52,16 +52,18 @@ Other options include: | |||
1. ``moe_input_jitter_eps`` adds noise to the input tensor by applying jitter with a specified epsilon value. | |||
|
|||
2. ``moe_token_dropping`` enables selectively dropping and padding tokens for each expert to achieve | |||
a specified capacity. | |||
a specified capacity, similar to GShard, Switch-Transformer, and DeepSpeed-MoE. Briefly, if the number |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fix run-on sentence.
moe_token_dropping enables selectively dropping and padding tokens for each expert to achieve a specified capacity. Similar to GShard, Switch-Transformer, and DeepSpeed-MoE. Briefly, if the number of tokens routed to an expert exceeds its capacity, then the exceeding tokens are dropped.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approved changes.
* Moe doc fixes Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com> * JG fixes Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com> --------- Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com> Signed-off-by: adityavavre <aditya.vavre@gmail.com>
* Moe doc fixes Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com> * JG fixes Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com> --------- Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com>
* Moe doc fixes Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com> * JG fixes Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com> --------- Signed-off-by: Alexandros Koumparoulis <akoumparouli@nvidia.com> Signed-off-by: Hainan Xu <hainanx@nvidia.com>
What does this PR do ?
Add a one line overview of what this PR aims to accomplish.
Collection: [Note which collection this PR will affect]
Changelog
Usage
# Add a code snippet demonstrating how to use this
GitHub Actions CI
The Jenkins CI system has been replaced by GitHub Actions self-hosted runners.
The GitHub Actions CI will run automatically when the "Run CICD" label is added to the PR.
To re-run CI remove and add the label again.
To run CI on an untrusted fork, a NeMo user with write access must first click "Approve and run".
Before your PR is "Ready for review"
Pre checks:
PR Type:
If you haven't finished some of the above items you can still open "Draft" PR.
Who can review?
Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.
Additional Information