-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add auto configurator to NeMo #9688
Conversation
nemo/collections/llm/tools/auto_configurator/core/training_config.py
Outdated
Show resolved
Hide resolved
General comment: NeMo uses google-style docstrings so please use that format. |
) | ||
|
||
# Get generated configs | ||
configs = runner.generate_configs() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need to initialize the objects here? That could be removed since the runner can return a run.Partial
of the train function. Then we can just build that & call it.
from nemo.collections.llm.utils import Config | ||
|
||
|
||
class Basic: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there still a need for this abstraction layer? If we have a Partial[train] that we pass around to change, we don't need this class since everything is overwrite'able (like train.model.config.num_layers = 10
).
return configs | ||
|
||
|
||
def generate_grid_search_configs( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could this function take in Partial[train]
instead?
* Fix when optimizers are setup for PEFT * Apply isort and black reformatting Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com> * Init DDP inside PEFT * Apply isort and black reformatting Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com> * Some fixes, loss seems to become nan with peft for some reason * Apply isort and black reformatting Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com> * Loss goes down on fp32 * Apply isort and black reformatting Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com> * Simplifying FNMixin * Apply isort and black reformatting Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com> * Fix bug with new checkpoint-io * Apply isort and black reformatting Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com> * Fix failing test: test_peft_on_train_epoch_start_with_adapter * Apply isort and black reformatting Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com> --------- Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com> Co-authored-by: marcromeyn <marcromeyn@users.noreply.github.com> Co-authored-by: Chen Cui <chcui@nvidia.com>
* Parametrize FPS group Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com> * Apply isort and black reformatting Signed-off-by: mikolajblaz <mikolajblaz@users.noreply.github.com> * Change deafult to False Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com> * Add logic to new ckptIO Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com> * Turn on parallel save by default Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com> --------- Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com> Signed-off-by: mikolajblaz <mikolajblaz@users.noreply.github.com> Co-authored-by: Dmytro Pykhtar <37850217+dimapihtar@users.noreply.github.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
* Change default parallel_save to False (#9633) Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com> * Unwrap ckpt_io for model opt (async save) (#9622) (#9634) Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com> * add reset_lr documentation Signed-off-by: dimapihtar <dpihtar@gmail.com> * fix style Signed-off-by: dimapihtar <dpihtar@gmail.com> * fix style Signed-off-by: dimapihtar <dpihtar@gmail.com> * fix style Signed-off-by: dimapihtar <dpihtar@gmail.com> * add image Signed-off-by: dimapihtar <dpihtar@gmail.com> * fix typo Signed-off-by: dimapihtar <dpihtar@gmail.com> * fix plot Signed-off-by: dimapihtar <dpihtar@gmail.com> * fix plot Signed-off-by: dimapihtar <dpihtar@gmail.com> * change plot size Signed-off-by: dimapihtar <dpihtar@gmail.com> * fix style Signed-off-by: dimapihtar <dpihtar@gmail.com> * move image Signed-off-by: dimapihtar <dpihtar@gmail.com> * add reset_lr to intro page Signed-off-by: dimapihtar <dpihtar@gmail.com> --------- Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com> Signed-off-by: dimapihtar <dpihtar@gmail.com> Co-authored-by: mikolajblaz <mikolajblaz@users.noreply.github.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
e2f6d7b
to
7ddb11a
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A PR of this size is not acceptable. Can we split this up into smaller chunks? (or is simply a rebase missing?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
CodeQL found more than 20 potential problems in the proposed changes. Check the Files changed tab for more details.
This is already a big PR which in this particular case is OK I feel like since it's a port of functionality that exists in the launcher now. But it seems like it's not rebased correctly and therefore it contains many un-related changes. Can this be fixed @dimapihtar ? |
@marcromeyn @ko3n1g New PR: |
What does this PR do ?
Add a one line overview of what this PR aims to accomplish.
Collection: [Note which collection this PR will affect]
Changelog
Usage
# Add a code snippet demonstrating how to use this
GitHub Actions CI
The Jenkins CI system has been replaced by GitHub Actions self-hosted runners.
The GitHub Actions CI will run automatically when the "Run CICD" label is added to the PR.
To re-run CI remove and add the label again.
To run CI on an untrusted fork, a NeMo user with write access must first click "Approve and run".
Before your PR is "Ready for review"
Pre checks:
PR Type:
If you haven't finished some of the above items you can still open "Draft" PR.
Who can review?
Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.
Additional Information