Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add auto configurator to NeMo #9688

Closed
wants to merge 157 commits into from
Closed

add auto configurator to NeMo #9688

wants to merge 157 commits into from

Conversation

dimapihtar
Copy link
Collaborator

What does this PR do ?

Add a one line overview of what this PR aims to accomplish.

Collection: [Note which collection this PR will affect]

Changelog

  • Add specific line by line info of high level changes in this PR.

Usage

  • You can potentially add a usage example below
# Add a code snippet demonstrating how to use this 

GitHub Actions CI

The Jenkins CI system has been replaced by GitHub Actions self-hosted runners.

The GitHub Actions CI will run automatically when the "Run CICD" label is added to the PR.
To re-run CI remove and add the label again.
To run CI on an untrusted fork, a NeMo user with write access must first click "Approve and run".

Before your PR is "Ready for review"

Pre checks:

  • Make sure you read and followed Contributor guidelines
  • Did you write any new necessary tests?
  • Did you add or update any necessary documentation?
  • Does the PR affect components that are optional to install? (Ex: Numba, Pynini, Apex etc)
    • Reviewer: Does the PR have correct import guards for all optional libraries?

PR Type:

  • New Feature
  • Bugfix
  • Documentation

If you haven't finished some of the above items you can still open "Draft" PR.

Who can review?

Anyone in the community is free to review the PR once the checks have passed.
Contributor guidelines contains specific people who can review PRs to various areas.

Additional Information

  • Related to # (issue)

@github-actions github-actions bot added the NLP label Jul 11, 2024
@dimapihtar dimapihtar changed the base branch from main to r2.0.0rc1 July 11, 2024 12:00
@github-actions github-actions bot removed the NLP label Jul 11, 2024
@github-actions github-actions bot added the NLP label Jul 26, 2024
@github-actions github-actions bot removed the NLP label Jul 31, 2024
@dimapihtar dimapihtar marked this pull request as ready for review July 31, 2024 14:38
@marcromeyn
Copy link
Collaborator

General comment: NeMo uses google-style docstrings so please use that format.

)

# Get generated configs
configs = runner.generate_configs()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need to initialize the objects here? That could be removed since the runner can return a run.Partial of the train function. Then we can just build that & call it.

from nemo.collections.llm.utils import Config


class Basic:
Copy link
Collaborator

@marcromeyn marcromeyn Aug 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there still a need for this abstraction layer? If we have a Partial[train] that we pass around to change, we don't need this class since everything is overwrite'able (like train.model.config.num_layers = 10).

return configs


def generate_grid_search_configs(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could this function take in Partial[train] instead?

marcromeyn and others added 13 commits August 27, 2024 05:48
* Fix when optimizers are setup for PEFT

* Apply isort and black reformatting

Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com>

* Init DDP inside PEFT

* Apply isort and black reformatting

Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com>

* Some fixes, loss seems to become nan with peft for some reason

* Apply isort and black reformatting

Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com>

* Loss goes down on fp32

* Apply isort and black reformatting

Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com>

* Simplifying FNMixin

* Apply isort and black reformatting

Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com>

* Fix bug with new checkpoint-io

* Apply isort and black reformatting

Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com>

* Fix failing test: test_peft_on_train_epoch_start_with_adapter

* Apply isort and black reformatting

Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com>

---------

Signed-off-by: marcromeyn <marcromeyn@users.noreply.github.com>
Co-authored-by: marcromeyn <marcromeyn@users.noreply.github.com>
Co-authored-by: Chen Cui <chcui@nvidia.com>
Signed-off-by: huvunvidia <86480512+huvunvidia@users.noreply.github.com>
Co-authored-by: huvunvidia <86480512+huvunvidia@users.noreply.github.com>
* Parametrize FPS group

Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com>

* Apply isort and black reformatting

Signed-off-by: mikolajblaz <mikolajblaz@users.noreply.github.com>

* Change deafult to False

Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com>

* Add logic to new ckptIO

Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com>

* Turn on parallel save by default

Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com>

---------

Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com>
Signed-off-by: mikolajblaz <mikolajblaz@users.noreply.github.com>
Co-authored-by: Dmytro Pykhtar <37850217+dimapihtar@users.noreply.github.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
* Change default parallel_save to False (#9633)

Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com>

* Unwrap ckpt_io for model opt (async save) (#9622) (#9634)

Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com>

* add reset_lr documentation

Signed-off-by: dimapihtar <dpihtar@gmail.com>

* fix style

Signed-off-by: dimapihtar <dpihtar@gmail.com>

* fix style

Signed-off-by: dimapihtar <dpihtar@gmail.com>

* fix style

Signed-off-by: dimapihtar <dpihtar@gmail.com>

* add image

Signed-off-by: dimapihtar <dpihtar@gmail.com>

* fix typo

Signed-off-by: dimapihtar <dpihtar@gmail.com>

* fix plot

Signed-off-by: dimapihtar <dpihtar@gmail.com>

* fix plot

Signed-off-by: dimapihtar <dpihtar@gmail.com>

* change plot size

Signed-off-by: dimapihtar <dpihtar@gmail.com>

* fix style

Signed-off-by: dimapihtar <dpihtar@gmail.com>

* move image

Signed-off-by: dimapihtar <dpihtar@gmail.com>

* add reset_lr to intro page

Signed-off-by: dimapihtar <dpihtar@gmail.com>

---------

Signed-off-by: Mikołaj Błaż <mblaz@nvidia.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Co-authored-by: mikolajblaz <mikolajblaz@users.noreply.github.com>
dimapihtar and others added 9 commits August 27, 2024 05:50
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
Signed-off-by: dimapihtar <dpihtar@gmail.com>
Signed-off-by: dimapihtar <dimapihtar@users.noreply.github.com>
@github-actions github-actions bot removed the common label Aug 28, 2024
Copy link
Collaborator

@ko3n1g ko3n1g left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A PR of this size is not acceptable. Can we split this up into smaller chunks? (or is simply a rebase missing?)

Copy link

@github-advanced-security github-advanced-security bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CodeQL found more than 20 potential problems in the proposed changes. Check the Files changed tab for more details.

@marcromeyn
Copy link
Collaborator

This is already a big PR which in this particular case is OK I feel like since it's a port of functionality that exists in the launcher now. But it seems like it's not rebased correctly and therefore it contains many un-related changes. Can this be fixed @dimapihtar ?

@dimapihtar
Copy link
Collaborator Author

This is already a big PR which in this particular case is OK I feel like since it's a port of functionality that exists in the launcher now. But it seems like it's not rebased correctly and therefore it contains many un-related changes. Can this be fixed @dimapihtar ?

@marcromeyn @ko3n1g
no worries about that. It was my try to rebase PR to main branch but it still has a lot of conflicts so I've already created new one based on main branch. I keep this PR opened in order to not to miss @marcromeyn comments. I'll close this PR a bit later.

New PR:
#10270

@dimapihtar dimapihtar closed this Sep 10, 2024
@dimapihtar dimapihtar deleted the dpykhtar/nemo_autoconf branch September 10, 2024 07:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.