-
Notifications
You must be signed in to change notification settings - Fork 863
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
allow paralllevel as 1 to start torchrun npro-per-node #2608
Conversation
Codecov Report
@@ Coverage Diff @@
## master #2608 +/- ##
=======================================
Coverage 72.39% 72.39%
=======================================
Files 85 85
Lines 3956 3956
Branches 58 58
=======================================
Hits 2864 2864
Misses 1088 1088
Partials 4 4 📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks @lxning LGTM, I could successfully test it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, see comments. Would be good to make sure that this scenario is covered by one of our unit tests.
logger.warn("Invalid parallelLevel:{}, set as 1", parallelLevel); | ||
this.parallelLevel = 1; | ||
if (parallelLevel < 0) { | ||
logger.warn("Invalid parallelLevel:{}, set as 0", parallelLevel); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if you set to 1 and then to -1 the level stays at 1 and is not set to 0 as indicated by the warning
@@ -153,7 +153,7 @@ public void startWorker(int port, String deviceIds) | |||
argl.add(configManager.getMetricsConfigPath()); | |||
|
|||
try { | |||
latch = new CountDownLatch(model.getParallelLevel()); | |||
latch = new CountDownLatch(model.getParallelLevel() > 0 ? model.getParallelLevel() : 1); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: you could replace this and in the following with a more concise max(1, model.getParallelLevel())
Description
Please read our CONTRIBUTING.md prior to creating your first pull request.
Please include a summary of the feature or issue being fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.
This PR supports the example TP Llama which requires torchrun npro-per-node=1.
Fixes #(issue)
Type of change
Please delete options that are not relevant.
Feature/Issue validation/testing
Please describe the Unit or Integration tests that you ran to verify your changes and relevant result summary. Provide instructions so it can be reproduced.
Please also list any relevant details for your test configuration.
The test is demonstrated in example.
Checklist: