-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add replace_with_parallel_cross_entropy flag #9579
Add replace_with_parallel_cross_entropy flag #9579
Conversation
Thanks for your contribution! |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## develop #9579 +/- ##
===========================================
- Coverage 52.77% 52.76% -0.01%
===========================================
Files 709 710 +1
Lines 111172 111235 +63
===========================================
+ Hits 58672 58695 +23
- Misses 52500 52540 +40 ☔ View full report in Codecov by Sentry. |
@@ -74,6 +74,7 @@ python -u -m paddle.distributed.launch \ | |||
--do_eval \ | |||
--device "gpu" \ | |||
--data_impl "mmap" \ | |||
--replace_with_parallel_cross_entropy 0 \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it be better to place replace_with_parallel_cross_entropy
in tensor_parallel_config
configuration?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done
Move replacew_ith_marallel_cross_detropy
to tensor_marallel_config
for configuration.
0db0cef
to
e38c9d8
Compare
scripts/distribute/ci_case_auto.sh
Outdated
@@ -839,6 +840,103 @@ function llama_pir_auto_fuse_ffn_attention_qkv_MP2() { | |||
echo "=========== $FUNCNAME run end ===========" | |||
} | |||
|
|||
function llama_pir_auto_replace_with_parallel_cross_entropy_MP2() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这样的写法会增加2个测试。建议直接在已有的测试上,加上 parallel cross entropy 参数验证,只需增加一个测试。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
252a2d0
to
d16239c
Compare
5a1ee7b
to
55eca3e
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
Others
PR changes
Others
Description
为
replace_with_parallel_cross_entropy pass
添加开关,可以在脚本进行开关控制开 pass:
--tensor_parallel_config "replace_with_parallel_cross_entropy"
关 pass:
--tensor_parallel_config ""