Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add replace_with_parallel_cross_entropy flag #9579

Merged

Conversation

waliwali777
Copy link
Contributor

@waliwali777 waliwali777 commented Dec 6, 2024

PR types

Others

PR changes

Others

Description

replace_with_parallel_cross_entropy pass 添加开关,可以在脚本进行开关控制
开 pass: --tensor_parallel_config "replace_with_parallel_cross_entropy"
关 pass: --tensor_parallel_config ""

Copy link

paddle-bot bot commented Dec 6, 2024

Thanks for your contribution!

Copy link

codecov bot commented Dec 6, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 52.76%. Comparing base (5e1f01f) to head (55eca3e).
Report is 8 commits behind head on develop.

Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #9579      +/-   ##
===========================================
- Coverage    52.77%   52.76%   -0.01%     
===========================================
  Files          709      710       +1     
  Lines       111172   111235      +63     
===========================================
+ Hits         58672    58695      +23     
- Misses       52500    52540      +40     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@@ -74,6 +74,7 @@ python -u -m paddle.distributed.launch \
--do_eval \
--device "gpu" \
--data_impl "mmap" \
--replace_with_parallel_cross_entropy 0 \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it be better to place replace_with_parallel_cross_entropy in tensor_parallel_config configuration?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done
Move replacew_ith_marallel_cross_detropy to tensor_marallel_config for configuration.

@waliwali777 waliwali777 force-pushed the add_replace_cross_entropy_flag branch 9 times, most recently from 0db0cef to e38c9d8 Compare December 12, 2024 08:48
@@ -839,6 +840,103 @@ function llama_pir_auto_fuse_ffn_attention_qkv_MP2() {
echo "=========== $FUNCNAME run end ==========="
}

function llama_pir_auto_replace_with_parallel_cross_entropy_MP2() {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这样的写法会增加2个测试。建议直接在已有的测试上,加上 parallel cross entropy 参数验证,只需增加一个测试。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

@waliwali777 waliwali777 force-pushed the add_replace_cross_entropy_flag branch from 252a2d0 to d16239c Compare December 16, 2024 09:20
@waliwali777 waliwali777 force-pushed the add_replace_cross_entropy_flag branch from 5a1ee7b to 55eca3e Compare December 16, 2024 13:26
Copy link
Contributor

@liym27 liym27 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@wawltor wawltor merged commit 4451c05 into PaddlePaddle:develop Dec 17, 2024
9 of 12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants