Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bugfix on step4 #4228

Merged
merged 2 commits into from
Jan 4, 2023
Merged

bugfix on step4 #4228

merged 2 commits into from
Jan 4, 2023

Conversation

SylarTiaNII
Copy link
Contributor

bugfix on step4 in bert migration, modify diff_threshold from default to 0.01.

@paddle-bot
Copy link

paddle-bot bot commented Dec 23, 2022

Thanks for your contribution!

@CLAassistant
Copy link

CLAassistant commented Dec 23, 2022

CLA assistant check
All committers have signed the CLA.

@codecov
Copy link

codecov bot commented Jan 3, 2023

Codecov Report

Merging #4228 (97dd36e) into develop (f01d8eb) will decrease coverage by 0.00%.
The diff coverage is n/a.

@@             Coverage Diff             @@
##           develop    #4228      +/-   ##
===========================================
- Coverage    36.30%   36.30%   -0.01%     
===========================================
  Files          419      419              
  Lines        59222    59222              
===========================================
- Hits         21499    21498       -1     
- Misses       37723    37724       +1     
Impacted Files Coverage Δ
paddlenlp/transformers/roberta/modeling.py 89.85% <0.00%> (-0.37%) ⬇️

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

Copy link
Collaborator

@ZHUI ZHUI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@ZHUI ZHUI merged commit 255360f into PaddlePaddle:develop Jan 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants