You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I have done pretraining on Chinese-dataset(50G) and run downstream finetuning on ChineseClue benchmark, the default hyperparameters ars the same to bert-base:
learning_rate: 3e-5,
epoch: 3 or 5
the finetuning results on benchmark are worse than official Chinese bert-base released by Goolge
The text was updated successfully, but these errors were encountered:
---Original---
From: "zhu143xin"<notifications@github.com>
Date: Thu, Jan 28, 2021 20:04 PM
To: "joongbo/tta"<tta@noreply.github.com>;
Cc: "yyht"<htxu91@gmail.com>;"Author"<author@noreply.github.com>;
Subject: Re: [joongbo/tta] I have done some experiments on Chinese using bert-base config, the results are not promising (#6)
Hi, I want to use TTA to do some work about spelling Error Correction on Chinese, and did you do some experiments?
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub, or unsubscribe.
Hi, I have done pretraining on Chinese-dataset(50G) and run downstream finetuning on ChineseClue benchmark, the default hyperparameters ars the same to bert-base:
learning_rate: 3e-5,
epoch: 3 or 5
the finetuning results on benchmark are worse than official Chinese bert-base released by Goolge
The text was updated successfully, but these errors were encountered: