You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, bro
I'm sorry to open this issue here, because leaving issue is not available in the rep of speechtokenizer_trainer.
Thanks a lot for sharing such a useful and meaningful work, I've carefully read and try your training_code for weeks, but there is till some issues bothering me.
In my experiment, the code works well when '--do_distillation' turned off. But when distillation works, the gradient of speechtokenizer becomes NAN after the first backward step. I wander if you have successfully reproducte speechtokenizer or there is still some problem in loss_distillation. Thanks again for your sharing.
The text was updated successfully, but these errors were encountered:
Hi, bro
I'm sorry to open this issue here, because leaving issue is not available in the rep of speechtokenizer_trainer.
Thanks a lot for sharing such a useful and meaningful work, I've carefully read and try your training_code for weeks, but there is till some issues bothering me.
In my experiment, the code works well when '--do_distillation' turned off. But when distillation works, the gradient of speechtokenizer becomes NAN after the first backward step. I wander if you have successfully reproducte speechtokenizer or there is still some problem in loss_distillation. Thanks again for your sharing.
The text was updated successfully, but these errors were encountered: