LoRA微调时只有第一个loss有数值,后面的loss全为0.0 #568
Replies: 8 comments 7 replies
-
同样的问题,解决了吗兄弟 |
Beta Was this translation helpful? Give feedback.
-
同样的问题,解决了吗兄弟 |
Beta Was this translation helpful? Give feedback.
-
同样的问题,解决了吗,兄弟 |
Beta Was this translation helpful? Give feedback.
-
同样的问题,解决了吗兄弟 |
Beta Was this translation helpful? Give feedback.
-
我用两个不同的环境,有一个是这种情况,有一个不是,是不是环境的原因? |
Beta Was this translation helpful? Give feedback.
-
peft==0.6.0 解决了,可以参考 |
Beta Was this translation helpful? Give feedback.
-
作者说的peft是0.7.1哦,2024/3/5 |
Beta Was this translation helpful? Give feedback.
-
You are right, 我前天注意到了,ok
…---- 回复的原邮件 ----
| 发件人 | ***@***.***> |
| 发送日期 | 2024年03月05日 16:57 |
| 收件人 | THUDM/ChatGLM3 ***@***.***> |
| 抄送人 | sofiane-20241050 ***@***.***>,
Manual ***@***.***> |
| 主题 | Re: [THUDM/ChatGLM3] LoRA微调时只有第一个loss有数值,后面的loss全为0.0 (Discussion #568) |
作者说的peft是0.7.1哦,2024/3/5
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
如题,怎么解决呢?
Beta Was this translation helpful? Give feedback.
All reactions