-
Notifications
You must be signed in to change notification settings - Fork 9.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore: Add hashsum for stablelm models #7018
Conversation
You have to add the models in |
Related: #7024 |
@ggerganov I can do that, but I would recommend against it. It will couple the scripts, increase complexity, and is completely unneccessary for producing the output. Supporting rationale for why would increase my confidence in this approach. Ill read it again (for the nth time), but I feel less dubious in my stance. |
The function |
@slaren Thanks, I'll need to review it again with fresh eyes. I must've missed it. |
Should I add the mi(s|x)tral BPE or SPM? I noticed that I can't convert the original models anymore with either script, but I can convert the safetensors. Also, there are so many qwen models. 😅 Looks like there's a repo dedicated to just the tokenizer though, https://huggingface.co/Qwen/Qwen-tokenizer/tree/main |
I really don't like this approach. This is gonna get really bad, really fast. I see the intent here to automate the |
Don't merge this yet. I have some ideas. |
d9eaa44
to
858f6b7
Compare
ref #6920 (comment)