-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
It doesn't support new model "o1-mini" and "o1-preview" #120
Comments
Hi, Thanks in advance! |
Here’s my example code: const countTokens = (messages: any[], model: TiktokenModel): number => {
}; const messages = [ const model: TiktokenModel = "gpt-4o-mini"; |
Hello! Will keep monitoring openai#337 to see if there are any changes w.r.t. the underlying token map. |
@tmlxrd Just counting role and content is not necessarily enough. You need to also include the tokens which are used to separate the messages: see dqbd/tiktokenizer |
Thank you for your answer! I got 1708 incoming tokens in the big text and 1717 in the response from openai. It's a small difference, but I don't understand what it's about, so I added two roles UPD: Thank you for the link to the feature. It works better now, but there are discrepancies with the answer from openai |
Do 'o1-mini' and 'o1-preview' still use the cl100k_base vocabulary? |
Hi. Unfortunately, I don't know that. Share the answer if you find the information |
Got clarification with the latest |
Hi openai devs,
how can I count tokens for o1-preview and o1-mini?
Thanks in advance!
The text was updated successfully, but these errors were encountered: