Replies: 1 comment
-
@L4zyy thanks for spotting this bug. It seems to be added accidently by a recent PR: I will fix it in this PR: #399. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
At https://github.com/camel-ai/camel/blob/master/camel/types/enums.py#L41
The model_type to get the tiktoken is fixed to "gpt-3.5-turbo" unless we are using the stub model. I wonder if this is doing on purpose? Is it because the result are same for these model types? (which is <Encoding 'cl100k_base'>)
Beta Was this translation helpful? Give feedback.
All reactions