Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue: 400 Unsupported parameter: 'max_tokens' using model 'o1-preview' #1003

Open
Prem95 opened this issue Sep 13, 2024 · 2 comments
Open
Assignees

Comments

@Prem95
Copy link

Prem95 commented Sep 13, 2024

Issue you'd like to raise.

Changed the model in Playground to use o1-preview (I have Tier 5 support) but received this error instead

400 Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.

Error: 400 Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.
    at APIError.generate (file:///node_modules/openai/error.mjs:41:20)
    at OpenAI.makeStatusError (file:///node_modules/openai/core.mjs:268:25)
    at OpenAI.makeRequest (file:///node_modules/openai/core.mjs:311:30)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async file:///node_modules/@langchain/openai/dist/chat_models.js:881:29
    at async RetryOperation._fn (/node_modules/p-retry/index.js:50:12)

Suggestion:

Support for new model from OpenAI

@dennisofficial
Copy link

For me, I'm just not getting any traces from o1-preview

@hinthornw
Copy link
Collaborator

@dennisofficial do you have code to share for that?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants