-
Notifications
You must be signed in to change notification settings - Fork 7.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Request] Allow custom openai endpoint #1424
Labels
Comments
any updates? |
This is pretty easy to implement with OpenAILike. I tested it briefly at first with LocalAI, and then with vLLM. I also need this feature, so I'll get around to opening a PR if no one else picks it up. |
mchill
added a commit
to mchill/privateGPT
that referenced
this issue
Dec 22, 2023
This mode behaves the same as the openai mode, except that it allows setting custom models not supported by OpenAI. It can be used with any tool that serves models from an OpenAI compatible API. Implements zylon-ai#1424
imartinez
pushed a commit
that referenced
this issue
Dec 26, 2023
This mode behaves the same as the openai mode, except that it allows setting custom models not supported by OpenAI. It can be used with any tool that serves models from an OpenAI compatible API. Implements #1424
Stale issue |
simonbermudez
pushed a commit
to simonbermudez/saimon
that referenced
this issue
Feb 24, 2024
This mode behaves the same as the openai mode, except that it allows setting custom models not supported by OpenAI. It can be used with any tool that serves models from an OpenAI compatible API. Implements zylon-ai#1424
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Due to changes in PrivateGPT, openai replacements no longer work as we cannot define custom openai endpoints.
LocalAI is an example: https://github.com/mudler/LocalAI/tree/master/examples/privateGPT
It is a drop in replacement for ChatGPT Openai with compatible REST APIs. Being able to define custom openai endpoints in the settings files would enable us to use localai.
The text was updated successfully, but these errors were encountered: