-
Notifications
You must be signed in to change notification settings - Fork 469
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow alternative OpenAI-compatible endpoints #494
Conversation
[AI-32] Add option for custom OpenAI-compatible endpoint
Great work! I have a few questions:
|
Hey @Niek! Thank you for your feedback, appreciate it. To answer your questions:
So, I'll implement that call to |
[AI-32] Check /models endpoint before saving API URL
@Niek I have added the check for the Looking forward to your feedback regarding CORS. |
Thanks a lot, merged now @betschki! I tried to get it working with Anthropic, but no luck. The changes needed are:
So for now, this is only available for 100% OpenAI-compatible endpoints like llama.cpp. |
@all-contributors please add @betschki for code, ideas |
Sounds like an important enough change to implement though, because Anthropic is better in many use-cases, like it's better for tech & coding than OpenAI. |
Agreed, it would be nice if the code could be restructured a bit so all API requests are done through |
Context
This PR introduces functionality to allow users to specify a custom OpenAI-compatible API endpoint in addition to the default OpenAI API. The main goal behind this is to enable the usage of alternative endpoints such as self-hosted or third-party services. At stadt.werk, we use this feature to interact with our own self-hosted OpenAI-compatible server.
The enhancement extends the flexibility of the existing interface by letting users switch between OpenAI’s official endpoint (default behaviour) and their own custom endpoint through the UI. Additionally, the PR refines model fetching, which is essential for any non-OpenAI endpoints.
Compatability
We'd love to see this enhancement merged into
chatgpt-web
, since we believe this can be a great feature that has also been mentioned a few times (here and here, for example).Looking forward to feedback.