-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Server support stream #346
Conversation
server/src/main/kotlin/com/xebia/functional/xef/server/http/routes/Routes.kt
Outdated
Show resolved
Hide resolved
server/src/main/kotlin/com/xebia/functional/xef/server/http/routes/Routes.kt
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We could test this change live by creating our xef-client
, which uses the OpenAI
Models we have in Xef but instead uses our server URL. Sending a streaming request like the SpaceCraft
example that streams the response will show if this integration is wired the same as Open AIs. Let's discuss it tomorrow to see how we can approach it and if we can already write some tests for it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just a minor suggestion. Please, let me know your opinion about that
server/src/main/kotlin/com/xebia/functional/xef/server/http/routes/Routes.kt
Show resolved
Hide resolved
server/src/main/kotlin/com/xebia/functional/xef/server/http/routes/Routes.kt
Show resolved
Hide resolved
server/src/main/kotlin/com/xebia/functional/xef/server/http/routes/Routes.kt
Show resolved
Hide resolved
@raulraja the last changes added to this PR add the possibility to change the host in order to use the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thank you @Montagon @javipacheco !
This PR modifies the
chat/completions
endpoint to support the stream option. Now this endpoint uses a Ktor client to send the information directly to OpenAI and returns the output in the same format as received.