-
-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for streaming messages #52
Comments
This might already be possible in some way by a plugin, by calling the jupyter-chat/packages/jupyter-chat/src/model.ts Lines 76 to 85 in 5cf3955
|
Yes, this is possible but will call the To avoid this, we could find a way to tell the message component to append the new content to the DOM, using the
|
Linking to jupyterlab/jupyter-ai#228 as related. |
Problem
Currently messages are added to the chat in one go. See this screencast as an example:
jupyterlab-codestral-demo-lite.webm
However most of the AI chat UIs stream responses to the user, for example:
mistral-ai-chat.webm
Proposed Solution
Add optional support for streaming messages, which could be enabled via the settings editor or by extensions.
Additional context
This would be useful for AI extensions using
jupyter-chat
for interacting with some online AI providers, such as https://github.com/jtpio/jupyterlab-codestralThe text was updated successfully, but these errors were encountered: