-
Notifications
You must be signed in to change notification settings - Fork 419
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add chat memory #30
Comments
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this issue will be closed. |
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this issue will be closed. |
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this issue will be closed. |
This issue was closed because it has been stalled for 7 days with no activity. |
Currently only the last chat message with the user question is fed to the chat model. However, the complete chat history is sent to the server. We should use LangChain's Memory component to integrate the previous messages in the model invocation, with a limit set.
Tasks
Add BufferMemory with the last 5 messages to the existing chain in /chat endpointExtract the5
message window value inconstants.ts
Superseded by #106 that implements full chat history
The text was updated successfully, but these errors were encountered: