-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multiple knowledge themes #366
Comments
I don't think it's currently possible to use X,Y,Z dataset based on the user talking. And for the training, you can write your own trainers and implement your own logic on removing/updating etc... |
Adding information is currently supported through the training process, however not to the extent that you describe. I'm planning to (very soon) introduce a For having the bot maintain different conversations, this functionality will be added as a part of #313. My plan is to add the ability for a chat bot to track different chat |
Thanks, gunthercox, yes, I believe that Conversation object should be just what I need. |
I'm going to close this ticket off as the functionality mentioned is being tracked in #513 |
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
From what I understand, right now all the conversations of the chatbot are added to one database.
But what about 'injecting' and 'removing' certain knowledge?
For example, a chatbot is trained from live conversations with users, but at some point I want to inject into him dialogues from a certain book, to allow him to use answers from that book along with those trained from actual people.
But then at some point I may want to make him forget only that book's dialogues, not the whole db.
Or maybe I want him to talk to one user using only answers from book N, but to another user using answers from book X, but also using conversation training Y for both of them.
Is it currently possible? If not, it may be an interesting feature to think about.
The text was updated successfully, but these errors were encountered: