Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Change to query the AI service directly and return a response instead of using ai_talk_bot_process_request #18

Open
ynott opened this issue Jan 29, 2025 · 0 comments
Labels
enhancement New feature or request

Comments

@ynott
Copy link

ynott commented Jan 29, 2025

How to use GitHub

  • Please use the 👍 reaction to show that you are interested into the same feature.
  • Please don't comment if you have no relevant information to add. It's just extra noise for everyone subscribed to this issue.
  • Subscribe to receive notifications on status change and new comments.

Feature request

Which Nextcloud Version are you currently using:

  • Nextcloud 30.0.4
  • OpenAI and LocalAI integration 3.4.0
  • Talk 20.1.3
  • Assistant Talk Bot 3.0.1

Assistant's Batch process is slow to reply.

Even though the current Assistant Talk Bot is running as a daemon in a Docker container, this container only registers batch processing to ai_talk_bot_process_request.
This registers it in Nextcloud's oc_taskprocessing_tasks table, and there is no reply to the message until the task is executed by cron.
The message is sent to the LLM model in Nextcloud's batch process and replied to /message?reply_to=XXX&token=TOKEN.

Image

Faster replies by responding directly without going through the Batch process.

However, if the Docker container (ghcr.io/nextcloud/talk_bot_ai) is modified to call the AI service directly and reply to the message, there is no need to wait for the cron task to be executed.

Image

I believe there is room for improvement in this AI Talk Bot.

@ynott ynott added the enhancement New feature or request label Jan 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant