Keep running llamacpp in background #10096
Unanswered
Fulgurance
asked this question in
Q&A
Replies: 1 comment 1 reply
-
you can run in a conversation mode for example: |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi everyone, I did yet few tests now with llama-cli. I would like to know if there is a possibility to keep the chat mode running in background of the system, and send when necessary a request when it's needed to the running modele instead of restart the modele everytime.
Because starting the modele consume a bit of time.
Beta Was this translation helpful? Give feedback.
All reactions