-
-
Notifications
You must be signed in to change notification settings - Fork 14.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add nextjs ollama llm UI frontend for Ollama #313146
Conversation
Could you make sure the CI is passing? |
570b290
to
3845bee
Compare
Ah thanks. Some whitespace issues are fixed now. |
Could you pass through |
@ofborg eval |
3845bee
to
4f9e80e
Compare
@drupol Done. I didn't know there was a new formatting style other than plain nixpkgs-fmt. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
How about adding an entry in the release note? |
4f9e80e
to
6002b9f
Compare
@@ -125,7 +125,9 @@ Use `services.pipewire.extraConfig` or `services.pipewire.configPackages` for Pi | |||
|
|||
- [rspamd-trainer](https://gitlab.com/onlime/rspamd-trainer), script triggered by a helper which reads mails from a specific mail inbox and feeds them into rspamd for spam/ham training. | |||
|
|||
- [ollama](https://ollama.ai), server for running large language models locally. | |||
- [ollama](https://ollama.ai), backend server for running large language models locally. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This line should not be modified.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reverted
Good idea. I added a release note for the 24.05 release |
6002b9f
to
68a7ade
Compare
68a7ade
to
ff6d95a
Compare
75671e9
to
66dc059
Compare
Resolved conflicts. Do you have merge rights, or would i need to search for someone? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you add a simple nixos test that simply check that the service is launched and the relevant port is open?
There are plenty of examples in the nixos directory.
66dc059
to
0ae79d6
Compare
Done. Although i can't see what's wrong with that failing manual job. edit: I think removing a lib.mdDoc call should do the trick. |
0ae79d6
to
9981bc9
Compare
9981bc9
to
226d552
Compare
NixOS already has good support for the Ollama backend service. Now we can benefit from having a convenient web frontend as well for it.
226d552
to
8a05b4f
Compare
I have to thank you! I got to see and learn a lot about nixpkgs in a few days what i have been planning to look at for a long time. |
default = 3000; | ||
example = 3000; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This by default conflicts with a lot of services:
https://github.com/search?q=repo%3ANixOS%2Fnixpkgs+default+%3D+3000%3B&type=code
For example for an onion service setup when trying to run multiple of services that provide port 3000 it will require doing a port management that i would think would be better if it was handled by nix through probably doing a unique ports per services to make the setup more functional?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We would need something like systemd's "DynamicUser" feature that just picks one new, unique one, but for ports. For NixOS modules i don't care about the port if my servicce sits behind a nginx reverse-proxy. Do you know about such a feature?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We would need something like systemd's "DynamicUser" feature that just picks one new, unique one, but for ports. [...] Do you know about such a feature? -- @malteneuss (#313146 (comment))
Systemd doesn't seem to have this feature. Should we file a feature request for it?
For NixOS modules i don't care about the port if my servicce sits behind a nginx reverse-proxy. -- @malteneuss (#313146 (comment))
It's a minor annoyance in both scenarios 🤔
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
True. Although now that i think about it, i don't believe it would work if systemd picks a random port, because to configure e.g. nginx as a reverse-proxy, you need to determine the port to forward to at Nix build time (so beforehand). So, i would keep the default port for now the same as the upstream project does (for nextjs-ollama-llm-ui its port 3000)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was some work being done in NixOS/rfcs#151 that got closed due to a lack of intresst.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the info. Unfortunately, i don't have enough time to move this forward.
Just wanted to mention that you can use a dev version with a convenient ollama response streaming fix here: #315184 (until the next release version arrives) |
let | ||
cfg = config.services.nextjs-ollama-llm-ui; | ||
# we have to override the URL to a Ollama service here, because it gets baked into the web app. | ||
nextjs-ollama-llm-ui = cfg.package.override { ollamaUrl = "https://ollama.lambdablob.com"; }; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't this be using the user's chosen ollamaUrl
?
I can't get it working unless I override the package manually.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, this slipped through. It will be fixed with the next release #315184
Description of changes
NixOS already has good support for the Ollama backend service. Now we can benefit from having a convenient web frontend as well for it. This is a much simpler (stateless) service with fewer dependencies than e.g. https://github.com/open-webui/open-webui as requested in #309567.
Things done
Add https://github.com/jakobhoeg/nextjs-ollama-llm-ui as a Nix package and add a corresponding NixOS module.
nix.conf
? (See Nix manual)sandbox = relaxed
sandbox = true
nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD"
. Note: all changes have to be committed, also see nixpkgs-review usage./result/bin/
)You can try this module out by adding the following parts to your existing flake.nix file:
and visiting "127.0.0.1:11435" in your browser.
Add a 👍 reaction to pull requests you find important.