-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Ollama sysext #85
Conversation
Awesome idea! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
generally support the idea, just want to hear the answer to the two comments
I've reworked this a bit to include the environment variables by default and using the tarball-based distribution instead of just downloading the binary as before. |
7f81383
to
d59f881
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. Thanks! Note that ollama.service
won't start automatically when the sysext will be loaded. If it's not an issue, we can go ahead. :)
hmm, that seems like it probably should happen on boot, I'll take a look at changing that |
When a service has to be started when the sysext is loaded, we usually add a drop-in to sysext-bakery/create_wasmcloud_sysext.sh Lines 92 to 94 in 82e4914
|
@tormath1 thanks for the pointer, I thought I had originally already included that, but looks like I'd overlooked it. Should be good to go now 🙂 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! Feel free to squash everything in a single commit (feat: add ollama sysext
)
Signed-off-by: Joonas Bergius <joonas@cosmonic.com>
Add Ollama sysext
This creates a sysext for running Ollama on Flatcar.
How to use
Create a Flatcar instance with configuration that looks roughly something like this:
Testing done
I've deployed a DigitalOcean droplet with the above configuration and verified that the expected version of Ollama was available and could be used to inference (following steps from Ollama vision models blog post):
changelog/
directory (user-facing change, bug fix, security fix, update)/boot
and/usr
size, packages, list files for any missing binaries, kernel modules, config files, kernel modules, etc.