-
Notifications
You must be signed in to change notification settings - Fork 203
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Browser compatible? #13
Comments
Hey @radames thanks for opening the issue. I just release the new version of the javascript library last night. If you get a chance to try again let me know if it works. Your code should be updated to this: import { Ollama } from 'https://esm.run/ollama';
const ollama = new Ollama();
async function generate() {
try {
const stream = await ollama.generate({model: "mistral", prompt: "What is a llama in 5 words?", stream: true})
for await (const out of stream) {
console.log(out);
}
} catch (e) {
console.error(e);
}
}
generate(); |
thanks @BruceMacD, it still doesn't work, Also the newer version introduced another issue. My guess is #3 might help the bundlers figure it out, wdyt? https://cdn.jsdelivr.net/npm/ollama@0.4.0/+esm
|
Seems like 2ef7df8 added dependency on @radames You will need to ignore or polyfill the module to get it working in the browser: rollup/rollup#897 @BruceMacD I find having to patch node packages to get it to work in the browser a poor experience, a better way is to allow passing a module through the constructor like how I had fetch working (#2). |
Hello, I wrote a small JS client that works both in node and the browser. I've abstracted the fs into a dynamic import, that picks nodestreams or browserstreams. Perhaps this can be usefull https://github.com/dditlev/ollama-js-client/blob/main/src/fetch_jsonstream.ts if (isNode()) {
// Handle Node.js streams
const { Readable } = await import("stream");
if (response.body instanceof Readable) {
return this.processNodeStream(response.body, decoder, callback);
}
} else if (isBrowserReadableStream(response.body)) {
// Handle browser streams
const reader = response.body.getReader();
return this.processStreamReader(reader, decoder, callback);
} else {
callback({ error: "Unrecognized stream type" });
} I would like to contribute and perhaps direct people from my repo to this repo, so there is only one good js ollama lib? |
Lines 103 to 104 in a24c2de
I found a problem in this line. Footnotes |
Great point, @nshcr. They might need to use a polyfill here https://www.npmjs.com/package/@azure/core-asynciterator-polyfill and maybe user https://github.com/egoist/tsup to bundle the ts code? |
I'm afraid that const reader = itr.getReader()
while (true) {
const { done, value: chunk } = await reader.read()
buffer += decoder.decode(chunk)
//...
if (done) {
break
}
} |
how to get links |
Hi thanks for building this client, I was testing it on a browser app, but got an error for the streaming generation.
here is a reproduction using the la
run
OLLAMA_ORIGINS=* ollama run mistral
then
https://jsfiddle.net/pnazk0cs/17/
The text was updated successfully, but these errors were encountered: