Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Browser compatible? #13

Closed
radames opened this issue Jan 22, 2024 · 8 comments · Fixed by #47
Closed

Browser compatible? #13

radames opened this issue Jan 22, 2024 · 8 comments · Fixed by #47
Assignees

Comments

@radames
Copy link

radames commented Jan 22, 2024

Hi thanks for building this client, I was testing it on a browser app, but got an error for the streaming generation.

here is a reproduction using the la

run

 OLLAMA_ORIGINS=* ollama run mistral

then

https://jsfiddle.net/pnazk0cs/17/

<script type="module">
import {  Ollama } from 'https://esm.run/ollama';

const ollama = new Ollama();

async function generate() {
  try {
    const stream = await ollama.generate("mistral", "What is a llama in 5 words?", {
      stream: true,
    })
    for await (const out of stream) {
      console.log(out);
    }
  } catch (e) {
    console.error(e);
  }
}
generate();
</script>
_display/?editor_console=true:72 TypeError: t[Symbol.iterator] is not a function
    at n (utils.js:13:89)
    at utils.js:79:38
    at Generator.next (<anonymous>)
    at c (utils.js:23:42)
    at utils.js:22:121
    at new Promise (<anonymous>)
    at i.<computed>.r.<computed> [as next] (utils.js:22:63)
    at f.<anonymous> (index.js:77:104)
    at Generator.next (<anonymous>)
    at l (index.js:23:42)
@BruceMacD BruceMacD added enhancement New feature or request and removed enhancement New feature or request labels Jan 22, 2024
@BruceMacD
Copy link
Collaborator

Hey @radames thanks for opening the issue. I just release the new version of the javascript library last night. If you get a chance to try again let me know if it works. Your code should be updated to this:

import { Ollama } from 'https://esm.run/ollama';

const ollama = new Ollama();

async function generate() {
  try {
    const stream = await ollama.generate({model: "mistral", prompt: "What is a llama in 5 words?", stream: true})
    for await (const out of stream) {
      console.log(out);
    }
  } catch (e) {
    console.error(e);
  }

}

generate();

@radames
Copy link
Author

radames commented Jan 24, 2024

thanks @BruceMacD, it still doesn't work,

Also the newer version introduced another issue. My guess is #3 might help the bundlers figure it out, wdyt?

https://cdn.jsdelivr.net/npm/ollama@0.4.0/+esm

/**
 * Failed to bundle using Rollup v2.79.1: the file imports a not supported node.js built-in module "fs".
 * If you believe this to be an issue with jsDelivr, and not with the package itself, please open an issue at https://github.com/jsdelivr/jsdelivr
 */

 throw new Error('Failed to bundle using Rollup v2.79.1: the file imports a not supported node.js built-in module "fs". If you believe this to be an issue with jsDelivr, and not with the package itself, please open an issue at https://github.com/jsdelivr/jsdelivr');

Compare to
https://cdn.jsdelivr.net/npm/ollama@0.3.0/+esm

@saul-jb
Copy link
Collaborator

saul-jb commented Jan 24, 2024

Seems like 2ef7df8 added dependency on fs which is not available in the browser.

@radames You will need to ignore or polyfill the module to get it working in the browser: rollup/rollup#897

@BruceMacD I find having to patch node packages to get it to work in the browser a poor experience, a better way is to allow passing a module through the constructor like how I had fetch working (#2).

@dditlev
Copy link

dditlev commented Jan 24, 2024

Hello, I wrote a small JS client that works both in node and the browser. I've abstracted the fs into a dynamic import, that picks nodestreams or browserstreams. Perhaps this can be usefull https://github.com/dditlev/ollama-js-client/blob/main/src/fetch_jsonstream.ts

if (isNode()) {
  // Handle Node.js streams
  const { Readable } = await import("stream");
  if (response.body instanceof Readable) {
    return this.processNodeStream(response.body, decoder, callback);
  }
} else if (isBrowserReadableStream(response.body)) {
  // Handle browser streams
  const reader = response.body.getReader();
  return this.processStreamReader(reader, decoder, callback);
} else {
  callback({ error: "Unrecognized stream type" });
}

I would like to contribute and perhaps direct people from my repo to this repo, so there is only one good js ollama lib?

@nshcr
Copy link
Contributor

nshcr commented Jan 25, 2024

ollama-js/src/utils.ts

Lines 103 to 104 in a24c2de

// TS is a bit strange here, ReadableStreams are AsyncIterable but TS doesn't see it.
for await (const chunk of itr as unknown as AsyncIterable<Uint8Array>) {

I found a problem in this line.
According to MDN documentation1, only Firefox and Node.js/Deno support the AsyncIterator feature in ReadableStream currently.
This is why there is a TS error and doesn't work in Chromium.

Footnotes

  1. https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream

@radames
Copy link
Author

radames commented Jan 27, 2024

Great point, @nshcr. They might need to use a polyfill here https://www.npmjs.com/package/@azure/core-asynciterator-polyfill and maybe user https://github.com/egoist/tsup to bundle the ts code?

@nshcr
Copy link
Contributor

nshcr commented Jan 27, 2024

Great point, @nshcr. They might need to use a polyfill here https://www.npmjs.com/package/@azure/core-asynciterator-polyfill and maybe user https://github.com/egoist/tsup to bundle the ts code?

I'm afraid that @azure/core-asynciterator-polyfill is just a polyfill for AsyncIterator, rather than a polyfill for AsyncIterator within ReadableStream.
Based on this issue, I suggest using the regular usage for ReadableStream instead of for await.
Here is a piece of rewritten code that could work in most places, which is roughly the same as before.

const reader = itr.getReader()
while (true) {
  const { done, value: chunk } = await reader.read()

  buffer += decoder.decode(chunk)

  //...

  if (done) {
    break
  }
}

@naveenraj008
Copy link

how to get links

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
6 participants