From 053bfd6b3d85136ffefcfe8eb1732206fe3be9fc Mon Sep 17 00:00:00 2001 From: Bruce MacDonald Date: Tue, 23 Jan 2024 14:38:18 -0500 Subject: [PATCH 1/2] Update README.md --- README.md | 42 ++++++++++++++++++++++++++++++------------ 1 file changed, 30 insertions(+), 12 deletions(-) diff --git a/README.md b/README.md index aefe999..26eb1c9 100644 --- a/README.md +++ b/README.md @@ -10,8 +10,6 @@ npm i ollama ## Usage -A global default client is provided for convenience and can be used for both single and streaming responses. - ```javascript import ollama from 'ollama' @@ -22,6 +20,9 @@ const response = await ollama.chat({ console.log(response.message.content) ``` +## Streaming responses +Response streaming can be enabled by setting `stream: true`, modifying function calls to return an `AsyncGenerator` where each part is an object in the stream. + ```javascript import ollama from 'ollama' @@ -32,19 +33,19 @@ for await (const part of response) { } ``` -## API - -The API aims to mirror the [HTTP API for Ollama](https://github.com/jmorganca/ollama/blob/main/docs/api.md). - -### Ollama - +## Create ```javascript -new Ollama(config) +import ollama from 'ollama' + +const modelfile = ` +FROM llama2 +SYSTEM "You are mario from super mario bros." +` +await ollama.create({ model: 'example', modelfile: modelfile }) ``` -- `config` `` (Optional) Configuration object for Ollama. - - `host` `` (Optional) The Ollama host address. Default: `"http://127.0.0.1:11434"`. - - `fetch` `` (Optional) The fetch library used to make requests to the Ollama host. +## API +The Ollama JavaScript library's API is designed around the [Ollama REST API](https://github.com/jmorganca/ollama/blob/main/docs/api.md) ### chat @@ -178,6 +179,23 @@ ollama.embeddings(request) - `options` ``: (Optional) Options to configure the runtime. - Returns: `` +## Custom client + +A custom client can be created with the following fields: + +- `host` ``: (Optional) The Ollama host address. Default: `"http://127.0.0.1:11434"`. +- `fetch` ``: (Optional) The fetch library used to make requests to the Ollama host. + +```javascript +import { Ollama } from '../src/index' + +const ollama = new Ollama({ host: 'http://localhost:11434' }) +const response = await ollama.chat({ + model: 'llama2', + messages: [{ role: 'user', content: 'Why is the sky blue?' }], +}) +``` + ## Building To build the project files run: From 6305023e59d482a8504b21116d7158bfe1f57ac9 Mon Sep 17 00:00:00 2001 From: Bruce MacDonald Date: Tue, 23 Jan 2024 14:53:07 -0500 Subject: [PATCH 2/2] Update README.md Co-authored-by: Jeffrey Morgan --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 26eb1c9..14467ec 100644 --- a/README.md +++ b/README.md @@ -187,7 +187,7 @@ A custom client can be created with the following fields: - `fetch` ``: (Optional) The fetch library used to make requests to the Ollama host. ```javascript -import { Ollama } from '../src/index' +import { Ollama } from 'ollama' const ollama = new Ollama({ host: 'http://localhost:11434' }) const response = await ollama.chat({