Skip to content

A Swift client library for interacting with Ollama

License

Notifications You must be signed in to change notification settings

mattt/ollama-swift

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama Swift Client

A Swift client library for interacting with the Ollama API.

Requirements

Installation

Swift Package Manager

Add the following to your Package.swift file:

.package(url: "https://github.com/mattt/ollama-swift.git", from: "1.0.0")

Usage

Note

The tests and example code for this library use the llama3.2 model. Run the following command to download the model to run them yourself:

ollama pull llama3.2

Initializing the client

import Ollama

// Use the default client (http://localhost:11434)
let client = Client.default

// Or create a custom client
let customClient = Client(host: URL(string: "http://your-ollama-host:11434")!, userAgent: "MyApp/1.0")

Generating text

Generate text using a specified model:

do {
    let response = try await client.generate(
        model: "llama3.2",
        prompt: "Tell me a joke about Swift programming.",
        options: [
            "temperature": 0.7,
            "max_tokens": 100
        ]
    )
    print(response.response)
} catch {
    print("Error: \(error)")
}

Chatting with a model

Generate a chat completion:

do {
    let response = try await client.chat(
        model: "llama3.2",
        messages: [
            .system("You are a helpful assistant."),
            .user("In which city is Apple Inc. located?")
        ]
    )
    print(response.message.content)
} catch {
    print("Error: \(error)")
}

Generating embeddings

Generate embeddings for a given text:

do {
    let embeddings = try await client.createEmbeddings(
        model: "llama3.2",
        input: "Here is an article about llamas..."
    )
    print("Embeddings: \(embeddings)")
} catch {
    print("Error: \(error)")
}

Managing models

Listing models

List available models:

do {
    let models = try await client.listModels()
    for model in models {
        print("Model: \(model.name), Modified: \(model.modifiedAt)")
    }
} catch {
    print("Error: \(error)")
}

Retrieving model information

Get detailed information about a specific model:

do {
    let modelInfo = try await client.showModel("llama3.2")
    print("Modelfile: \(modelInfo.modelfile)")
    print("Parameters: \(modelInfo.parameters)")
    print("Template: \(modelInfo.template)")
} catch {
    print("Error: \(error)")
}

Pulling a model

Download a model from the Ollama library:

do {
    let success = try await client.pullModel("llama3.2")
    if success {
        print("Model successfully pulled")
    } else {
        print("Failed to pull model")
    }
} catch {
    print("Error: \(error)")
}

Pushing a model

do {
    let success = try await client.pushModel("mynamespace/mymodel:latest")
    if success {
        print("Model successfully pushed")
    } else {
        print("Failed to push model")
    }
} catch {
    print("Error: \(error)")
}

About

A Swift client library for interacting with Ollama

Resources

License

Stars

Watchers

Forks

Languages