Skip to content

Commit

Permalink
docs: Add docker instructions
Browse files Browse the repository at this point in the history
  • Loading branch information
mudler committed Nov 28, 2023
1 parent b782136 commit 19e41d1
Show file tree
Hide file tree
Showing 2 changed files with 40 additions and 6 deletions.
12 changes: 10 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -109,9 +109,17 @@ Hot topics:

Check out the [Getting started](https://localai.io/basics/getting_started/index.html) section in our documentation.

### 💡 Example: Use Luna-AI Llama model
### Community

See the [documentation](https://localai.io/basics/getting_started)
WebUI
- https://github.com/Jirubizu/localai-admin
- https://github.com/go-skynet/LocalAI-frontend

Model galleries
- https://github.com/go-skynet/model-gallery

Other:
- Helm chart https://github.com/go-skynet/helm-charts

### 🔗 Resources

Expand Down
34 changes: 30 additions & 4 deletions docs/content/getting_started/_index.en.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,36 @@ weight = 1
url = '/basics/getting_started/'
+++

`LocalAI` is available as a container image and binary. You can check out all the available images with corresponding tags [here](https://quay.io/repository/go-skynet/local-ai?tab=tags&tag=latest).
`LocalAI` is available as a container image and binary. It can be used with docker, podman, kubernetes and any container engine. You can check out all the available images with corresponding tags [here](https://quay.io/repository/go-skynet/local-ai?tab=tags&tag=latest).

See also our [How to]({{%relref "howtos" %}}) section for end-to-end guided examples curated by the community.

### How to get started
For a always up to date step by step how to of setting up LocalAI, Please see our [How to]({{%relref "howtos" %}}) page.

### Fast Setup
The easiest way to run LocalAI is by using [`docker compose`](https://docs.docker.com/compose/install/) or with [Docker](https://docs.docker.com/engine/install/) (to build locally, see the [build section]({{%relref "build" %}})). The following example uses `docker compose`:
The easiest way to run LocalAI is by using [`docker compose`](https://docs.docker.com/compose/install/) or with [Docker](https://docs.docker.com/engine/install/) (to build locally, see the [build section]({{%relref "build" %}})).

{{< tabs >}}
{{% tab name="Docker" %}}

```bash
# Prepare the models into the `model` directory
mkdir models
# copy your models to it
cp your-model.bin models/
# run the LocalAI container
docker run -p 8080:8080 -v $PWD/models:/models -ti --rm quay.io/go-skynet/local-ai:latest --models-path /models --context-size 700 --threads 4
# Try the endpoint with curl
curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d '{
"model": "your-model.bin",
"prompt": "A long time ago in a galaxy far, far away",
"temperature": 0.7
}'
```

{{% /tab %}}
{{% tab name="Docker compose" %}}



```bash

Expand Down Expand Up @@ -44,6 +67,9 @@ curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d
"temperature": 0.7
}'
```
{{% /tab %}}

{{< /tabs >}}

### Example: Use luna-ai-llama2 model with `docker compose`

Expand Down

0 comments on commit 19e41d1

Please sign in to comment.