Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: Add docker instructions, add community projects section in README #1359

Merged
merged 1 commit into from
Nov 28, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 10 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -109,9 +109,17 @@ Hot topics:

Check out the [Getting started](https://localai.io/basics/getting_started/index.html) section in our documentation.

### 💡 Example: Use Luna-AI Llama model
### Community

See the [documentation](https://localai.io/basics/getting_started)
WebUI
- https://github.com/Jirubizu/localai-admin
- https://github.com/go-skynet/LocalAI-frontend

Model galleries
- https://github.com/go-skynet/model-gallery

Other:
- Helm chart https://github.com/go-skynet/helm-charts

### 🔗 Resources

Expand Down
34 changes: 30 additions & 4 deletions docs/content/getting_started/_index.en.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,36 @@ weight = 1
url = '/basics/getting_started/'
+++

`LocalAI` is available as a container image and binary. You can check out all the available images with corresponding tags [here](https://quay.io/repository/go-skynet/local-ai?tab=tags&tag=latest).
`LocalAI` is available as a container image and binary. It can be used with docker, podman, kubernetes and any container engine. You can check out all the available images with corresponding tags [here](https://quay.io/repository/go-skynet/local-ai?tab=tags&tag=latest).

See also our [How to]({{%relref "howtos" %}}) section for end-to-end guided examples curated by the community.

### How to get started
For a always up to date step by step how to of setting up LocalAI, Please see our [How to]({{%relref "howtos" %}}) page.

### Fast Setup
The easiest way to run LocalAI is by using [`docker compose`](https://docs.docker.com/compose/install/) or with [Docker](https://docs.docker.com/engine/install/) (to build locally, see the [build section]({{%relref "build" %}})). The following example uses `docker compose`:
The easiest way to run LocalAI is by using [`docker compose`](https://docs.docker.com/compose/install/) or with [Docker](https://docs.docker.com/engine/install/) (to build locally, see the [build section]({{%relref "build" %}})).

{{< tabs >}}
{{% tab name="Docker" %}}

```bash
# Prepare the models into the `model` directory
mkdir models
# copy your models to it
cp your-model.bin models/
# run the LocalAI container
docker run -p 8080:8080 -v $PWD/models:/models -ti --rm quay.io/go-skynet/local-ai:latest --models-path /models --context-size 700 --threads 4
# Try the endpoint with curl
curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d '{
"model": "your-model.bin",
"prompt": "A long time ago in a galaxy far, far away",
"temperature": 0.7
}'
```

{{% /tab %}}
{{% tab name="Docker compose" %}}



```bash

Expand Down Expand Up @@ -44,6 +67,9 @@ curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d
"temperature": 0.7
}'
```
{{% /tab %}}

{{< /tabs >}}

### Example: Use luna-ai-llama2 model with `docker compose`

Expand Down
Loading