Skip to content

Commit

Permalink
feat: v0.0.3
Browse files Browse the repository at this point in the history
- basic eject functionality
- ollama commands
  • Loading branch information
av committed Jul 29, 2024
1 parent aab6fee commit 9d0a266
Show file tree
Hide file tree
Showing 4 changed files with 82 additions and 7 deletions.
34 changes: 32 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -89,13 +89,13 @@ graph LR
class SearXNG optional
```

This project is a script around a pre-configured Docker Compose setup that connects various LLM-related projects together. It simplifies the initial configuration and can serve as a base for your own customized setup.
This project is a CLI and a pre-configured Docker Compose setup that connects various LLM-related projects together. It simplifies the initial configuration and can serve as a base for your own customized setup.

- Services are pre-configured to work together
- Reused local cache - huggingface, ollama, etc.
- All configuration in one place
- Access required CLIs via Docker without installing them

- Eject from Harbor at any time

## Harbor CLI Reference

Expand Down Expand Up @@ -193,6 +193,36 @@ harbor hf --help
harbor hf scan-cache
```

### `harbor ollama <command>`

Runs Ollama CLI in the container against the Harbor configuraiton.

```bash
# All Ollama commands are available
harbor ollama --version

# Show currently cached models
harbor ollama list

# See for more commands
harbor ollama --help
```

### `harbor eject`

Renders Harbor's Docker Compose configuration into a standalone config that can be moved and used elsewhere. Accepts the same options as `harbor up`.

```bash
# Eject with default services
harbor eject

# Eject with additional services
harbor eject searxng

# Likely, you want the output to be saved in a file
harbor eject searxng llamacpp > docker-compose.harbor.yml
```

## Services Overview

| Service | Handle / Local URL | Description |
Expand Down
2 changes: 1 addition & 1 deletion compose.llamacpp.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ services:
- 33831:8080
command: >
--server
--model ${HARBOR_LLAMACPP_MODEL}
--model $(./scripts/hf.sh https://huggingface.co/bartowski/Qwen2-7B-Instruct-GGUF/blob/main/Qwen2-7B-Instruct-IQ2_S.gguf)
--port 8080
--host 0.0.0.0
networks:
Expand Down
48 changes: 46 additions & 2 deletions harbor.sh
Original file line number Diff line number Diff line change
Expand Up @@ -88,14 +88,17 @@ show_help() {
echo " logs - View the logs of the containers"
echo " help - Show this help message"
echo
echo "Setup Manageent Commands:"
echo " hf - Run the Hugging Face CLI"
echo "Setup Management Commands:"
echo " hf - Run the Harbor's Hugging Face CLI"
echo " ollama - Run the Harbor's Ollama CLI. Ollama service should be running"
echo " smi - Show NVIDIA GPU information"
echo
echo "CLI Commands:"
echo " open - Open a service in the default browser"
echo " ln - Create a symbolic link to the CLI"
echo " defaults - Show the default services"
echo " version - Show the CLI version"
echo " eject - Eject the Compose configuration, accepts same options as 'up'"
echo
echo "Options:"
echo " Additional options to pass to the compose_with_options function"
Expand Down Expand Up @@ -185,6 +188,35 @@ open_service() {
echo "Opened $url in your default browser."
}

smi() {
if command -v nvidia-smi &> /dev/null; then
nvidia-smi
else
echo "nvidia-smi not found."
fi
}

eject() {
$(compose_with_options "$@") config
}

run_in_service() {
local service_name="$1"
shift
local command_to_run="$@"

if docker compose ps --services --filter "status=running" | grep -q "^${service_name}$"; then
echo "Service ${service_name} is running. Executing command..."
docker compose exec ${service_name} ${command_to_run}
else
echo "Harbor ${service_name} is not running. Please start it with 'harbor up ${service_name}' first."
fi
}

exec_ollama() {
run_in_service ollama ollama "$@"
}

cd $harbor_home

# Main script logic
Expand Down Expand Up @@ -236,6 +268,18 @@ case "$1" in
shift
show_version
;;
smi)
shift
smi
;;
eject)
shift
eject $@
;;
ollama)
shift
exec_ollama $@
;;
*)
echo "Unknown command: $1"
show_help
Expand Down
5 changes: 3 additions & 2 deletions open-webui/config.json
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,9 @@
"http://llamacpp:8080/v1"
],
"api_keys": [
""
]
"123"
],
"enabled": true
},
"image_generation": {
"engine": "comfyui",
Expand Down

0 comments on commit 9d0a266

Please sign in to comment.