Skip to content

Commit

Permalink
Dockerize Nexus (#5)
Browse files Browse the repository at this point in the history
* initial dockerfile

* docker compose for sui network

* add ollama

* add faucet

* use local images

* add publish packages

* add events

* finish docker compose

* fix bug

* update tags

* cleanup

* cleanup

* un ignore docker .env

* use tag testnet-v1.28.3

* run github action on specific paths only

* fix python formatting and compose service startup

* windows compatibility

* add nexusctl

* nexusctl

* nexusctl

* fixes for windows

* update just

* fixes for just on macos

* fix just for linux

* use python shell in just

* change quotes to single

* check if brew is installed

* update readme

* update readme

* reformat nexusctl

* change image name

* update docker readme

* use base config for validator compose

* Update docker/README.md

Co-authored-by: Christos KK Loverdos <loverdos@gmail.com>

* Update docker/nexus/Dockerfile

Co-authored-by: Christos KK Loverdos <loverdos@gmail.com>

* Update docker/nexus/Dockerfile

Co-authored-by: Christos KK Loverdos <loverdos@gmail.com>

* use EOF notation

* fix ignored build args

* make setting up venv in build a separate script

* update github action to trigger on PR

* use set dockerfile syntax

---------

Co-authored-by: Christos KK Loverdos <loverdos@gmail.com>
  • Loading branch information
tuky191 and loverdos authored Nov 6, 2024
1 parent 1949203 commit 75b199c
Show file tree
Hide file tree
Showing 42 changed files with 1,704 additions and 453 deletions.
2 changes: 2 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
./offchain/events/build
./offchain/tools/build
2 changes: 2 additions & 0 deletions .gitattributes
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Force LF line endings for all files
* text=auto eol=lf
8 changes: 6 additions & 2 deletions .github/workflows/python.yml
Original file line number Diff line number Diff line change
@@ -1,8 +1,12 @@
# Github workflow to check python code

name: Python
on: [push]

on:
push:
paths:
- "examples/**"
- "nexus_sdk/**"
- "offchain/**"
jobs:
# https://black.readthedocs.io/en/stable/integrations/github_actions.html
formatting-check:
Expand Down
7 changes: 6 additions & 1 deletion .github/workflows/talus-agentic-framework.yml
Original file line number Diff line number Diff line change
@@ -1,8 +1,13 @@
# Github workflow to build and test the Talus Agentic Framework project

name: Talus Agentic Framework
on: [push]
on:
pull_request:

push:
paths:
- "onchain/**"
- "e2e_tests/**"
env:
# defines what Sui version to install from the Sui's Github release page
# https://github.com/MystenLabs/sui/releases
Expand Down
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -106,6 +106,7 @@ celerybeat.pid

# Environments
.env
!docker/.env
.venv
env/
venv/
Expand Down Expand Up @@ -162,3 +163,5 @@ nohup.out
.idea
*.iml
.vscode
./docker/sui/genesis/files/
./docker/sui/genesis/files/.venv
12 changes: 12 additions & 0 deletions docker/.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
SUI_TAG=testnet-v1.28.3
LLAMA_MODEL_VERSION=llama3.2:1b
LLAMA_MODEL_VERSION_TAG=llama3.2-1b
RPC_URL=http://fullnode1:9000
WS_URL=ws://fullnode1:9000
MODEL_URL=http://ollama:11434
FAUCET_URL=http://faucet:5003/gas
TOOL_URL=http://tools:8080/tool/use
LLM_ASSISTANT_URL=http://tools:8080/predict
OLLAMA_DEVICE_DRIVER=nvidia
OLLAMA_DEVICE_COUNT=all
OLLAMA_DEVICE_CAPABILITIES=gpu
33 changes: 33 additions & 0 deletions docker/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# README

## Infrastructure description

Local infra consists of following services

- sui
- 4 validators
- faucet
- fullnode
- nexus
- events
- tools
- examples
- ollama (only on windows and linux)

There are also a few startup services

- sui
- build-suitools
- builds a sui image to a tag specified in .env
- build-genesis
- runs generate.py to generate new sui genesis.blob and validator.yaml
- publish-package
- builds and publishes the nexus smart contracts from ./onchain directory
- bootstrap-model
- bootstraps a Llama model on the Sui blockchain by creating a node and the model using nexus_sdk, then saves their details for future use.

## Troubleshooting

If you encounter trouble building the `build-genesis` image, try switching the context to default.

`docker context use default`
31 changes: 31 additions & 0 deletions docker/containers.just
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
set shell := ["python3", "-c"]

[private]
default:
@__import__('os').system("just -l containers")

[private]
[no-cd]
check:
@import os, sys; from subprocess import call; result = call("docker ps | grep -q 'examples'", shell=True); \
print("Docker environment is already running.") if result == 0 else (print("Docker environment is not running. Starting environment...") or os.system("just containers start"))

# Builds the Docker containers using Docker Compose
[no-cd]
build:
@print("Building Docker containers..."); __import__('os').system("python3 ./docker/nexusctl.py create")

# Starts the Docker containers using Docker Compose
[no-cd]
start:
@print("Starting Docker containers..."); __import__('os').system("python3 ./docker/nexusctl.py start")

# Stops the Docker containers using Docker Compose
[no-cd]
stop:
@print("Stopping Docker containers..."); __import__('os').system("python3 ./docker/nexusctl.py stop")

# Deletes all Docker volumes related to the project using Docker Compose
[no-cd]
clean:
@print("Deleting Docker volumes..."); __import__('os').system("python3 ./docker/nexusctl.py delete")
3 changes: 3 additions & 0 deletions docker/docker-compose-nollama.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
include:
- ./sui/compose.yaml
- ./nexus/compose.yaml
4 changes: 4 additions & 0 deletions docker/docker-compose.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
include:
- ./sui/compose.yaml
- ./ollama/compose.yaml
- ./nexus/compose.yaml
29 changes: 29 additions & 0 deletions docker/nexus/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
#syntax=docker/dockerfile:1

FROM python:3.10-slim AS builder

ARG INSTALL_RUST=false

ENV INSTALL_RUST=${INSTALL_RUST}

WORKDIR /app

RUN ls -lta

COPY . .

COPY --from=nexus bin/setup_venv.sh /usr/local/bin/setup_venv.sh

RUN chmod +x /usr/local/bin/setup_venv.sh

RUN /usr/local/bin/setup_venv.sh

FROM python:3.10-slim AS runtime

WORKDIR /app

COPY --from=builder /app /app

EXPOSE 8080

CMD ["bash", "-c", "source .venv/bin/activate && uvicorn src.nexus_tools.server.main:app --host 0.0.0.0 --port 8080"]
82 changes: 82 additions & 0 deletions docker/nexus/bin/bootstrap_model.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
# Import necessary modules
import json
from pathlib import Path
from nexus_sdk import get_sui_client_with_airdrop, create_node, create_model
import os

shared_dir = Path(os.getenv("SHARED_DIR", "."))
package_id_file = Path(shared_dir) / "package_id.json"
keystore_path = Path(shared_dir) / "sui.keystore"

rpc_url = os.getenv("RPC_URL", "http://localhost:9000")
ws_url = os.getenv("WS_URL", "ws://localhost:9000")
faucet_url = os.getenv("FAUCET_URL", "http://localhost:5003/gas")


# Decoupled function to create node and model and save details to a file.
def create_and_save_node_and_model(client, package_id):
node_id = create_example_node(client, package_id)
llama_id, llama_owner_cap_id = create_llama_model(client, package_id, node_id)

# Save the node details to a JSON file
shared_dir = Path(os.getenv("SHARED_DIR", "."))
shared_dir.mkdir(parents=True, exist_ok=True)
node_details = {
"node_id": node_id,
"llama_id": llama_id,
"llama_owner_cap_id": llama_owner_cap_id,
}
with open(shared_dir / "node_details.json", "w") as f:
json.dump(node_details, f, indent=4)

return node_id, llama_id, llama_owner_cap_id


# Creates a new node owned object.
def create_example_node(client, package_id):
node_id = create_node(client, package_id, "LocalNode", "CPU", 16)
if not node_id:
raise Exception("Failed to create node")
return node_id


# Creates llama model representation on chain.
# Returns the model ID and the model owner capability ID.
def create_llama_model(client, package_id, node_id):
model_id, model_owner_cap_id = create_model(
client=client,
package_id=package_id,
node_id=node_id,
name="llama3.2:1b",
model_hash=b"llama3.2_1b_hash",
url=os.getenv("MODEL_URL", "http://localhost:11434"),
token_price=1000,
capacity=1000000,
num_params=1000000000,
description="llama3.2 1b",
max_context_length=8192,
is_fine_tuned=False,
family="Llama3.2",
vendor="Meta",
is_open_source=True,
datasets=["test"],
)
if not model_id:
raise Exception("Failed to create model")
return model_id, model_owner_cap_id


if __name__ == "__main__":

client = get_sui_client_with_airdrop(
rpc_url=rpc_url,
ws_url=ws_url,
faucet_url=faucet_url,
keystore_path=keystore_path,
)
with open(package_id_file, "r") as f:
package_id_list = json.load(f)
package_id = package_id_list[0]

create_and_save_node_and_model(client, package_id)
print("environment prepared successfully")
29 changes: 29 additions & 0 deletions docker/nexus/bin/setup_venv.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
#!/bin/sh

if [ "$INSTALL_RUST" = "true" ]; then
apt-get update
apt-get install -y --no-install-recommends curl build-essential
curl https://sh.rustup.rs -sSf | sh -s -- -y
. $HOME/.cargo/env
rustup update
rustup default stable
fi

pip install uv
uv venv -p "$PYTHON_VERSION"
export OSTYPE=${OSTYPE:-linux-gnu}
. .venv/bin/activate

if [ "$INSTALL_RUST" = "true" ]; then
. $HOME/.cargo/env
fi

if [ -f "pyproject.toml" ]; then
uv pip install .
else
for dir in */; do
if [ -f "$dir/pyproject.toml" ]; then
uv pip install "$dir"
fi
done
fi
83 changes: 83 additions & 0 deletions docker/nexus/bin/start_events.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
import os
import json
import subprocess
from pathlib import Path

# Set paths
shared_dir = Path(os.getenv("SHARED_DIR", "."))
keystore_path = Path(shared_dir) / "sui.keystore"

# Extract details from JSON files
package_id_path = Path(shared_dir) / "package_id.json"
node_details_path = Path(shared_dir) / "node_details.json"


rpc_url = os.getenv("RPC_URL", "http://localhost:9000")
ws_url = os.getenv("WS_URL", "ws://localhost:9000")
tool_url = os.getenv("TOOL_URL", "http://0.0.0.0:8080/tool/use")

# Load package ID
try:
with open(package_id_path, "r") as f:
package_id = json.load(f)[0]
except (FileNotFoundError, IndexError, json.JSONDecodeError) as e:
print(f"Error: Unable to load package ID from {package_id_path}. Details: {e}")
exit(1)

# Load node details
try:
with open(node_details_path, "r") as f:
node_details = json.load(f)
model_owner_cap_id = node_details.get("llama_owner_cap_id")
except (FileNotFoundError, json.JSONDecodeError) as e:
print(f"Error: Unable to load node details from {node_details_path}. Details: {e}")
exit(1)

if not model_owner_cap_id:
print("Error: Model owner capability ID is missing.")
exit(1)

# Load SUI private key from keystore JSON
try:
with open(keystore_path, "r") as f:
keys = json.load(f)
if not keys:
raise ValueError(
"Sui keystore file is empty. Please check your Sui configuration."
)
private_key = keys[0] # Assuming the first key is used
except (FileNotFoundError, json.JSONDecodeError, ValueError) as e:
print(f"Error: Unable to load SUI private key from {keystore_path}. Details: {e}")
exit(1)

# Set environment variables
os.environ["PACKAGE_ID"] = package_id
os.environ["SUI_PRIVATE_KEY"] = private_key
os.environ["MODEL_OWNER_CAP_ID"] = model_owner_cap_id

# Command to run the Python script
command = [
"python",
"events/src/nexus_events/sui_event.py",
"--packageid",
package_id,
"--privkey",
private_key,
"--modelownercapid",
model_owner_cap_id,
"--rpc",
rpc_url,
"--ws",
ws_url,
"--toolurl",
tool_url, # New argument for tool URL
]

print(f"Running command: {' '.join(command)}")

# Execute the command
try:
subprocess.run(command, check=True)
except subprocess.CalledProcessError as e:
print(f"Error: Failed to execute command. Details: {e}")
exit(1)
Loading

0 comments on commit 75b199c

Please sign in to comment.