Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Standalone/offline] - wren-ai-service #1249

Open
D31000 opened this issue Jan 29, 2025 · 2 comments
Open

[Standalone/offline] - wren-ai-service #1249

D31000 opened this issue Jan 29, 2025 · 2 comments
Labels
bug Something isn't working

Comments

@D31000
Copy link

D31000 commented Jan 29, 2025

Describe the bug
Wren-ai-service container try to pull embedder on offline environment, while ollama embedder is on another server.

To Reproduce
Steps to reproduce the behavior:

  1. Docker-compose up
  2. check log of wren-ai-service container

Expected behavior
Wren-ai-service contaienr running ;)

Desktop (please complete the following information):

  • OS: Redhat

Wren AI Information

  • wren-ai-service : 0.14.3
  • wren-ui : 0.19.1 / dev-20250102-v1
  • wren-engine-ibis : 0.13.1
  • wren-engin : 0.13.1

Additional context
I'm on totaly offline environment (poc).

One server on windows server 2k19 for ollama.
ollama --version : 0.5.4
With model :
nomic-embed-text (don't remenber exactly but pull this one on 2024/11)
llama3.2

Another server on Redhat with docker for wrenai.

Allready tested ollama embeded response from RH docker server look like OK.
curl http://179.111.28.236:11434/api/embeddings -d '{ "model": "nomic-embed-text", "prompt": "The sky is blue because of Rayleigh scattering" }'

Same for llama3.2.
curl http://179.111.28.236:11434/api/generate -d '{ "model": "llama3.2", "prompt": "Why is the sky blue?", "stream": false }'

I'm not use launcher because i have see some pull file from github => i'm in offline environment i can't do this.
I have pull image from another computer and use usb transfert to push on my environment (docker save / docker load).

i have check logs form this post : #1127 (where wren-ai-container look good)

And see different type of config for embedder :

type: embedder
provider: ollama_embedder
models:
  - model: nomic-embed-text
    dimension: 768
api_base: http://192.168.6.190:11434
timeout: 6200

in your config exemple :

type: embedder
provider: ollama_embedder
models:
  - model: nomic-embed-text  # put your ollama embedder model name here
url: http://host.docker.internal:11434  # change this to your ollama host, url should be <ollama_url>
timeout: 120

in my test if i haven't set dimension i have a error but this may be one another problem...
with api_base i have one another error :)
i mixed both for going more far in the execution of this container.
but may be it's not the good way.

i have see here (#512) that the trying download is the normal way but in offline environment for me this is not possible. There is a way for not using this download ?

Relevant log output

.env.log
config.yaml.log
docker-compose.yaml.log

wrenai-ibis-server.log
wrenai-wren-ai-service.log
wrenai-wren-engine.log
wrenai-wren-ui.log

thank for you help !

@D31000 D31000 added the bug Something isn't working label Jan 29, 2025
@D31000
Copy link
Author

D31000 commented Jan 31, 2025

Hello,

I managed to go a little further.
I have update wren-ai-service from 0.14.3 to 0.15.5 and now i have one new erreur.

I0131 10:59:43.691 10 wren-ai-service:24] Using Engine: wren_ui
I0131 10:59:43.699 10 wren-ai-service:367] Using Qdrant Document Store with Embedding Model Dimension: 512
I0131 10:59:44.023 10 wren-ai-service:135] Loading Helpers for DB Schema Indexing Pipeline: src.pipelines.indexing.utils
I0131 10:59:44.025 10 wren-ai-service:367] Using Qdrant Document Store with Embedding Model Dimension: 512
I0131 10:59:44.323 10 wren-ai-service:367] Using Qdrant Document Store with Embedding Model Dimension: 512
I0131 10:59:44.628 10 wren-ai-service:367] Using Qdrant Document Store with Embedding Model Dimension: 512
W0131 10:59:44.937 10 wren-ai-service:129] SQL pairs file not found: pairs.json
I0131 10:59:44.939 10 wren-ai-service:367] Using Qdrant Document Store with Embedding Model Dimension: 512
I0131 10:59:45.244 10 wren-ai-service:367] Using Qdrant Document Store with Embedding Model Dimension: 512
I0131 10:59:45.564 10 wren-ai-service:367] Using Qdrant Document Store with Embedding Model Dimension: 512
I0131 10:59:46.202 10 wren-ai-service:367] Using Qdrant Document Store with Embedding Model Dimension: 512
I0131 10:59:46.508 10 wren-ai-service:367] Using Qdrant Document Store with Embedding Model Dimension: 512
I0131 10:59:46.822 10 wren-ai-service:367] Using Qdrant Document Store with Embedding Model Dimension: 512
ERROR:    Traceback (most recent call last):
  File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 693, in lifespan
    async with self.lifespan_context(app) as maybe_state:
  File "/usr/local/lib/python3.12/contextlib.py", line 204, in __aenter__
    return await anext(self.gen)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
    async with original_context(app) as maybe_original_state:
  File "/usr/local/lib/python3.12/contextlib.py", line 204, in __aenter__
    return await anext(self.gen)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/src/__main__.py", line 32, in lifespan
    app.state.service_container = create_service_container(pipe_components, settings)
                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/src/globals.py", line 112, in create_service_container
    **pipe_components["sql_generation_reasoning"],
      ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^
KeyError: 'sql_generation_reasoning'

ERROR:    Application startup failed. Exiting.

qdrant log look like good.

please find full log :

wrenai-wren-ai-service.log
qdrant.log

thank for you help !

@cyyeh
Copy link
Member

cyyeh commented Jan 31, 2025

@D31000 with new versions of Wren AI, there may be new pipelines, so you need to get the latest version of them in the config.example.yaml depending on the version of Wren AI

For example, the latest version of Wren AI: https://github.com/Canner/WrenAI/blob/0.15.2/docker/config.example.yaml

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants