Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Translation example : Connection to UI failed #1382

Open
6 tasks
ShankarRIntel opened this issue Jan 11, 2025 · 4 comments
Open
6 tasks

[Bug] Translation example : Connection to UI failed #1382

ShankarRIntel opened this issue Jan 11, 2025 · 4 comments
Assignees
Labels
bug Something isn't working

Comments

@ShankarRIntel
Copy link

Priority

Undecided

OS type

Ubuntu

Hardware type

Xeon-GNR

Installation method

  • Pull docker images from hub.docker.com
  • Build docker images from source

Deploy method

  • Docker compose
  • Docker
  • Kubernetes
  • Helm

Running nodes

Single Node

What's the version?

1.1

Description

Was not able to bring up the servers but errors are being thrown when testing and connecting to the UI
translation_errors.log

Reproduce steps

As in the documentation

Raw log

No response

@ShankarRIntel ShankarRIntel added the bug Something isn't working label Jan 11, 2025
@xiguiw
Copy link
Collaborator

xiguiw commented Jan 13, 2025

@ShankarRIntel

From the log, the llm_endpoint is not set correctly.

The GenAIExamples are being refactored recently, the document and environment variables may not be updated.

Let me know how you set environment variables etc.

�[33mtranslation-gaudi-backend-server  | �[0m  File "/usr/local/lib/python3.11/site-packages/prometheus_fastapi_instrumentator/middleware.py", line 172, in __call__
�[32mllm-tgi-gaudi-server              | �[0m    return AsyncOpenAI(api_key=OPENAI_API_KEY, base_url=llm_endpoint + "/v1", timeout=600, default_headers=headers)
�[32mllm-tgi-gaudi-server              | �[0m                                                        ~~~~~~~~~~~~~^~~~~~~
�[32mllm-tgi-gaudi-server              | �[0mTypeError: unsupported operand type(s) for +: 'NoneType' and 'str'

@ShankarRIntel
Copy link
Author

ShankarRIntel commented Jan 13, 2025 via email

@xiguiw
Copy link
Collaborator

xiguiw commented Jan 14, 2025

@ShankarRIntel

I tried the wrong examples (ChatQnA) hours ago.

After tried Translation, I cannot reproduce your issue.
I ran Translate on Gaudi successfully.

Please give you steps to reproduce this issue.

Can I please know how I can set it? Is it in the documentation?

Here are steps to set environment variables:

https://github.com/opea-project/GenAIExamples/tree/main/Translation/docker_compose/intel/hpu/gaudi#setup-environment-variables

Set up other environment variables:

cd ../../../
source set_env.sh

Please check TGI_LLM_ENDPOINT in your environment as following:

echo $TGI_LLM_ENDPOINT 

@xiguiw
Copy link
Collaborator

xiguiw commented Jan 14, 2025

@ShankarRIntel

I unset TGI_LLM_ENDPOINT, did not get the error log as yours.
If I run source set_env.sh, I get the result successfully on Console.

Please

  1. update your GenAIExamples and GenAIComps code to latest
  2. follow REAME to build docker images and tried it again.

If you still ran into the issue, please list each step at you did, so that I can reproduce it at my side.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants