Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐛 Bug Report: exception thrown with LlamaIndex new workflow API #2394

Open
1 task done
nirga opened this issue Dec 12, 2024 · 1 comment
Open
1 task done

🐛 Bug Report: exception thrown with LlamaIndex new workflow API #2394

nirga opened this issue Dec 12, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@nirga
Copy link
Member

nirga commented Dec 12, 2024

Which component is this bug for?

Llamaindex Instrumentation

📜 Description

When using the new workflows API, there's an exception thrown:

Traceback (most recent call last):
File "/app/.venv/lib/python3.11/site-packages/opentelemetry/context/init.py", line 152, in detach
_RUNTIME_CONTEXT.detach(token)
File "/app/.venv/lib/python3.11/site-packages/opentelemetry/context/contextvars_context.py", line 50, in detach
self._current_context.reset(token)  # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ValueError: <Token var=<ContextVar name='current_context' default={} at 0x7bd2c6f07470> at 0x7bd1b27cbf40> was created in a different Context

👟 Reproduction steps

👍 Expected behavior

👎 Actual Behavior with Screenshots

🤖 Python Version

No response

📃 Provide any additional context for the Bug.

No response

👀 Have you spent some time to check if this bug has been raised before?

  • I checked and didn't find similar issue

Are you willing to submit PR?

None

@dosubot dosubot bot added the bug Something isn't working label Dec 12, 2024
@FelipeSantos-Ascensus
Copy link

FelipeSantos-Ascensus commented Dec 12, 2024

I am not sure if this will help, but I was able to generate data using this:

from openinference.instrumentation.llama_index import LlamaIndexInstrumentor
from arize.otel import register

# Setup OTel via our convenience function
tracer_provider = register(
    space_id = "your-space-id", # in app space settings page
    api_key = "your-api-key", # in app space settings page
    project_name = "your-project-name", # name this to whatever you would like
)

 # Finish automatic instrumentation
LlamaIndexInstrumentor().instrument(tracer_provider=tracer_provider)

It also has for Langchain and others..

https://docs.arize.com/arize/llm-tracing/how-to-tracing-manual/set-up-tracing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants