-
Notifications
You must be signed in to change notification settings - Fork 292
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Phoenix throwing protobuf error upon import #2695
Comments
Hey @brycecf! Great to see you back here, navigating the exciting yet sometimes bumpy roads of technology with us. How's it going? Based on the error message you're encountering, it seems like there might be a compatibility issue with the protobuf version you're using. The error message suggests regenerating the code with protoc >= 3.19.0 or downgrading the protobuf package to 3.20.x or lower. Given the versions of the packages you're using, it's possible that there's a conflict between these packages or a missing dependency. Here are some steps you can take to resolve this issue:
If after following these steps the issue remains unresolved, it might be helpful to consult the documentation or support forums for the specific packages involved for more targeted advice. Regarding the protobuf version, the exact version being used in your environment is not specified in your message. However, based on the Arize Phoenix repository's I hope this helps! If you have any other questions or need further clarification, feel free to ask. Sources
|
Thanks for the report @brycecf ! We'll dive in in the next few days |
I also have the same problem here. I am forced to set Line 42 in f697811
This issue is related to protocolbuffers/protobuf#10051 Let me know when there is some news or if I can provide more details! |
Is protobuf v5+ installed? Can it be downgraded to v4? The call stack trace above indicates that the problem package is |
Downgrade to protobuf <5.0
downgrade to protobuf <4.0
|
Thanks, @MartinoMensio! I'm glad you got it to work. Incidentally, I'm not really sure what the root cause is. |
Since this has been shown to be a third-party issue, I'm closing this ticket for now. The workaround indicated above is to install protobuf <4. |
After further investigation, I managed to get it working also with protobuf v4: The main problem was with the version of
And now I am able to run without the exception above with protobuf v4 |
Thanks, @MartinoMensio, for the detailed investigation. |
The source code of opentelemetry-proto 1.11.1 shows that it only requires |
is this fixed for |
@dhirajsuvarna sorry to hear you are facing issues. llama-index-callbacks-arize-phoenix is just a wrapper around https://pypi.org/project/openinference-instrumentation-llama-index/ so I think trying some of the above steps around otel dependencies would be my first line of investigation. We keep our dependencies pretty relaxed just so that it doesn't conflict with any pre-existing dependencies |
Well I did following changes and got it working
These changes got the stuff working. |
Does storing these pinned dependencies in a requirements.txt or poetry lock not work? llama-index-callbacks-arize-phoenix doesn't actually have these deps I think but I'll check the llama-index repo on Monday |
@mikeldking - Well i am using poetry, so theoratically it would be possible to change the In my opinion, lets spend some time on finding the root cause, before we go down that route. |
@dhirajsuvarna in general I think it's better for you to pin the OTel versions directly in your requirements rather than rely on the version range specified by Phoenix. In general Phoenix cannot have hard dependencies on specific versions of open telemetry since it needs to support various instrumentations and client environments.I would recommend a setup like this to specify your dependencies. This will give you the deterministic environment that you're looking for https://github.com/Arize-ai/openinference/blob/85d6c0a3e148fcc50018873e06a9dbd38f5d5b06/python/examples/llama-index/backend/pyproject.toml |
looks like this issue will be fixed very soon open-telemetry/opentelemetry-python#3931 (comment) |
Describe the bug
Running the one-click Arise Phoenix integration with LlamaIndex results in the following protobuf error:
To Reproduce
Steps to reproduce the behavior:
import phoenix as px
Expected behavior
Import should be successful.
Environment (please complete the following information):
Sonoma 14.3.1
jupyterlab=4.1.5
Safari 17.3.1
arize-phoenix=3.17.1
,llama-index-core=0.10.22
,llama-index-callbacks-arize-phoenix=0.1.4
,openinference-instrumentation-llama-index=1.2.1
Additional context
The text was updated successfully, but these errors were encountered: