Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(langchain): improve callbacks #1426

Merged
merged 13 commits into from
Jul 4, 2024

Conversation

tibor-reiss
Copy link
Contributor

@tibor-reiss tibor-reiss commented Jul 1, 2024

Builds on top of #1317 which should be merged first:

  • Added (a)stream and (a)transform to wrapped function list.

  • Removed "instance" logic.

  • I have added tests that cover my changes.

  • If adding a new instrumentation or changing an existing one, I've added screenshots from some observability platform showing the change.

  • PR name follows conventional commits format: feat(instrumentation): ... or fix(instrumentation): ....

  • (If applicable) I have updated the documentation accordingly.

@tibor-reiss tibor-reiss changed the title Langchain callbacks streaming feat(langchain): improve callbacks Jul 1, 2024
@nirga
Copy link
Member

nirga commented Jul 3, 2024

@tibor-reiss it's not that the OpenAI span is not generated - it's generated on a separate trace, which is a problem. It should be on the same trace as the rest of the LangChain spans:

Screenshot 2024-07-03 at 21 39 36
Screenshot 2024-07-03 at 21 39 46

@tibor-reiss
Copy link
Contributor Author

tibor-reiss commented Jul 3, 2024

@tibor-reiss it's not that the OpenAI span is not generated - it's generated on a separate trace, which is a problem. It should be on the same trace as the rest of the LangChain spans:

Hi @nirga, could you please share your code which resulted in this?

I checked several places and noticed that this seems to be already broken in main. E.g. in test_lcel / test_streaming (I renamed this test to test_invoke in this PR).

The issue seems to come from that BaseChatModel.generate is called. This is still in the non-callback world. It has a different signature (no config in 2nd place), so adding it to SYNC_FUNCTIONS is not enough - probably needs a separate logic to use it's 3rd argument, callbacks - I'll look into it asap...

@nirga
Copy link
Member

nirga commented Jul 3, 2024

Hmm @tibor-reiss might be new since we introduced the callbacks, cause this used to work with the previous mechanism

@tibor-reiss
Copy link
Contributor Author

Hmm @tibor-reiss might be new since we introduced the callbacks, cause this used to work with the previous mechanism

@nirga Thought the same, so was just checking the stuff before the callbacks (https://github.com/tibor-reiss/openllmetry/blob/aaa303b9ffe6926d1830c51f911918d570ce2b1c/packages/opentelemetry-instrumentation-langchain/opentelemetry/instrumentation/langchain/__init__.py): the test test_simple_lcel already has this problem, i.e. openai_chat is not part of the workflow.

Copy link
Member

@nirga nirga left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@tibor-reiss I think I'm going to merge this anyway since it's an improvement over what we currently have. Long term, I think we'll need to figure out how to combine the "previous" way with this one, since it seems with the callback way of instrumenting things we're losing the context across the trace (as I've shown in the example above).

@nirga nirga merged commit fd80834 into traceloop:main Jul 4, 2024
8 checks passed
@tibor-reiss tibor-reiss deleted the langchain-callbacks-streaming branch July 4, 2024 19:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants