Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(llmobs): llmobs-specific context manager #10767

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

Yun-Kim
Copy link
Contributor

@Yun-Kim Yun-Kim commented Sep 23, 2024

Summary

Public facing changes:

  • Adds LLMObs.current_span() to API, returns the current active LLMObs-generated (integration, SDK) span.
  • Adds LLMObs.current_trace_context() to API, returns current LLMObs context (which can represent both a span or a distributed span)
  • Any LLMObs method that allow an optional span argument will now default to finding the current active LLMObs span rather than the current active APM span.
  • Adds multithreading support for LLMObs. Previously multithreaded apps would result in broken traces.

Private changes:

  • LLMObs has its own context provider which keeps track of the active LLM-type span (generated by both LLMObs._start_span() and LLM integrations)
  • HTTPPropagation now adds LLMObs parent ID as a field on the request headers directly, rather than through the context object.

Previous behavior

LLMObs spans are based on APM spans, except LLMObs spans' parenting involves only other LLMObs spans. So with a potential trace structure containing a mixture of APM-specific and LLMObs spans, like:

Span A (LLMObs span) --> Span B (Apm-specific span) --> Span C (LLMObs span)

LLMObs only cares about the LLMObs spans, where span C's parent is the root span, even though in APM it would be span B. Combined with distributed tracing and multithreading, this makes it not so easy to determine that "correct" (read LLMObs) parenting tree for traces submitted to LLM Observability.

Problems with previous approach

Previously we worked around this by traversing the span's local parent tree and finding the next LLM-type span on both span start and finish for non-distributed cases, and for distributed cases we would attach the parent ID on the span context's meta field to be propagated in distributed request headers. However attaching things to the span context meta was not suitable long-term due to a couple factors:

  1. Context objects are not thread-safe: in a multithreading case with n>1 child threads creating their own spans, the parent ID stored in the context object could be overwritten during thread execution, therefore incorrectly propagating parent IDs.
  2. Context objects store trace-specific information, and are not designed for our use case where we skip spans here and there in the trace. This also leads to edge cases that were handled with ugly workaround code:
Example ugly workaround Any meta fields set on the context object gets propagated as span tags on all subsequent spans in the trace on span start time, except for the spans in the first service of a trace which get propagated at span finish time. Fixing this resulted in overriding these span tags on span start and more checks on span finish.

Current approach

Instead of being dependent on a Context object that doesn't quite fit our use case and trying to make it fit our use case, we simply keep track of our own active LLMObs span/context:

  • LLMObsContextProvider handles keeping track of the current active LLMObs span via active() and activate()
  • Instead of traversing a span's local ancestor tree to solve for a span's llmobs parent ID, we just use LLMObsContextProvider._activate_llmobs_span() and set the llmobs parent ID as a tag at span start time.
    (called by LLMObs._start_span() and BaseLLMIntegration.trace(submit_to_llmobs=True) and the bedrock integration).
  • LLMObs.inject_distributed_headers now uses the LLMObsContextProvider to inject the active llmobs span's ID into request headers
  • LLMObs.activate_distributed_headers() now uses the LLMObsContextProvider to activate the extracted llmobs context to continue the trace in a distributed case.
  • trace_utils.activate_distributed_headers() now includes automatic llmobs context activation if llmobs is enabled. I've config-gated this so that LLMObs is only imported for llmobs users (same for HTTPPropagator.inject().

By keeping track of our own active LLMObs spans, spans submitted to LLM Observability have an independent set of span and parent IDs, even if the span and trace IDs are shared with APM spans for now. This is the first step to decoupling from tracer internals.

TODO: add span/trace ID field for APM correlation

Next steps

We can go further by generating LLMObs-specific span/trace IDs which are separate from APM. This will solve some edge cases with traces involving mixed APM/LLMObs spans.

Checklist

  • PR author has checked that all the criteria below are met
  • The PR description includes an overview of the change
  • The PR description articulates the motivation for the change
  • The change includes tests OR the PR description describes a testing strategy
  • The PR description notes risks associated with the change, if any
  • Newly-added code is easy to change
  • The change follows the library release note guidelines
  • The change includes or references documentation updates if necessary
  • Backport labels are set (if applicable)

Reviewer Checklist

  • Reviewer has checked that all the criteria below are met
  • Title is accurate
  • All changes are related to the pull request's stated goal
  • Avoids breaking API changes
  • Testing strategy adequately addresses listed risks
  • Newly-added code is easy to change
  • Release note makes sense to a user of the library
  • If necessary, author has acknowledged and discussed the performance implications of this PR as reported in the benchmarks PR comment
  • Backport labels are set in a manner that is consistent with the release branch maintenance policy

Copy link
Contributor

github-actions bot commented Sep 23, 2024

CODEOWNERS have been resolved as:

ddtrace/llmobs/_context.py                                              @DataDog/ml-observability
ddtrace/contrib/internal/futures/threading.py                           @DataDog/apm-core-python @DataDog/apm-idm-python
ddtrace/contrib/trace_utils.py                                          @DataDog/apm-core-python @DataDog/apm-idm-python
ddtrace/llmobs/_constants.py                                            @DataDog/ml-observability
ddtrace/llmobs/_integrations/base.py                                    @DataDog/ml-observability
ddtrace/llmobs/_integrations/bedrock.py                                 @DataDog/ml-observability
ddtrace/llmobs/_llmobs.py                                               @DataDog/ml-observability
ddtrace/llmobs/_trace_processor.py                                      @DataDog/ml-observability
ddtrace/llmobs/_utils.py                                                @DataDog/ml-observability
ddtrace/propagation/http.py                                             @DataDog/apm-sdk-api-python
tests/llmobs/test_llmobs_service.py                                     @DataDog/ml-observability
tests/llmobs/test_llmobs_trace_processor.py                             @DataDog/ml-observability
tests/llmobs/test_propagation.py                                        @DataDog/ml-observability
tests/tracer/test_propagation.py                                        @DataDog/apm-sdk-api-python

tests/llmobs/test_propagation.py Show resolved Hide resolved
tests/llmobs/test_propagation.py Show resolved Hide resolved
tests/llmobs/test_propagation.py Show resolved Hide resolved
tests/llmobs/test_propagation.py Show resolved Hide resolved
tests/llmobs/test_propagation.py Show resolved Hide resolved
tests/llmobs/test_propagation.py Show resolved Hide resolved
tests/llmobs/test_llmobs_trace_processor.py Show resolved Hide resolved
tests/llmobs/test_propagation.py Show resolved Hide resolved
tests/llmobs/test_propagation.py Show resolved Hide resolved
ddtrace/contrib/trace_utils.py Outdated Show resolved Hide resolved
@pr-commenter
Copy link

pr-commenter bot commented Sep 23, 2024

Benchmarks

Benchmark execution time: 2024-09-23 22:44:27

Comparing candidate commit 191a0d0 in PR branch yunkim/llmobs-context with baseline commit 33daba9 in branch main.

Found 0 performance improvements and 0 performance regressions! Performance is the same for 356 metrics, 48 unstable metrics.

@Yun-Kim Yun-Kim marked this pull request as ready for review September 23, 2024 22:13
@Yun-Kim Yun-Kim requested review from a team as code owners September 23, 2024 22:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant