Skip to content

Commit

Permalink
Instrumentation cookbook (#583)
Browse files Browse the repository at this point in the history
* Update cookbooks

* Fix

* Fix

* Add

* Update indexg

* resolve comments

* casing

---------

Co-authored-by: Jack Gerrits <jack@jackgerrits.com>
Co-authored-by: Jack Gerrits <jackgerrits@users.noreply.github.com>
  • Loading branch information
3 people authored Sep 23, 2024
1 parent a72ebae commit 0172c71
Show file tree
Hide file tree
Showing 4 changed files with 59 additions and 2 deletions.
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# Instrumentating your code locally

AutoGen supports instrumenting your code using [OpenTelemetry](https://opentelemetry.io). This allows you to collect traces and logs from your code and send them to a backend of your choice.

While debugging, you can use a local backend such as [Aspire](https://aspiredashboard.com/) or [Jaeger](https://www.jaegertracing.io/). In this guide we will use Aspire as an example.

## Setting up Aspire

Follow the instructions [here](https://learn.microsoft.com/en-us/dotnet/aspire/fundamentals/dashboard/overview?tabs=bash#standalone-mode) to set up Aspire in standalone mode. This will require Docker to be installed on your machine.

## Instrumenting your code

Once you have a dashboard set up, now it's a matter of sending traces and logs to it. You can follow the steps in the [Telemetry Guide](../guides/telemetry.md) to set up the opentelemetry sdk and exporter.

After instrumenting your code with the Aspire Dashboard running, you should see traces and logs appear in the dashboard as your code runs.

## Observing LLM calls using Open AI

If you are using the Open AI package, you can observe the LLM calls by setting up the opentelemetry for that library. We use [opentelemetry-instrumentation-openai](https://pypi.org/project/opentelemetry-instrumentation-openai/) in this example.

Install the package:
```bash
pip install opentelemetry-instrumentation-openai
```

Enable the instrumentation:
```python
from opentelemetry.instrumentation.openai import OpenAIInstrumentor

OpenAIInstrumentor().instrument()
```

Now running your code will send traces including the LLM calls to your telemetry backend (Aspire in our case).

![Open AI Telemetry logs](../../images/open-ai-telemetry-example.png)
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,10 @@ These are the components that are currently instrumented:
## Instrumenting your application
To instrument your application, you will need an sdk and an exporter. You may already have these if your application is already instrumented with open telemetry.

## Clean instrumentation

If you do not have open telemetry set up in your application, you can follow these steps to instrument your application.

```bash
pip install opentelemetry-sdk
```
Expand Down Expand Up @@ -41,8 +45,25 @@ def configure_oltp_tracing(endpoint: str = None) -> trace.TracerProvider:

Now you can send the trace_provider when creating your runtime:
```python
single_threaded_runtime = SingleThreadedAgentRuntime(tracer_provider=provider)
worker_runtime = WorkerAgentRuntime(tracer_provider=provider)
# for single threaded runtime
single_threaded_runtime = SingleThreadedAgentRuntime(tracer_provider=tracer_provider)
# or for worker runtime
worker_runtime = WorkerAgentRuntime(tracer_provider=tracer_provider)
```

And that's it! Your application is now instrumented with open telemetry. You can now view your telemetry data in your telemetry backend.

### Exisiting instrumentation

If you have open telemetry already set up in your application, you can pass the tracer provider to the runtime when creating it:
```python
from opentelemetry import trace

# Get the tracer provider from your application
tracer_provider = trace.get_tracer_provider()

# for single threaded runtime
single_threaded_runtime = SingleThreadedAgentRuntime(tracer_provider=tracer_provider)
# or for worker runtime
worker_runtime = WorkerAgentRuntime(tracer_provider=tracer_provider)
```
Original file line number Diff line number Diff line change
Expand Up @@ -94,5 +94,6 @@ cookbook/openai-assistant-agent
cookbook/langgraph-agent
cookbook/llamaindex-agent
cookbook/local-llms-ollama-litellm
cookbook/instrumenting
```
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 0172c71

Please sign in to comment.