Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I would like to configure Open Telemetry Console Exporter to write using Json Format #5036

Closed
willsbctm opened this issue Nov 9, 2023 · 12 comments
Labels
help wanted Good for taking. Extra help will be provided by maintainers question Further information is requested

Comments

@willsbctm
Copy link

willsbctm commented Nov 9, 2023

Question

I would like to configure Open Telemetry Console Exporter to write using Json Format. Is there a way?

Details

App: Dotnet web api 7.0
Packages:

  • OpenTelemetry.Exporter.Console Version 1.6.0
  • OpenTelemetry.Extensions.Hosting Version 1.6.0
  • OpenTelemetry.Instrumentation.AspNetCore Version 1.6.0-beta.2
  • OpenTelemetry.Instrumentation.Http Version 1.6.0-beta.2
  • OpenTelemetry.Instrumentation.Runtime Version 1.5.1

Setup

// Logging setup
builder.Logging.ClearProviders();
builder.Logging.AddOpenTelemetry(logging =>
{
    logging.IncludeScopes = true;
    var resourceBuilder = ResourceBuilder
       .CreateDefault()
       .AddService(serviceName);

    logging.SetResourceBuilder(resourceBuilder).AddConsoleExporter(config =>
    {

    });
}).AddConsoleFormatter<CustomFormatter, ConsoleFormatterOptions>();

builder.Services.AddOpenTelemetry()
    // Tracing setup
    .WithTracing(b =>
    {
        b.AddSource(serviceName)
            .ConfigureResource(resource => resource
                .AddService(serviceName: serviceName,
                    serviceVersion: serviceVersion))
            .AddAspNetCoreInstrumentation()
            .AddHttpClientInstrumentation()
            .AddConsoleExporter();
    })
    // Metrics setup
    .WithMetrics(builder => builder
        .AddAspNetCoreInstrumentation()
        .AddRuntimeInstrumentation()
        .AddHttpClientInstrumentation()
        .AddConsoleExporter());

Output:

Activity.TraceId:            4957f2d7c3d7449acbd8a2eee5c7726c
Activity.SpanId:             b507258da46529d6
Activity.TraceFlags:         Recorded
Activity.ActivitySourceName: Microsoft.AspNetCore
Activity.DisplayName:        SampleRoute
Activity.Kind:               Server
Activity.StartTime:          2023-11-09T20:44:17.1290533Z
Activity.Duration:           00:00:00.0463293
Activity.Tags:
    net.host.name: localhost
    net.host.port: 7096
    http.method: GET
    http.scheme: https
    http.target: /SampleRoute
    http.url: https://localhost:7096/SampleRoute
    http.flavor: 2.0
    http.user_agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/119.0.0.0 Safari/537.36 Edg/119.0.0.0
    http.route: SampleRoute
    http.status_code: 200
Resource associated with Activity:
    service.name: MyService
    service.version: 1.0.0
    service.instance.id: eb0de8e3-42e4-4576-823f-71fff55a4b28
    telemetry.sdk.name: opentelemetry
    telemetry.sdk.language: dotnet
    telemetry.sdk.version: 1.6.0

Desired output:

{
    "Activity": { 
        "TraceId":  "4957f2d7c3d7449acbd8a2eee5c7726c"
        "SpanId":   "b507258da46529d6"
         "...": "..."
}
@willsbctm willsbctm added the question Further information is requested label Nov 9, 2023
@willsbctm willsbctm changed the title Use Console Exporter to write using JSon Formatter I would like to configure Open Telemetry Exporter to write using Json Format Nov 9, 2023
@willsbctm willsbctm changed the title I would like to configure Open Telemetry Exporter to write using Json Format I would like to configure Open Telemetry Console Exporter to write using Json Format Nov 9, 2023
@utpilla
Copy link
Contributor

utpilla commented Nov 9, 2023

@willsbctm Currently, ConsoleExporter does not support JsonFormat for the output. Would you like to contribute to the repo by updating the ConsoleExporter to do that? We have had similar asks before but we have a few requirements from our end when it comes to updating the ConsoleExporter. Check this discussion: #4548 (comment)

If you wouldn't be interested in doing that, you could always build your own exporter to print out the data in any format that you like: https://github.com/open-telemetry/opentelemetry-dotnet/tree/main/docs/trace/extending-the-sdk#exporter

@utpilla utpilla added the help wanted Good for taking. Extra help will be provided by maintainers label Nov 15, 2023
@lindeberg
Copy link

@utpilla I tried building an exporter to print json:

public class JsonConsoleExporter : BaseExporter<LogRecord>
{
    public override ExportResult Export(in Batch<LogRecord> batch)
    {
        using var scope = SuppressInstrumentationScope.Begin();

        foreach (var record in batch)
        {
            Console.WriteLine(JsonSerializer.Serialize(record));
        }

        return ExportResult.Success;
    }
}

I noticed Severity and more are not being serialized, because they are in the current experimental state Internal properties. Is there any way to opt-in to the experimental properties? The same question was asked here.

@lindeberg
Copy link

lindeberg commented Dec 7, 2023

Let me explain why a console json exporter is important. In the cloud native environment scraping logs from stdout (console) is the industry standard. When log records are printed in json-format, and scraped, backends like Loki are able to parse the data. I'm working in a large organization and we are wondering what to do. And if OTel is all about OTLP, even when the stdout scraping is so common.

@cijothomas
Copy link
Member

Please check the readme of Console exporter: https://github.com/open-telemetry/opentelemetry-dotnet/tree/main/src/OpenTelemetry.Exporter.Console#console-exporter-for-opentelemetry-net

This is not designed for use in any production systems. This is not optimized for performance needs of prod workloads, and there is no guarantee that the output format will remain same.

You may want to consider using OTLPExporter, and have collector output to console (or better - have the collector send to Loki directly!), for better performance and compatibility!

@lindeberg
Copy link

@cijothomas Okay, thanks. Could you guide me further?

We are not yet able to migrate to OTel Collector, and are currently using Grafana Agent. It collects logs from the files written by the container runtime for all pods on the k8s hosts. It doesn't directly read from stdout. Scraping these files is very reliable because if an app crashes, the log files created by the container runtime will still exist for the Grafana Agent to retrieve. This way, developers don't need to implement or worry about buffering, retry mechanisms, endpoints, etc. In case Grafana Agent or Loki are down, they will read logs from the files when they come back up. How is this considered in the OTLP push model?

@lindeberg
Copy link

@davidfowl We've seen your tweets favoring OTLP over stdout, would love to hear your thoughts too. :D

@cijothomas
Copy link
Member

cijothomas commented Dec 7, 2023

@cijothomas Okay, thanks. Could you guide me further?

We are not yet able to migrate to OTel Collector, and are currently using Grafana Agent. It collects logs from the files written by the container runtime for all pods on the k8s hosts. It doesn't directly read from stdout. Scraping these files is very reliable because if an app crashes, the log files created by the container runtime will still exist for the Grafana Agent to retrieve. This way, developers don't need to implement or worry about buffering, retry mechanisms, endpoints, etc. In case Grafana Agent or Loki are down, they will read logs from the files when they come back up. How is this considered in the OTLP push model?

Isn't Grafana Agent capable of accepting OTLP directly? If you are worried about losing telemetry while the agent is temporarily down, then that is a problem to be solved in the OTLP Exporter itself by doing retry etc. Tracking issue : #4791

Using the ConsoleExporter from this repo will kill your apps throughput, as this uses "Simple" processor, which pushes every telemetry as soon as they arrive to console, but takes a lock, so throughput will be affected!
https://github.com/open-telemetry/opentelemetry-dotnet/blob/main/src/OpenTelemetry.Exporter.Console/ConsoleExporterHelperExtensions.cs#L68

(It is of course possible to improve the ConsoleExporter's performance, similar to how ConsoleLoggerProvider from .NET itself works, but as far as I can tell - there are no such plans in this repo.)

You maybe better off leveraging OTLP -> Agent + accept the temporary data loss and wait (contribute) for #4791 to resolve to improve reliability.

@pyohannes
Copy link
Contributor

Isn't Grafana Agent capable of accepting OTLP directly?

Yes, it is: https://grafana.com/docs/agent/latest/flow/getting-started/collect-opentelemetry-data/

Nevertheless, I get @lindeberg's point about wanting to log to stdout, it's the most common practice for Kubernetes applications, and it allows you to implement and configure your service/container without tying to a logging standard or framework:

The easiest and most adopted logging method for containerized applications is writing to standard output and standard error streams.

For these reasons, and for a seamless integration of OTel .NET into the K8s ecosystem, it would be very nice to have performant stdout logging.

In case Grafana Agent or Loki are down, they will read logs from the files when they come back up.

That's true, however, if you rely on stdout logging and any of your nodes go down, you'll also lose data, as the log files will be lost. There are reliability drawbacks for both strategies.

@danielcweber
Copy link

@lindeberg would this PR work for your use case?

@willsbctm
Copy link
Author

Thanks for the replies.

@CodeBlanch
Copy link
Member

Tagging @matt-hensley who expressed some interest on the SIG today for picking up this effort.

@cijothomas
Copy link
Member

Tagging @matt-hensley who expressed some interest on the SIG today for picking up this effort.

Official stdout is likely to get part of spec: open-telemetry/opentelemetry-specification#4183

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Good for taking. Extra help will be provided by maintainers question Further information is requested
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants