Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[loadbalancing-exporter] enabling exporterhelper persistent queue breaks loadbalancer exporting #16826

Closed
stalloneclone opened this issue Dec 8, 2022 · 12 comments · Fixed by #36094
Assignees
Labels
bug Something isn't working exporter/loadbalancing never stale Issues marked with this label will be never staled and automatically removed

Comments

@stalloneclone
Copy link

Component(s)

exporter/loadbalancing

What happened?

Description

Enabling persistent queuing on an otlp exporter configured in the loadbalancing-exporter results in Sender Failed warnings from the batch processor and no logs being exported. When the sending_queue.storage configuration option is removed the loadbalancing-exporter works.

It appears that when persistent queuing is enabled the loadbalancing-exporter is only registering the first endpoint in the hostname list.

Steps to Reproduce

1.) Use the test config provided in loadbalancing-exporter

2.) Add and configure the storage extension and add enable the extension in the service configuration.

3.) Enable persistent queuing via sending_queue.storage config options.

Expected Result

Expected results are logs being exported via the loadbalancing-exporter to the backend receivers.

Actual Result

This config results in errors (listed below) and a failure to export logs.

Collector version

v0.64.0

Environment information

Environment

OS: Debian 11

Otel was installed via deb pkg from releases.

OpenTelemetry Collector configuration

receivers:
  otlp/loadbalancer:
    protocols:
      grpc:
        endpoint: localhost:4317
  otlp/backend-1:
    protocols:
      grpc:
        endpoint: localhost:55690
  otlp/backend-2:
    protocols:
      grpc:
        endpoint: localhost:55700
  otlp/backend-3:
    protocols:
      grpc:
        endpoint: localhost:55710
  otlp/backend-4:
    protocols:
      grpc:
        endpoint: localhost:55720
  filelog:
    include:
      - /var/log/syslog
    start_at: end
    include_file_path: true
    include_file_name: true
    storage: file_storage

processors:
  batch:
    send_batch_size: 10000
    timeout: 200ms

extensions:
  file_storage:
    directory: /var/lib/otelcol
    timeout: 1s
    compaction:
      on_start: true
      on_rebound: true
      directory: /tmp
      max_transaction_size: 65536

exporters:
  logging:
  loadbalancing:
    protocol:
      otlp:
        timeout: 1s
        tls:
          insecure: true
        retry_on_failure:
          enabled: true
          initial_interval: 5s
          max_interval: 30s
          max_elapsed_time: 5m
        sending_queue:
          enabled: true
          storage: file_storage
          num_consumers: 10
          queue_size: 5000
    resolver:
      static:
        hostnames:
        - localhost:55690
        - localhost:55700
        - localhost:55710
        - localhost:55720

service:
  extensions: [file_storage]
  pipelines:
    logs/loadbalancer:
      receivers:
        - filelog
      processors: [batch]
      exporters:
        - loadbalancing
    logs/backend-1:
      receivers:
        - otlp/backend-1
      processors: []
      exporters:
        - logging
    logs/backend-2:
      receivers:
        - otlp/backend-2
      processors: []
      exporters:
        - logging
    logs/backend-3:
      receivers:
        - otlp/backend-3
      processors: []
      exporters:
        - logging
    logs/backend-4:
      receivers:
        - otlp/backend-4
      processors: []
      exporters:
        - logging

Log output

Dec 08 21:39:10 otel-agent systemd[1]: Started OpenTelemetry Collector Contrib.

Dec 08 21:39:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:10.713Z        info        service/telemetry.go:110        Setting up own telemetry...
Dec 08 21:39:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:10.714Z        info        service/telemetry.go:140        Serving Prometheus metrics        {"address": ":8888", "level": "basic"}
Dec 08 21:39:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:10.714Z        info        components/components.go:30        In development component. May change in the future.        {"kind": "exporter", "data_type": "logs", "name": "logging", "stability": "in development"}
Dec 08 21:39:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:10.715Z        info        service/service.go:89        Starting otelcol-contrib...        {"Version": "0.64.0", "NumCPU": 1}
Dec 08 21:39:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:10.715Z        info        extensions/extensions.go:41        Starting extensions...
Dec 08 21:39:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:10.715Z        info        extensions/extensions.go:44        Extension is starting...        {"kind": "extension", "name": "file_storage"}
Dec 08 21:39:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:10.715Z        info        extensions/extensions.go:48        Extension started.        {"kind": "extension", "name": "file_storage"}
Dec 08 21:39:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:10.715Z        info        pipelines/pipelines.go:74        Starting exporters...
Dec 08 21:39:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:10.715Z        info        pipelines/pipelines.go:78        Exporter is starting...        {"kind": "exporter", "data_type": "logs", "name": "logging"}
Dec 08 21:39:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:10.715Z        info        pipelines/pipelines.go:82        Exporter started.        {"kind": "exporter", "data_type": "logs", "name": "logging"}
Dec 08 21:39:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:10.715Z        info        pipelines/pipelines.go:78        Exporter is starting...        {"kind": "exporter", "data_type": "logs", "name": "loadbalancing"}
Dec 08 21:39:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:10.717Z        info        filestorage/client.go:244        finished compaction        {"kind": "extension", "name": "file_storage", "directory": "/var/lib/otelcol/exporter_otlp__logs", "elapsed": 0.00013827}
Dec 08 21:39:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:10.720Z        warn        zapgrpc/zapgrpc.go:191        [core] [Channel #1 SubChannel #2] grpc: addrConn.createTransport failed to connect to {
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.582Z        info        pipelines/pipelines.go:82        Exporter started.        {"kind": "exporter", "data_type": "logs", "name": "loadbalancing"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.582Z        info        pipelines/pipelines.go:86        Starting processors...
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.582Z        info        pipelines/pipelines.go:90        Processor is starting...        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.582Z        info        pipelines/pipelines.go:94        Processor started.        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.582Z        info        pipelines/pipelines.go:98        Starting receivers...
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.582Z        info        pipelines/pipelines.go:102        Receiver is starting...        {"kind": "receiver", "name": "otlp/backend-4", "pipeline": "logs"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.582Z        info        otlpreceiver@v0.64.0/otlp.go:71        Starting GRPC server        {"kind": "receiver", "name": "otlp/backend-4", "pipeline": "logs", "endpoint": "localhost:55720"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.582Z        info        pipelines/pipelines.go:106        Receiver started.        {"kind": "receiver", "name": "otlp/backend-4", "pipeline": "logs"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.582Z        info        pipelines/pipelines.go:102        Receiver is starting...        {"kind": "receiver", "name": "otlp/backend-3", "pipeline": "logs"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.582Z        info        otlpreceiver@v0.64.0/otlp.go:71        Starting GRPC server        {"kind": "receiver", "name": "otlp/backend-3", "pipeline": "logs", "endpoint": "localhost:55710"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.583Z        info        pipelines/pipelines.go:106        Receiver started.        {"kind": "receiver", "name": "otlp/backend-3", "pipeline": "logs"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.583Z        info        pipelines/pipelines.go:102        Receiver is starting...        {"kind": "receiver", "name": "otlp/backend-1", "pipeline": "logs"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.583Z        info        otlpreceiver@v0.64.0/otlp.go:71        Starting GRPC server        {"kind": "receiver", "name": "otlp/backend-1", "pipeline": "logs", "endpoint": "localhost:55690"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.583Z        info        pipelines/pipelines.go:106        Receiver started.        {"kind": "receiver", "name": "otlp/backend-1", "pipeline": "logs"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.583Z        info        pipelines/pipelines.go:102        Receiver is starting...        {"kind": "receiver", "name": "otlp/backend-2", "pipeline": "logs"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.583Z        info        otlpreceiver@v0.64.0/otlp.go:71        Starting GRPC server        {"kind": "receiver", "name": "otlp/backend-2", "pipeline": "logs", "endpoint": "localhost:55700"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.583Z        info        pipelines/pipelines.go:106        Receiver started.        {"kind": "receiver", "name": "otlp/backend-2", "pipeline": "logs"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.583Z        info        pipelines/pipelines.go:102        Receiver is starting...        {"kind": "receiver", "name": "filelog", "pipeline": "logs"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.583Z        info        adapter/receiver.go:55        Starting stanza receiver        {"kind": "receiver", "name": "filelog", "pipeline": "logs"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.586Z        info        filestorage/client.go:244        finished compaction        {"kind": "extension", "name": "file_storage", "directory": "/var/lib/otelcol/receiver_filelog_", "elapsed": 0.0002708}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.587Z        info        fileconsumer/file.go:312        Resuming from previously known offset(s). 'start_at' setting is not applicable.        {"kind": "receiver", "name": "filelog", "pipeline": "logs", "component": "fileconsumer"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.587Z        info        pipelines/pipelines.go:106        Receiver started.        {"kind": "receiver", "name": "filelog", "pipeline": "logs"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.587Z        info        service/service.go:106        Everything is ready. Begin running and processing data.
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.788Z        info        fileconsumer/file.go:159        Started watching file        {"kind": "receiver", "name": "filelog", "pipeline": "logs", "component": "fileconsumer", "path": "/var/log/syslog"}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.989Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.997Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 100}
Dec 08 21:39:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:13.999Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 12}
Dec 08 21:39:14 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:14.194Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\"; couldn't find the exporter for the endpoint \"localhost:55710\"", "errorCauses": [{"error": "couldn't find the exporter for the endpoint \"localhost:55720\""}, {"error": "couldn't find the exporter for the endpoint \"localhost:55710\""}]}
Dec 08 21:39:14 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:14.394Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:39:14 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:14.596Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:39:14 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:14.797Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:39:14 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:14.999Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:39:15 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:15.199Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:39:15 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:15.400Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:39:15 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:15.602Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:39:15 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:15.802Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:39:16 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:16.004Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:39:16 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:16.205Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:39:39 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:39.084Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:39:39 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:39.287Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:39:39 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:39:39.488Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:09.183Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:09.386Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:09.587Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:09.788Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:09.989Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:10.192Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:10.392Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:10.592Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:10.793Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:10.995Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:11 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:11.197Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:11 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:11.398Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:11 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:11.598Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:11 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:11.800Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:12.001Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:12.201Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:12.402Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:12.603Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:12.804Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:13.006Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:13.205Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:13.407Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:13.608Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:13.809Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:14 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:14.009Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:14 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:14.211Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:14 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:14.413Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:14 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:14.613Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:39 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:39.107Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:39 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:39.308Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:39 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:39.509Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:39 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:39.710Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:39 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:39.912Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:40 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:40.113Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:40 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:40.314Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:40 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:40.515Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:40 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:40.716Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:40 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:40.917Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:41 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:41.118Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:41 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:41.320Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:41 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:41.521Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:41 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:41.724Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:41 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:41.923Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:42 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:42.124Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:42 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:42.325Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:42 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:42.526Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:42 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:42.727Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:42 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:42.929Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:43 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:43.129Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:43 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:43.331Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:43 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:43.532Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:43 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:43.734Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:43 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:43.936Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:44 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:44.136Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:44 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:44.337Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:44 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:44.538Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:44 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:44.739Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:44 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:44.940Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:45 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:45.142Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:45 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:45.342Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:45 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:45.542Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:45 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:45.744Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:45 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:45.945Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:46 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:46.146Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:46 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:46.347Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:46 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:46.547Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:46 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:46.748Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:46 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:46.951Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:47 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:47.150Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:47 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:47.351Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:47 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:47.553Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:47 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:47.752Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:47 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:47.953Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:48 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:48.154Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:48 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:48.354Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:48 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:48.556Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:48 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:48.756Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:40:48 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:48.959Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:49 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:49.160Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:49 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:49.360Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:49 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:49.560Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:49 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:49.762Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:49 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:49.962Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:40:50 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:50.162Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:50 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:50.364Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:40:50 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:50.565Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:40:50 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:40:50.765Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:41:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:09.050Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:41:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:09.251Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:41:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:09.454Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:41:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:09.656Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:41:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:09.856Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:10.057Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:10.259Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:41:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:10.459Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:10.660Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:10.861Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:11 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:11.061Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:11 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:11.262Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:11 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:11.464Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:41:11 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:11.664Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:41:11 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:11.865Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:12.067Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:41:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:12.269Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:41:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:12.470Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:12.672Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:41:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:12.872Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:41:39 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:39.165Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:41:39 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:39.365Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:39 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:39.566Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:39 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:39.767Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:41:39 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:39.967Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:41:40 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:40.168Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:40 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:40.368Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:41:40 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:40.568Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:40 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:40.770Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:41:40 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:40.971Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:41:41 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:41.172Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:41 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:41.374Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:41:41 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:41.575Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:41:41 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:41.777Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:41:41 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:41.977Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:42 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:42.178Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:42 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:42.380Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:41:42 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:42.581Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:41:42 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:42.782Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:41:42 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:42.983Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:41:43 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:43.185Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:41:43 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:43.387Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:41:43 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:43.588Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:41:43 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:43.790Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:41:43 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:41:43.990Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:42:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:09.099Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:42:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:09.300Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:42:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:09.500Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:42:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:09.701Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:42:09 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:09.902Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:42:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:10.103Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:42:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:10.303Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:42:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:10.504Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:42:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:10.705Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55720\""}
Dec 08 21:42:10 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:10.907Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:42:11 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:11.108Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:42:11 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:11.310Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:42:11 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:11.510Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:42:11 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:11.711Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:42:11 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:11.913Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:42:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:12.113Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55700\""}
Dec 08 21:42:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:12.315Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:42:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:12.516Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:42:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:12.716Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:42:12 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:12.917Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:42:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:13.118Z        info        LogsExporter        {"kind": "exporter", "data_type": "logs", "name": "logging", "#logs": 1}
Dec 08 21:42:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:13.318Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}
Dec 08 21:42:13 otel-agent otelcol-contrib[2229801]: 2022-12-08T21:42:13.520Z        warn        batchprocessor@v0.64.0/batch_processor.go:178        Sender failed        {"kind": "processor", "name": "batch", "pipeline": "logs/loadbalancer", "error": "couldn't find the exporter for the endpoint \"localhost:55710\""}

Additional context

No response

@stalloneclone stalloneclone added bug Something isn't working needs triage New item requiring triage labels Dec 8, 2022
@github-actions
Copy link
Contributor

github-actions bot commented Dec 8, 2022

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@jpkrohling
Copy link
Member

That's an interesting combination. Do you know whether the persistent queue feature supports multiple exporters using the same file storage?

@rpasche
Copy link

rpasche commented Mar 8, 2023

I have the exact same issue running otel-contrib 0.70.0. Removing the sending_queue configuration makes the exporter work again.

@atoulme atoulme added exporter/loadbalancing and removed needs triage New item requiring triage labels Mar 10, 2023
@github-actions
Copy link
Contributor

Pinging code owners for exporter/loadbalancing: @jpkrohling. See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions
Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Oct 30, 2023
@jpkrohling jpkrohling removed the Stale label Nov 1, 2023
Copy link
Contributor

github-actions bot commented Jan 1, 2024

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@mrsimo
Copy link

mrsimo commented Apr 3, 2024

Just wanted to mention that I just found out about this when trying to add a persistent queue to our setup. The errors are a bit misleading, too. Perhaps it would be possible to error out earlier if the storage setting is passed to avoid some headaches while support for storage is implemented (or not).

Thanks a lot 😄

@jpkrohling jpkrohling self-assigned this Apr 30, 2024
@MXfive
Copy link

MXfive commented Sep 19, 2024

Does anyone have any pointers as to where the issue for this lies? Keen to dive into this and develop a fix.

@bisharaah
Copy link

UP!
can you look into this please.

@an-mmx
Copy link
Contributor

an-mmx commented Sep 26, 2024

Is there any chance to make it work?

shivanthzen pushed a commit to shivanthzen/opentelemetry-collector-contrib that referenced this issue Dec 5, 2024
…e and timeout settings (open-telemetry#36094)

#### Description
##### Problem statement
`loadbalancing` exporter is actually a wrapper that's creates and
manages set of actual `otlp` exporters
Those `otlp` exporters technically shares same configuration parameters
that are defined on `loadbalancing` exporter level, including
`sending_queue` configuration. The only difference is `endpoint`
parameter that are substituted by `loadbalancing` exporter itself
This means, that `sending_queue`, `retry_on_failure` and `timeout`
settings can be defined only on `otlp` sub-exporters, while top-level
`loadbalancing` exporter is missing all those settings
This configuration approach produces several issue, that are already
reported by users:
* Impossibility to use Persistent Queue in `loadbalancing` exporter (see
open-telemetry#16826). That's happens because `otlp` sub-exporters are sharing the
same configurations, including configuration of the queue, i.e. they all
are using the same `storage` instance at the same time which is not
possible at the moment
* Data loss even using `sending_queue` configuration (see open-telemetry#35378).
That's happens because Queue is defined on level of `otlp` sub-exporters
and if this exporter cannot flush data from queue (for example, endpoint
is not available anymore) there is no other options that just to discard
data from queue, i.e. there is no higher level queue and persistent
storage where data can be returned is case of permanent failure

There might be some other potential issue that was already tracked and
related to current configuration approach

##### Proposed solution
The easiest way to solve issues above - is to use standard approach for
queue, retry and timeout configuration using `exporterhelper`
This will bring queue, retry and timeout functionality to the top-level
of `loadbalancing` exporter, instead of `otlp` sub-exporters
Related to mentioned issues it will bring:
* Single Persistent Queue, that is used by all `otlp` sub-exporters (not
directly of course)
* Queue will not be discarded/destroyed if any (or all) of endpoint that
are unreachable anymore, top-level queue will keep data until new
endpoints will be available
* Scale-up and scale-down event for next layer of OpenTelemetry
Collectors in K8s environments will be more predictable, and will not
include data loss anymore (potential fix for open-telemetry#33959). There is still a
big chance of inconsistency when some data will be send to incorrect
endpoint, but it's already better state that we have right now

##### Noticeable changes
* `loadbalancing` exporter on top-level now uses `exporterhelper` with
all supported functionality by it
* `sending_queue` will be automatically disabled on `otlp` exporters
when it already present on top-level `loadbalancing` exporter. This
change is done to prevent data loss on `otlp` exporters because queue
there doesn't provide expected result. Also it will prevent potential
misconfiguration from user side and as result - irrelevant reported
issues
* `exporter` attribute for metrics generated from `otlp` sub-exporters
now includes endpoint for better visibility and to segregate them from
top-level `loadbalancing` exporter - was `"exporter": "loadbalancing"`,
now `"exporter": "loadbalancing/127.0.0.1:4317"`
* logs, generated by `otlp` sub-exporters now includes additional
attribute `endpoint` with endpoint value with the same reasons as for
metrics

#### Link to tracking issue
Fixes open-telemetry#35378
Fixes open-telemetry#16826

#### Testing
Proposed changes was heavily tested on large K8s environment with set of
different scale-up/scale-down event using persistent queue configuration
- no data loss were detected, everything works as expected

#### Documentation
`README.md` was updated to reflect new configuration parameters
available. Sample `config.yaml` was updated as well
ZenoCC-Peng pushed a commit to ZenoCC-Peng/opentelemetry-collector-contrib that referenced this issue Dec 6, 2024
…e and timeout settings (open-telemetry#36094)

#### Description
##### Problem statement
`loadbalancing` exporter is actually a wrapper that's creates and
manages set of actual `otlp` exporters
Those `otlp` exporters technically shares same configuration parameters
that are defined on `loadbalancing` exporter level, including
`sending_queue` configuration. The only difference is `endpoint`
parameter that are substituted by `loadbalancing` exporter itself
This means, that `sending_queue`, `retry_on_failure` and `timeout`
settings can be defined only on `otlp` sub-exporters, while top-level
`loadbalancing` exporter is missing all those settings
This configuration approach produces several issue, that are already
reported by users:
* Impossibility to use Persistent Queue in `loadbalancing` exporter (see
open-telemetry#16826). That's happens because `otlp` sub-exporters are sharing the
same configurations, including configuration of the queue, i.e. they all
are using the same `storage` instance at the same time which is not
possible at the moment
* Data loss even using `sending_queue` configuration (see open-telemetry#35378).
That's happens because Queue is defined on level of `otlp` sub-exporters
and if this exporter cannot flush data from queue (for example, endpoint
is not available anymore) there is no other options that just to discard
data from queue, i.e. there is no higher level queue and persistent
storage where data can be returned is case of permanent failure

There might be some other potential issue that was already tracked and
related to current configuration approach

##### Proposed solution
The easiest way to solve issues above - is to use standard approach for
queue, retry and timeout configuration using `exporterhelper`
This will bring queue, retry and timeout functionality to the top-level
of `loadbalancing` exporter, instead of `otlp` sub-exporters
Related to mentioned issues it will bring:
* Single Persistent Queue, that is used by all `otlp` sub-exporters (not
directly of course)
* Queue will not be discarded/destroyed if any (or all) of endpoint that
are unreachable anymore, top-level queue will keep data until new
endpoints will be available
* Scale-up and scale-down event for next layer of OpenTelemetry
Collectors in K8s environments will be more predictable, and will not
include data loss anymore (potential fix for open-telemetry#33959). There is still a
big chance of inconsistency when some data will be send to incorrect
endpoint, but it's already better state that we have right now

##### Noticeable changes
* `loadbalancing` exporter on top-level now uses `exporterhelper` with
all supported functionality by it
* `sending_queue` will be automatically disabled on `otlp` exporters
when it already present on top-level `loadbalancing` exporter. This
change is done to prevent data loss on `otlp` exporters because queue
there doesn't provide expected result. Also it will prevent potential
misconfiguration from user side and as result - irrelevant reported
issues
* `exporter` attribute for metrics generated from `otlp` sub-exporters
now includes endpoint for better visibility and to segregate them from
top-level `loadbalancing` exporter - was `"exporter": "loadbalancing"`,
now `"exporter": "loadbalancing/127.0.0.1:4317"`
* logs, generated by `otlp` sub-exporters now includes additional
attribute `endpoint` with endpoint value with the same reasons as for
metrics

#### Link to tracking issue
Fixes open-telemetry#35378
Fixes open-telemetry#16826

#### Testing
Proposed changes was heavily tested on large K8s environment with set of
different scale-up/scale-down event using persistent queue configuration
- no data loss were detected, everything works as expected

#### Documentation
`README.md` was updated to reflect new configuration parameters
available. Sample `config.yaml` was updated as well
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working exporter/loadbalancing never stale Issues marked with this label will be never staled and automatically removed
Projects
None yet
8 participants