Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[receiver/otlp] Responded with HTTP Status Code 500 instead of 429 if soft memory limit is reached #9636

Closed
marcinsiennicki95 opened this issue Feb 26, 2024 · 2 comments · Fixed by #9357
Labels
bug Something isn't working

Comments

@marcinsiennicki95
Copy link

Describe the bug

I am currently configuring the OpenTelemetry Collector to gather logs through the FileLog Receiver, which are then transmitted to Second OpenTelemetry Colector via oltphttp exporter. First OpenTelemetry Collector ingests logs through the FileLog Reciver and forwards them using OpenTelemetry Output. In this configuration, OpenTelemetry acts as a proxy for OpenTelemetry Colelctor.

When a log file was moved and processed by the first collector, both collectors reached their soft limits the same situation is when the reach hard limit. At this point, the second collector started rejecting data, redirecting it back to the receiver. As a result, the first OpenTelemetry Collector encountered errors for both logs and metrics.
I expected that the first collector would obtain a recoverable error, such as "too many requests," so when the second collector resumes normal operation, pending requests can be resent instead of being dropped.

Steps to reproduce

  1. First Collector Started
  2. Second Collector Started
  3. Move Errorlog (1mln logs) file to destination
  4. Turn off Second Collector
  5. Turn off First Collector

Communication between 2 Instances of OpenTelemetry Collector with memory_limiter processors

First OpenTelemetry Collector configuration

receivers:
  filelog:
    include: [ C:/ERRORLOG ]
    encoding: utf-16le
    include_file_path: true
    max_log_size: 1MiB
    storage: file_storage
    poll_interval: 10s
    retry_on_failure:
        enabled: true
  
  hostmetrics:
    collection_interval: 2s
    scrapers:
      cpu:
      memory:

exporters:
  logging:

  otlphttp:
    endpoint: http://127.0.0.1:4320
    tls:
      insecure: true

extensions:
  file_storage:
    directory: C:/collector
    fsync: true

processors:
  batch:
    send_batch_size: 1000
  memory_limiter:
    check_interval: 1s
    limit_mib: 400
    spike_limit_mib: 250

service:
  extensions: [file_storage]
  pipelines:
    logs:
      receivers: [filelog]
      processors: [batch]
      exporters: [logging, otlphttp]
    metrics:
      receivers: [hostmetrics]
      processors: [batch]
      exporters: [logging, otlphttp]

Log output for first Collector

PS C:\Users\kCuraCloudAdmin> C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\otelcol-contrib.exe --config C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\config.yaml
2024-02-12T12:12:06.088Z        info    service@v0.93.0/service.go:165  Everything is ready. Begin running and processing data.
2024-02-12T12:12:07.229Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
"resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:12:25.077Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:12:25.078Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.
0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "5.5346304s"}
2024-02-12T12:12:27.150Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:12:27.322Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.
0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.706840488s"}
2024-02-12T12:12:27.475Z        info    otelcol@v0.93.0/collector.go:258        Received signal from OS {"signal": "interrupt"}
2024-02-12T12:12:27.475Z        info    service@v0.93.0/service.go:179  Starting shutdown...
2024-02-12T12:12:27.479Z        info    adapter/receiver.go:140 Stopping stanza receiver        {"kind": "receiver", "name": "filelog/continues", "data_type": "logs"}
2024-02-12T12:12:27.496Z        info    adapter/receiver.go:140 Stopping stanza receiver        {"kind": "receiver", "name": "filelog", "data_type": "logs"}
2024-02-12T12:12:27.517Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1
:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:12:27.523Z        info    extensions/extensions.go:59     Stopping extensions...
2024-02-12T12:12:27.523Z        info    service@v0.93.0/service.go:193  Shutdown complete.
PS C:\Users\kCuraCloudAdmin> C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\otelcol-contrib.exe --config C:\\collector\\otelcol-contrib_0.93.0_windows_amd64\\config.yaml
2024-02-12T12:13:12.508Z        info    service@v0.93.0/telemetry.go:76 Setting up own telemetry...
2024-02-12T12:13:12.508Z        info    service@v0.93.0/telemetry.go:146        Serving metrics {"address": ":8888", "level": "Basic"}
2024-02-12T12:13:12.512Z        info    exporter@v0.93.0/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "logs", "name": "logging"}
2024-02-12T12:13:12.513Z        info    exporter@v0.93.0/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "metrics", "name": "logging"}
2024-02-12T12:13:12.514Z        info    memorylimiter/memorylimiter.go:77       Memory limiter configured       {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "limit_mib": 400, "spike_limit_mib": 250, "check_interval": 1}
2024-02-12T12:13:12.514Z        info    service@v0.93.0/service.go:139  Starting otelcol-contrib...     {"Version": "0.93.0", "NumCPU": 2}
2024-02-12T12:13:12.515Z        info    extensions/extensions.go:34     Starting extensions...
2024-02-12T12:13:12.515Z        info    extensions/extensions.go:37     Extension is starting...        {"kind": "extension", "name": "file_storage"}
2024-02-12T12:13:12.515Z        info    extensions/extensions.go:52     Extension started.      {"kind": "extension", "name": "file_storage"}
2024-02-12T12:13:12.516Z        info    adapter/receiver.go:45  Starting stanza receiver        {"kind": "receiver", "name": "filelog", "data_type": "logs"}
2024-02-12T12:13:12.547Z        warn    fileconsumer/file.go:51 finding files: no files match the configured criteria   {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer"}
2024-02-12T12:13:12.567Z        info    adapter/receiver.go:45  Starting stanza receiver        {"kind": "receiver", "name": "filelog/continues", "data_type": "logs"}
2024-02-12T12:13:12.582Z        warn    fileconsumer/file.go:51 finding files: no files match the configured criteria   {"kind": "receiver", "name": "filelog/continues", "data_type": "logs", "component": "fileconsumer"}
2024-02-12T12:13:12.594Z        info    service@v0.93.0/service.go:165  Everything is ready. Begin running and processing data.
2024-02-12T12:13:13.757Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:13.758Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "3.935162309s"}
2024-02-12T12:13:15.631Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:15.636Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.263222629s"}
2024-02-12T12:13:17.739Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:19.633Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:31.773Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:32.702Z        info    fileconsumer/file.go:268        Started watching file   {"kind": "receiver", "name": "filelog", "data_type": "logs", "component": "fileconsumer", "path": "C:\\ERRORLOG"}
2024-02-12T12:13:32.708Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:32.712Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:33.759Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.018Z        info    memorylimiter/memorylimiter.go:222      Memory usage is above soft limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 155}
2024-02-12T12:13:36.087Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.708Z        error   scraperhelper/scrapercontroller.go:200  Error scraping metrics  {"kind": "receiver", "name": "hostmetrics", "data_type": "metrics", "error": "context deadline exceeded", "scraper": "cpu"}
go.opentelemetry.io/collector/receiver/scraperhelper.(*controller).scrapeMetricsAndReport
        go.opentelemetry.io/collector/receiver@v0.93.0/scraperhelper/scrapercontroller.go:200
go.opentelemetry.io/collector/receiver/scraperhelper.(*controller).startScraping.func1
        go.opentelemetry.io/collector/receiver@v0.93.0/scraperhelper/scrapercontroller.go:176
2024-02-12T12:13:36.714Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 1, "metrics": 1, "data points": 2}
2024-02-12T12:13:36.839Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.922Z        info    memorylimiter/memorylimiter.go:192      Memory usage after GC.  {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 130}
2024-02-12T12:13:36.923Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:37.532Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 210}
2024-02-12T12:13:38.519Z        info    memorylimiter/memorylimiter.go:215      Memory usage back within limits. Resuming normal operation.     {"kind": "processor", "name": "memory_limiter", "pipeline": "logs", "cur_mem_mib": 101}
2024-02-12T12:13:39.664Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:41.769Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/logs responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 1000}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createLogsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:13:42.239Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:43.294Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:43.308Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:43.695Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:44.224Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/logs responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 1000}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createLogsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:13:44.297Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:44.305Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:45.313Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:45.331Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:45.606Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/logs responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 1000}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createLogsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:13:45.767Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:52.831Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 1, "log records": 1}
2024-02-12T12:13:52.832Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "logs", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/logs responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 1}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createLogsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:13:53.720Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:01.684Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:03.768Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:27.767Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:27.768Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
2024-02-12T12:14:33.598Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:33.600Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:35.669Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:35.670Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:37.743Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:49.606Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:51.686Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:51.688Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:14:53.763Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:53.764Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "2.528889573s"}
2024-02-12T12:14:55.635Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:55.637Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "4.469254533s"}
2024-02-12T12:14:56.312Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "3.781748409s"}
2024-02-12T12:14:57.723Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:57.724Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "5.922314888s"}
2024-02-12T12:14:59.590Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:59.591Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "5.101666595s"}
2024-02-12T12:15:00.106Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.260620386s"}
2024-02-12T12:15:00.124Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "10.609948295s"}
2024-02-12T12:15:01.673Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:01.673Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.600585342s"}
2024-02-12T12:15:03.661Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "3.893131851s"}
2024-02-12T12:15:03.758Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:03.761Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "5.631966261s"}
2024-02-12T12:15:04.714Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.862112453s"}
2024-02-12T12:15:05.622Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:05.622Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.964815392s"}
2024-02-12T12:15:07.381Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "12.618459834s"}
2024-02-12T12:15:07.558Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "11.67278761s"}
2024-02-12T12:15:07.698Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:07.699Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "4.90095434s"}
2024-02-12T12:15:08.280Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.703637679s"}
2024-02-12T12:15:09.409Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "7.030750379s"}
2024-02-12T12:15:09.771Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:15:09.773Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.588990315s"}
2024-02-12T12:15:10.748Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "16.49570526s"}
2024-02-12T12:15:10.864Z        info    otelcol@v0.93.0/collector.go:258        Received signal from OS {"signal": "interrupt"}
2024-02-12T12:15:10.864Z        info    service@v0.93.0/service.go:179  Starting shutdown...
2024-02-12T12:15:10.869Z        info    adapter/receiver.go:140 Stopping stanza receiver        {"kind": "receiver", "name": "filelog/continues", "data_type": "logs"}
2024-02-12T12:15:10.884Z        info    adapter/receiver.go:140 Stopping stanza receiver        {"kind": "receiver", "name": "filelog", "data_type": "logs"}
2024-02-12T12:15:10.897Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}

go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.898Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "interrupted due to shutdown: failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43
2024-02-12T12:15:10.905Z        info    extensions/extensions.go:59     Stopping extensions...
2024-02-12T12:15:10.906Z        info    service@v0.93.0/service.go:193  Shutdown complete.

Second OpenTelemetry Collector Configuration

receivers:  
  otlp:
    protocols:
      http:
        endpoint: "127.0.0.1:4320"

exporters:
  logging:

  otlphttp:
    endpoint: http://127.0.0.1:4318
    tls:
      insecure: true

processors:
  batch:
    send_batch_size: 1000
  memory_limiter:
    check_interval: 1s
    limit_mib: 400
    spike_limit_mib: 250

service:
  telemetry:
    metrics:
      address: "0.0.0.0:9090" # Use a port that you know is free
  pipelines:
    logs:
      receivers: [otlp]
      processors: [memory_limiter,batch]
      exporters: [logging, otlphttp]
    metrics:
      receivers: [otlp]
      processors: [memory_limiter,batch]
      exporters: [logging, otlphttp]

Log output for second Collector

PS C:\Users\kCuraCloudAdmin> C:\\collector\\second\\otelcol-contrib.exe --config C:\\collector\\second\\config.yaml
2024-02-12T12:13:15.717Z        info    service@v0.93.0/telemetry.go:76 Setting up own telemetry...
2024-02-12T12:13:15.718Z        info    service@v0.93.0/telemetry.go:146        Serving metrics {"address": "0.0.0.0:9090", "level": "Basic"}
2024-02-12T12:13:15.723Z        info    exporter@v0.93.0/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "metrics", "name": "logging"}
2024-02-12T12:13:15.730Z        info    memorylimiter/memorylimiter.go:77       Memory limiter configured       {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "limit_mib": 400, "spike_limit_mib": 250, "check_interval": 1}
2024-02-12T12:13:15.730Z        info    exporter@v0.93.0/exporter.go:275        Deprecated component. Will be removed in future releases.       {"kind": "exporter", "data_type": "logs", "name": "logging"}
2024-02-12T12:13:15.730Z        info    service@v0.93.0/service.go:139  Starting otelcol-contrib...     {"Version": "0.93.0", "NumCPU": 2}
2024-02-12T12:13:15.730Z        info    extensions/extensions.go:34     Starting extensions...
2024-02-12T12:13:15.734Z        info    otlpreceiver@v0.93.0/otlp.go:152        Starting HTTP server    {"kind": "receiver", "name": "otlp", "data_type": "metrics", "endpoint": "127.0.0.1:4320"}
2024-02-12T12:13:15.739Z        info    service@v0.93.0/service.go:165  Everything is ready. Begin running and processing data.
2024-02-12T12:13:17.806Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 4, "metrics": 4, "data points": 20}
2024-02-12T12:13:19.698Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:21.796Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:23.056Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:23.680Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:25.786Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:27.677Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:29.754Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:31.838Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:32.720Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:32.732Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:33.759Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:33.887Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:34.838Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 11, "log records": 1010}
2024-02-12T12:13:34.916Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.044Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.091Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:36.922Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 1, "metrics": 1, "data points": 2}
2024-02-12T12:13:37.077Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:37.098Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:38.733Z        info    memorylimiter/memorylimiter.go:222      Memory usage is above soft limit. Forcing a GC. {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 164}
2024-02-12T12:13:39.139Z        info    memorylimiter/memorylimiter.go:192      Memory usage after GC.  {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 107}
2024-02-12T12:13:39.806Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:13:41.734Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 150}
2024-02-12T12:13:42.732Z        info    memorylimiter/memorylimiter.go:215      Memory usage back within limits. Resuming normal operation.     {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 132}
2024-02-12T12:13:42.742Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:42.750Z        info    LogsExporter    {"kind": "exporter", "data_type": "logs", "name": "logging", "resource logs": 10, "log records": 1000}
2024-02-12T12:13:43.734Z        warn    memorylimiter/memorylimiter.go:229      Memory usage is above soft limit. Refusing data.        {"kind": "processor", "name": "memory_limiter", "pipeline": "metrics", "cur_mem_mib": 228}
2024-02-12T12:13:43.908Z        info    MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "logging", "resource metrics": 2, "metrics": 2, "data points": 10}
2024-02-12T12:14:52.290Z        info    otelcol@v0.93.0/collector.go:258        Received signal from OS {"signal": "interrupt"}
2024-02-12T12:14:52.290Z        info    service@v0.93.0/service.go:179  Starting shutdown...
2024-02-12T12:14:52.305Z        info    extensions/extensions.go:59     Stopping extensions...
2024-02-12T12:14:52.305Z        info    service@v0.93.0/service.go:193  Shutdown complete.

Results

Test Scenario

  1. First Collector Started
  2. Second Collector Started
  3. Move Errorlog (1mln logs) file to destination
  4. Turn off Second Collector
  5. Turn off First Collector

Initially, the OpenTelemetry Collector was unable to send metrics to the second Collector and encountered a retryable error.

2024-02-12T12:13:13.758Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "3.935162309s"}

After the second collector became available, metrics were successfully transmitted. When a log file was moved and processed by the first collector, both collectors reached their soft limits the same situation is when the reach hard limit. At this point, the second collector started rejecting data, redirecting it back to the receiver. As a result, the first OpenTelemetry Collector encountered errors for both logs and metrics.

2024-02-12T12:14:33.600Z        error   exporterhelper/common.go:95     Exporting failed. Dropping data.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "not retryable error: Permanent error: error exporting items, request to http://127.0.0.1:4320/v1/metrics responded with HTTP Status Code 500, Message=data refused due to high memory usage, Details=[]", "dropped_items": 10}
go.opentelemetry.io/collector/exporter/otlphttpexporter.createMetricsExporter.WithQueue.func5.1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/common.go:95
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/queue_sender.go:117
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/bounded_memory_queue.go:57
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
        go.opentelemetry.io/collector/exporter@v0.93.0/exporterhelper/internal/consumers.go:43

I expected that the first collector would obtain a recoverable error, such as "too many requests," so when the second collector resumes normal operation, pending requests can be resent instead of being dropped.

At the end, after shutting down the second collector, the first collector displayed an error

2024-02-12T12:15:09.773Z        info    exporterhelper/retry_sender.go:118      Exporting failed. Will retry the request after interval.        {"kind": "exporter", "data_type": "metrics", "name": "otlphttp", "error": "failed to make an HTTP request: Post \"http://127.0.0.1:4320/v1/metrics\": dial tcp 127.0.0.1:4320: connectex: No connection could be made because the target machine actively refused it.", "interval": "6.588990315s"}
@marcinsiennicki95 marcinsiennicki95 added the bug Something isn't working label Feb 26, 2024
@marcinsiennicki95 marcinsiennicki95 changed the title [exporter/otlphttp] Responded with HTTP Status Code 500 instead of 429 if soft memory limit is reached [receiver/otlp] Responded with HTTP Status Code 500 instead of 429 if soft memory limit is reached Feb 26, 2024
@marcinsiennicki95
Copy link
Author

I found OpenTelemtry specification how http error codes should be handled. It looks like the current behavior is not implemented correctly, and the error codes are not followed.
If I understand correctly, when setting up a chain like this: Collector 1 (client) with the OLTP exporter (or fluent-bit with output OLTP), and Collector 2 (server) with the OLTP receiver, the receiver on Collector 2 should return handle HTTP status codes as specified in the specification

@TylerHelmuth
Copy link
Member

@marcinsiennicki95 you are correct. I believe this issue is related to #9357. If you use grpc then you should be getting retryable errors.

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
2 participants