Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Histogram Maximum and Minimum values lost from receiving to exporting. #16714

Closed
Narcolapser opened this issue Dec 5, 2022 · 3 comments · Fixed by #19635
Closed

Histogram Maximum and Minimum values lost from receiving to exporting. #16714

Narcolapser opened this issue Dec 5, 2022 · 3 comments · Fixed by #19635
Assignees
Labels
bug Something isn't working exporter/influxdb

Comments

@Narcolapser
Copy link

Narcolapser commented Dec 5, 2022

Component(s)

exporter/influxdb

What happened?

Description

I am attempting to leverage the maximum and minimum optional values that are part of the histogram data model. I can export the values from my program that is producing other metrics, but by the time the metrics get to the influx server maximum and minimum have been stripped from the data.

Steps to Reproduce

Add maximum and minimum recording to a histogram and export via OTLP to the OTEL Collector. In my case it was done like so:

    const attributes = { description: `Histogram of ${name} events.`, maximum: true, minimum: true, average: true };
    const histogram = meter.createHistogram(name, attributes);

This produced a packet that contained the following snipet:

{"attributes":[{"key":"status_code","value":{"stringValue":"200"}},{"key":"caller","value":{"stringValue":"unknown"}},{"key":"method","value":{"stringValue":"GET"}},{"key":"routeName","value":{"stringValue":"/v1/status"}}],"bucketCounts":[0,17,1,0,0,0,0,0,0,0,0],"explicitBounds":[0,5,10,25,50,75,100,250,500,1000],"count":18,"sum":31.246087999999997,"min":0.8881749999999999,"max":8.820832,"startTimeUnixNano":1670269851940438000,"timeUnixNano":1670269869661102600}]}}

"min" and "max" are in the values being submitted, like wise when I look at the logs coming out of my otel-collector below. However, by the time the results were then exported min and maximum were absent:

route,caller=unknown,environment=local,method=GET,otel.library.name=high-quality-engineering,project=high-quality-engineering,routeName=/v1/status,service_team=delta,status_code=200,tech_service=salt 0=0,50=0,sum=1607.1812599999964,25=0,75=0,250=0,500=0,count=2407,5=2406,100=0,10=1,1000=0 1670263155013154000

Expected Result

route,caller=unknown,environment=local,method=GET,otel.library.name=high-quality-engineering,project=high-quality-engineering,routeName=/v1/status,service_team=delta,status_code=200,tech_service=salt 0=0,50=0,sum=1607.1812599999964,25=0,75=0,250=0,500=0,count=2407,5=2406,100=0,10=1,1000=0,maximum=12345,minimum=1234,1670263155013154000

Actual Result

route,caller=unknown,environment=local,method=GET,otel.library.name=high-quality-engineering,project=high-quality-engineering,routeName=/v1/status,service_team=delta,status_code=200,tech_service=salt 0=0,50=0,sum=1607.1812599999964,25=0,75=0,250=0,500=0,count=2407,5=2406,100=0,10=1,1000=0 1670263155013154000

Collector version

0.64.1

Environment information

Environment

Running in docker, with the below docker-compose file:

version: "2"
services:

Collector

otel-collector:
image: otel/opentelemetry-collector-contri
command: ["--config=/etc/otel-collector-config.yaml", "${OTELCOL_ARGS}"]
volumes:
- ./otel-collector-config.yaml:/etc/otel-collector-config.yaml
ports:
- "4318:4318" # OTLP http receiver

OpenTelemetry Collector configuration

receivers:
  otlp:
    protocols:
      http:

processors:
  attributes/insert:
    actions:
      - key: "environment"
        value: "local"
        action: insert

exporters:
  logging:
    loglevel: debug

  influxdb:
    endpoint: http://localhost:8086/api/v2/write
    timeout: 500ms
    bucket: metrics

    sending_queue:
      enabled: true
      num_consumers: 3
      queue_size: 5000

    retry_on_failure:
      enabled: true
      initial_interval: 1s
      max_interval: 3s
      max_elapsed_time: 10s

service:
  pipelines:
    metrics:
      receivers: [otlp]
      processors: [attributes/insert]
      exporters: [logging, influxdb]

Log output

otel-collector_1  | Metric #17
otel-collector_1  | Descriptor:
otel-collector_1  |      -> Name: route
otel-collector_1  |      -> Description: Histogram of route events.
otel-collector_1  |      -> Unit:
otel-collector_1  |      -> DataType: Histogram
otel-collector_1  |      -> AggregationTemporality: Cumulative
otel-collector_1  | HistogramDataPoints #0
otel-collector_1  | Data point attributes:
otel-collector_1  |      -> status_code: Str(200)
otel-collector_1  |      -> caller: Str(unknown)
otel-collector_1  |      -> method: Str(GET)
otel-collector_1  |      -> routeName: Str(/v1/status)
otel-collector_1  |      -> environment: Str(local)
otel-collector_1  | StartTimestamp: 2022-12-05 17:18:10.4089518 +0000 UTC
otel-collector_1  | Timestamp: 2022-12-05 19:48:52.1910003 +0000 UTC
otel-collector_1  | Count: 8817
otel-collector_1  | Sum: 5337.239690
otel-collector_1  | Min: 0.410130
otel-collector_1  | Max: 9.803715
otel-collector_1  | ExplicitBounds #0: 0.000000
otel-collector_1  | ExplicitBounds #1: 5.000000
otel-collector_1  | ExplicitBounds #2: 10.000000
otel-collector_1  | ExplicitBounds #3: 25.000000
otel-collector_1  | ExplicitBounds #4: 50.000000
otel-collector_1  | ExplicitBounds #5: 75.000000
otel-collector_1  | ExplicitBounds #6: 100.000000
otel-collector_1  | ExplicitBounds #7: 250.000000
otel-collector_1  | ExplicitBounds #8: 500.000000
otel-collector_1  | ExplicitBounds #9: 1000.000000
otel-collector_1  | Buckets #0, Count: 0
otel-collector_1  | Buckets #1, Count: 8816
otel-collector_1  | Buckets #2, Count: 1
otel-collector_1  | Buckets #3, Count: 0
otel-collector_1  | Buckets #4, Count: 0
otel-collector_1  | Buckets #5, Count: 0
otel-collector_1  | Buckets #6, Count: 0
otel-collector_1  | Buckets #7, Count: 0
otel-collector_1  | Buckets #8, Count: 0
otel-collector_1  | Buckets #9, Count: 0
otel-collector_1  | Buckets #10, Count: 0

Additional context

No response

@Narcolapser Narcolapser added bug Something isn't working needs triage New item requiring triage labels Dec 5, 2022
@github-actions
Copy link
Contributor

github-actions bot commented Dec 6, 2022

Pinging code owners for exporter/influxdb: @jacobmarble. See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions
Copy link
Contributor

github-actions bot commented Feb 6, 2023

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@jacobmarble
Copy link
Contributor

I'm working on this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working exporter/influxdb
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants