-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(gcp_stackdriver_metrics): Fixes GcpSeries serialization format without breaking config. #16394
fix(gcp_stackdriver_metrics): Fixes GcpSeries serialization format without breaking config. #16394
Conversation
✅ Deploy Preview for vrl-playground canceled.
|
✅ Deploy Preview for vector-project canceled.
|
Regression Test Results
Run ID: fe39bd29-7140-4674-bab7-bf2fddd0eda1 Explanation
A regression test is an integrated performance test for
The table below, if present, lists those experiments that have experienced a
statistically significant change in their Changes in
Fine details of change detection per experiment.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd like to have a test for this to make sure we are creating the expected json. Would it be possible to extract the creation of gcp::GcpSeries
into a separate function that then asserts that it converts to the correct json?
@StephenWakely , I've added a test which would detect any future regressions of this sort (e.g. where we accidentally change the serialized output when making another change). Is this what you are looking for? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perfect!
Regression Detector ResultsRun ID: be533103-f6dd-4200-8662-2c3085440eef ExplanationA regression test is an integrated performance test for The table below, if present, lists those experiments that have experienced a statistically significant change in mean optimization goal performance between baseline and comparison SHAs with 90.00% confidence OR have been detected as newly erratic. Negative values mean that baseline is faster, positive comparison. Results that do not exhibit more than a ±5.00% change in their mean optimization goal are discarded. An experiment is erratic if its coefficient of variation is greater than 0.1. The abbreviated table will be omitted if no interesting change is observed. No interesting changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%. Fine details of change detection per experiment.
|
CI is failing because |
Whoops! Sorry I forgot that!. I have pushed a commit which fixes the formatting. Thanks! |
Regression Detector ResultsRun ID: 2651b6e0-ae13-42a1-9404-80e9f5154dfd ExplanationA regression test is an integrated performance test for The table below, if present, lists those experiments that have experienced a statistically significant change in mean optimization goal performance between baseline and comparison SHAs with 90.00% confidence OR have been detected as newly erratic. Negative values mean that baseline is faster, positive comparison. Results that do not exhibit more than a ±5.00% change in their mean optimization goal are discarded. An experiment is erratic if its coefficient of variation is greater than 0.1. The abbreviated table will be omitted if no interesting change is observed. Changes in experiment optimization goals with confidence ≥ 90.00% and |Δ mean %| ≥ 5.00%:
Fine details of change detection per experiment.
|
@jasonahills many thanks! |
fix(gcp_stackdriver_metrics): Fixes
GcpSeries
serialization format without breaking config.