Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PROF-10124] Lower value used to clamp very high allocation profiling weights #3793

Merged
merged 1 commit into from
Jul 24, 2024

Conversation

ivoanjo
Copy link
Member

@ivoanjo ivoanjo commented Jul 18, 2024

What does this PR do?

This PR lowers the profiling internal constant MAX_ALLOC_WEIGHT from 65535 to 10000.

This value is used when clamping very high allocation profiling weights.

Aka: Each allocation sample taken by the profiler has a weight assigned to it; this weight is clamped (e.g. limited to) this maximum value. The remaining weight is assigned to a separate "skipped samples" placeholder (see #3792).

Motivation:

Very large weights on samples can produce biased results; by lowering the maximum we reduce the maximum bias that can ever be introduced.

Additional Notes:

I've gathered data from a number of apps when choosing this value, see https://docs.google.com/document/d/1lWLB714wlLBBq6T4xZyAc4a5wtWhSmr4-hgiPKeErlA/edit for details (Datadog-only link, sorry!).

How to test the change?

The tests for #3792 also cover this change (although the changes are otherwise independent).

… weights

**What does this PR do?**

This PR lowers the profiling internal constant `MAX_ALLOC_WEIGHT` from
65535 to 10000.

This value is used when clamping very high allocation profiling
weights.

Aka: Each allocation sample taken by the profiler has a weight
assigned to it; this weight is clamped (e.g. limited to) this maximum
value. The remaining weight is assigned to a separate "skipped samples"
placeholder (see #3792).

**Motivation:**

Very large weights on samples can produce biased results; by lowering
the maximum we reduce the maximum bias that can ever be introduced.

**Additional Notes:**

I've gathered data from a number of apps when choosing this value, see
<https://docs.google.com/document/d/1lWLB714wlLBBq6T4xZyAc4a5wtWhSmr4-hgiPKeErlA/edit>
for details (Datadog-only link, sorry!).

**How to test the change?**

The tests for #3792 also cover this change (although the changes are
otherwise independent).
@ivoanjo ivoanjo requested a review from a team as a code owner July 18, 2024 10:43
@github-actions github-actions bot added the profiling Involves Datadog profiling label Jul 18, 2024
@pr-commenter
Copy link

pr-commenter bot commented Jul 18, 2024

Benchmarks

Benchmark execution time: 2024-07-18 10:52:23

Comparing candidate commit 4cd8e7b in PR branch ivoanjo/prof-10124-lower-max-alloc-weight with baseline commit f6ec4ac in branch master.

Found 0 performance improvements and 0 performance regressions! Performance is the same for 10 metrics, 2 unstable metrics.

@codecov-commenter
Copy link

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 97.91%. Comparing base (f6ec4ac) to head (4cd8e7b).

Additional details and impacted files
@@           Coverage Diff           @@
##           master    #3793   +/-   ##
=======================================
  Coverage   97.91%   97.91%           
=======================================
  Files        1246     1246           
  Lines       75037    75037           
  Branches     3629     3629           
=======================================
+ Hits        73473    73474    +1     
+ Misses       1564     1563    -1     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@ivoanjo ivoanjo merged commit af8ee95 into master Jul 24, 2024
171 checks passed
@ivoanjo ivoanjo deleted the ivoanjo/prof-10124-lower-max-alloc-weight branch July 24, 2024 08:48
@github-actions github-actions bot added this to the 2.3.0 milestone Jul 24, 2024
@TonyCTHsu TonyCTHsu mentioned this pull request Aug 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
profiling Involves Datadog profiling
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants