Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add discardResponseMessage option for gRPC client #3820

Merged

Conversation

lzakharov
Copy link
Contributor

@lzakharov lzakharov commented Jul 1, 2024

What?

Adds a new gRPC client option discardResponseMessage, which is equivalent to discardResponseBodies.

Why?

Parsing heavy responses can result in high CPU and memory usage.

Checklist

  • I have performed a self-review of my code.
  • I have added tests for my changes.
  • I have run linter locally (make lint) and all checks pass.
  • I have run tests locally (make tests) and all tests pass.
  • I have commented on my code, particularly in hard-to-understand areas.

Related PR(s)/Issue(s)

#2497

@CLAassistant
Copy link

CLAassistant commented Jul 1, 2024

CLA assistant check
All committers have signed the CLA.

@lzakharov lzakharov marked this pull request as ready for review July 1, 2024 17:07
@lzakharov lzakharov requested a review from a team as a code owner July 1, 2024 17:07
@lzakharov lzakharov requested review from mstoykov and codebien and removed request for a team July 1, 2024 17:07
Copy link
Collaborator

@codebien codebien left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @lzakharov,
sorry for the late review. Thanks for your contribution 🙇

Did you run the profiler to make sure it reduces the memory as expected?

After checking a bit the gRPC library, I assume that we are not discarding the response body as we do in HTTP.
With the suggested code, we are mostly only skipping the JSON parsing. That might lead to not being very useful in reducing memory usage. It might be more impactful on the CPU.

@lzakharov
Copy link
Contributor Author

lzakharov commented Jul 10, 2024

Hi, thanks for your review comment!

By reducing memory consumption, I meant the absence of the need to allocate objects when parsing into JSON. I don't yet know how to fully ignore a message at the gRPC library side, need to do some research. My main focus was on CPU consumption, since this is the problem we encountered in our tests.

@lzakharov
Copy link
Contributor Author

lzakharov commented Jul 13, 2024

After some research, I discovered that emptypb.Empty can be used as a response message type to reduce memory consumption. It still requires reading all response data into the bytes buffer here, but I haven't found how to optimize this. Anyway, the current solution shows a noticeable improvement in performance.

To compare performance, I modified the gRPC server from the testutils package to include an RPC method that returns a list of features for a specified rectangle (with the same logic as in ListFeatures).

Without the discardResponseMessage option:

k6 ➜  /usr/bin/time -l -h ./k6 run --vus 100 --iterations 10000 examples/grpc_invoke.js
...
        3.54s real              10.84s user             1.44s sys
           190300160  maximum resident set size
                   0  average shared memory size
                   0  average unshared data size
                   0  average unshared stack size
               10594  page reclaims
                1562  page faults
                   0  swaps
                   0  block input operations
                   0  block output operations
               49785  messages sent
               28878  messages received
                4900  signals received
                1661  voluntary context switches
               83331  involuntary context switches
         90855390651  instructions retired
         35639318415  cycles elapsed
           164775808  peak memory footprint

With the discardResponseMessage option:

k6 ➜  /usr/bin/time -l -h ./k6 run --vus 100 --iterations 10000 examples/grpc_invoke_discard.js
...
        3.89s real              4.71s user              1.89s sys
           137052160  maximum resident set size
                   0  average shared memory size
                   0  average unshared data size
                   0  average unshared stack size
                8839  page reclaims
                   2  page faults
                   0  swaps
                   0  block input operations
                   0  block output operations
               51883  messages sent
               29678  messages received
                1760  signals received
                 516  voluntary context switches
              177455  involuntary context switches
         31404551635  instructions retired
         19073852897  cycles elapsed
           111707968  peak memory footprint

@lzakharov lzakharov requested a review from codebien July 15, 2024 18:05
Copy link
Collaborator

@codebien codebien left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @lzakharov,
the results are quite interesting! Thanks for taking care of it.

If we can find an alternative solution to the current global variable then we are fine with merging your proposal. 👍

It would be helpful if you could do something similar also for gRPC streaming. You can split it in a second pull request.

lib/netext/grpcext/conn.go Outdated Show resolved Hide resolved
@lzakharov
Copy link
Contributor Author

It would be helpful if you could do something similar also for gRPC streaming. You can split it in a second pull request.

I'll try to add it in the next PR.

@lzakharov lzakharov requested a review from codebien July 25, 2024 10:20
Copy link
Collaborator

@codebien codebien left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@codebien codebien added this to the v0.54.0 milestone Jul 30, 2024
Copy link
Collaborator

@mstoykov mstoykov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Thanks for the contribution @lzakharov 🙇

@codebien codebien merged commit eaa5419 into grafana:master Aug 21, 2024
23 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants