gax: Excessive byte[] allocations with large read stream requests #3621
Labels
priority: p3
Desirable enhancement or fix. May not be included in next release.
type: bug
Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Originally opened here: googleapis/java-bigtable#2480 then noticed it's not the right repo, so moving here for a fix.
Hi, recently we've been using the BigtableDataClient to send large requests using the
![Image](https://private-user-images.githubusercontent.com/7058741/411494002-79fb4787-8eae-401f-b1f6-2ff680746a4d.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3Mzk1NzE2MDUsIm5iZiI6MTczOTU3MTMwNSwicGF0aCI6Ii83MDU4NzQxLzQxMTQ5NDAwMi03OWZiNDc4Ny04ZWFlLTQwMWYtYjFmNi0yZmY2ODA3NDZhNGQucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI1MDIxNCUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNTAyMTRUMjIxNTA1WiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9MGQxZjViZWMzMmI1ODBhNzM2YzZmMTdmYmEzZGZkMjAwM2I1Y2EwZGQwZDEzZGJlOWI3YTAzZWIwNzM1YTc0MiZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QifQ.LcFr5KSAJMYfPil8_nSh0yd-QLvpRDogrqDRiNrz0Es)
readRows
streaming method, we observed that with some large requests our CPUs increase to 100%, upon profiling it was mostly the GC running, running an allocation profile showed the following:Which is coming from:
com.google.api.gax.rpc.StateCheckingResponseObserver#onResponse
:The string concatenation is common to be optimized into a StringBuilder by the JVM, so there is a ton of allocation coming from this bit:
getClass() + " received a response after being closed."
, but the exception is not being thrown here.Pre-computing the error message every time is an issue, we need to only do that if the exception will be thrown.
Environment details
Steps to reproduce
Code example
Stack trace
See the screenshot above
External references such as API reference guides
N/A
Any additional information below
N/A
The text was updated successfully, but these errors were encountered: