-
Notifications
You must be signed in to change notification settings - Fork 901
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory Leak with Java 21 #9805
Comments
hi @anuragagarwal561994, does before and after refer to before Java 21 and after Java 21? or before OTel Java agent and after OTel Java agent? |
@trask this is before and after Otel Agent on Java 21 and when using virtual thread. Without virtual threads things look like the second graph itself |
@anuragagarwal561994 It is unlikely that we will be able to help you based on just the information that you have provided. If you are seeking for a speedy resolution for this issue then please provide a sample app that allows for reproducing this issue or use the tools available to you (heap dumps, jfr, whatever) to figure out what objects are being leaked or what is causing the direct byte buffers to accumulate. |
@laurit we are trying to figure it out ourselves will be able to provide more information on this soon. Just wanted to check if we have benchmarked the agent with hava 21 and virtual thread kind of scenarios and if not we can try building that use case For now what we are doing is trying to disable all instruments and enable them one by one to identify what may be leaking. |
@trask @laurit we have found the instrumentation with issue and it actually makes sense.\
These are the instrumentations that are being applied in our application, we tried doing binary searching for disabling instrumentations and then found
We are getting repeated such errors messages when we enabled debug logs for otel. This instrumentation is not virtual thread compatible and hence creating issues. Let me know if more details are needed, but I think this can be a good start to reproduce the issue. |
can you try with the latest version (1.31.0), which includes #9616? |
@anuragagarwal561994 knowing that the issue has something to do with executor instrumentation is a start, but it really isn't all that useful. I still think that examining the heap dump could give you more information. The log line about grpc that you pasted might not be related to this issue at all. |
Yes @laurit I actually checked forgot to update here, we did a binary search in the instrumentations in use and identified that executors instrumentation is the one causing the issue. We are checking with the latest version if everything looks fine, will take 1-2 days to give further updates |
Describe the bug
We recently migrated to java 21 and tried Virtual Thread executor provided by tomcat v10.
We fine tuned the application to get things working, but saw ever increasing old generation space wih 3.2 MB/s promotion rate while in general our application has close to 200-500 KB / s promotion rate.
When we tried to disable multiple components like logging or further fine tuning things we were not able to identify any reason except for when we removed otel agent and everything started working fine.
I have not been able to identify what the root cause of the issue is, but I could observe two things:
Steps to reproduce
Our application is serving around 2K qps with 2 GB/s allocation rate.
Java 21
Tomcat with Virtual Thread Executor v10
Expected behavior
Memory should not increase unboundedly
Actual behavior
Direct Buffer Size is increasing unboundedly
Old Gen space increasing unboundedly
Promotion rate increased
Javaagent or library instrumentation version
1.28
Environment
openjdk version "21" 2023-09-19 LTS
OpenJDK Runtime Environment Temurin-21+35 (build 21+35-LTS)
OpenJDK 64-Bit Server VM Temurin-21+35 (build 21+35-LTS, mixed mode, sharing)
Linux 5.15.120+ SMP Sat Sep 30 10:17:59 UTC 2023 x86_64 Linux
Additional context
Key libraries we use include:
kafka
apache http async client
grpc
tomcat
The text was updated successfully, but these errors were encountered: