-
Notifications
You must be signed in to change notification settings - Fork 184
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
kindling-agent was OOM-killed frequently #352
Comments
能否提供一下日志文件?里面有一些自监控指标有助于排查问题。 |
kindling-agent的日志? |
需要标准输出的内容,自监控的日志只输出到标准输出了 |
请问您在这个node上的应用数量有多少?请求量在什么样的量级? |
|
目前node上pod124个 |
想看下自监控指标,这个指标会输出到“标准输出”中,不是agent.log。就是使用 |
开篇提供日志~,只是日志太多,删除了部分 kubectl logs -n kindling kindling-agent-xgr2k|grep -v times_tota |more |
😂刚好删除了需要的部分,可以把结果输出到文件中,然后发一下吗 |
I found the kindling pods restarted for Readiness probe failure.
|
It won't restart if only the readiness probe fails. It restarts because the kernel is not supported. You should compile your own probe for your kernel version. |
I have the same problem,the kindling agent always killed by OOM,Whether there is memory leak? |
Thanks for the feedback. We are still working on this issue. You can reset the enviroment variable |
Describe the bug
A clear and concise description of what the bug is.
kindling-agent 频繁OOM,Limits memory调到了5Gi,还是会OOM
发现这个问题主要发生在物理机上
How to reproduce?
Steps to reproduce the behavior.
What did you expect to see?
A clear and concise description of what you expected to happen.
What did you see instead?
A clear and concise description of what you saw instead.
Screenshots
If applicable, add screenshots to help explain your problem.
What config did you use?
Config: (e.g. the yaml config file)
Logs
Please attach the logs by running the following command:
Environment (please complete the following information)
Additional context
Add any other context about the problem here, like appliction protocol.
The text was updated successfully, but these errors were encountered: