Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

memory leak when processing h2 protocol requests #2924

Open
maxbear1988 opened this issue Jul 26, 2022 · 5 comments
Open

memory leak when processing h2 protocol requests #2924

maxbear1988 opened this issue Jul 26, 2022 · 5 comments
Labels
C-bug Category: bug. Something is wrong. This is bad!

Comments

@maxbear1988
Copy link

when hyper process h2 protocol requests with large body,memory keeps increasing.I dump two jeprof.out.x.x.m2.heap,compare them,find _$LT$h2..codec..framed_read..FramedRead$LT$T$GT$$u20$as$u20$futures_core..stream..Stream$GT$::poll_next::hd05c05a51fa46d93 malloc most memory,and not free.

0.0 0.0% 100.0% 1361.2 89.8% 0x000056341f367654 bytes::bytes_mut::BytesMut::reserve_inner::hfde02843ef3655ca + 132 in section .text
0.0 0.0% 100.0% 1436.7 94.7% 0x000056341e389a87 _$LT$h2..codec..framed_read..FramedRead$LT$T$GT$$u20$as$u20$futures_core..stream..Stream$GT$::poll_next::hd05c05a51fa46d93 + 1687 in section

@maxbear1988 maxbear1988 added the C-bug Category: bug. Something is wrong. This is bad! label Jul 26, 2022
@lidong14
Copy link

@seanmonstar

@seanmonstar
Copy link
Member

Is there any more info that I could use to try to understand this? Code that consistently triggers it?

@maxbear1988
Copy link
Author

this bytes bug will cause oom.tokio-rs/bytes#559
@seanmonstar

@maxbear1988
Copy link
Author

@seanmonstar i keep sending request in h2 with 6k body with the same client. Memory keep growing.i use jemalloc to dump jeprof file,and find most memories are alloc by slab. buffer's slab len in Recv keeps growing,not falling
memory generation path:
(gdb) info symbol 0x000055dd9ef1972a
tokio::runtime::task::harness::Harness$LT$T$C$S$GT$::poll::h3052a101c45448e8 + 106 in section .text of target:
(gdb) info symbol 0x000055dd9f9f6167
_$LT$hyper..server..server..new_svc..NewSvcTask$LT$I$C$N$C$S$C$E$C$W$GT$$u20$as$u20$core..future..future..Future$GT$::poll::h67e0aa0b56e6ccce + 311 in section .text of target:
(gdb) info symbol 0x000055dd9f7498c8
_$LT$hyper..server..conn..upgrades..UpgradeableConnection$LT$I$C$S$C$E$GT$$u20$as$u20$core..future..future..Future$GT$::poll::h5970e45e5ae63534 + 136 in section .text of target:
(gdb) info symbol 0x000055dd9edbc94c
_$LT$hyper..proto..h2..server..Server$LT$T$C$S$C$B$C$E$GT$$u20$as$u20$core..future..future..Future$GT$::poll::h3bb6508da049998b + 1244 in section .text of target:
(gdb) info symbol 0x000055dd9f0df006
h2::server::Connection$LT$T$C$B$GT$::poll_accept::h86c2773a2703c70a + 38 in section .text of target:
(gdb) info symbol 0x000055dd9f06122b
h2::proto::connection::Connection$LT$T$C$P$C$B$GT$::poll::hd2b8a11d4cc3b8fc + 7435 in section .text of target:
(gdb) info symbol 0x000055dd9f044516
h2::proto::connection::DynConnection$LT$B$GT$::recv_frame::h35c6093d0834538d + 5558 in section .text of target:
(gdb) info symbol 0x000055dd9f3ff5f3
h2::proto::streams::streams::DynStreams$LT$B$GT$::recv_data::h281cfb64377cb138 + 435 in section .text of target:
(gdb) info symbol 0x000055dd9efaa464
h2::proto::streams::counts::Counts::transition::h5c345d9937d3cc6e + 116 in section .text of target:
(gdb) info symbol 0x000055dd9ff64413
h2::proto::streams::recv::Recv::recv_data::h2c5a5da8673f66ac + 5331 in section .text of target:
(gdb) info symbol 0x000055dd9ffabf6a
_ZN4slab13Slab$LT$T$GT$6insert17h12028a4fbb703ab9E.llvm.11221374846190048674 + 106 in section .text of target:
(gdb) info symbol 0x000055dd9ff7f10e

@gitmalong
Copy link

@maxbear1988 are you still facing the issue? I am currently facing a memory leak in my Axum / hyper app so I am investigating in all directions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
C-bug Category: bug. Something is wrong. This is bad!
Projects
None yet
Development

No branches or pull requests

4 participants