-
Notifications
You must be signed in to change notification settings - Fork 225
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix server's request header's memory leakage #289
Conversation
Signed-off-by: reed-lau <geoliuwei@gmail.com>
The |
yes, it may be released somewhere else, but the current state is that: no one release it, which will cause a memory leakage in server side, and the amount is one sizeof(rmw_request_id_t) per request. we can also add one interface something like to reproduce the memory leak: |
Please provide the exact patch you are using. |
diff --git a/demo_nodes_py/demo_nodes_py/services/add_two_ints_client.py b/demo_nodes_py/demo_nodes_py/services/add_two_ints_client.py
index 3535846..c708ffc 100644
--- a/demo_nodes_py/demo_nodes_py/services/add_two_ints_client.py
+++ b/demo_nodes_py/demo_nodes_py/services/add_two_ints_client.py
@@ -25,15 +25,16 @@ def main(args=None):
cli = node.create_client(AddTwoInts, 'add_two_ints')
while not cli.wait_for_service(timeout_sec=1.0):
print('service not available, waiting again...')
- req = AddTwoInts.Request()
- req.a = 2
- req.b = 3
- future = cli.call_async(req)
- rclpy.spin_until_future_complete(node, future)
- if future.result() is not None:
- node.get_logger().info('Result of add_two_ints: %d' % future.result().sum)
- else:
- node.get_logger().error('Exception while calling service: %r' % future.exception())
+ while True:
+ req = AddTwoInts.Request()
+ req.a = 2
+ req.b = 3
+ future = cli.call_async(req)
+ rclpy.spin_until_future_complete(node, future)
+ if future.result() is not None:
+ node.get_logger().info('Result of add_two_ints: %d' % future.result().sum)
+ else:
+ node.get_logger().error('Exception while calling service: %r' % future.exception())
node.destroy_node()
rclpy.shutdown() |
I can confirm the increasing memory usage over time (I would recommend using |
#302 is similar and it would be good to address that one first to make sure the simple cases don't have an increasing memory usage. |
As suspected the fix for #302 seems to also address the memory increase reported in this ticket. Please try the patch from ros2/rcl#418 to check that it addresses the problem for you too. Since it does so for me and since this patch was not mergable I will go ahead and close the ticket for now. Please feel free to continue commenting with your findings. |
in ros2's server-client mode:
client
send_request
to server, then servertake_request
andsend_response
to the client, after clienttake_response
, the one rpc is finished.in the server side,
take_request
will create(in factPyMem_Malloc
is called) a header(typermw_request_id_t
) which is used for identifying the request, conceptually, the header will be destroyed aftersend_response
. The offical implementation forget toPyMem_Free
it, which will cause the memory leakage.Signed-off-by: reed-lau geoliuwei@gmail.com