-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Darknet-ros process died on Jetson TX1 #124
Comments
Small update related to the issue - https://answers.ros.org/question/220502/image-subscriber-lag-despite-queue-1/ |
How did you solve this issues?
… On 18 Oct 2018, at 15:27, Pushkal Katara ***@***.***> wrote:
Small update related to the issue - https://answers.ros.org/question/220502/image-subscriber-lag-despite-queue-1/
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub, or mute the thread.
|
I am still trying to figure out a way to solve the issue. As the implementation is using ImageTransport Subscriber in C++, there is no such thing as buff_size as an argument in roscpp. |
Hi @mbjelonic, I tried to solve the issue using AlexAB darknet and different multithreading implementation. The issue is resolved for me. I have pushed the code on this link - https://github.com/pushkalkatara/darknet_ros. |
Except for "malloc" and "free" problems, I also met
I am quite confused about this problem. I am running this package on tx2 with a Intel Realsense Camera D415. |
I am running Yolo v3 on Jetson TX1 but after loading the model in the memory and processing some frames from the Image topic the process dies.
I noticed in Tegra-stats, there is a constant increase in memory usage. I added a swap file of 16 GB also, but the memory fills up the swap with time and the process dies.
Some of the possible issues i thought of:
The issue is somewhere related to ROS as the same model with the config file based on Yolo v3 was successfully running without ROS.
The text was updated successfully, but these errors were encountered: