-
Notifications
You must be signed in to change notification settings - Fork 69
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory Leak? #334
Comments
That is an interesting find. I'm not quite sure where exactly it might be doing this, but do you see this issue for just "fanfox" or is it happening with other websites too? |
Oh and that's while downloading the actual images only and not while scraping the page. |
I tried it with Mangadex as well, same problem. I also updated to the 2023.01.08.2 image, I was using 2023.01.08 when I wrote the issue. I don't know if this will help, but I recorded it when I tested it with mangadex. |
Thanks for sharing this. I'll try this out on my machine this weekend. 👍🏽 |
Did you have a chance to take a look at this yet? |
No, really packed with professional and personal life. Super difficult to find out time for this. But, I'm planning on re-writing this entire script soon. I'll try to find some time soon 🙏🏽 |
Hi there,
I'm using the docker image ghcr.io/xonshiz/comic-dl:latest and when using it to grab more than one chapter at a time the memory usage keeps increasing until the host machine either runs out of memory and crashes or it exits. I set a 2000 MB limit on the container now using the --memory flag but that is reached after about half an hour.
Here's the docker command:
docker run -d --name comic-dl --memory=2000m -v /c/Users/Username/ComicDL:/directory:rw -w /directory ghcr.io/xonshiz/comic-dl:latest_linux_amd64 comic_dl -dd /directory --auto
and here's the config.json I'm using:
I had to add the "cookie":"None", in there myself or it would refuse to run at all.
Thanks for the help, even with the memory leak I am slowly but steadily getting things done.
EDIT: Added a screenshot
The text was updated successfully, but these errors were encountered: