Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Excessive memory usage (version 1.3.0, nginx 1.22.1) #108

Open
dup2 opened this issue Jul 12, 2023 · 5 comments
Open

Excessive memory usage (version 1.3.0, nginx 1.22.1) #108

dup2 opened this issue Jul 12, 2023 · 5 comments

Comments

@dup2
Copy link
Contributor

dup2 commented Jul 12, 2023

We use nginx 1.22.1 with version 1.3.0 and see excessive memory usage.

We create zip files with local files only based on a manifest without CRC checksums.

For a small number of files (i.e. 3 files, total around 11GB) it is no problem, memory usage stays almost the same

For a larger number of files (i.e. 200 files, total around 2 GB) the memory usage spikes to several hundred MB and even some GB sometimes. It seems the memory usage is related to the number of files.

We are aware of #67 but this does not seem to fix the issue.

nginx information

nginx version: nginx/1.22.1
built by gcc 11.2.0 (Ubuntu 11.2.0-19ubuntu1) 
built with OpenSSL 3.0.2 15 Mar 2022
TLS SNI support enabled

This is running on Ubuntu 22.04

@dup2
Copy link
Contributor Author

dup2 commented Jul 27, 2023

Any feedback on this? Or can someone explain me how to debug this?

@dup2
Copy link
Contributor Author

dup2 commented Jul 28, 2023

More tests reveal a more accurate picture here (using top to check memory of the nginx worker, looking at VIRT and RES)

File Information Total Size VIRT RES COMMENT
75 x 3.8 MB 285 MB 565 MB 288 MB RES is about the total size
75 x 7.5 MB 562 MB 846 MB 571 MB RES is about the total size
75 x 75 MB 5.6 GB 318 MB .. 508 MB .. 641 MB .. 812 MB 65 MB .. 233 MB .. 365 MB .. 537 MB Initial .. after 15% .. after 25% .. after 35%
5 x 750 MB 3.7 GB 299 MB .. 321 MB .. 343 MB 26 MB .. 49 MB .. 71 MB 1st file .. 2nd file .. 3rd file

So it seams for smaller files, the RSS memory usage is matching the total size, for medium and large files around 20MB are used in the RSS per file (increased memory usage with streaming progress).

This means that for a few large files, it works with reasonable memory usage. But for a large amount of files, it does not scale as the memory usage is per file.

@nahuel
Copy link

nahuel commented Jul 30, 2023

What's the behavior before the patch made in #67? Is worse after it?

@dup2
Copy link
Contributor Author

dup2 commented Jul 30, 2023

We see no changes in our tests when comparing 1.2 and 1.3. As it seems to be an issue depending on the number of files, we suspect something with subrequests using up memory - for example that the result of the subrequest is stored somehow in a memory buffer which seems to be capped at 20MB or so.

The mentioned #67 seems to affect use cases where you have a huge amount of files only.

@herclogon
Copy link

The same issue we have. We created files 10000 1Mb files:

for i in {1..10000}; do dd if=/dev/random of=$i.dat bs=1M count=1; echo $i; done

256Mb of RAM is not enough to download, memory increases all the time until OOMKill.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants