-
-
Notifications
You must be signed in to change notification settings - Fork 117
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request]: Compressing large folders,For example, compressing the entire D: drive #223
Comments
Hi!
Bit7z cannot automatically separate the files into subsets for you. |
Thank you for your response, I will give it a try. |
When I split a large folder into individual files for processing, I find the speed to be very slow. Is this normal, and what can I do to improve the compression speed? my code is like this`
|
A compressed package of about 15GB takes about 5 minutes using this method, while it might take 50 minutes using the above method. file_list only include a directory(about 15GB).
|
A few notes first:
As for the performance of the code:
So yes, the slowness is normal, especially for such a large archive and because you're doing a lot of single file append compressions.
In my first reply, I suggested the "update" approach because it was the only way to achieve what you originally described in this issue. Ideally, only one compression should be performed. In the issue, you mentioned that your program terminated because of an error during the compression process. This could be due to an underlying problem with the files being compressed, so I would check that before taking the "update" approach. |
When compressing, if there is an exception, it will throw an error message and stop compressing. Is there a way to throw an error message but not stop compressing? I hope the compression can continue and skip those erroneous files. |
When compressing a particularly large file, about 200GB, a stack overflow may occur, causing an abnormal exit. Are there any good ways to handle this?`
` |
Unfortunately, no. But I think it could be a useful feature, so I'll definitely add it to the library. Here is what you can do for now:
Um, that's weird, I'll try to replicate the problem. |
7-zip GUI is ok. I can use try, catch(...), to catch the stack overflow exception and getlasterror=5, which is basically reproducible during my tests. I hope to avoid stack overflow exceptions. |
Thanks for the further details. I've already tried a similar configuration: MSVC 2022, x86, same flags, 7-Zip 23.01, single ~110GB file to be compressed as a zip file. |
Test code.
|
This code can catch error:3.
|
It seems that the issue is caused by the recursive call in this code.
|
Ah, I see! |
Fixes stack overflow issues when indexing deeply nested directory structures (close issue #223)
Fixes stack overflow issues when indexing deeply nested directory structures (close issue #223)
Hi! |
I tried it.
|
Interesting. |
yes,it works. |
Perfect, thanks! 🙏 |
Feature description
When compressing a very large folder, calling the compress function can take a long time and it will exit when an error is output. Is it possible to divide a large folder into n smaller files and process the compress function individually?
Additional context
No response
Code of Conduct
The text was updated successfully, but these errors were encountered: