-
Notifications
You must be signed in to change notification settings - Fork 695
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory usage for large datasets #276
Comments
We're experiencing the same issue, perhaps RubyZip could be replaced with https://bitbucket.org/winebarrel/zip-ruby/wiki/Home ? |
RubyZip isn't the issue I think, axlsx is. |
Tried to fix it @ https://github.com/jurriaan/axlsx/tree/fix_enormous_string Please report if this helps (it should reduce usage to 1.5 GB or so) |
@guilleva @michaeldauria Did this change work for you? It should be possible to reduce the memory usage even further, but it's a beginning :) |
Hi jurriaan, thanks for looking into this, I haven't been able to test it, I really sorry, I will do my best to do it today and I will let you know. Thanks again |
Just pushed some new commits, the example now uses ~700 MB on my system.. |
I will try this out today and report back |
It is possible it won't help in your situation though.. the fact is that atm axlsx allocates way too much objects and causes the heap to grow very big.. I'm trying to reduce the number of allocated objects.. |
This did not end up helping too much, memory seemed more stable, but we still ran out anyway. |
We have a similar issue. We're building sheets with hundreds of thousands of rows and as many as 100 tabs. axlsx uses 5.3GB to build a 111MB xlsx file. I would be happy to test some changes on this, but your branch is no longer available. Any further updates on this? |
@jeremywadsack I've merged most of the changes into the master branch, but it's still using a lot of memory. I hope to have some time to look at this issue again in the coming weeks |
Hi. Any updates on this? |
Our Delayed Job workers aren't processing jobs (out of memory), probably because of this issue. Any updates on the status? Cheers! |
+1 on this. We are having the same issue with huge datasets :/ |
We just ended up spinning up more hardware to deal with this for the time being. :/ |
Old thread. We're also having problems. 2.1 MB file needs 650MB of RAM. Still waiting for a solution |
Yeah. We would still love a solution. We run a 24gb server instance so we
|
Just had another build crash because a 400MB Excel file took 22GB of memory to build. (Incidentally, Excel used 1.5GB to open this file.) In the python world we use |
+1 for a solution/update for this. |
+1 for a solution |
First of all, thank you for this wonderful gem. I'm also having this problem when trying to generate 30k rows of text. As an alternative (and as opposed to increasing memory), does Axlsx have the capability to append data to an existing spreadsheet or join two of them without exhausting memory? I'd at least be able to write multiple files and join them together at the end. |
Maybe there could be an option of using a temp file instead of filling a StringIO object, once it is finished it can just place everything in the zip file. Thoughts? |
The XLSX file isn't a single file. It's a collection of files that are zipped together. Which I think makes this more complicated. The python library xlsxwriter has a feature like this:
This does introduce limits in how you can use it:
Perhaps something like this could be implemented in axlsx, though? |
I haven't taken a look at it just yet, but I'm assuming it's just like a word doc... a bunch of XML files zipped together. However, with word docs you can indeed extract certain contents from the word/document.xml file, and I've done this to pull templates and insert data into other word docs (same process). During the mean time, I'm going to try working on my own solution. Haven't seen much feedback from the guilleva here so not really expecting a fix anytime soon. |
But the file that has the actual rows can it be still be written sequentially no? looking inside an xlsx file I can see that the |
+1, this is a huge issue for me. I'd be happy to hear about possible workarounds. |
+1, big issue for me too. |
Would it help if I'd add a bounty via bountysource? @randym what do you think? |
#352 is a similar request to append rows to an existing file. |
I've been struggling with same issues, when generating > 100k rows our servers started to use swap and become very slow. Solved by making own excel library https://github.com/paxa/fast_excel, it works 3-4 times faster and consume 5-20 MB ram regardless of data size |
The problem with these memory issues we face using AXLSX via the to_spreadsheet gem is, that the memory won't be free after xslx creation. This means we have to manually restart our worker dyno for now and also limit the maximum number of rows per Excel sheet. @Paxa Nice work. Depending on the answer of the maintainer of the to_spreadsheet gem (allows a Rails app to render Excel files using the existing slim/haml/erb/etc views) I'll see if fast_excel could be used as an alternative to AXLSX. |
For now we limit the number of rows per xslx file to mitigate the memory problems using the to_spreadsheet gem. |
According to my basic memoryprofile skills, the main issue is here: https://github.com/randym/axlsx/blob/master/lib/axlsx/workbook/worksheet/row.rb#L156 This is never released, expanding tot up to 3GB in my testcase.
|
@barttenbrinke can you share your code to reproduce and measure this problem? |
anyone already fixed the issue ? |
@simi https://github.com/SamSaffron/memory_profiler @comictvn This wil not be an easy fix, seeing that row.rb has not been touched in > year, and the fact that this ticket is from 2014, I would not get your hopes up. |
Well, yes, @jurriaan said this in 2014: "the fact is that atm axlsx allocates way too much objects and causes the heap to grow very big." I think there are two approaches here, neither is likely to be "easy":
The problem with the first approach is that we might generate a large excel file with 20+ tabs and each of them with five million rows (I know Excel doesn't support that many rows, but the file format does). I think you'd need to reduce the size of the The problem with the second approach is that it means the developer would have to write rows in sequential order. That's a change to the interface, so would need to be "opt-in". |
👍 for solution 2 |
+1, this is a huge issue for me. I'd be happy to hear about possible workarounds. |
We've been very happy with |
@randym any update on this ? |
Hi, I'm facing a memory issue when trying to generate a xlsx file with >600 columns and ~10k rows. The memory usage grows up to 3GB aprox and it's making the dyno be restarted by heroku.
I'm wondering if there is an known way to generate it without making a heavy use of memory.
The way I'm doing it can be represented by this script:
The text was updated successfully, but these errors were encountered: