Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out of memory issue on 100k data points #31

Open
HymnsForDisco opened this issue Jul 13, 2021 · 3 comments
Open

Out of memory issue on 100k data points #31

HymnsForDisco opened this issue Jul 13, 2021 · 3 comments

Comments

@HymnsForDisco
Copy link

runtime: VirtualAlloc of 80000000000 bytes failed with errno=1455
fatal error: out of memory

Crashing here (calling gonum code):

qr.QTo(q)

Is there any method, or plans to support larger data sets?

The actual size of the data set is only about 2 MB, so the 80GB virtual alloc would suggest that there are some serious scaling issues with the current implementation. I see that the actual crash is occurring in the gonum code, but I'm just wondering if there's a solution here to make this library usable with large datasets.

@mish15
Copy link
Member

mish15 commented Jul 13, 2021

Hmmm never hit that before. I suspect it's the matrix calc. Have you verified the data is ok on smaller sets of the same data?

@HymnsForDisco
Copy link
Author

Have you verified the data is ok on smaller sets of the same data?

With 10k the program runs. 20k was slow, and 30k froze my computer for a good while but did actually succeed.

@mish15
Copy link
Member

mish15 commented Jul 13, 2021

Yeah it's possibly just a limitation of this approach. As that matrix gets larger the computation will blow out. The matrix calcs are outside of this repo, so unfortunately it can't be solved here easily.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants