-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Large Bin File Error #28
Comments
Hi, do you mind opening a pull request? I'd suggest a more proper fix as:
|
I ran into this as well. Note that this will still only let you go as big as 16G worth of vectors, and you lose the memory mapping from calling @dirkgr FYI, side-effect of your efficiency fixes causes the max number of doubles to be |
I'm going to look into a fix for this. |
Thanks for looking at it. Let me know if you want me to contribute in some way. The limit is the number of doubles you can put into a |
I don't know if Java can't, but the API for |
Seems like this would benefit from using nd4j, if nothing else you could use their DoubleBuffer which supports longs for the length If there is interest, I could maybe try it out and submit a pull request. Not sure how you feel about adding that dependency |
DoubleBuffer vectors = ByteBuffer.allocateDirect(vocabSize * layerSize * 8).asDoubleBuffer();
this line was throwing error since the int multiplication vocabSize * layerSize * 8 > Integer.MAX_VALUE so negative number was passed into the method.
As a dirty fix i change it to the following:
DoubleBuffer vectors = DoubleBuffer.allocate(1000000000);
The text was updated successfully, but these errors were encountered: