-
-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Dataset maximum dimension sizes #283
Conversation
Thanks a lot for looking at jhdf and opening a PR. I think this change is along the right lines. I did originally have dimensions and max size as So to get your PR merged:
With this implemented can you open the dataset you want? Or are the actual dimensions also too large? Could you attach an example file? |
I updated the Dataset API to long[] getMaxSize and added a test case. Changing int[] getDimensions() to long[] leads to may requierd changes... I tried it and converted at some points but I was not sure if this is allway a good solution? |
Yes think this sounds great. Your right changing getDimensions will be a little harder probably good for another change. Will take a better look soon and merge this. Thanks! |
/AzurePipelines run |
Azure Pipelines successfully started running 1 pipeline(s). |
Kudos, SonarCloud Quality Gate passed! |
Thanks a lot, I will work towards a release soon. |
Thank you for this quick reactions and this great project!! |
Hi,
I did some experiments and realized that I get an Exception if the Dataset maximum dimension sizes is larger than max int.
Some of my files have much larger dimension sizes...
As fahr as I see it from the stack trace a long value is converted to an int.
For this reason I propose to change the max dimenson variable to long.
What do you think about this?
In this PR the API is still the same as before. Just one extension if somebody wants the value as long