Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AppCrash #125

Open
JakubChecinski opened this issue Jun 12, 2018 · 7 comments
Open

AppCrash #125

JakubChecinski opened this issue Jun 12, 2018 · 7 comments
Assignees
Milestone

Comments

@JakubChecinski
Copy link

JakubChecinski commented Jun 12, 2018

Not 100% sure what caused it, maybe choosing not to normalize 3D vectors in a widget

The error message is attached as another image to: https://imgur.com/a/kHeqYxi

Edit: ok I think it was for 3D_CUBIC window actually, which could explain lack of memory error

@LemurPwned
Copy link
Owner

The crash is most likly due to the Text Selection bug - that also solves #124

@JakubChecinski
Copy link
Author

I still get this error for 3D Cubic, even with normalization

The error message is almost identical but now also contains info about a runtime warning: https://imgur.com/j7olFzu

@LemurPwned
Copy link
Owner

LemurPwned commented Jun 13, 2018

Can you provide more info about the dataset and settings? Does it happen each time?

Runtime warnings are due to dividing by zero or very small number close to zero - these produce np.Nan values which allow later for not drawing an empty object

@LemurPwned LemurPwned self-assigned this Jun 13, 2018
@LemurPwned LemurPwned added this to the Bugs milestone Jun 13, 2018
@JakubChecinski
Copy link
Author

I checked different datasets and found an interesting thing: this error occurs only when loading full Examples->0520nm directory. If I load a single file or loading the Examples->0200nm directory, 3D Cubic widget works fine. Maybe my laptop really lacks available memory?

Settings: default 3D Cubic settings, I do not touch anything

Yes, for 0520nm directory it seems to happen every time.

@LemurPwned
Copy link
Owner

Does it happen for Arrow structures?
This might be due to large memory usage -> 520 is a huge dataset. then each cube requries 72 veritces. Similarly, arrows also need plenty of veritces - 64 for each structure.

@JakubChecinski
Copy link
Author

I tried to check for Arrow, but got bug #133 instead.

I also think it's memory, but can we provide a wrapper around it? I.e., instead of crashing the app would refuse to load a dataset that is too large and display warning? Or is it technically too difficult?

@LemurPwned
Copy link
Owner

(Same goes for #133 .)
Wrapper is a straightforward solution. I might think about getting the data progresively loaded into ram as the animation progresses, however, there are two options:

  • animation would pause/slow down while picking extra data
  • data is loaded on the go but in a separate thread (more difficult, I am not sure if simple mutexing will be enough).

Anyway, how about when larger averaging or decimation is selected?
The general problem here is that there is a time cap set for data loading, hence TimeoutError.
For example, if there was a 100 files, they are all loaded in parallel and each has a time cap of 20 seconds by deafult to be processed. I can obviously increase the limit, but it seems to me that it won't fix an issue since 20 sec is quite plenty for a file to be loaded and processed.

Weirdly enough, it seems like a purely Windows issue. I have noticed that somehow processing takes much longer here.

@LemurPwned LemurPwned mentioned this issue Jul 12, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants