-
Notifications
You must be signed in to change notification settings - Fork 463
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SubDyn stack overflow for structures with large number of members, with Windows/VisualStudio #843
Comments
Hi @Ry8icks, |
naah , here's the correct info:
and yes, I'm using Visual Studio with the latest Intel Visual Fortran Compiler. |
We can introduce these changes in our solution, I agree that these should be increased. 512Mb seems a bit excessive, is it the minimum you could use? I'm a bit rusty on that, but I know some of the old code of SubDyn uses a lot of assumed-shape arrays, which I believe uses the stack. I wonder if that could be the main issue. Do you know the specific lines that lead to stack overflow? Could you maybe share your SubDyn model here? @bjonkman @rafmudaf @andrew-platt. Do you have any preferences regarding the allowing stack size limit? |
@ebranlard , I think increasing the stack size is a short-term solution. I don't think this will be sufficient to solve the issue if someone were to run that model in the FAST S-Function for Simulink, though, so it would probably be better to avoid using the stack for large variables if possible. I got rid of some of those calls that use a lot stack space a while ago--just enough to get the model we were using to run. I'm sure there is more to be done, though. Any equation with |
@bjonkman, thanks for your input. I must admit I have only lazily replaced the matmul calls with lapack calls and introducing intermediate variables, depending on which ones where big matrices, and whenever I ran into stack issues (mostly for win32). I should do another pass through the code to replace those. @Ry8icks if you can share a dummy model, I could use it to fix these calls (the largest model I've used was the OC4 jacket with a large NDIV and number of modes). It might take a couple of months before I get to it. I hope you are fine with changing the stack size for now. |
I'm not able to recreate the problem. I set the stack size to 8 Mb and it still worked. It's been a bit trial and errors, i.e. update of the input files and update of the code and now I've lost track of it all. In either case the following files work on my version of Openfast. I've made some corrections to the code which I've already reported, e.g. increased number of joints connected to a joint to 10. It should be noted that some of the parameters as given in the files are a bit arbitrary. Pile_stiff_matrix.txt |
We have been having issues with stack overflows in FAST.Farm when large wind grids were passed to AWAE. This was eventually tracked down to line 1078 in AWAE.f90 which reads: `m%u_IfW_Low%PositionXYZ = p%Grid_low` The `p%Grid_low` is of unknown size at compile time, but can be extremely large (3x160000 or more). This copy involves a temporary array and would normally be handled on the stack, but could result in an overflow for some models. By setting the `/heap-arrays:1000` any operation resulting in a temporary array of unknown array size at compile time will use the heap for the temprary array, and as will any array known at compile time to be larger than 1000 kB. Testing shows that this fixes issue OpenFAST#2053, and will likely also solve OpenFAST#843 and OpenFAST#2241 See https://www.intel.com/content/www/us/en/docs/fortran-compiler/developer-guide-reference/2024-2/heap-arrays.html for reference.
Bug description
"Stack overflow" is issued when running with a jacket. In my case the jacket have 383 members.
To Reproduce
Run with a substructure with a lot of joint/members.
Expected behavior
It should work...
However, this is also an easy fix. Just increase the stack size for the linker. I've increased to 512 Mb (from I think 9999999.)
Screenshots, if applicable
OpenFAST Version
System Information (please complete the following information):
Additional context
The text was updated successfully, but these errors were encountered: