petsc check error #19467
-
Dear all, We have been trying for multiple time to install moose on lemhi, It looks like mpi link issue, but the openmpi is loaded "openmpi/4.1.1-gcc-11.2.0-n2sm" Really appreciate some guidance! Cheers, |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
@fdkong openmpi on lemhi works fine for this? |
Beta Was this translation helpful? Give feedback.
-
I have been running into errors with gcc-11.2.0. You might want to try an older (stabler) variant. I created a new stack due to this, it is available with: $> module load use.moose civet_env
$> module list
Currently Loaded Modules:
1) use.moose 6) cmake/3.21.3-gcc-9.4.0-d22d
2) flex/2.6.4-gcc-11.2.0-wvjp 7) patchelf/0.13-oneapi-2021.4.0-uxhq
3) bison/3.7.6-gcc-11.2.0-aysl 8) miniforge
4) gcc/9.4.0-gcc-8.4.1-57pg 9) civet_env
5) mvapich2/2.3.6-gcc-9.4.0-n6hz The above is what our Civet clients load, when we need to build PETSc. Or if you would rather not build PETSc, you're welcome to use the default modules already in place: $> module load use.moose PETSc
$> module list
Currently Loaded Modules:
1) use.moose 4) gcc/9.4.0-gcc-8.4.1-57pg
2) flex/2.6.4-gcc-11.2.0-wvjp 5) mvapich2/2.3.6-gcc-9.4.0-n6hz
3) bison/3.7.6-gcc-11.2.0-aysl 6) PETSc/3.15.1-gcc-9.4.0 Both methods though, should describe in enough detail what version of GCC we are using for the most part. |
Beta Was this translation helpful? Give feedback.
I have been running into errors with gcc-11.2.0. You might want to try an older (stabler) variant.
I created a new stack due to this, it is available with:
The above is what our Civet clients load, when we need to build PETSc.
Or if you would rather not build PETSc, you're welcome to use the default modules already in place:
$> module load …