Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HDF5 1.12.1: Unable to compile the parallel version using CMake #833

Closed
jwsblokland opened this issue Jul 14, 2021 · 15 comments
Closed

HDF5 1.12.1: Unable to compile the parallel version using CMake #833

jwsblokland opened this issue Jul 14, 2021 · 15 comments

Comments

@jwsblokland
Copy link
Contributor

jwsblokland commented Jul 14, 2021

Hello,

The other day I tried to compile the parallel version of the released HDF5 library 1.12.1 using CMake on Linux. Unfortunately, I got a strange CMake error from FindMPI.cmake about could not find MPI_C. Here is the CMake command I have used

  CC=gcc FC=gfortran CXX=g++ cmake -G "Unix Makefiles" -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=./phdf5 -DHDF5_ENABLE_PARALLEL=ON ../source

and the resulting CMake error is

-- Testing maximum decimal precision for C - 21;33;
-- maximum decimal precision for C var - 33
-- Found Perl: /usr/bin/perl (found version "5.16.3")
-- ....All Warnings are enabled
-- Could NOT find MPI_C (missing: MPI_C_LIB_NAMES MPI_C_HEADER_DIR MPI_C_WORKS)
CMake Error at /glb/apps/hpc/EasyBuild/software/rhel/7/CMake/3.18.4/share/cmake 3.18/Modules/FindPackageHandleStandardArgs.cmake:165 (message):
Could NOT find MPI (missing: MPI_C_FOUND)

Reason given by package: MPI component 'CXX' was requested, but language CXX is not enabled. MPI component 'Fortran' was requested, but language Fortran is not enabled.

Call Stack (most recent call first):
/glb/apps/hpc/EasyBuild/software/rhel/7/CMake/3.18.4/share/cmake-3.18/Modules/FindPackageHandleStandardArgs.cmake:458 (_FPHSA_FAILURE_MESSAGE)
/glb/apps/hpc/EasyBuild/software/rhel/7/CMake/3.18.4/share/cmake-3.18/Modules/FindMPI.cmake:1721 (find_package_handle_standard_args)
CMakeLists.txt:712 (find_package)

-- Configuring incomplete, errors occurred!
See also "/glb/data/SeismicApps3/nljbl6/HDF5.custom/1.12.1_patch/build-intel-2019a-libaec/CMakeFiles/CMakeOutput.log".
See also "/glb/data/SeismicApps3/nljbl6/HDF5.custom/1.12.1_patch/build-intel-2019a-libaec/CMakeFiles/CMakeError.log".

This command worked fine for version 1.12.0 using CMake version 3.18.4. I also tried using CMake version 3.14.0 which resulted in the same error.

I did some additional investigation and I noticed that you have introduced a compile+run test using a macro in ConfigureChecks.cmake (line 252 - 359). In this macro the function try_run() is used and when I commented out this function the CMake step runs fine. It was able to find the required MPI libraries. At this moment, it is unclear to me if you made a mistake in how to use the try_run() function or it is a bug in CMake itself.

@jwsblokland jwsblokland changed the title HDF 1.12.1: Unable to compile the parallel version using CMake HDF5 1.12.1: Unable to compile the parallel version using CMake Jul 14, 2021
@byrnHDF
Copy link
Contributor

byrnHDF commented Jul 14, 2021

The C++ libraries are no longer enabled by default and therefore CMake could not reconcile your CXX=g++ option when looking for MPI compilers.

@jwsblokland
Copy link
Contributor Author

Ok. I have tried the following CMake command:

CC=gcc cmake -G "Unix Makefiles" -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=./phdf5 -DHDF5_ENABLE_PARALLEL=ON  -DHDF5_BUILD_CPP_LIB=OFF -DHDF5_BUILD_FORTRAN=OFF -DHDF5_ENABLE_THREADSAFE=OFF ../source

Unfortunately, I still got the same CMake Error.

@byrnHDF
Copy link
Contributor

byrnHDF commented Jul 14, 2021

I haven't tried just using CC=gcc. I either leave it unset and use the CMake detected compiler or set CC=mpicc, the mpi compiler wrapper

@byrnHDF
Copy link
Contributor

byrnHDF commented Jul 14, 2021

Also, when I use CC=mpicc, I add the following CMake command line define - MPI_C_COMPILER=mpicc

@jwsblokland
Copy link
Contributor Author

jwsblokland commented Jul 14, 2021

Yes, that works. This will mean that everything will be built using the MPI compiler. Even targets which supposed to be compiled with only gcc without any MPI flags. I do not know if there are such targets in your build. If not, then using mpicc is a nice workaround for the problem. Furthermore, it may be worthwhile to mention this somewhere in the release note because it is different compared to the build of the parallel version of HDF5 1.12.0.

@byrnHDF
Copy link
Contributor

byrnHDF commented Jul 14, 2021

You could try just the define MPI_C_COMPILER=mpicc?

@jwsblokland
Copy link
Contributor Author

jwsblokland commented Jul 15, 2021

I have tried the all of the following options

CC=gcc cmake -G "Unix Makefiles" -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=./phdf5 -DHDF5_ENABLE_PARALLEL=ON  -DHDF5_BUILD_CPP_LIB=OFF -DHDF5_BUILD_FORTRAN=OFF -DHDF5_ENABLE_THREADSAFE=OFF ../source

CC=gcc MPI_C_COMPILER=mpicc cmake -G "Unix Makefiles" -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=./phdf5 -DHDF5_ENABLE_PARALLEL=ON  -DHDF5_BUILD_CPP_LIB=OFF -DHDF5_BUILD_FORTRAN=OFF -DHDF5_ENABLE_THREADSAFE=OFF ../source

CC=gcc cmake -G "Unix Makefiles" -DMPI_C_COMPILER=mpicc -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=./phdf5 -DHDF5_ENABLE_PARALLEL=ON  -DHDF5_BUILD_CPP_LIB=OFF -DHDF5_BUILD_FORTRAN=OFF -DHDF5_ENABLE_THREADSAFE=OFF ../source

export MPI_C_COMPILER=mpicc
CC=gcc cmake -G "Unix Makefiles" -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=./phdf5 -DHDF5_ENABLE_PARALLEL=ON  -DHDF5_BUILD_CPP_LIB=OFF -DHDF5_BUILD_FORTRAN=OFF -DHDF5_ENABLE_THREADSAFE=OFF ../source

CC=mpicc cmake -G "Unix Makefiles" -DCMAKE_BUILD_TYPE=Release -DCMAKE_INSTALL_PREFIX=./phdf5 -DHDF5_ENABLE_PARALLEL=ON  -DHDF5_BUILD_CPP_LIB=OFF -DHDF5_BUILD_FORTRAN=OFF -DHDF5_ENABLE_THREADSAFE=OFF ../source

Only the last one works for me.

@byrnHDF
Copy link
Contributor

byrnHDF commented Jul 15, 2021

Thanks! Very useful information. Interesting that the key seems to be CC=mpicc.
The new try_run that was moved into ConfigureChecks is the first "run" check and must set some CMake internal variable.

@jwsblokland
Copy link
Contributor Author

I have found the problem. In line 286 of the file ConfigureChecks.cmake

set (${RUN_RESULT_VAR} 1 CACHE INTERNAL "Have C function ${FUNCTION_NAME}")

says you want to store the value 1 in the variable of the value of RUN_RESULT_VAR. This looks like a mistake to me. I think what you want to do is the following

set (RUN_RESULT_VAR 1 CACHE INTERNAL "Have C function ${FUNCTION_NAME}")

Using this modification in building the parallel version of the library with CC=gcc solves the problem. Looking at the code, maybe your intention was to store the value 1 into the value of RETURN_VAR. In this case the line becomes

set (${RETURN_VAR} 1 CACHE INTERNAL "Have C function ${FUNCTION_NAME}")

Again, I tested it and this also works. If you want, I can create a pull request for this bug fix.

@byrnHDF
Copy link
Contributor

byrnHDF commented Jul 19, 2021

Yes I believe you have found the problem, this error was hidden until now in the Fortran configure.
Interestingly, the result is ignored in the call. I think the intent was to return the result in the RETURN_VAR, so your last solution is correct.
Looks like the source has changed a bit, so I will submit a PR to develop.

Allen

@byrnHDF
Copy link
Contributor

byrnHDF commented Jul 19, 2021

See PR #843

@jwsblokland
Copy link
Contributor Author

Thanks. One question. How will this bug fix be distributed? I mean are you going to update the 1.12.1 release, create 1.12.2 release or some other method?

@byrnHDF
Copy link
Contributor

byrnHDF commented Jul 19, 2021

We will include it in releases going forward. Given that CC=mpicc works correctly, I don't see a need for a patch. The end result is that the mpi compiler will be found and used for all compiles.

@jwsblokland
Copy link
Contributor Author

Allen, I think we can close this issue or is there a special reason to keep it open?

@byrnHDF
Copy link
Contributor

byrnHDF commented Aug 20, 2021

@jwsblokland - Just waiting for you to verify.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants