Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]error in mpi4py setup command: 'python_requires' must be a string containing valid version specifiers; Invalid specifier: '!=3.4.*'' #3007

Closed
LemonBoy68 opened this issue May 25, 2023 · 5 comments
Labels
UNCONFIRMED New issues are unconfirmed until a maintainer can duplicate them

Comments

@LemonBoy68
Copy link

LemonBoy68 commented May 25, 2023

Hi all~
I have some problems with self-compilation and installation of hdf5:

version:
Python 3.9.16
mpi4py :3.1.4
certifi 2023.5.7
charset-normalizer 3.1.0
Cython 0.29.34
idna 3.4
mpi4py 3.1.4
numpy 1.24.3
packaging 23.1
Pint 0.21
pip 23.1.2
platformdirs 3.5.1
pooch 1.7.0
PySocks 1.7.1
requests 2.31.0
scipy 1.10.1
setuptools 67.7.2
typing_extensions 4.6.0

Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Collecting h5py
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/69/f4/3172bb63d3c57e24aec42bb93fcf1da4102752701ab5ad10b3ded00d0c5b/h5py-3.8.0.tar.gz (400 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
Installing backend dependencies ... error
error: subprocess-exited-with-error

× pip subprocess to install backend dependencies did not run successfully.
│ exit code: 1
╰─> [23 lines of output]
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Collecting mpi4py==3.1.0
Using cached https://pypi.tuna.tsinghua.edu.cn/packages/1a/a3/3a33c19379cb08cc1be16d1928a899e5b18151c7e8994d315a66017bc46f/mpi4py-3.1.0.tar.gz (2.4 MB)
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'error'
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [1 lines of output]
    error in mpi4py setup command: 'python_requires' must be a string containing valid version specifiers; Invalid specifier: '!=3.4.*''
    [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.

error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.
[end of output]

@derobins
Copy link
Member

Is this an HDF5 problem?

@derobins derobins added the UNCONFIRMED New issues are unconfirmed until a maintainer can duplicate them label May 25, 2023
@ajelenak
Copy link
Contributor

No, this is not HDF5 problem. It is a known issue and has a known solution. This issue can be closed here because it is duplicated as h5py/h5py#2267.

@LemonBoy68
Copy link
Author

Ok,tanks

@LemonBoy68
Copy link
Author

No, this is not HDF5 problem. It is a known issue and has a known solution. This issue can be closed here because it is duplicated as h5py/h5py#2267.

see all messages. Rank 0 message is:
Traceback (most recent call last):
File "inputA.py", line 1360, in
dtStep,model_time,step,MeshOutput = update(MeshOutput)
File "inputA.py", line 1332, in update
MeshOutput = SaveData(MeshOutput)
File "inputA.py", line 841, in SaveData
s1Hnd=swarm.save(outputPath+"swarm"+ str(step).zfill(4)+".h5")
File "/share/home/Tninghui/anaconda3/envs/uwgo_oneapi/lib/python3.8/site-packages/underworld/swarm/_swarm.py", line 290, in save
self.particleCoordinates.save(filename, collective, units=units, **kwargs)
File "/share/home/Tninghui/anaconda3/envs/uwgo_oneapi/lib/python3.8/site-packages/underworld/swarm/_swarmvariable.py", line 499, in save
h5f.attrs[str(kwarg)] = str(val)
File "/share/home/Tninghui/anaconda3/envs/uwgo_oneapi/lib/python3.8/site-packages/underworld/utils/_io.py", line 105, in exit
self.h5f.close()
File "/share/home/Tninghui/anaconda3/envs/uwgo_oneapi/lib/python3.8/site-packages/h5py/_hl/files.py", line 586, in close
self.id._close_open_objects(h5f.OBJ_LOCAL | h5f.OBJ_FILE)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py/h5f.pyx", line 360, in h5py.h5f.FileID._close_open_objects
RuntimeError: Can't decrement id ref count (Invalid argument, error stack:
MPI_FILE_SET_SIZE(75): Inconsistent arguments to collective routine )
Abort(1) on node 1 (rank 1 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 1
Abort(1) on node 48 (rank 48 in comm 496): application called MPI_Abort(comm=0x84000003, 1) - process 48
Abort(1) on node 96 (rank 96 in comm 496): application called MPI_Abort(comm=0x84000003, 1) - process 96
Abort(1) on node 9 (rank 9 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 9
Abort(1) on node 49 (rank 49 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 49
Abort(1) on node 142 (rank 142 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 142
Abort(1) on node 144 (rank 144 in comm 496): application called MPI_Abort(comm=0x84000003, 1) - process 144
Abort(1) on node 20 (rank 20 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 20
Abort(1) on node 68 (rank 68 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 68
Abort(1) on node 97 (rank 97 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 97
Abort(1) on node 168 (rank 168 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 168
Abort(1) on node 33 (rank 33 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 33
Abort(1) on node 86 (rank 86 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 86
Abort(1) on node 98 (rank 98 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 98
Abort(1) on node 172 (rank 172 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 172
Abort(1) on node 47 (rank 47 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 47
Abort(1) on node 93 (rank 93 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 93
Abort(1) on node 112 (rank 112 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 112
Abort(1) on node 174 (rank 174 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 174
Abort(1) on node 2 (rank 2 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 2
Abort(1) on node 94 (rank 94 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 94
Abort(1) on node 113 (rank 113 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 113
Abort(1) on node 189 (rank 189 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 189
Abort(1) on node 6 (rank 6 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 6
Abort(1) on node 57 (rank 57 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 57
Abort(1) on node 114 (rank 114 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 114
Abort(1) on node 158 (rank 158 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 158
Abort(1) on node 7 (rank 7 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 7
Abort(1) on node 58 (rank 58 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 58
Abort(1) on node 115 (rank 115 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 115
Abort(1) on node 160 (rank 160 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 160
Abort(1) on node 8 (rank 8 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 8
Abort(1) on node 72 (rank 72 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 72
Abort(1) on node 116 (rank 116 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 116
Abort(1) on node 161 (rank 161 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 161
Abort(1) on node 10 (rank 10 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 10
Abort(1) on node 74 (rank 74 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 74
Abort(1) on node 117 (rank 117 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 117
Abort(1) on node 162 (rank 162 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 162

@ajelenak
Copy link
Contributor

The errors you posted above have nothing to do with building h5py with custom MPI-enabled libhdf5. So this issue is not any more appropriate to use. I suggest you consider posting your problem on the HDF Forum and see if someone from the HDF community can help you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
UNCONFIRMED New issues are unconfirmed until a maintainer can duplicate them
Projects
None yet
Development

No branches or pull requests

3 participants