PyMPDATA-MPI constitutes a PyMPDATA + numba-mpi coupler enabling numerical solutions of transport equations with the MPDATA numerical scheme in a hybrid parallelisation model with both multi-threading and MPI distributed memory communication. PyMPDATA-MPI adapts to API of PyMPDATA offering domain decomposition logic.
In a minimal setup, PyMPDATA-MPI can be used to solve the following transport equation:
In spherical geometry, the MPIPolar
boundary condition class, while the outer dimension uses
MPIPeriodic
.
Note that the spherical animations below depict simulations without MPDATA corrective iterations,
i.e. only plain first-order upwind scheme is used (FIX ME).
In the cartesian example below (based on a test case from Arabas et al. 2014),
a constant advector field
flowchart BT
H5PY ---> HDF{{HDF5}}
subgraph pythonic-dependencies [Python]
TESTS --> H[pytest-mpi]
subgraph PyMPDATA-MPI ["PyMPDATA-MPI"]
TESTS["PyMPDATA-MPI[tests]"] --> CASES(simulation scenarios)
A1["PyMPDATA-MPI[examples]"] --> CASES
CASES --> D[PyMPDATA-MPI]
end
A1 ---> C[py-modelrunner]
CASES ---> H5PY[h5py]
D --> E[numba-mpi]
H --> X[pytest]
E --> N
F --> N[Numba]
D --> F[PyMPDATA]
end
H ---> MPI
C ---> slurm{{slurm}}
N --> OMPI{{OpenMP}}
N --> L{{LLVM}}
E ---> MPI{{MPI}}
HDF --> MPI
slurm --> MPI
style D fill:#7ae7ff,stroke-width:2px,color:#2B2B2B
click H "https://pypi.org/p/pytest-mpi"
click X "https://pypi.org/p/pytest"
click F "https://pypi.org/p/PyMPDATA"
click N "https://pypi.org/p/numba"
click C "https://pypi.org/p/py-modelrunner"
click H5PY "https://pypi.org/p/h5py"
click E "https://pypi.org/p/numba-mpi"
click A1 "https://pypi.org/p/PyMPDATA-MPI"
click D "https://pypi.org/p/PyMPDATA-MPI"
click TESTS "https://pypi.org/p/PyMPDATA-MPI"
Rectangular boxes indicate pip-installable Python packages (click to go to pypi.org package site).
PyMPDATA-MPI started as an MSc project of Kacper Derlatka (@Delcior) mentored by @slayoo.
Development of PyMPDATA-MPI has been supported by the Poland's National Science Centre (grant no. 2020/39/D/ST10/01220).
We acknowledge Poland’s high-performance computing infrastructure PLGrid (HPC Centers: ACK Cyfronet AGH) for providing computer facilities and support within computational grant no. PLG/2023/016369
copyright: Jagiellonian University & AGH University of Krakow
licence: GPL v3
- MPI support for PyMPDATA implemented externally (i.e., not incurring any overhead or additional dependencies for PyMPDATA users)
- MPI calls within Numba njitted code (hence not using
mpi4py
, but rathernumba-mpi
) - hybrid domain-decomposition parallelism: threading (internal in PyMPDATA, in the inner dimension) + MPI (either inner or outer dimension)
- example simulation scenarios featuring HDF5/MPI-IO output storage (using h5py)
- py-modelrunner simulation orchestration
- portability across Linux & macOS (no Windows support as of now due to challenges in getting HDF5/MPI-IO to work there)
- Continuous Integration (CI) with different OSes and different MPI implementations (leveraging to mpi4py's setup-mpi Github Action)
- full test coverage including CI builds asserting on same results with multi-node vs. single-node computations (with help of pytest-mpi)
- ships as a pip-installable package - aimed to be a dependency of domain-specific packages