Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Port MPI to Taal #292

Merged
merged 32 commits into from
Nov 12, 2020
Merged

Port MPI to Taal #292

merged 32 commits into from
Nov 12, 2020

Conversation

ranocha
Copy link
Member

@ranocha ranocha commented Nov 4, 2020

As discussed, I've tried to use a minimally invasive approach to port MPI to Taal. In most cases, I dispatch on the mesh type. In the future, I would like to create an MPI array type for u etc., which handles gather operations etc. to enable error-based timestep control in DiffEq. However, that's left for future work and can be done once we've settled on our approach to MPI.

Closes #287

@ranocha ranocha changed the base branch from master to dev November 4, 2020 16:27
@ranocha ranocha marked this pull request as draft November 4, 2020 16:28
@codecov
Copy link

codecov bot commented Nov 4, 2020

Codecov Report

Merging #292 (4c716ea) into dev (2a9d8c7) will increase coverage by 0.13%.
The diff coverage is 89.23%.

Impacted file tree graph

@@            Coverage Diff             @@
##              dev     #292      +/-   ##
==========================================
+ Coverage   88.44%   88.57%   +0.13%     
==========================================
  Files          87       90       +3     
  Lines       13460    13765     +305     
==========================================
+ Hits        11905    12193     +288     
- Misses       1555     1572      +17     
Flag Coverage Δ
unittests 88.57% <89.23%> (+0.13%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
src/io/io.jl 86.07% <ø> (-2.05%) ⬇️
src/io/parallel.jl 84.69% <ø> (-1.26%) ⬇️
src/mesh/parallel.jl 96.10% <0.00%> (ø)
src/semidiscretization.jl 76.33% <ø> (ø)
src/solvers/dg/2d/dg.jl 91.90% <ø> (ø)
src/solvers/dg/2d/parallel.jl 93.77% <60.00%> (ø)
src/callbacks/stepsize_dg2d.jl 86.20% <75.00%> (-4.27%) ⬇️
src/mesh/mesh.jl 80.70% <76.47%> (+0.32%) ⬆️
src/callbacks/analysis.jl 83.49% <78.68%> (+0.32%) ⬆️
src/callbacks/save_restart_dg.jl 85.10% <86.44%> (+2.60%) ⬆️
... and 25 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 2a9d8c7...4c716ea. Read the comment docs.

@ranocha ranocha requested a review from sloede November 5, 2020 10:34
@ranocha ranocha changed the title WIP: Port MPI to Taal Port MPI to Taal Nov 5, 2020
@ranocha ranocha marked this pull request as ready for review November 5, 2020 10:34
@ranocha ranocha added parallelization Related to MPI, threading, tasks etc. taal labels Nov 5, 2020
@ranocha ranocha linked an issue Nov 5, 2020 that may be closed by this pull request
src/semidiscretization.jl Outdated Show resolved Hide resolved
@ranocha ranocha mentioned this pull request Nov 11, 2020
50 tasks
Copy link
Member

@sloede sloede left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, first round of reviews done. Great job in porting all that MPI stuff to Taal! There are still a few issues to iron out, but nothing I see is unsolvable.

If it helps, we should consider setting up a call to talk about some of the larger issues instead of long a back-and-forth via comments. Just let me know...

README.md Outdated Show resolved Hide resolved
src/callbacks/analysis.jl Outdated Show resolved Hide resolved
src/callbacks/analysis.jl Outdated Show resolved Hide resolved
src/callbacks/analysis.jl Show resolved Hide resolved
src/callbacks/analysis.jl Show resolved Hide resolved
src/callbacks/stepsize_dg2d.jl Show resolved Hide resolved
src/mesh/mesh.jl Outdated Show resolved Hide resolved
src/callbacks/save_solution_dg.jl Outdated Show resolved Hide resolved
src/solvers/dg/dg_2d_parallel.jl Show resolved Hide resolved
src/solvers/dg/dg_2d.jl Show resolved Hide resolved
ranocha and others added 3 commits November 12, 2020 10:10
Co-authored-by: Michael Schlottke-Lakemper <sloede@users.noreply.github.com>
ranocha and others added 4 commits November 12, 2020 10:32
Co-authored-by: Michael Schlottke-Lakemper <michael@sloede.com>
Co-authored-by: Michael Schlottke-Lakemper <michael@sloede.com>
src/callbacks/analysis.jl Outdated Show resolved Hide resolved
@ranocha ranocha requested a review from sloede November 12, 2020 10:59
Copy link
Member

@sloede sloede left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM and can be merged when all tests pass. Great work!

By the way, have you ever seen one of the MPI tests fail? IIRC, I've never experienced that before, so I cannot say for certain that the overall testing would fail in that case.

src/mesh/mesh.jl Outdated Show resolved Hide resolved
@sloede
Copy link
Member

sloede commented Nov 12, 2020

By the way, have you ever seen one of the MPI tests fail? IIRC, I've never experienced that before, so I cannot say for certain that the overall testing would fail in that case.

Or put differently: Should the tests currently pass, we have a problem with MPI testing (caused by me, since I set it up in the first place), since currently a file is tested that does not even exist:

@testset "elixir_hyp_diff_llf.jl" begin

ranocha and others added 2 commits November 12, 2020 12:23
Co-authored-by: Michael Schlottke-Lakemper <michael@sloede.com>
@sloede
Copy link
Member

sloede commented Nov 12, 2020

As expected. The MPI tests should have failed but they don't.

@ranocha
Copy link
Member Author

ranocha commented Nov 12, 2020

As expected. The MPI tests should have failed but they don't.

Why should they have failed? We get

Test Summary: | Pass  Test Summary: | Pass  Total
Total
Examples 2D   |   82     82
Examples 2D   |   82     82
799.767451 seconds (1.93 M allocations: 96.366 MiB, 0.00% gc time)
Test Summary:  | Pass  Total
Trixi.jl tests |    1      1
800.634312 seconds (3.64 M allocations: 183.784 MiB, 0.02% gc time)
    Testing Trixi tests passed 

in https://github.com/trixi-framework/Trixi.jl/pull/292/checks?check_run_id=1390251395.

@ranocha ranocha merged commit c3261d0 into dev Nov 12, 2020
@ranocha ranocha deleted the taal_mpi branch November 12, 2020 12:49
@sloede
Copy link
Member

sloede commented Nov 12, 2020

Ah, never mind. Didn't see that you fixed the elixir names.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
parallelization Related to MPI, threading, tasks etc. taal
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Migrate MPI to Taal
2 participants