Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add PML Support for multi-J Algorithm #2603

Merged
merged 5 commits into from
Dec 13, 2021
Merged

Conversation

EZoni
Copy link
Member

@EZoni EZoni commented Nov 29, 2021

Add evolution and damping of electromagnetic fields in the PML for the multi-J algorithm. This should allow us to start testing the multi-J algorithm with mesh refinement.

To-do:

I added a CI test multi_J_2d_psatd_pml similar to multi_J_2d_psatd: only differences are pml boundary conditions in z (instead of damped) and no time averaging (because time averaging is not implemented in the PML). Please find below a comparison of the results of the two tests.

Test multi_J_2d_psatd (damped boundary conditions in z):

Test multi_J_2d_psatd_pml (pml boundary conditions in z, no time averaging):

@EZoni EZoni added component: spectral Spectral solvers (PSATD, IGF) component: boundary PML, embedded boundaries, et al. labels Nov 29, 2021
@EZoni EZoni mentioned this pull request Dec 7, 2021
8 tasks
@EZoni EZoni changed the title [WIP] Add PML Support for multi-J Algorithm Add PML Support for multi-J Algorithm Dec 8, 2021
@EZoni EZoni requested a review from RemiLehe December 8, 2021 18:11
@RemiLehe RemiLehe self-assigned this Dec 8, 2021
@RemiLehe RemiLehe merged commit efcf0d4 into ECP-WarpX:development Dec 13, 2021
roelof-groenewald added a commit to ModernElectron/WarpX that referenced this pull request Dec 14, 2021
* C++17, CMake 3.17+ (ECP-WarpX#2300)

* C++17, CMake 3.17+

Update C++ requirements to compile with C++17 or newer.

* Superbuild: C++17 in AMReX/PICSAR/openPMD-api

* Summit: `cuda/11.0.3` -> `cuda/11.3.1`

When compiling AMReX in C++17 on Summit, the `cuda/11.0.3` module
(`nvcc 11.0.2211`) dies with:
```
... Base/AMReX_TinyProfiler.cpp
nvcc error   : 'cicc' died due to signal 11 (Invalid memory reference)
nvcc error   : 'cicc' core dumped
```

Although this usually is a memory issue, it also appears in `-j1`
compiles.

* Replace AMREX_SPACEDIM: Evolve & FieldSolver (ECP-WarpX#2642)

* AMREX_SPACEDIM : Boundary Conditions

* AMREX_SPACEDIM : Parallelization

* Fix compilation

* AMREX_SPACEDIM : Initialization

* Fix Typo

* space

* AMREX_SPACEDIM : Particles

* AMREX_SPACEDIM : Evolve and FieldSolver

* C++17: structured bindings to replace "std::tie(x,y,z) = f()" (ECP-WarpX#2644)

* use structured bindings

* std::ignore equivalent in structured bindings

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Perlmutter: December Update (ECP-WarpX#2645)

Update the Perlmutter instructions for the major update from
December 8th, 2021.

* 1D tests for plasma acceleration (ECP-WarpX#2593)

* modify requirements.txt and add input file for 1D Python pwfa

* add 1D Python plasma acceleration test to CI

* picmi version

* USE_PSATD=OFF for 1D

* Update Examples/Physics_applications/plasma_acceleration/PICMI_inputs_plasma_acceleration_1d.py

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Update Regression/WarpX-tests.ini

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Cartesian1D class in pywarpx/picmi.py

* requirements.txt: update picmistandard

* update picmi version

* requirements.txt: revert unintended changes

* 1D Laser Acceleration Test

* Update Examples/Physics_applications/laser_acceleration/inputs_1d

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Update Examples/Physics_applications/plasma_acceleration/PICMI_inputs_plasma_acceleration_1d.py

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* add data_list to PICMI laser_acceleration test

* increase max steps and fix bug in pywarpx/picmi.py 1DCartesian moving window direction

* add data_lust to Python laser acceleration test

* picmistandard update

Co-authored-by: Prabhat Kumar <prabhatkumar@kraken.dhcp.lbl.gov>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* CMake 3.22+: Policy CMP0127 (ECP-WarpX#2648)

Fix a warning with CMake 3.22+.

We use simple syntax in cmake_dependent_option, so we are compatible with the extended syntax in CMake 3.22+:   https://cmake.org/cmake/help/v3.22/policy/CMP0127.html

* run_test.sh: Own virtual env (ECP-WarpX#2653)

Isolate builds locally, so we don't overwrite a developer's
setup anymore. This also avoids a couple of nifty problems that
can occur by mixing those envs.

Originally part of ECP-WarpX#2556

* GNUmake: Fix Python Install (force) (ECP-WarpX#2655)

Local developers and cached CI installs ddi never install `pywarpx`
if and old version existed. The `--force` must be with us.

* Add: Regression/requirements.txt

Forgotten in ECP-WarpX#2653

* Azure: `set -eu -o pipefail`

Lol, that's not the default.
We previously had `script` where it was the default.

Introduced in ECP-WarpX#2615

* GNUmake & `WarpX-test.ini`: `python` -> `python3`

Consistent with all other calls to Python in tests.

* Fix missing checksums1d (ECP-WarpX#2657)

* Docs: Fix missing Checksum Ref

* Checksum: LaserAcceleration_1d

* Checksum: Python_PlasmaAcceleration_1d

* Regression/requirements.txt: openpmd-api

Follow-up to 8f93e01

* Azure: pre-install `setuptools` upgrade

Might fix:
```
 - installing setuptools_scm using the system package manager to ensure consistency
 - migrating from the deprecated setup_requires mechanism to pep517/518
   and using a pyproject.toml to declare build dependencies
   which are reliably pre-installed before running the build tools

  warnings.warn(
TEST FAILED: /home/vsts/.local/lib/python3.8/site-packages/ does NOT support .pth files

You are attempting to install a package to a directory that is not
on PYTHONPATH and which Python does not read ".pth" files from.  The
installation directory you specified (via --install-dir, --prefix, or
the distutils default setting) was:

    /home/vsts/.local/lib/python3.8/site-packages/

and your PYTHONPATH environment variable currently contains:

    ''

Here are some of your options for correcting the problem:

* You can choose a different installation directory, i.e., one that is
  on PYTHONPATH or supports .pth files

* You can add the installation directory to the PYTHONPATH environment
  variable.  (It must then also be on PYTHONPATH whenever you run
  Python and want to use the package(s) you are installing.)

* You can set up the installation directory to support ".pth" files by
  using one of the approaches described here:

  https://setuptools.readthedocs.io/en/latest/easy_install.html#custom-installation-locations

Please make the appropriate changes for your system and try again.
```

* GNUmake `installwarpx`: `mv` -> `cp`

No reason to rebuild. Make will detect dependency when needed.

* Python GNUmake: Remove Prefix Hacks

FREEEEDOM. venv power.

* Azure: Ensure latest venv installed

* Python/setup.py: picmistandard==0.0.18

Forgotten in ECP-WarpX#2593

* Fix: analysis_default_regression.py

Mismatched checksum file due to crude hard-coding.

* PWFA 1D: Fix output name

Hard coded, undocumented convention: turns out this must be the name
of the test that we define in the ini file. Logical, isn't it. Not.

Follow-up to ECP-WarpX#2593

* Docs: `python3 -m pip` & Virtual Env (ECP-WarpX#2656)

* Docs: `python3 -m pip`

Use `python3 -m pip`:
- works independent of PATH
- always uses the right Python
- is the recommended way to use `pip`

* Dependencies: Python incl. venv

Backported from ECP-WarpX#2556.

Follow-up to ECP-WarpX#2653

* CMake: 3.18+ (ECP-WarpX#2651)

With the C++17 switch, we required CMake 3.17+ since that one introduced the `cuda_std_17` target compile feature.

It turns out that one of the many CUDA improvements in CMake 3.18+ is also to fix that feature for good, so we bump our requirement in CMake. Since CMake is easy to install, it's easier to require a clean newer version than working around a broken old one.

Spotted first by Phil on AWS instances, thx!

* fix check for absolute library install path (ECP-WarpX#2646)

Co-authored-by: Hannes T <s9105947@users.noreply.github.com>

* use if constexpr to replace template specialization (ECP-WarpX#2660)

* fix for setting the boundary condition potentials in 1D ES simulations (ECP-WarpX#2649)

* `use_default_v_<galilean,comoving>` Only w/ Boosted Frame (ECP-WarpX#2654)

* ICC CI: Unbound Vars (`setvars.sh`) (ECP-WarpX#2663)

Ignore:
```
/opt/intel/oneapi/compiler/latest/env/vars.sh: line 236: OCL_ICD_FILENAMES: unbound variable
```

* QED openPMD Tests: Specify H5 Backend (ECP-WarpX#2661)

We default to ADIOS `.bp` if available. Thus, specify HDF5 assumption

* C++17: if constexpr for templates in ShapeFactors (ECP-WarpX#2659)

* use if constexpr to replace template specialization

* Rmove Interface Annotations

* Replace static_assert with amrex::Abort

* Add includes & authors

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* ABLASTR Library (ECP-WarpX#2263)

* [Draft] ABLASTR Library

- CMake object library
- include FFTW wrappers to start with

* Add: MPIInitHelpers

* Enable ABLASTR-only builds

* Add alias WarpX::ablastr

* ABLASTR: openPMD forwarding

* make_third_party_includes_system: Avoid Collision

* WarpX: depend on `ablastr`

* Definitions: WarpX -> ablastr

* CMake: Reduce build objects for ABLASTR

Skip all object files that we do not use in builds.

* CMake: app/shared links all object targets

Our `PRIVATE` source/objects are not PUBLICly propagated themselves.

* Docs: Fix Warning Logger Typo (ECP-WarpX#2667)

* Python: Add 3.10, Relax upper bound (ECP-WarpX#2664)

There are no breaking changes in Python 3.10 that affect us.

Giving the version compatibility of Python and it's ABI stability,
there is no need at the moment to provide an upper limit. Thus,
relaxed now in general.

* Fixing the initialization of the EB data in ghost cells (ECP-WarpX#2635)

* Using ng_FieldSolver ghost cells in the EB data

* Removed an unused variable

* Fixed makeEBFabFactory also in in WarpXRgrid.cpp

* Fixed end of line whitespace

* Undoing ECP-WarpX#2607

* Add PML Support for multi-J Algorithm (ECP-WarpX#2603)

* Add PML Support for multi-J Algorithm

* Add CI Test

* Fix the scope of profiler for SYCL (ECP-WarpX#2668)

In main.cpp, the destructor of the profiler was called after
amrex::Finalize.  This caused an error in SYCL due to a device
synchronization call in the dtor, because the SYCL queues in amrex had been
deleted.  In this commit, we limit the scope of the profiler so that its
destructor is called before the queues are deleted.  Note that it was never
an issue for CUDA/HIP, because the device synchronization calls in those
backends do not need any amrex objects.

* Add high energy asymptotic fit for Proton-Boron total cross section (ECP-WarpX#2408)

* Add high energy asymptotic fit for Proton Boron total cross section

* Write keV and MeV instead of kev and mev

* Add @return doxystrings

* Add anisotropic mesh refinement example (ECP-WarpX#2650)

* Add anisotropic mesh refinement example

* Update benchmark

* AMReX/PICSAR: Weekly Update (ECP-WarpX#2666)

* AMReX: Weekly Update

* Reset: PEC_particle, RepellingParticles, subcyclingMR

New AMReX grid layout routines split grids until they truly reach
number of MPI ranks, if blocking factor allows. This changes some of
our particle orders slightly.

* Add load balancing test (ECP-WarpX#2561)

* Added embedded_circle test

* Add embedded_circle test files

* Removed diag files

* removed PICMI input file

* Update to use default regression analysis

* Added line breaks for spacing

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Added description

* Fixed benchmark file

* Added load balancing to test

* Commented out load_balancing portion of test.
This will be added back in once load balancing is fixed.

* Add load balancing to embedded_boundary test

* Updated checksum

* Added embedded_circle test

* Add embedded_circle test files

* removed PICMI input file

* Update to use default regression analysis

* Added load balancing to test

* Commented out load_balancing portion of test.
This will be added back in once load balancing is fixed.

* Add load balancing to embedded_boundary test

* added analysis.py file in order to relax tolerance on test

* Ensure that timers are used to update load balancing algorithm

* Updated test name retrieval

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
Co-authored-by: Roelof <roelof.groenewald@modernelectron.com>
Co-authored-by: Roelof Groenewald <40245517+roelof-groenewald@users.noreply.github.com>

* Adding EB multifabs to the Python interface (ECP-WarpX#2647)

* Adding edge_lengths and face_areas to the Python interface

* Added wrappers for the two new arrays of data

* Adding a CI test

* Fixed test name

* Added customRunCmd

* Added mpi in test

* Refactor DepositCharge so it can be called from ImpactX. (ECP-WarpX#2652)

* Refactor DepositCharge so it can be called from ImpactX.

* change thread_num

* Fix namespace

* remove all static WarpX:: members and methods from DepositChargeDoIt.

* fix unused

* Don't access ref_ratio unless lev != depos_lev

* more unused

* remove function to its own file / namespace

* don't need a CMakeLists.txt for this

* lower case namespace, rename file

* Refactor: Profiler Wrapper

Explicit control for synchronization instead of global state.

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* ABLASTR: Fix Doxygen in `DepositCharge`

* update version number and changelog

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
Co-authored-by: Prabhat Kumar <89051199+prkkumar@users.noreply.github.com>
Co-authored-by: Luca Fedeli <luca.fedeli@cea.fr>
Co-authored-by: Prabhat Kumar <prabhatkumar@kraken.dhcp.lbl.gov>
Co-authored-by: s9105947 <80697868+s9105947@users.noreply.github.com>
Co-authored-by: Hannes T <s9105947@users.noreply.github.com>
Co-authored-by: Edoardo Zoni <59625522+EZoni@users.noreply.github.com>
Co-authored-by: Phil Miller <phil.miller@intensecomputing.com>
Co-authored-by: Lorenzo Giacomel <47607756+lgiacome@users.noreply.github.com>
Co-authored-by: Weiqun Zhang <WeiqunZhang@lbl.gov>
Co-authored-by: Neïl Zaim <49716072+NeilZaim@users.noreply.github.com>
Co-authored-by: Remi Lehe <remi.lehe@normalesup.org>
Co-authored-by: Kevin Z. Zhu <86268612+KZhu-ME@users.noreply.github.com>
Co-authored-by: Andrew Myers <atmyers@lbl.gov>
lgiacome pushed a commit to lgiacome/WarpX that referenced this pull request Dec 16, 2021
* Add PML Support for multi-J Algorithm

* Add CI Test
@EZoni EZoni deleted the pml_multiJ branch June 7, 2023 21:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component: boundary PML, embedded boundaries, et al. component: spectral Spectral solvers (PSATD, IGF)
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants