Skip to content

Commit

Permalink
📚 Docs: update tutorials, how-tos, and citations
Browse files Browse the repository at this point in the history
  • Loading branch information
bastonero committed Feb 16, 2025
1 parent e5199d4 commit dee667b
Show file tree
Hide file tree
Showing 10 changed files with 129 additions and 180 deletions.
2 changes: 1 addition & 1 deletion docs/source/1_computing_hubbard.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -183,7 +183,7 @@
"from aiida_quantumespresso.workflows.pw.base import PwBaseWorkChain\n",
"from aiida_quantumespresso.common.types import ElectronicType\n",
"kpoints = KpointsData()\n",
"kpoints.set_kpoints_mesh([2,2,2])\n",
"kpoints.set_kpoints_mesh([1,1,1])\n",
"\n",
"builder = PwBaseWorkChain.get_builder_from_protocol(\n",
" code=data.pw_code, # modify here if you downloaded the notebook\n",
Expand Down
13 changes: 7 additions & 6 deletions docs/source/2_parallel_hubbard.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@
"from aiida_quantumespresso.workflows.pw.base import PwBaseWorkChain\n",
"from aiida_quantumespresso.common.types import ElectronicType\n",
"kpoints = KpointsData()\n",
"kpoints.set_kpoints_mesh([2,2,2])\n",
"kpoints.set_kpoints_mesh([1,1,1])\n",
"\n",
"builder = PwBaseWorkChain.get_builder_from_protocol(\n",
" code=data.pw_code, # modify here if you downloaded the notebook\n",
Expand Down Expand Up @@ -108,7 +108,7 @@
" \"parallelize_atoms\":True, \n",
" \"parallelize_qpoints\":False, \n",
" \"hp\":{\"hubbard_structure\":data.structure},\n",
" \"qpoints_distance\": 1000, # to get few q points\n",
" \"qpoints_distance\": 100.0, # to get few q points\n",
" }\n",
")\n",
"\n",
Expand Down Expand Up @@ -139,11 +139,11 @@
"metadata": {},
"source": [
"The following just happened:\n",
"- A grid of q points is generated automatically using the distance (between points) in $\\AA$ we gave in input (of 1000 $\\AA$ to have very sparse - it is just a tutorial!).\n",
"- A grid of q points is generated automatically using the distance (between points) in $\\r{A}^{-1}$ we gave in input (of 100 $\\r{A}^{-1}$ to have very sparse - it is just a tutorial!).\n",
"- The `HpParallelizeAtomsWorkChain` is called.\n",
"- This work chain calls first a `HpBaseWorkChain` to get the independent atoms to perturb.\n",
"- **Three** `HpBaseWorkChain` are submitted __simultaneously__, one for cobalt, and two for the two oxygen sites.\n",
"- The response matrices ($\\chi^{(0)}$,$\\chi$) of each atom are collected to post-process them and compute the final U/V values using $$V_{IJ} = (\\chi^{(0) -1} -\\chi^{-1})_{IJ}$$\n",
"- The response matrices ($\\chi^{(0)}$,$\\chi$) of each atom are collected to post-process them and compute the final U/V values using $V_{IJ} = (\\chi^{(0) -1} -\\chi^{-1})_{IJ}$\n",
"\n",
"As for the `HpBaseWorkChain`, we also have here the `hubbard_structure` output namespace, containing the same results as the serial execution:"
]
Expand Down Expand Up @@ -193,7 +193,8 @@
" \"parallelize_qpoints\":True, \n",
" \"hp\":{\"hubbard_structure\":data.structure},\n",
" \"qpoints_distance\": 1000, # to get few q points\n",
" }\n",
" \"max_concurrent_base_workchains\": 2, # useful to not overload HPC or local computer\n",
" }\n",
")\n",
"\n",
"results, hp_node = run_get_node(builder)"
Expand All @@ -214,7 +215,7 @@
"metadata": {},
"source": [
"The following just happened:\n",
"- A grid of q points was generated automatically using the distance (between points) in $\\AA$ we gave in input (of 1000 $\\AA$ to have very sparse - it is just a tutorial!).\n",
"- A grid of q points was generated automatically using the distance (between points) in $\\r{A}^{-1}$ we gave in input (of 1000 $\\r{A}^{-1}$ to have very sparse - it is just a tutorial!).\n",
"- The `HpParallelizeAtomsWorkChain` is called.\n",
"- This work chain calls first a `HpBaseWorkChain` to get the independent atoms to perturb.\n",
"- For independent each atom (three in total) an `HpParallelizeQpointsWorkChain` is submitted __simultaneously__, one for cobalt, and two for the two oxygen sites.\n",
Expand Down
188 changes: 34 additions & 154 deletions docs/source/3_self_consistent.ipynb

Large diffs are not rendered by default.

23 changes: 23 additions & 0 deletions docs/source/citeus.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
(citeus)=

# Cite

If you use this plugin for your research, please cite the following works:

> Lorenzo Bastonero, Cristiano Malica, Eric Macke, Marnik Bercx, Sebastiaan Huber, Iurii Timrov, and Nicola Marzari, *Hubbard from first-principles made easy from automated and reproducible workflows* (2025)
> Sebastiaan. P. Huber _et al._, [*AiiDA 1.0, a scalable computational infrastructure for automated reproducible workflows and data provenance*](https://doi.org/10.1038/s41597-020-00638-4), Scientific Data **7**, 300 (2020)
> Martin Uhrin, Sebastiaan. P. Huber, Jusong Yu, Nicola Marzari, and Giovanni Pizzi, [*Workflows in AiiDA: Engineering a high-throughput, event-based engine for robust and modular computational workflows*](https://www.sciencedirect.com/science/article/pii/S0010465522001746), Computational Materials Science **187**, 110086 (2021)
Please, also cite the relevant _Quantum ESPRESSO_ and _HP_ references.

> Iurii Timrov, Nicola Marzari, and Matteo Cococcioni, [*HP – A code for the calculation of Hubbard parameters using density-functional perturbation theory*](https://www.sciencedirect.com/science/article/pii/S0010465522001746), Computer Physics Communication **279**, 108455 (2022)
> Paolo Giannozzi _et al._, [*Advanced capabilities for materials modelling with Quantum ESPRESSO*](https://iopscience.iop.org/article/10.1088/1361-648X/aa8f79) J.Phys.:Condens.Matter **29**, 465901 (2017)
> Paolo Giannozzi _et al._, [*QUANTUM ESPRESSO: a modular and open-source software project for quantum simulations of materials*](https://iopscience.iop.org/article/10.1088/0953-8984/21/39/395502) J. Phys. Condens. Matter **21**, 395502 (2009)
For the GPU-enabled version of _Quantum ESPRESSO_:

> Paolo Giannozzi _et al._, [*Quantum ESPRESSO toward the exascale*](https://pubs.aip.org/aip/jcp/article/152/15/154105/1058748/Quantum-ESPRESSO-toward-the-exascale), J. Chem. Phys. **152**, 154105 (2020)
45 changes: 45 additions & 0 deletions docs/source/howto/analyze.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
(howto-analyze)=

# How to analyze the results

When a `SelfConsistentHubbardWorkChain` is completed, there are quite a few possible analyses to perform.

## How to inspect the final Hubbard parameters

A _complete_ `SelfConsistentHubbardWorkChain` will produce a {{ hubbard_structure }} containing the parsed Hubbard parameters.
The parameters are stored under the `hubbard` namespace:

```shell
In [1]: node = load_node(HP_CALCULATION_IDENTIFIER)

In [2]: node.outputs.hubbard_structure.hubbard
Out[2]:
Hubbard(parameters=(HubbardParameters([...]), ...), projectors='ortho-atomic', formulation='dudarev')
```

This corresponds to a `pydantic` class, so you can access the stores values (`parameters`, `projectors`, `formulations`) simply by:
```shell
In [3]: node.outputs.hubbard_structure.hubbard.parameters
Out[3]: [HubbardParameters(atom_index=0, atom_manifold='3d', neighbour_index=0, neighbour_manifold='3d', translation=(0, 0, 0), value=5.11, hubbard_type='Ueff'), ...]
```

To access to a specific value:
```shell
In [4]: hubbard_structure.hubbard.parameters[0].value
Out[4]: 5.11
```

To visualize them as Quantum ESPRESSO HUBBARD card:

```shell
In [5]: from aiida_quantumespresso.utils.hubbard import HubbardUtils

In [6]: hubbard_card = HubbardUtils(node.outputs.hubbard_structure.hubbard).get_hubbard_card()

In [7]: print(hubbard_card)
Out[7]:
HUBBARD ortho-atomic
V Co-3d Co-3d 1 1 5.11
V Co-3d O-2p 1 2 1.65
...
```
6 changes: 3 additions & 3 deletions docs/source/howto/calculations/hp.md
Original file line number Diff line number Diff line change
Expand Up @@ -112,13 +112,13 @@ builder = load_code('hp').get_builder()
builder.parent_scf = parent_scf
```

## How to run a calculation without symlinking
## How to run a calculation with symlinking

Specify `PARENT_FOLDER_SYMLINK: False` in the `settings` input:

```python
builder = load_code('hp').get_builder()
builder.settings = Dict({'PARENT_FOLDER_SYMLINK': False})
builder.settings = Dict({'PARENT_FOLDER_SYMLINK': True})
```

If this setting is specified, the plugin will NOT symlink the SCF folder.
Expand Down Expand Up @@ -146,7 +146,7 @@ To visualize them as Quantum ESPRESSO HUBBARD card:
```python
In [3]: from aiida_quantumespresso.utils.hubbard import HubbardUtils

In [4]: hubbard_card = HubbardUtils(node.outputs.hubbard_structure.hubbard).get_hubbard_card
In [4]: hubbard_card = HubbardUtils(node.outputs.hubbard_structure.hubbard).get_hubbard_card()

In [5]: print(hubbard_card)
Out[5]:
Expand Down
1 change: 1 addition & 0 deletions docs/source/howto/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ At the very least, make sure you have followed and understand the tutorial on [r
```{toctree}
:maxdepth: 2
analyze
understand
calculations/index
workflows/index
Expand Down
2 changes: 1 addition & 1 deletion docs/source/local_module/temp_profile.py
Original file line number Diff line number Diff line change
Expand Up @@ -180,7 +180,7 @@ def create_licoo_hubbard_structure():
return hubbard_structure


def load_sssp_pseudos(version='1.2', functional='PBEsol', protocol='efficiency'):
def load_sssp_pseudos(version='1.3', functional='PBEsol', protocol='efficiency'):
"""Load the SSSP pseudopotentials."""
config = SsspConfiguration(version, functional, protocol)
label = SsspFamily.format_configuration_label(config)
Expand Down
2 changes: 2 additions & 0 deletions docs/source/topics/index.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
(topics)=

# Topic guides

```{toctree}
Expand Down
27 changes: 12 additions & 15 deletions src/aiida_hubbard/workflows/hubbard.py
Original file line number Diff line number Diff line change
Expand Up @@ -75,21 +75,18 @@ class SelfConsistentHubbardWorkChain(WorkChain, ProtocolMixin):
The procedure in each step of the convergence cycle is slightly different depending on the electronic and
magnetic properties of the system. Each cycle will roughly consist of three steps:
* Relaxing the structure at the current Hubbard values (optional).
* One or two SCF calculations depending whether the system is metallic or insulating, respectively.
* A self-consistent calculation of the Hubbard parameters, restarted from the last SCF run.
The possible options for the set of SCF calculations that have to be run in the second step look are:
* Metals:
- SCF with smearing.
* Insulators
- SCF with smearing.
- SCF with fixed occupations; if magnetic, total magnetization and number of bands
are fixed to the values found from the previous SCF calculation.
* Relaxing the structure at the current Hubbard values (optional).
* One or two DFT calculations depending whether the system is metallic or insulating, respectively.
* A DFPT calculation of the Hubbard parameters, perturbing the ground-state of the last DFT run.
The possible options for the set of DFT SCF calculations that have to be run in the second step look are:
* Metals:
- SCF with smearing.
* Insulators
- SCF with smearing.
- SCF with fixed occupations; if magnetic, total magnetization and number of bands
are fixed to the values found from the previous SCF calculation.
When convergence is achieved a node will be returned containing the final converged
:class:`~aiida_quantumespresso.data.hubbard_structure.HubbardStructureData`.
Expand Down

0 comments on commit dee667b

Please sign in to comment.