Skip to content

Commit

Permalink
updated loader for padded disjoint and added tutorial for jax models.
Browse files Browse the repository at this point in the history
  • Loading branch information
PatReis committed Dec 20, 2023
1 parent 6544fcd commit f53abb7
Show file tree
Hide file tree
Showing 6 changed files with 566 additions and 33 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -299,7 +299,7 @@ You can find a [table](training/results/README.md) of common benchmark datasets

Some known issues to be aware of, if using and making new models or layers with `kgcnn`.
* Jagged or nested Tensors loading into models for PyTorch backend is not working.
* Dataloader for Jax is not yet implemented.
* Dataloader for Jax is implemented but does not provide all functionalities like output padding or dictionary inputs.
* ForceModel does not support all tensor types and does not have Scaler layer yet.
* BatchNormalization layer dos not support padding yet.

Expand Down
80 changes: 62 additions & 18 deletions docs/source/literature.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -21,18 +21,33 @@
"id": "f9c829a4",
"metadata": {},
"source": [
"* **[GCN](https://github.com/aimat-lab/gcnn_keras/blob/master/kgcnn/literature/GCN)**: [Semi-Supervised Classification with Graph Convolutional Networks](https://arxiv.org/abs/1609.02907) by Kipf et al. (2016)\n",
"* **[Schnet](https://github.com/aimat-lab/gcnn_keras/blob/master/kgcnn/literature/Schnet)**: [SchNet – A deep learning architecture for molecules and materials ](https://aip.scitation.org/doi/10.1063/1.5019779) by Schütt et al. (2017)\n",
"* **[GAT](https://github.com/aimat-lab/gcnn_keras/blob/master/kgcnn/literature/GAT)**: [Graph Attention Networks](https://arxiv.org/abs/1710.10903) by Veličković et al. (2018)\n",
"* **[GraphSAGE](https://github.com/aimat-lab/gcnn_keras/blob/master/kgcnn/literature/GraphSAGE)**: [Inductive Representation Learning on Large Graphs](http://arxiv.org/abs/1706.02216) by Hamilton et al. (2017)\n",
"* **[GNNExplainer](https://github.com/aimat-lab/gcnn_keras/blob/master/kgcnn/literature/GNNExplain)**: [GNNExplainer: Generating Explanations for Graph Neural Networks](https://arxiv.org/abs/1903.03894) by Ying et al. (2019)\n",
"* **[AttentiveFP](https://github.com/aimat-lab/gcnn_keras/blob/master/kgcnn/literature/AttentiveFP)**: [Pushing the Boundaries of Molecular Representation for Drug Discovery with the Graph Attention Mechanism](https://pubs.acs.org/doi/10.1021/acs.jmedchem.9b00959) by Xiong et al. (2019)\n",
"* **[GATv2](https://github.com/aimat-lab/gcnn_keras/blob/master/kgcnn/literature/GATv2)**: [How Attentive are Graph Attention Networks?](https://arxiv.org/abs/2105.14491) by Brody et al. (2021)\n",
"* **[GIN](https://github.com/aimat-lab/gcnn_keras/blob/master/kgcnn/literature/GIN)**: [How Powerful are Graph Neural Networks?](https://arxiv.org/abs/1810.00826) by Xu et al. (2019)\n",
"* **[PAiNN](https://github.com/aimat-lab/gcnn_keras/blob/master/kgcnn/literature/PAiNN)**: [Equivariant message passing for the prediction of tensorial properties and molecular spectra](https://arxiv.org/pdf/2102.03150.pdf) by Schütt et al. (2020)\n",
"* **[DMPNN](https://github.com/aimat-lab/gcnn_keras/blob/master/kgcnn/literature/DMPNN)**: [Analyzing Learned Molecular Representations for Property Prediction](https://pubs.acs.org/doi/abs/10.1021/acs.jcim.9b00237) by Yang et al. (2019)\n",
"* **[AttentiveFP](kgcnn/literature/AttentiveFP)**: [Pushing the Boundaries of Molecular Representation for Drug Discovery with the Graph Attention Mechanism](https://pubs.acs.org/doi/10.1021/acs.jmedchem.9b00959) by Xiong et al. (2019)\n",
"* **[CGCNN](kgcnn/literature/CGCNN)**: [Crystal Graph Convolutional Neural Networks for an Accurate and Interpretable Prediction of Material Properties](https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.120.145301) by Xie et al. (2018)\n",
"* **[CMPNN](kgcnn/literature/CMPNN)**: [Communicative Representation Learning on Attributed Molecular Graphs](https://www.ijcai.org/proceedings/2020/0392.pdf) by Song et al. (2020)\n",
"* **[DGIN](kgcnn/literature/DGIN)**: [Improved Lipophilicity and Aqueous Solubility Prediction with Composite Graph Neural Networks ](https://pubmed.ncbi.nlm.nih.gov/34684766/) by Wieder et al. (2021)\n",
"* **[DimeNetPP](kgcnn/literature/DimeNetPP)**: [Fast and Uncertainty-Aware Directional Message Passing for Non-Equilibrium Molecules](https://arxiv.org/abs/2011.14115) by Klicpera et al. (2020)\n",
"* **[DMPNN](kgcnn/literature/DMPNN)**: [Analyzing Learned Molecular Representations for Property Prediction](https://pubs.acs.org/doi/abs/10.1021/acs.jcim.9b00237) by Yang et al. (2019)\n",
"* **[EGNN](kgcnn/literature/EGNN)**: [E(n) Equivariant Graph Neural Networks](https://arxiv.org/abs/2102.09844) by Satorras et al. (2021)\n",
"* **[GAT](kgcnn/literature/GAT)**: [Graph Attention Networks](https://arxiv.org/abs/1710.10903) by Veličković et al. (2018)\n",
"* **[GATv2](kgcnn/literature/GATv2)**: [How Attentive are Graph Attention Networks?](https://arxiv.org/abs/2105.14491) by Brody et al. (2021)\n",
"* **[GCN](kgcnn/literature/GCN)**: [Semi-Supervised Classification with Graph Convolutional Networks](https://arxiv.org/abs/1609.02907) by Kipf et al. (2016)\n",
"* **[GIN](kgcnn/literature/GIN)**: [How Powerful are Graph Neural Networks?](https://arxiv.org/abs/1810.00826) by Xu et al. (2019)\n",
"* **[GNNExplainer](kgcnn/literature/GNNExplain)**: [GNNExplainer: Generating Explanations for Graph Neural Networks](https://arxiv.org/abs/1903.03894) by Ying et al. (2019)\n",
"* **[GNNFilm](kgcnn/literature/GNNFilm)**: [GNN-FiLM: Graph Neural Networks with Feature-wise Linear Modulation](https://arxiv.org/abs/1906.12192) by Marc Brockschmidt (2020)\n",
"* **[GraphSAGE](kgcnn/literature/GraphSAGE)**: [Inductive Representation Learning on Large Graphs](http://arxiv.org/abs/1706.02216) by Hamilton et al. (2017)\n",
"* **[HamNet](kgcnn/literature/HamNet)**: [HamNet: Conformation-Guided Molecular Representation with Hamiltonian Neural Networks](https://arxiv.org/abs/2105.03688) by Li et al. (2021)\n",
"* **[HDNNP2nd](kgcnn/literature/HDNNP2nd)**: [Atom-centered symmetry functions for constructing high-dimensional neural network potentials](https://aip.scitation.org/doi/abs/10.1063/1.3553717) by Jörg Behler (2011)\n",
"* **[INorp](kgcnn/literature/INorp)**: [Interaction Networks for Learning about Objects,Relations and Physics](https://arxiv.org/abs/1612.00222) by Battaglia et al. (2016)\n",
"* **[MAT](kgcnn/literature/MAT)**: [Molecule Attention Transformer](https://arxiv.org/abs/2002.08264) by Maziarka et al. (2020)\n",
"* **[MEGAN](kgcnn/literature/MEGAN)**: [MEGAN: Multi-explanation Graph Attention Network](https://link.springer.com/chapter/10.1007/978-3-031-44067-0_18) by Teufel et al. (2023)\n",
"* **[Megnet](kgcnn/literature/Megnet)**: [Graph Networks as a Universal Machine Learning Framework for Molecules and Crystals](https://doi.org/10.1021/acs.chemmater.9b01294) by Chen et al. (2019)\n",
"* **[MoGAT](kgcnn/literature/MoGAT)**: [Multi-order graph attention network for water solubility prediction and interpretation](https://www.nature.com/articles/s41598-022-25701-5) by Lee et al. (2023)\n",
"* **[MXMNet](kgcnn/literature/MXMNet)**: [Molecular Mechanics-Driven Graph Neural Network with Multiplex Graph for Molecular Structures](https://arxiv.org/abs/2011.07457) by Zhang et al. (2020)\n",
"* **[NMPN](kgcnn/literature/NMPN)**: [Neural Message Passing for Quantum Chemistry](http://arxiv.org/abs/1704.01212) by Gilmer et al. (2017)\n",
"* **[CGCNN](kgcnn/literature/CGCNN)**: [Crystal Graph Convolutional Neural Networks for an Accurate and Interpretable Prediction of Material Properties](https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.120.145301) by Xie et al. (2018)"
"* **[PAiNN](kgcnn/literature/PAiNN)**: [Equivariant message passing for the prediction of tensorial properties and molecular spectra](https://arxiv.org/pdf/2102.03150.pdf) by Schütt et al. (2020)\n",
"* **[RGCN](kgcnn/literature/RGCN)**: [Modeling Relational Data with Graph Convolutional Networks](https://arxiv.org/abs/1703.06103) by Schlichtkrull et al. (2017)\n",
"* **[rGIN](kgcnn/literature/rGIN)** [Random Features Strengthen Graph Neural Networks](https://arxiv.org/abs/2002.03155) by Sato et al. (2020)\n",
"* **[Schnet](kgcnn/literature/Schnet)**: [SchNet – A deep learning architecture for molecules and materials ](https://aip.scitation.org/doi/10.1063/1.5019779) by Schütt et al. (2017)\n"
]
},
{
Expand Down Expand Up @@ -336,14 +351,26 @@
"\n",
"| model | kgcnn | epochs | MAE [log mol/L] | RMSE [log mol/L] |\n",
"|:------------|:--------|---------:|:-----------------------|:-----------------------|\n",
"| AttentiveFP | 4.0.0 | 200 | **0.4351 ± 0.0110** | **0.6080 ± 0.0207** |\n",
"| AttentiveFP | 4.0.0 | 200 | 0.4351 ± 0.0110 | 0.6080 ± 0.0207 |\n",
"| DGIN | 4.0.0 | 300 | 0.4434 ± 0.0252 | 0.6225 ± 0.0420 |\n",
"| DMPNN | 4.0.0 | 300 | 0.4401 ± 0.0165 | 0.6203 ± 0.0292 |\n",
"| EGNN | 4.0.0 | 800 | 0.4507 ± 0.0152 | 0.6563 ± 0.0370 |\n",
"| GAT | 4.0.0 | 500 | 0.4818 ± 0.0240 | 0.6919 ± 0.0694 |\n",
"| GATv2 | 4.0.0 | 500 | 0.4598 ± 0.0234 | 0.6650 ± 0.0409 |\n",
"| GCN | 4.0.0 | 800 | 0.4613 ± 0.0205 | 0.6534 ± 0.0513 |\n",
"| GIN | 4.0.0 | 300 | 0.5369 ± 0.0334 | 0.7954 ± 0.0861 |\n",
"| GNNFilm | 4.0.0 | 800 | 0.4854 ± 0.0368 | 0.6724 ± 0.0436 |\n",
"| GraphSAGE | 4.0.0 | 500 | 0.4874 ± 0.0228 | 0.6982 ± 0.0608 |\n",
"| HDNNP2nd | 4.0.0 | 500 | 0.7857 ± 0.0986 | 1.0467 ± 0.1367 |\n",
"| INorp | 4.0.0 | 500 | 0.5055 ± 0.0436 | 0.7297 ± 0.0786 |\n",
"| MAT | 4.0.0 | 400 | 0.5064 ± 0.0299 | 0.7194 ± 0.0630 |\n",
"| MEGAN | 4.0.0 | 400 | **0.4281 ± 0.0201** | **0.6062 ± 0.0252** |\n",
"| Megnet | 4.0.0 | 800 | 0.5679 ± 0.0310 | 0.8196 ± 0.0480 |\n",
"| MXMNet | 4.0.0 | 900 | 0.6486 ± 0.0633 | 1.0123 ± 0.2059 |\n",
"| NMPN | 4.0.0 | 800 | 0.5046 ± 0.0266 | 0.7193 ± 0.0607 |\n",
"| PAiNN | 4.0.0 | 250 | 0.4857 ± 0.0598 | 0.6650 ± 0.0674 |\n",
"| RGCN | 4.0.0 | 800 | 0.4703 ± 0.0251 | 0.6529 ± 0.0318 |\n",
"| rGIN | 4.0.0 | 300 | 0.5196 ± 0.0351 | 0.7142 ± 0.0263 |\n",
"| Schnet | 4.0.0 | 800 | 0.4777 ± 0.0294 | 0.6977 ± 0.0538 |\n",
"\n",
"#### FreeSolvDataset\n",
Expand All @@ -352,12 +379,27 @@
"\n",
"| model | kgcnn | epochs | MAE [log mol/L] | RMSE [log mol/L] |\n",
"|:----------|:--------|---------:|:-----------------------|:-----------------------|\n",
"| DMPNN | 4.0.0 | 300 | **0.5487 ± 0.0754** | **0.9206 ± 0.1889** |\n",
"| CMPNN | 4.0.0 | 600 | 0.5202 ± 0.0504 | 0.9339 ± 0.1286 |\n",
"| DGIN | 4.0.0 | 300 | 0.5489 ± 0.0374 | 0.9448 ± 0.0787 |\n",
"| DimeNetPP | 4.0.0 | 872 | 0.6167 ± 0.0719 | 1.0302 ± 0.1717 |\n",
"| DMPNN | 4.0.0 | 300 | 0.5487 ± 0.0754 | **0.9206 ± 0.1889** |\n",
"| EGNN | 4.0.0 | 800 | 0.5386 ± 0.0548 | 1.0363 ± 0.1237 |\n",
"| GAT | 4.0.0 | 500 | 0.6051 ± 0.0861 | 1.0326 ± 0.1819 |\n",
"| GATv2 | 4.0.0 | 500 | 0.6151 ± 0.0247 | 1.0535 ± 0.0817 |\n",
"| GCN | 4.0.0 | 800 | 0.6400 ± 0.0834 | 1.0876 ± 0.1393 |\n",
"| GIN | 4.0.0 | 300 | 0.8100 ± 0.1016 | 1.2695 ± 0.1192 |\n",
"| GNNFilm | 4.0.0 | 800 | 0.6562 ± 0.0552 | 1.1597 ± 0.1245 |\n",
"| GraphSAGE | 4.0.0 | 500 | 0.5894 ± 0.0675 | 1.0009 ± 0.1491 |\n",
"| HamNet | 4.0.0 | 400 | 0.6619 ± 0.0428 | 1.1410 ± 0.1120 |\n",
"| HDNNP2nd | 4.0.0 | 500 | 1.0201 ± 0.1559 | 1.6351 ± 0.3419 |\n",
"| INorp | 4.0.0 | 500 | 0.6612 ± 0.0188 | 1.1155 ± 0.1061 |\n",
"| MAT | 4.0.0 | 400 | 0.8115 ± 0.0649 | 1.3099 ± 0.1235 |\n",
"| MEGAN | 4.0.0 | 400 | 0.6303 ± 0.0550 | 1.0429 ± 0.1031 |\n",
"| Megnet | 4.0.0 | 800 | 0.8878 ± 0.0528 | 1.4134 ± 0.1200 |\n",
"| MoGAT | 4.0.0 | 200 | 0.7097 ± 0.0374 | 1.0911 ± 0.1334 |\n",
"| MXMNet | 4.0.0 | 900 | 1.1386 ± 0.1979 | 3.0487 ± 2.1757 |\n",
"| RGCN | 4.0.0 | 800 | **0.5128 ± 0.0810** | 0.9228 ± 0.1887 |\n",
"| rGIN | 4.0.0 | 300 | 0.8503 ± 0.0613 | 1.3285 ± 0.0976 |\n",
"| Schnet | 4.0.0 | 800 | 0.6070 ± 0.0285 | 1.0603 ± 0.0549 |\n",
"\n",
"#### ISO17Dataset\n",
Expand Down Expand Up @@ -435,10 +477,12 @@
"\n",
"Materials Project dataset from Matbench with 636 crystal structures and their corresponding Exfoliation energy (meV/atom). We use a random 5-fold cross-validation. \n",
"\n",
"| model | kgcnn | epochs | MAE [meV/atom] | RMSE [meV/atom] |\n",
"|:--------------------------|:--------|---------:|:-------------------------|:--------------------------|\n",
"| PAiNN.make_crystal_model | 4.0.0 | 800 | 49.3889 ± 11.5376 | 121.7087 ± 30.0472 |\n",
"| Schnet.make_crystal_model | 4.0.0 | 800 | **45.2412 ± 11.6395** | **115.6890 ± 39.0929** |\n",
"| model | kgcnn | epochs | MAE [meV/atom] | RMSE [meV/atom] |\n",
"|:-----------------------------|:--------|---------:|:-------------------------|:--------------------------|\n",
"| CGCNN.make_crystal_model | 4.0.0 | 1000 | 57.6974 ± 18.0803 | 140.6167 ± 44.8418 |\n",
"| DimeNetPP.make_crystal_model | 4.0.0 | 780 | 50.2880 ± 11.4199 | 126.0600 ± 38.3769 |\n",
"| PAiNN.make_crystal_model | 4.0.0 | 800 | 49.3889 ± 11.5376 | 121.7087 ± 30.0472 |\n",
"| Schnet.make_crystal_model | 4.0.0 | 800 | **45.2412 ± 11.6395** | **115.6890 ± 39.0929** |\n",
"\n",
"#### MatProjectLogGVRHDataset\n",
"\n",
Expand Down Expand Up @@ -593,7 +637,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.13"
"version": "3.10.5"
}
},
"nbformat": 4,
Expand Down
2 changes: 1 addition & 1 deletion kgcnn/data/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -331,7 +331,7 @@ def rename_property_on_graphs(self, old_property_name: str, new_property_name: s

def tf_disjoint_data_generator(self, inputs, outputs, **kwargs):
assert isinstance(inputs, list), "Dictionary input is not yet implemented"
module_logger.info("Dataloader is experimental and not fully tested or stable.")
module_logger.info("Dataloader is experimental and does not have all features for in and output.")
return tf_disjoint_list_generator(self, inputs=inputs, outputs=outputs, **kwargs)


Expand Down
Loading

0 comments on commit f53abb7

Please sign in to comment.