Reference implementation of Graph-learned orbital embeddings (Globe) and Molecular orbital network (Moon) from
Generalizing Neural Wave Functions
by Nicholas Gao and Stephane Günnemann
published at ICML 2023.
If you're looking for our implementation of PESNet, check out https://github.com/n-gao/pesnet.
- Create a new conda environment:
conda create -n globe python=3.11 # python>=3.10 conda activate globe
- Install JAX. On our cluster, we use
conda install cudatoolkit=11.7 cudatoolkit-dev=11.7 cudnn=8.8 -c conda-forge pip install --upgrade "jax[cuda11_local]" -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html conda env config vars set LD_LIBRARY_PATH=$CONDA_PREFIX/lib/ # this requires reactivating the conda env
- Install
globe
:pip install -e .
There are two simple ways of training neural wave functions:
- Using the CLI to start a single experiment.
- Using
seml
to start job arrays on a SLURM cluster.
The CLI is a simple way to start a single experiment. You can provide additional configuration files or overwrite parameters. For instance, to train a model on the N2 PES:
python train_many.py with configs/systems/n2.yaml
If you now want to increase the number of determinants, simply overwrite the parameter:
python train_many.py with configs/systems/n2.yaml globe.determinants=32
To schedule multiple jobs, we recommend to use seml
. seml
takes a configuration file with defined parameter spaces and schedules a separate slurm job for each experiment. For instance, to train on the H4
, H6
and H10
from the paper, simply run:
seml globe_hydrogen add configs/seml/train_hydrogen.yaml
Please cite our paper if you use our method or code in your own works:
@inproceedings{gao_globe_2023,
title = {Generalizing Neural Wave Functions},
author = {Gao, Nicholas and G{\"u}nnemann, Stephan}
booktitle = {International Conference on Machine Learning (ICML)},
year = {2023}
}
The logo is generated by Bing Image Creator.