Skip to content

Helping MD customers run workloads on Graviton 3E instances (especially hpc7g) with optimized compiler settings

License

Notifications You must be signed in to change notification settings

aws-samples/aws-graviton-md-example

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

51 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Molecular Dynamics on Amazon EC2 hpc7g instances

Blog Post: Best practices for running molecular dynamics simulations on AWS Graviton3E

This repository provides examples to run Molecular Dynamics applications on AWS using ParallelCluster and Hpc7g instances, powered by Graviton 3E processors. As of today we have instructions for GROMACS and LAMMPS. By executing the scripts you see under the /codes folder, you can install these scientific applications with optimal compiler options.

ParallelCluster

In this guide we assume that you are going to deploy a scalable HPC cluster on AWS using AWS ParallelCluster. Below is the reference architecture for this project.

ParallelCluster

For detailed procedures on how to deploy AWS ParallelCluster on your AWS account, please follow the official User Guide (v3). You can either use CLI or ParallelCluster UI to deploy the cluster. Either way, you can use the configuration file template. This template assumes that you have already created a VPC and subnets to deploy your cluster into. If you are not familiar with this process, follow these steps in the official User Guide.

Once you have created your VPC and subnets, use this configuration file to deploy the cluster. Before you execute the command below, make usre to replace the subnets and ssh key information.

pcluster create-cluster -n gv-cluster -c 0-md-cluster.yaml

Note

All of the steps in this repo were performed and confirmed working on ParallelCluster version 3.6.0.

Compilers & Libraries

We recommend the following Compilers & Libraries:

  • Operating System: Amazon Linux 2
  • Compiler: Arm compiler for Linux (ACfL) version 23.04 or later
  • Library: Arm performance libraries (ArmPL) version 23.04 or later, included in ACfL
  • MPI: OpenMPI version 4.1.5 or later (latest official stable release)

Arm Compiler and Arm Performance Libraries

To install ACfL use this installation script on the head node of your newly created cluster. This will install ACfL and ArmPL under /shared/tools.
You will see the following message if the installation is successful.

Unpacking...
Installing...The installed packages contain modulefiles under /shared/arm/modulefiles
You can add these to your environment by running:
                $ module use /shared/arm/modulefiles
Alternatively:  $ export MODULEPATH=$MODULEPATH:/shared/arm/modulefiles

Open MPI

ParallelCluster by default comes with Open MPI pre-installed, but the installed version is compiled with gcc. In order for us to use Open MPI with Arm compilers, we will need to compile it with the newly installed ACfL. We recommend using Open MPI version 4.1.5. Use this script to install Open MPI 4.1.5 with ACfL on the head node. The script installs Open MPI under the /shared/tools/openmpi-4.1.5-arml/ directory.

Tip

To use the ACfL compiled Open MPI you will need to specify PATH=/shared/tools/openmpi-4.1.5-arml/bin:$PATH and LD_LIBRARY_PATH=/shared/tools/openmpi-4.1.5-arml/lib:$LD_LIBRARY_PATH.

GROMACS

GROMACS is a free and open-source software suite for high-performance molecular dynamics and output analysis. If you follow the instructions below you will be able to download and install the application on your ParallelCluster environment.

Prerequisites

In order to install GROMACS you will need cmake. Install it on the head node by executing this script. This script will download cmake under ~/software and install it onto /shared/tools/cmake-3.26.4-arm64.

Download Source Code

Execute the download script on the head node. This will download the GROMACS source code tar ball onto ~/software and extract the files in it.

Compilation

To build GROMACS with optimized SVE settings, execute this script on the head node. The software will be installed under /shared/gromacs2022.5-armcl-sve.

Job Submission

If you have used this configuration file to deploy your ParallelCluster environment, the cluster is set up with the Slurm workload manager. Submit the first GROMACS job using this example Slurm job submission script. You can do so by placing the script under your /home directory, and execute the following Slurm command. This will download the test case as well as execute the job on 1 instance. The test case being used here is Test Case A - GluCl Ion Channel from the Unified European Application Benchmark Suite (UEABS).

sbatch 3-gromacs-acfl-sve.sh

LAMMPS

LAMMPS is a classical molecular dynamics simulator, and is used for particle-based modelling of materials. If you follow the instructions below you will be able to download and install the application on your ParallelCluster environment.

Download Source Code

Execute the 1-download-lammps.sh script on the head node. This will download the LAMMPS source code onto ~/software.

Compilation

To build LAMMPS with the optimized SVE settings, execute the following scripts on the head node. The software will be compiled, and executables will be copied under /shared/tools/lammps/armpl-sve. These steps should be done only after both the compilers and OpenMPI have been installed.

Job Submission

If you have used this configuration file to deploy your ParallelCluster environment, the cluster is set up with the Slurm workload manager. Submit the first LAMMPS job using one of the following Slurm job submission scripts. The test case being used here is Lennard Jones.

ARM-compiled LAMMPS: 3a-lammps-acfl-sve.sh

curl -LO https://raw.githubusercontent.com/aws-samples/aws-graviton-md-example/main/codes/LAMMPS/3a-lammps-acfl-sve.sh
sbatch 3a-lammps-acfl-sve.sh

GCC-compiled LAMMPS: 3b-lammps-gcc.sh

curl -LO https://raw.githubusercontent.com/aws-samples/aws-graviton-md-example/main/codes/LAMMPS/3b-lammps-gcc.sh
sbatch 3b-lammps-gcc.sh

Security

See CONTRIBUTING for more information.

License

This library is licensed under the MIT-0 License. See the LICENSE file.

About

Helping MD customers run workloads on Graviton 3E instances (especially hpc7g) with optimized compiler settings

Topics

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages