Skip to content

Commit

Permalink
Updating documentation for slurm
Browse files Browse the repository at this point in the history
  • Loading branch information
fongcj committed Oct 11, 2024
1 parent b3de5dd commit 4b1c27e
Showing 1 changed file with 11 additions and 10 deletions.
21 changes: 11 additions & 10 deletions docs/reference/user-guide/slurm.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,22 +55,23 @@ Below is a simple example of a SLURM job script.

```bash
#!/bin/bash
#SBATCH --job-name=my_analysis # Job name
#SBATCH --output=my_analysis_%j.out # Output file (%j is replaced by job ID)
#SBATCH --error=my_analysis_%j.err # Error file
#SBATCH --ntasks=1 # Run a single task (1 CPU core)
#SBATCH --mem=8G # Memory request
#SBATCH --time=02:00:00 # Time limit (2 hours)
#SBATCH --partition=short # Partition to submit the job to

# Your executable or command goes here
srun python my_script.py --input data/input_file.csv --output results/output_file.csv
#SBATCH --job-name=train_RoBERTa_infer # Job name
#SBATCH --output=/gpfs/mindphidata/cdm_repos/github/progression-predict/slurm/logs/log.infer.%j.out # Output file
#SBATCH --error=/gpfs/mindphidata/cdm_repos/github/progression-predict/slurm/logs/log.infer.%j.err # Error file
#SBATCH --ntasks=1 # Run on a single CPU
#SBATCH --mem=10G # Memory request
#SBATCH --gpus=1 # Number of GPUs

# Run the executable with the provided arguments (you may need to adapt this if different arguments are required)
srun ./run_infer_mlflow.sh $SLURM_ARRAY_TASK_ID
```

In this script:
- The `#SBATCH` directives configure the job's resources.
- The `srun` command launches the program, which in this case runs a Python script.



---

## Running Array Jobs
Expand Down

0 comments on commit 4b1c27e

Please sign in to comment.