SLURM chained jobs

Hi all,

I was curious to learn how to do chained SLURM jobs.

The bash script chain is the following. Each bash script should be executed once the prior one is complete without errors. If one step fails, the whole chain should stop.

  1. sbatch
  2. sbatch --array=1-12
  3. sbatch --array=1-100
  4. sbatch
  5. sbatch
  6. sbatch
  7. sbatch --array=1-12
  8. sbatch --array=1-100

Each bash script takes in an array and passes sets of parameters to specific python scripts.
e.g. sbatch --array=1-100 reads in array numbers 1-100 and configures parameters for scenarios, which are then forwarded to a python script →

I wasn’t able to correctly set this up yet.

Do you have any advice on this?



are you looking for something like --dependency:afterok? Please see our doc for some information about that.


ah yes that’s it.


I was going to do something like this:
(source: Building pipelines using slurm dependencies)

#! /bin/bash

#first job - no dependencies
jid1=$(sbatch --mem=12g --cpus-per-task=4

#multiple jobs can depend on a single job
jid2=$(sbatch --dependency=afterany:$jid1 --mem=20g
jid3=$(sbatch --dependency=afterany:$jid1 --mem=20g

#a single job can depend on multiple jobs
jid4=$(sbatch --dependency=afterany:$jid2:$jid3

#swarm can use dependencies
jid5=$(swarm --dependency=afterany:$jid4 -t 4 -g 4 -f

#a single job can depend on an array job
#it will start executing when all arrayjobs have finished
jid6=$(sbatch --dependency=afterany:$jid5

#a single job can depend on all jobs by the same user with the same name
jid7=$(sbatch --dependency=afterany:$jid6 --job-name=dtest
jid8=$(sbatch --dependency=afterany:$jid6 --job-name=dtest
sbatch --dependency=singleton --job-name=dtest

#show dependencies in squeue output:
squeue -u $USER -o “%.8A %.4C %.10m %.20E”