Using modules in docker container when submitting via `singularity exec`?

Hi HPC Team,

So far, I have run a Python script via a bash submit file, where I loaded a conda environment. Here the relevant snippet:

module purge && module load GCC/11.3.0 OpenMPI/4.1.4 geopsy/3.4.2 Anaconda3/2022.05
source activate <conda-env>
python -B

In, the geopsy module is needed.

Now I wanted to run this in a docker container, so I’m doing

module purge && module load GCC/11.3.0 OpenMPI/4.1.4 geopsy/3.4.2 Anaconda3/2022.05
srun singularity exec --nv $HOME/docker/migrate_v002.sif python

and the script doesn’t run through with the following error:

FileNotFoundError: [Errno 2] No such file or directory: '/opt/ebsofts/geopsy/3.4.2-foss-2022a/bin/gpdc'

My suspicion is that the gpdc module isn’t loaded (even though I wrote the module load commands before srun singularity exec). How can gpdc be made available in the docker container? :slightly_smiling_face:

The path to the full submit file is here:


Best regards,

Hi @Imahn.Shekhzadeh1

/opt/ebsoft is not “mounted” in your container your need to bind it.

Hi @Adrien.Albert,

If I can come back to this question: While binding /opt/ebsoft in the docker container was indeed the solution, it turns out I’m not able to run geopsy in the docker container (even though I am with a conda environment on the cluster). I can tell that geopsy is not run, since the module takes an input file and spits out an output file. However, the output files are empty (when running this in the conda environment, they are not).

In my scripts, I use the path to geopsy as follows:

path__to_gpdc = "/opt/ebsofts/geopsy/3.4.2-foss-2022a/bin/gpdc"
proc = sp.Popen(

I ran the singularity container on yggdrasil as follows:

srun singularity exec --bind /opt/ebsofts:/opt/ebsofts --env PATH="/opt/gpdc/bin:$PATH" --nv $HOME/docker/migrate_v002.sif python -m ant.fwd_model.generate_data [...]

Any hints on what might be going wrong?

Thanks in advance.