Maximum cores usable per job?

Dear HPC
I have been trying to submit jobs with 40cores, but i could not submit jobs with >30 cores, I looked into the webpage info but I could not find such limitations announced as it is the case for the memory e.g… any idea if this is “normal” or “anybody” shall be able to use more cores than 30 for one given job ? I monitored the partition (spart) and e.g. 40 cores were available.

many thanks in advance for your help!

Dear @Raphael.Guex, please show us your sbatch script.

Dear @Yann.Sagon

Here it is, thank you for your help!

#SBATCH --ntasks=1
#SBATCH --time=2-0:0:0
#SBATCH --partition=public-cpu
#SBATCH --cpus-per-task 30
module load MATLAB/2022a
echo "Running ${BASE_MFILE_NAME}.m on $(hostname)"
srun matlab -nodesktop -nosplash -nodisplay -r ${BASE_MFILE_NAME}

I tried therefore to change this line to 40 or 50 cores:
#SBATCH --cpus-per-task 30

I also tried with the private partition of our lab: unsucessfull too

Hi @Raphael.Guex

What you are trying is to request a single compute node with 40 or more CPUs. We don’t have such compute node in the public-cpu partition. You should try with partition shared-cpu instead. This partition is limited to jobs with a duration of 12h00 max but you may request up to 128 CPUs. You can check the compute node characteristics here or use pestat.

Anyway, be sure that Matlab is taking advantage of having such a high number of CPUs available. For example with seff.